Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
ibbo14
Starting Member
10 Posts |
Posted - 2011-06-14 : 22:29:31
|
G'day,I have a quick question regarding best practices using rollbacks. I am in a predicament with another developer regarding on how we should run this process. Our process consists of retrieving security swipe cards from various sql servers across the network. The problem is that sometimes the linked server connection might fail due to a connection problem and wouldn't even know that the process has failed. I did suggest that we wrap our Stored procedure with a rollback in the event of an error being caught and write the error to a table, otherwise commit the transaction.Now this is where i am having a clash with another person whom believes that SQL rollbacks on a large transaction is a bad practice. He has advised that the best solution is to make a copy of the tables in question, run the process and if an error is caught to copy the data back into the original tables. This sounds really inefficient to me and is like reinventing the wheel.I just need to confirm if this is bad practice or not.Any help or ideas would be greatly appreciated. Ibo |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
ibbo14
Starting Member
10 Posts |
Posted - 2011-06-14 : 23:21:54
|
Many thanks for the reply Tara. The catch is that I do not have any setup rights on their servers, so my solution was to set up a SQL express on my work desktop where I do have sa rights and can extract the data from each server in a nightly batch process and produces csv reports. These are production server on mine sites, so the connection is pretty flaky and started encountering connection problems. Bit of long way, but atleast I dont't have to deal with IT hurdles all the time.I agree too, large transactions are pretty bad too, but unfortunaltey thats the volume of the data coming in. Unless there is a better way to extract a large volume of data in one go? |
 |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
ibbo14
Starting Member
10 Posts |
Posted - 2011-06-15 : 00:34:29
|
To cut the long story short, we are migrating our swipe system from one system to another for 13 mine sites and need to sustain site access during the down time. It’s a progressive rollout, which can take a couple of weeks for each site. So for health and safety regulations demand that workers have their required courses, medicals, drug & alcohol tests and approvals prior to entering site. So when a site is about to roll out the new system, the existing swipe setup will be switched off. so we need to pull the card information from these systems and the competencies of each worker from a single system which houses everyone’s competencies. Then we generate csv files for a windows mobile application which pulls all the data from text files and has the ability to swipe offline. So the files it brings in are added and removed records. I would have personally used a SQL ce or access database for this purpose, but stuck with this setup as someone else has set it up this way. Basically a difference from the current run vs the previous day and producing added and removed files. Very 1980's approach in my honest opinion.So there is a daily stored procedure running from my local machine which extracts the card information from multiple systems and then extract bio and competency data from a single system located somewhere else in the country. So I have been given readonly access which only really gives me select capabilities and no more. I cannot create a SP, Functions, Tables, nothing really but read. Also its a production server and cannot run queries during the day. The problem really with that is that snapshot isolation is not enabled and they will not enable it. So this is why i need to copy the data locally and also can query the data on my local machine during the day without affecting performance on the production server. Its more of convenience rather than speed in this situation. However the data does need to be massaged prior to producing an output as well as getting a daily difference from the previous run. So this is why I have to do all this local processing. But I am curious to know how can batch a select statment without using a cursor? |
 |
|
|
|
|
|
|