| Author |
Topic |
|
ceciliarenebaker
Starting Member
3 Posts |
Posted - 2005-08-25 : 14:31:59
|
| I have a question that may have an obvious answer for some of you, but I've been grappling with the issue for a couple of hours so I thought I'd give this forum a try. I am using SQL Server 2000 and I am sending out changes to office information from the central file to our offices out in the field. The stored procedure selects the transactions that are waiting to be sent and then pulls in related information (office capabilities for example) into temporary tables, then updates a transaction status record to "sent". Selecting a number of transactions into a temp table, updating a transaction status table to "sent" and then selecting related information into temporary tables is a close approximation to a snapshot of the information that exists at the time the transactions were selected. But is there any way to "lock" all of these tables so nothing changes while I'm gathering all my information? I'm also worried about someone else accidentally executing this "feed" procedure while another instance of this "feed" procedure is going on. Is there any way to force only one instance of a procedure to execute at one time? I figure some of you have already solved this sort of problem and I'm wondering how you dealt with these issues. Any high level help you can give me would be greatly appreciated -- I can handle the implementation details myself. Thanks again.Cecilia Baker |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2005-08-25 : 14:36:43
|
| I would use dry ice.....Or ALTER DATABASE [databasename] SET SINGLE USER WITH ROLLBACK IMMEDIATEThis mean no one, but your process will have access to the database and will kick everyone out.Sure you want to do that?Also, you need to set it back....do you know what books online is?If that's not what you want to do, then how how about some more specific details.read the hint link belowBrett8-)Hint: Want your questions answered fast? Follow the direction in this linkhttp://weblogs.sqlteam.com/brettk/archive/2005/05/25/5276.aspx |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2005-08-25 : 14:50:04
|
Food for thought while your browsing Brett's link:Here's a concept for you off the top of my head:(In Rod Serling's voice) "Picture if you will..." A new meta table with one row and one bit column called [InUse].The first thing your SP does is:update myMetaTable set [InUse] = 1 where [InUse] = 0if @@rowcount = 0begin Raiserror ('someone else is executing the procedure now', 17,1) return -1end---After all processingupdate myMetaTable set [InUse] = 0That could handle preventing concurrent instances of your procedure. For locking records while you're working...hmmm...Do you want to actually "block" people from accessing these tables or do you just want to prevent changes to the rows you have selected? That would be 2 different solutions.Be One with the OptimizerTG |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-08-25 : 16:08:02
|
Another angle, if You want "singleton" procs.Nothing to do with locking the data though.create procedure oneInstance asif exists(select * from master.dbo.sysprocesses where context_info = 0x0666)begin raiserror('I''m already running, sorry!',16,1) return 50000endset context_info 0x0666waitfor delay '00:00:05' -- wait 5 secondsselect 'I''m done'set context_info 0x00rockmoose |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2005-08-25 : 16:12:52
|
Ahhh, another cool thing we can use context_info for. Good one, Rocky! EDIT:just noticed the 0x0666...nice touch Be One with the OptimizerTG |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-08-25 : 17:01:19
|
They passed around some sudoko problems at work today.That was fun, they were impressed by the accuracy and speed with which I delivered answers.(using sql is not technically cheating is it ? )rockmoose |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2005-08-25 : 22:16:34
|
| Hi ceciliarenebaker, Welcome to SQL Team!We tend to do this type of job by having a "batch number" column in the table. At the start of the operation we set all the rows we are going to process with the next batch number, this in turn prevents other processes from operating on the same rows.Then we process all of the rows that are in "our batch"We write other processes to be aware of the batch number system, or use a status of "Being processed" that prevented other operations messing with the records (i.e. assuming the table has a Status field, then use a status code to indicate the rows are only available for reading, or not at all).If the batch goes wrong (email didn't get through, printer jammed, or somesuch) we can then "reprocess" that batch (which usually means just the re-print part, rather than the bit that pays out the money again!)Kristen |
 |
|
|
ceciliarenebaker
Starting Member
3 Posts |
Posted - 2005-09-02 : 13:19:07
|
quote: Originally posted by Kristen Hi ceciliarenebaker, Welcome to SQL Team!We tend to do this type of job by having a "batch number" column in the table. At the start of the operation we set all the rows we are going to process with the next batch number, this in turn prevents other processes from operating on the same rows.Then we process all of the rows that are in "our batch"We write other processes to be aware of the batch number system, or use a status of "Being processed" that prevented other operations messing with the records (i.e. assuming the table has a Status field, then use a status code to indicate the rows are only available for reading, or not at all).If the batch goes wrong (email didn't get through, printer jammed, or somesuch) we can then "reprocess" that batch (which usually means just the re-print part, rather than the bit that pays out the money again!)Kristen
|
 |
|
|
ceciliarenebaker
Starting Member
3 Posts |
Posted - 2005-09-02 : 13:22:28
|
Thanks Kristen! I ended up implementing something very similar to this, the idea of setting a batch number for "dibs" on the records is great. Thanks again.Cecilia Baker Hi ceciliarenebaker, Welcome to SQL Team!We tend to do this type of job by having a "batch number" column in the table. At the start of the operation we set all the rows we are going to process with the next batch number, this in turn prevents other processes from operating on the same rows.Then we process all of the rows that are in "our batch"We write other processes to be aware of the batch number system, or use a status of "Being processed" that prevented other operations messing with the records (i.e. assuming the table has a Status field, then use a status code to indicate the rows are only available for reading, or not at all).If the batch goes wrong (email didn't get through, printer jammed, or somesuch) we can then "reprocess" that batch (which usually means just the re-print part, rather than the bit that pays out the money again!)Kristen[/quote] |
 |
|
|
|