Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2001-12-17 : 09:07:06
|
| Emin writes "Hi everyone, I have a long running batch job , that must be made once every day . Preferably at the begin of the day. I want to lock my transaction table and other parameter tables these i use during this job, so no one can insert or update any row until the batch end. How can i lock all objects these i used at the batch? And is there a way of error handling except rollback transaction?Because i have to run this job 10000 - 100000 times per different account at once. So if one of them have a mistake, all of them are rolledback and commiting per account takes more longer." |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2001-12-17 : 10:07:26
|
| If you can provide the code of your job, we might be able to better provide a solution. I'm concerned about the "10,000 - 100,000 times per account" part; if you are using a cursor to do this, we need to find a better solution for you.You should consider setting the database to read-only before running the batch; it will only allow SELECT operations on all tables and will eliminate locking. While this is restrictive, it is usually better to prevent any modifications during an extensive database job that must succeed.Error handling doesn't necessarily mean rollback, but if you must consider the entire batch as a transaction then a rollback at any point will cancel the entire thing. This is by design. You could create a progress tracking system that commits the work in sub-batches. That allows you to complete 90% of the batch, and only have to re-run the 10% that failed. |
 |
|
|
|
|
|