Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
sqllearner
Aged Yak Warrior
639 Posts |
Posted - 2006-05-25 : 14:12:25
|
| I finshed a project where from flat file all the information is processed in a staging database and then from the staging it inserts/updates information into the production tables using stored procedures.There was 2 options here since the file was small in size. 1. Mass update and insert in a single transaction 2. Row by Row processing using cursor and committing the transaction at each row levelWe are using option 1 and it works fine with no issues with very less processing time though the error handling is hard.Now there is a change to the file and amount of records coming in would be huge around half a million records each day.Now Iam really confused.If I use the first option it may cause locking and blocking since its a mass update in one single transaction.If I choose option 2 then the time taken to process these many records will be huge.Is there any other way I can process these efficiently.Can I do it in batches If so how can I do it?..Please help me with this |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-05-25 : 15:01:31
|
| You can do (1), but in batches of, say, 10,000 rows.Just loop round, process 10,000 rows at a time, and keep looping until you are done.Kristen |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2006-05-25 : 15:02:52
|
| Use option 1 but do the updates and inserts in batches using SET ROWCOUNT nnn. Here is a controlled delete blog of mine:http://weblogs.sqlteam.com/tarad/archive/2003/10/15/305.aspxMy example processes 10,000 rows at a time until there are no more rows to process.Tara Kizeraka tduggan |
 |
|
|
|
|
|