Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Mondeo
Constraint Violating Yak Guru
287 Posts |
Posted - 2010-07-27 : 18:10:53
|
Hi,I've got a web application in asp.net which basically takes an uploaded csv file with about 1 million rows, validates it then inserts it into sql. To do each file takes about 10 minutes and during that time the w3wp web server process really takes a performance hit, it virtually maxes out the cpu. Performance on the sql box seems fine.I wondered what the best way to do this is, should I have a seperate application running to grab the incoming csv files and do the inserting, thereby offloading this task off the web serverOr is there another way, i'm using SQL 2008 StandardThanks |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2010-07-27 : 18:21:59
|
It sounds like your application is doing singleton inserts. You should instead be using BULK INSERT command or the Bulk Insert API available in the application code. You should at the minimum have a separate thread that does the bulk insert. You could offload this entire thing to an SSIS package that runs via SQL job or that you launch programmatically.You should be able to load million of rows of data into SQL Server in a few seconds if you were to use bulk insert.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-07-28 : 01:05:23
|
Or possibly XML? The schema's are a nightmare to set up (rubbish DOCs) for anything multi-level, but if yours is flat that wouldn't be a problem.The Object used to be called SQLXMLBulkLoad, not sure what its called these days, but it makes it easy to "push" the data, rather than having to "pull" it. But the BUlk Insert API Tara refers to may do that too (its not something I've used) |
|
|
Mondeo
Constraint Violating Yak Guru
287 Posts |
Posted - 2010-07-28 : 06:25:53
|
10 minute task is now 26 seconds using this bulk insert method.Thanks a lot for this |
|
|
SwePeso
Patron Saint of Lost Yaks
30421 Posts |
Posted - 2010-07-28 : 06:49:33
|
Also try to use TABLELOCK hint when doing the BULK INSERT, if you can afford to. N 56°04'39.26"E 12°55'05.63" |
|
|
Kristen
Test
22859 Posts |
Posted - 2010-07-28 : 06:51:30
|
... and the ORDER hint if the input file is pre-sorted in Clustered Index order. |
|
|
madhivanan
Premature Yak Congratulator
22864 Posts |
Posted - 2010-07-28 : 06:53:05
|
This reminds me ,sometimes back,when my co-worker told me that it took only a minute when he applied http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=49926 while the application took more than 45 minutesMadhivananFailing to plan is Planning to fail |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2010-07-28 : 12:50:45
|
quote: Originally posted by Mondeo 10 minute task is now 26 seconds using this bulk insert method.Thanks a lot for this
Woohoo! I'm working through a similar issue on one of my applications. We recently convinced the project that BULK INSERT or Bulk Log API through Java is the way to go. We weren't even aware that singleton inserts were being done when importing millions of rows from cell phone carrier files until the developer wanted some help with performance. Developer: "Our bulk insert is taking several minutes to insert 500,000 rows."Me: "Let me run a trace to see what's going on."A few minutes later:Me: "Ummm, your application is doing singleton inserts. You should instead be using BULK INSERT."Developer: "Oh, let me test that."Hours later:Developer: "BULK INSERT is super fast, thanks!"Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
|
|
|
|
|
|
|