Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 Database Design and Application Architecture
 Fastest way to import data.

Author  Topic 

ia72
Starting Member

5 Posts

Posted - 2008-12-02 : 12:44:23
Hi all,

Does anybody knows what is the fastest options to import data (over 10 mil records) for SQL Server 2005?

Thank you

sodeep
Master Smack Fu Yak Hacker

7174 Posts

Posted - 2008-12-02 : 12:48:28
Bulk insert, Export/Import.
Go to Top of Page

ia72
Starting Member

5 Posts

Posted - 2008-12-02 : 13:41:30
Thank you sodeep,

I am actually doing it using bulk insert.
I am importing data which contains Unicode,
and my bulk insert statement is:
bulk insert tblinsert
from 'c:\Data\filename.txt'
with (
codepage ='raw',
datafiletype = 'widechar',
fieldterminator = '|',
rowterminator = '\n'
)

I do not really like the speed for this operation.
My question is: Is it any way to speed up this process?

Go to Top of Page

sodeep
Master Smack Fu Yak Hacker

7174 Posts

Posted - 2008-12-02 : 13:45:58
Remove indexes ,bulk insert and reapply indexes once done.
Go to Top of Page

ia72
Starting Member

5 Posts

Posted - 2008-12-02 : 13:57:28
I did that in my tests,
but reapplying indexes on a tables over 20 mil does not improve things at all in my case.
Is it any information on a running times for bulk insert
Go to Top of Page

darkdusky
Aged Yak Warrior

591 Posts

Posted - 2008-12-03 : 06:42:28
Partitions can be used as a means to import very large amounts of data almost instantaneously.

http://blog.sqlauthority.com/2008/01/25/sql-server-2005-database-table-partitioning-tutorial-how-to-horizontal-partition-database-table/

Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2008-12-03 : 06:44:50
Maybe the bottleneck is not BULK INSERT?
Maybe it is the network speed if c drive is not a local drive?


E 12°55'05.63"
N 56°04'39.26"
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2008-12-03 : 06:46:01
Also try to add the BATCH_SIZE switch. Set it to 10000 records or so.


E 12°55'05.63"
N 56°04'39.26"
Go to Top of Page

SwePeso
Patron Saint of Lost Yaks

30421 Posts

Posted - 2008-12-03 : 06:56:04
I did a test on my laptop pc. It took about 12 seconds to import 10 million integers.


E 12°55'05.63"
N 56°04'39.26"
Go to Top of Page

ia72
Starting Member

5 Posts

Posted - 2008-12-04 : 21:39:11
Thank you for your for your help,

My table is partitioned, and it helps. (thank you darkdusky)
I tried using various batch_sizes, and it did not provide too much of improvement (in my case).

To Peso, when I did my tests on a development server my result were also good,
the problem started when I moved to a production environment. -)

So my best guess at this point, that it is not a bulk insert problem.
Go to Top of Page
   

- Advertisement -