Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2008 Forums
 SQL Server Administration (2008)
 Bulk Insert expansion rate

Author  Topic 

bcdudley
Starting Member

2 Posts

Posted - 2011-03-04 : 12:22:39
I am running MS SQL 2008 R2. I was recently handed a hard drive by one of our employees that contained about 225Gb of data that he needed imported to the SQL server. It contained about 100 txt documents with all the data varying in size from 100mb to 7gb.

I created the database and tables with a sample script that was included with the data. I then started to do my bulk insert and that is where I ran into some problems.

The data is organized into 6 different tables. The first table is created from 38.4 Gb of txt files. Once I have gotten it into SQL, it has expanded out to almost 500gb. It is still importing after 18 hours, but my 500gb partition that I had dedicated for this is almost full.

The script I am using to import this is:


BULK 
INSERT dbo.BI_YYYY
FROM 'Z:\HLDI\BI_2002.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM BI_YYYY
GO


A Sample of the data I am importing:

1996,2004,1GCCS144         ,8,70582,1949,U,U,S,110,,,
1996,2004,4T1BG12K ,8,29720,1947,U,U,S,243,,,
1996,2004,1N4BU31D ,8,22603,1976,F,M,S,120,1,1,200
1996,2004,1FALP52U ,8,61704,1973,M,M,S,154,,,
1996,2004,JNKCA21D ,8,81657,1967,M,S,N,100,,,
1996,2004,2C1MR529 ,8,65270,1987,F,U,N,24,,,


A 240mb txt file contains 4.6 million rows of this data. I have nothing that can open the 7gb except sql, so I have no clue how many rows are in it.

Any suggestions on what I can do to get this under control, or am I going to need more space?

Thank you.

dinakar
Master Smack Fu Yak Hacker

2507 Posts

Posted - 2011-03-04 : 12:26:28
I think even Notepad would crash if you tried to open a 7GB txt file.. try opening it with Wordpad and break the file into smaller chunks.. use BCP and specify a batch size so you dont have a huge transaction and explode your log.. you might also want to keep yout recovery model to bulk logged for this import..

Dinakar Nethi
************************
Life is short. Enjoy it.
************************
http://weblogs.sqlteam.com/dinakar/
Go to Top of Page

bcdudley
Starting Member

2 Posts

Posted - 2011-03-04 : 12:30:48
Thank you for the quick response. Can you please tell me what BCP is. Sorry, I am not a SQL person so I don't really know all the terms.

I have a program called textpad that would open the 7gb file, but my computer does not have enough ram. I only have 4gb.

Thank you.

edit, nevermind, I found it. Thanks.
Go to Top of Page
   

- Advertisement -