I am running MS SQL 2008 R2. I was recently handed a hard drive by one of our employees that contained about 225Gb of data that he needed imported to the SQL server. It contained about 100 txt documents with all the data varying in size from 100mb to 7gb. I created the database and tables with a sample script that was included with the data. I then started to do my bulk insert and that is where I ran into some problems.The data is organized into 6 different tables. The first table is created from 38.4 Gb of txt files. Once I have gotten it into SQL, it has expanded out to almost 500gb. It is still importing after 18 hours, but my 500gb partition that I had dedicated for this is almost full.The script I am using to import this is:BULK INSERT dbo.BI_YYYY FROM 'Z:\HLDI\BI_2002.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )GO--Check the content of the table.SELECT *FROM BI_YYYYGO
A Sample of the data I am importing:1996,2004,1GCCS144 ,8,70582,1949,U,U,S,110,,,1996,2004,4T1BG12K ,8,29720,1947,U,U,S,243,,,1996,2004,1N4BU31D ,8,22603,1976,F,M,S,120,1,1,2001996,2004,1FALP52U ,8,61704,1973,M,M,S,154,,,1996,2004,JNKCA21D ,8,81657,1967,M,S,N,100,,,1996,2004,2C1MR529 ,8,65270,1987,F,U,N,24,,,
A 240mb txt file contains 4.6 million rows of this data. I have nothing that can open the 7gb except sql, so I have no clue how many rows are in it.Any suggestions on what I can do to get this under control, or am I going to need more space?Thank you.