Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2003-07-15 : 08:25:16
|
| Brendan writes "I have a problem with bulk inserting data into SQL 2000.I have a fixed-length data file with a possible 10 million rows.The data has over 300 columns of data.So I can't use bcp to bring the data in or DTS as far as I know.There is some limitation with the number of columns.I can bulk insert the data fine but when it brings the data in the columns that have zero-length string are being imports as spaces. What I need is to bring that data in as nulls. Is there a work-around this problem or do I have to transform the data after I import it. I have built a query to do this but it would take way too long. I was hoping to avoid this at all possible.Any help with this matter would be greatly appreciated.Thanks,Brendan Lane" |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-07-15 : 12:10:07
|
| You have a fixed width file so you would need some way of telling the import that spaces are really null.The fastest way may well be to bulk insert, bcp out from a query or view which trims fields into a csv file then bulk insert again. The delimitted empty fields will be null.It is also worth trying importing into a staging table then inserting from there into the real table rather than updating - it will save wasted space too.For either of these don't update (especially)/insert the whole table at once - batch things up into say 100000 recs at a time depending on memory and disk space. It will have a lot less impact on the system.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
|
|
|