Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
Cedricsan
Starting Member
16 Posts |
Posted - 2007-02-19 : 09:16:39
|
Dear all,I am trying to insert csv files into a SQl table.The main issue is that the csv file rows contain a variable number of columns (up to 43).If I import the file manually using "import and export" data it works fine. But when I try to us bulk insert it goes wrong.Bulk insert tries to fill the 43 columns of the target SQL table all the time even if the csv file row contains only 18 columns. So the data load fails.Do you know a way to work around that?Any help or idea welcomed.Thanks in advance |
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2007-02-22 : 04:04:30
|
You can import into a single column staging table then insert from that into your production table.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
chanec
Starting Member
1 Post |
Posted - 2007-02-26 : 13:07:40
|
Hello,I'm having the same problem, and do not know how to get around this. I'm fairly new with SQL, but I do have some working knowledge of it. Do you have any information resources that can provide information to create a staging table? My issue is that I have a csv file that can vary in column size from 11-40 columns. I need to be able to map the correct columns to the production table, but since the columns change in size, and the ordering may change, i do not know how to do a proper transform. How would using a single column staging table solve this?As well, is there a way to dynamically map the column based on its name to the SQL column?Regards,Chanec |
|
|
nr
SQLTeam MVY
12543 Posts |
|
|
|
|