Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Scott
Posting Yak Master
145 Posts |
Posted - 2003-07-23 : 04:08:50
|
| I am creating a table called #LVE01 then I run the following:BULK INSERT #LVE01 FROM 'D:\Data\DIMS\FTP\HR\LVE01a.CSV' WITH (FIELDTERMINATOR = ',')The CSV opens perfectly in excel and if I use enterprise managers import wizard it work fine. The problem is that the last few fields of each row do not always have data, the carriage return is then straight after the last field with data, eg:data, and notdata, , , , When I run the bulk insert it seems to randomly chose when to go onto a new line. My last field is varcher(255) and often gets filled with multiple fields from the next row and gives a truncation error.Anyone with any ideas?thanksScottPS: I think what will work here is if I can map the text columns to the table columns.I have tried to run a trace when I use the Import data wizard but can't find anything useful.Edited by - scott on 07/23/2003 05:56:56 |
|
|
dsdeming
479 Posts |
Posted - 2003-07-23 : 08:32:04
|
| I've had the same problem with tab-delimited files created by Excel. Excel doesn't reliably create the delimiters when the last cell in the last row is empty/null. As a workaround, I modified the Excel format to ensure that the final column was not nullable. No problems after that.Dennis |
 |
|
|
Scott
Posting Yak Master
145 Posts |
Posted - 2003-07-23 : 09:06:36
|
| File is created on a Solaris box, just using excel to check the data.Looks fine in excel, imports fine into db using wizard, crashes when using bulk insert?ThanksScott |
 |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-07-23 : 19:34:30
|
| Try adding:BULK INSERT #LVE01 FROM 'D:\Data\DIMS\FTP\HR\LVE01a.CSV' WITH (FIELDTERMINATOR = ',', ROWTERMINATOR='\r')Solaris is a flavor of Unix, and Unix-type OS's use a single LF character as a line terminator, where DOS/Windows use CR-LF. |
 |
|
|
Scott
Posting Yak Master
145 Posts |
Posted - 2003-07-24 : 04:49:58
|
Thanks, that improves things but still not without errors.The problem as I see it is that the csv is badly formed, ie: some rows have less columns than than the table I am inserting the data into. These rows do have a line feed at the end of the last field with data but instead of inserting nulls into the table where there is no data the bulk insert give an error.eg:Create Table #LVE01 ( Field1 varchar(10), Field2 varchar(10), Field3 varchar(10)) eg CSVtest,xyz,abcabcd,xyz,five,nine,abc,xyz Using Bulk insert row three gives an error.??Scott |
 |
|
|
Amethystium
Aged Yak Warrior
701 Posts |
Posted - 2003-07-24 : 05:38:57
|
| Interesting...The problem is not BULK INSERT but the .CSV file created through Excel. I experienced exactly the same problem the other day. I suggest that you add an extra column to your spreadsheet and populate it with a comma.---------------Shadow to Light |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-07-24 : 05:41:00
|
| Insert into a staging table with a single 8000 char column then you can split into fields from there.Use a invalid char for the field terminator (|).Not very nice but reliable and it works for any file with less than 8000 char rows.If you build it with a parameterised felimitter you can use it for all your imports.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
|
|
|
|
|