Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
SamC
White Water Yakist
3467 Posts |
Posted - 2003-10-02 : 11:35:08
|
It's really not that bad. I think catchy subject lines get more readers.I have a bcp.fmt file as follows:8.0151 SQLCHAR 0 100 "," 1 Email SQL_Latin1_General_CP1_CI_AS2 SQLCHAR 0 100 "," 2 UserName SQL_Latin1_General_CP1_CI_AS3 SQLCHAR 0 100 "," 3 Password SQL_Latin1_General_CP1_CI_AS4 SQLCHAR 0 100 "," 4 FirstName SQL_Latin1_General_CP1_CI_AS5 SQLCHAR 0 100 "," 5 MiddleName SQL_Latin1_General_CP1_CI_AS6 SQLCHAR 0 100 "," 6 LastName SQL_Latin1_General_CP1_CI_AS7 SQLCHAR 0 100 "," 7 CenterName SQL_Latin1_General_CP1_CI_AS8 SQLCHAR 0 100 "," 8 BranchName SQL_Latin1_General_CP1_CI_AS9 SQLCHAR 0 100 "," 9 Offline SQL_Latin1_General_CP1_CI_AS10 SQLCHAR 0 100 "," 10 ISSO SQL_Latin1_General_CP1_CI_AS11 SQLCHAR 0 100 "," 11 EmployeeType SQL_Latin1_General_CP1_CI_AS12 SQLCHAR 0 100 "," 12 Inactive SQL_Latin1_General_CP1_CI_AS13 SQLCHAR 0 100 "," 13 NonEmployee SQL_Latin1_General_CP1_CI_AS14 SQLCHAR 0 300 "," 14 Fullname SQL_Latin1_General_CP1_CI_AS15 SQLCHAR 0 512 "\r\n" 15 Assignments SQL_Latin1_General_CP1_CI_ASI'd like the last two columns to be optional in the imported command delimited data. One bcp.fmt file, just give me the columns you can, in order, you can omit the last few if you need to. (This is a back compatability thing).Unfortunately, when I eliminate the last CSV value in the imported text data, the entire next row is taken as the last CSV value (commas and all) and inserted into the target table.What's up with that and can I stop it?Sam |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-10-02 : 13:04:39
|
| Nope, you'll need two (or however many) format files to handle each type of file. Needless to say if you have mixed line formats within the same data file you'll have to find another workaround.If you need that much flexbility, import it into a single varchar(8000) staging table and parse it from there. It even works for multiple line formats within the same file. |
 |
|
|
SamC
White Water Yakist
3467 Posts |
Posted - 2003-10-02 : 13:14:46
|
I hadn't thought of copying the entire row as a single line item. It may not be the end of everything to use multiple bcp files.I've got two options now. Thanks.Have a pepsi. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-10-03 : 12:54:35
|
| Depending on how much data you have...if it's huge I don't think (damn it...again) that it would be too effecient...It does give you a good amount of control over your data if it's suspect..You could implement all kinds of integrity checking if you like, before inserting to a final table...Search for Header and trailer...I remember lots of sample code being posted...And if you want some, let us know...Brett8-)SELECT @@POST FROM Brain ORDER BY NewId()That's correct! It's an AlphaNumeric! |
 |
|
|
SamC
White Water Yakist
3467 Posts |
Posted - 2003-10-03 : 13:52:48
|
Well, what I have is working now. But if a single CSV line is missing a column or has one two many, then DTS will botch the next CSV line.It's troublesome when DTS encounters a bad character or column in one line (row) it will corrupt the import of the next line (row) which may have perfectly valid data. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-10-03 : 14:23:37
|
quote: Originally posted by SamC But if a single CSV line is missing a column or has one two many, then DTS will botch the next CSV line.
What about one three many? Hey bad data is bad data...bcping in to a single column might help...and maybe you could isolate the rows...but it's still bad data...Brett8-)SELECT @@POST FROM Brain ORDER BY NewId()That's correct! It's an AlphaNumeric! |
 |
|
|
|
|
|
|
|