Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2003-04-18 : 09:17:44
|
| Andrew Mobbs writes "I am trying to import a set of data into my database. This dataset will be large, and run as a batch. The problem I am having is that some of the data I receive includes descriptive fields such as a username (there are about 5 other fields) in nvarchar format from a text file and I need to lookup a reference to a primary key in another table for each of these fields so that I can input a userid. In other words, I am trying to normalise the data whilst still using a fast bulk insert method. Do you have any ideas?" |
|
|
mfemenel
Professor Frink
1421 Posts |
Posted - 2003-04-18 : 09:23:41
|
| Coming from a text file, your best bet is to dump that information into a holding table on your server and then either perform your updates as you move it to the data's final resting place, or update the table and then insert it into it's final destination. No way that I know of to update the info as your importing from a text file.Mike"oh, that monkey is going to pay" |
 |
|
|
|
|
|