Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
shaneschmidt
Starting Member
17 Posts |
Posted - 2003-10-10 : 01:18:41
|
| We have a DTS package doing a Bulk Load insert but failing on Duplicate Key. We are not concerned about duplicate keys being ignored. Is there a way of NOT FAILING our package if a duplicatekey is found.Any assistance or guidance is appreciated. |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2003-10-10 : 01:39:35
|
| Import into a staging table without a unique index then insert distinct keys from there.Then you might consider doing this in a stored proc rather than the dts application.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-10-10 : 12:38:02
|
| I recommend the staging table to. I typically do not create any constraints on the staging table. Doing it this way allows the data to get into SQL Server at least. You can then transfer the data from the staging table to your table using T-SQL.You could also drop your primary key prior to the bulk load. Then add it after the bulk load is done but have it not check for duplicates on creation. Or you could delete the dups prior to creating the constraint.Tara |
 |
|
|
|
|
|