Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
medtech26
Posting Yak Master
169 Posts |
Posted - 2006-03-28 : 20:22:54
|
| Hello all,This is the case: as of now I got all of my (online store) items neatly organized in a SQL Server 2K DB (2 tables, one just holds the description), nearly 2K items with an up-to-date information and most currant description.Now, as we decided to upgrade the store, we'r leasing new file with over 90K items that's going to be FTP to our site every week and another file (inventory) that's going to be FTP every day.Few facts:* The files are text (comma delimited) files.* Some items exist both in our currant DB and the text file.* I would like to filter most of the 90K items and eventually have approximately 10K to work with.How should I approach this issue?I do hope that it's clear enough, if not you're welcome to ask and I'll try to my best clarify this as much as possible. |
|
|
CorpDirect
Yak Posting Veteran
92 Posts |
Posted - 2006-03-28 : 21:29:52
|
| Though I've not done much of this myself, so can't give specifics, you might want to look at using DTS to import your data from text files. Here's a resource:SQL Server DTS Best PracticesThere are some examples of how to do just what you're asking in there. DTS can be scheduled to run via SQL Agent or using DTSRUN from a Scheduled Task in the OS, and you can define how imported data will be transformed or filtered in your DTS package. You can even set up stored procedures to manipulate your data after import, and execute those as a step in the package, if that gives you better control.One suggestion: try this in a test environment or at least on copy tables first, while you work out the kinks!Good luck,Daniel |
 |
|
|
|
|
|
|
|