Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
sqlserver_newbee
Starting Member
11 Posts |
Posted - 2007-07-19 : 23:37:16
|
Hi All,We get several data files(flat files) in excess of 3-4 GB each month having records above 5 million. I am working on this project of transferring these data files to SQL Server database.I have designed the database and established the constraints on tables etc. I have also created a DTS Package with Transform Data Task for reading these fixed length files and loading them into the tables. It all works fine until here. Now each month we get new files with new records + updates to the old data. Because of the primary key constraints I can't obviously do a simple insert/append into the tables. I assume that maybe I need to load the new data into temp tables and then use t-sql queries to check the records in the actual table. If there is a duplicate primary key then update that row of actual table else insert the new record from temp table. Because of the size of the files it takes about 3 hours to insert data from text file to one table using Transform Data Task. This additional check + table transfers may take much more time and I have 20 such tables.Can someone please suggest me of a suitable way of doing this operation? I am looking for a solution that can be time-efficient and simpler to implement as I am a beginner. Any help is appreciated. Thanks a lot! |
|
dinakar
Master Smack Fu Yak Hacker
2507 Posts |
|
|
|
|