Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
bershi95
Starting Member
5 Posts |
Posted - 2003-11-05 : 12:52:35
|
| Hello, I have several HUGE text files that I need to convert to tables. Using the DTS wizard is straightforward enough, but I need to know if it is possible to somehow filter the source data (perhaps using a SQL statement) before completing the wizard. Since I'm rather new at all this, my description of the problem may me a bit unclear. Please let me know if you need further clarification. Thanks a lot!Bernard |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-11-05 : 12:55:34
|
| It is recommended that you import all of the data into a staging table. This would be a permanent table that is only used by DTS. Then use a SQL query to move your data from the staging table into your other table(s).Tara |
 |
|
|
bershi95
Starting Member
5 Posts |
Posted - 2003-11-05 : 13:03:50
|
Hi Tara,Thanks for the reply! Unfortunately, part of the problem is that the text files are so huge (1.8GB each) that the TEMPDB fills up before all of the data is imported. I get a the following message:Could not allocate space for object '(SYSTEM table id:291052263)'in database 'TEMPDB' because the 'DEFAULT' filegroup is full.Is the only solution to break up the source file data into more manageable chunks?What do ya think?Thanks again,Bernardquote: Originally posted by tduggan It is recommended that you import all of the data into a staging table. This would be a permanent table that is only used by DTS. Then use a SQL query to move your data from the staging table into your other table(s).Tara
|
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-11-05 : 13:07:10
|
| 1.8GB! Geez, that is a huge file. Yes, if you can not add disk space to the server to support the growth of tempdb that is needed, then I would say definitely break up the files.You might also consider the command line tool bcp. BULK INSERT should also be considered. For such a large amount of data, you'll want to do this as fast as possible. bcp or BULK INSERT are faster than DTS, so they should be considered. BULK INSERT is run from Query Analyzer, while bcp is run from a cmd window.Tara |
 |
|
|
bershi95
Starting Member
5 Posts |
Posted - 2003-11-05 : 13:25:17
|
| Ok, Tara. Thanks again for all your help! I really appreciate it.Bernard |
 |
|
|
|
|
|
|
|