Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
ells
Starting Member
25 Posts |
Posted - 2008-01-30 : 05:25:48
|
I have a data pump that selects data from one SQL server instance on another server and transfers the data into a target table on the local SQL server.My issue appears to be the number of records coming over. Currently it exceeds 565000 records and is still going.The select statement is quite simple the conditions are checkdate > CONVERT(DATETIME, '2003-03-31 00:00:00', 102)and substring(purchaser,1,3) in ('2f2','4rr','5hh','7jj')I really do need to somehow improve the performance of this data pump. I have considered changing this to be a sql task, changing the select so that the logic is written into a view on the target server.How do I find out what is causing this to run slowly?How do I speed this up?Can I improve the SQL ie load all records and then delete not wanted?Would a SQL task be better?I have a feeling I am just going to end up moving the date setting and not loading older data if there are no other options.Many Thanks.Mark |
|
ells
Starting Member
25 Posts |
Posted - 2008-01-30 : 05:27:47
|
oh forgot to add. There may be no indexes. How do I find out if the source table would benefit from an index etc??Thanks.Mark. |
|
|
RickD
Slow But Sure Yak Herding Master
3608 Posts |
Posted - 2008-01-30 : 07:05:58
|
Try using BCP instead, this will be much faster, load the data into a load table if you need to do anything with it before loading it into your end table. |
|
|
|
|
|