Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
mirk_mee_jim
Starting Member
4 Posts |
Posted - 2004-02-08 : 03:55:39
|
| Hi thereI have a DTS package which delete records from tables (around half a million records totally). Data is fetched from different sql server using datapump to local temp tables. Now the temp records that are imported from different SQL Server are copied to production tables using "Insert into" statements with different criteria. The problem is the transaction log is getting filled at the first part, deleting records itself. I have listed the restrictions at the server.Restrictions------------Cant go for BCP as there is no file write permission. There is no permission for truncate command.Cant make transaction log autogrow.The max size is set to 40 MB. I think that batch execution is a way out. But i have not used it and not sure.I would be very grateful for any help on this.Thanks Vinod |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-02-08 : 07:17:26
|
| How often is the tr log backed up/truncated?Sounds like you will have to do your processing in bits and make sure it is slow enough that the tr log doesn't grow.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
mirk_mee_jim
Starting Member
4 Posts |
Posted - 2004-02-09 : 01:40:24
|
| Thanks for the reply.The Transcation logs are backed up every 5 minutes. As the process is too long the maximum limit is reached within the process. If possible please suggest how to run the process in batches from the DTS.RegardsVinod |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-02-09 : 05:05:59
|
| Delete is easy.I assume it's an executesql taskinstead of delete tbl where ...set rowcount 1000select 1while @@rowcount > 0delete tbl where ...set rowcount 0If necessary put a delay (see waitfor) in this.For the imports you will need to import into staging tables in a database set to simple recovery - you can do this by utlising tempdb with global temp tables.For the inserts put an identity on the staging table and loop through thatdeclare @rows int, @maxid int, @id intselect @rows = 1000, @id = max(id) from ##tbl, @id = 1while @id < @maxidbegininsert ...from ##tblwhere id between @id and @id + @rowsselect @id = @id + @rows + 1end==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
mirk_mee_jim
Starting Member
4 Posts |
Posted - 2004-02-09 : 06:14:40
|
| I will try the suggestion on the Delete statement, thanks !I have an issue in importing other data, The two DB Servers are in different geographical locations (malaysia, US). I am running the DTS package from the US server. There is no trusted relationship for the user login which i have been provided with. It has restricted rights. So i cant use a single insert statement to bring the data, i.e. using four part name. I'm using the "Transform Data Task" to import the data from the remote server. Im clueless to make it into batch downloadable !!!Kindly let me know whether it is possible to configure a DTS task to download data in batches and how.RegardsVinod |
 |
|
|
nr
SQLTeam MVY
12543 Posts |
Posted - 2004-02-09 : 06:49:16
|
| Instead of importing into tables in your database (which is logged) import into global temp tables which will reside in tempdb and be non-logged (ish) then run a sql task to insert in batches.If you don't want to use a transform data task try openquery or openrowset in t-sql.==========================================Cursors are useful if you don't know sql.DTS can be used in a similar way.Beer is not cold and it isn't fizzy. |
 |
|
|
mirk_mee_jim
Starting Member
4 Posts |
Posted - 2004-02-10 : 02:47:31
|
| I tried exporting into a Gloable Temp table(##table) in tempdb database. Initially tempdb's transaction log size was 1 MB and after exporting the size increased to 22MB. The same was the case when I exported into a permenant table in tempdb. If this is the case, then I will get the same problem in the live server, am I right ? Please note that I am using SQL Server 2000. Regards,Vinod |
 |
|
|
|
|
|
|
|