Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
netwerkassist
Starting Member
13 Posts |
Posted - 2006-04-23 : 01:59:29
|
I would like to reduce the size of the production database for development. We have some tables that have over a million rows. I'm trying to delete several year of data from them using simple delete statement < year in question. However on large columns the transaction logs grow large. Is there a more economical way of achieveing this task without having to worry about disk space issues?Appreciate imput. |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-04-23 : 03:37:51
|
You could delete the rows in a loop - so that you only delete a manageable number of rows each iteration.The deleted data will still be logged, but if your database Recovery Model is set to SIMPLE then only the "uncommitted" transactions will be held so, in effect, the Log will be reused for each iteration.Change the Recovery model back to FULL, if you need to, after the deletion (e.g. so you have the ability to recover to point-in-time)DECLARE @intRowsDelete intSET @intRowsDelete = 1 -- Force first iterationWHILE @intRowsDelete > 0BEGIN SET ROWCOUNT 10000 -- Size of each batch DELETE FROM MyTable WHERE MyDate < '20060101' SELECT @intRowsDelete = @@ROWCOUNT -- Actual number deleted SET ROWCOUNT 0 -- Reset batch size -- Optional delay here if you want to allow other user access to the database -- WAITFOR DELAY '000:00:05'END Make sure the whole thing isn't running in a transaction block! If so put some COMMIT stuff in there to force a commit on each iteration.Kristen |
 |
|
|
|
|
|