Author |
Topic |
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-04-14 : 00:52:06
|
10 percent processed.20 percent processed.30 percent processed.40 percent processed.50 percent processed.60 percent processed.70 percent processed.80 percent processed.90 percent processed.Processed 227540000 pages for database 'NASNA', file 'NASNA' on file 1.100 percent processed.Processed 14 pages for database 'NASNA', file 'NASNA_log' on file 1.BACKUP DATABASE successfully processed 227540014 pages in 18353.542 seconds (96.856 MB/sec).usually I don't backup such large databases because in my case they store results of calculations that I could repeat if need be. The input to the calculations IS backed up. But in this case I needed a backup. It took 5 hours! elsasoft.org |
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-04-14 : 08:41:04
|
Wasn't aware of any perf params in backup command. I'll be sad to hear that WITH GO_SLOW is the default though. :( elsasoft.org |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
|
Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)
7020 Posts |
Posted - 2012-04-16 : 09:03:21
|
If it is SQL Server 2008 Enterprise Edition, or 2008 R2 Standard or Enterprise Edition you should set backup compression on. That will usually save a lot of time in the backups.You should also consider backing up to multiple output files in parallel. I'm not exactly sure why, but they always seem to run faster.backup database [MyDB] to disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_001.BAK' , disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_002.BAK' , disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_003.BAK' , disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_004.BAK' , disk = '\\MyBackupServer\MyBackupShare\MyDB\MyDB_db_201204152100_005.BAK' with Type = FULL, init, stats = 5 CODO ERGO SUM |
|
|
robvolk
Most Valuable Yak
15732 Posts |
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-04-16 : 12:47:06
|
great thanks everyone. maybe move this thread out of the corral. I wasn't looking for help but I certainly got schooled! :) elsasoft.org |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-06-06 : 15:02:36
|
Had to do this again with a bit larger db, and this time used WITH COMPRESSION and 8 bak files. rate was about 2x faster (192mb/sec vs 97mb/sec before). db is 2.6TB, backup finished in 3.5 hrs.so thanks for the tips!BACKUP DATABASE successfully processed 310004810 pages in 12571.842 seconds (192.645 MB/sec). elsasoft.org |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2012-06-06 : 15:48:15
|
Did you try changing MAXTRANSFERSIZE or BUFFERCOUNT? Are you backing up to a local drive or a SAN? |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-06-07 : 00:28:51
|
SAN. don't have local drives that big.did not specify MAXTRANSFERSIZE or BUFFERCOUNT - just went with defaults. elsasoft.org |
|
|
nigelrivett
Master Smack Fu Yak Hacker
3385 Posts |
Posted - 2012-06-07 : 04:21:24
|
Do you have a lot of static (old) data that can be placed on read only filegroups?==========================================Cursors are useful if you don't know sql.SSIS can be used in a similar way.Beer is not cold and it isn't fizzy. |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-06-07 : 13:03:21
|
yes, could do that with much of it. Would that matter for backup times? elsasoft.org |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2012-06-07 : 13:12:06
|
Yeah, if the read-only portion is a decent percentage of the total, you could do filegroup backups of just the live data instead. |
|
|
jezemine
Master Smack Fu Yak Hacker
2886 Posts |
Posted - 2012-06-07 : 22:25:05
|
oh i get it, yes. this was the first time the db was backed up though, and likely the last. elsasoft.org |
|
|
|