Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
Author |
Topic |
SFogli
Starting Member
3 Posts |
Posted - 2010-10-06 : 18:26:11
|
I have a situation where I need to copy up to 14TB of data from a set of servers to one or two servers. The average file size is about 474k. The file paths are stored within a SQL2005 db so we will be moving only those files we have paths for. We have been presented with a "solution" to use an in house web service to copy the data moving an estimated 8GB over 5hr (which for 14TB is about 364.5 days @24hrs\day??). I have moved sinlge files of 150GB+ across the network in about 90-120 minutes in the past so I beleive there has to be a faster way. I would like to get some feedback on an approach so here goes..Since the paths are stored in the DB I thought about looping through a table (in batches) then sending each copy command to xp_cmdshell for excution. The file servers are W2003 and would only have readonly access. I can't install anything on the fileservers and no monies will be spent on this effort. The SQL Server is W2008R2 with SQL 2005 sp3. Any ideas\thoughts are appreciated. |
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2010-10-06 : 18:30:16
|
Using xp_cmdshell would be a good solution for this, although you will be limited to the speed of Windows copy. Many people often suggest using other protocols (such as SSH), but we are limited on the database servers. 14TB is going to be slow with Windows copy, which I think is called CIFS or SMB, but it'll be faster than taking a year to copy the files with the in-house solution.To use xp_cmdshell, the SQL Server service account will need access to the path or a proxy account will need to be used.Tara KizerMicrosoft MVP for Windows Server System - SQL Serverhttp://weblogs.sqlteam.com/tarad/Subscribe to my blog |
|
|
SFogli
Starting Member
3 Posts |
Posted - 2010-10-06 : 18:46:46
|
Thanks Tara. I do have a concern that as the sp loops thru the table and sends the command out to DOS that it may overwhelm the file server\sql server as it would be about 100k request over about 25 seconds. Should I include a pause between loops?? |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
SFogli
Starting Member
3 Posts |
Posted - 2010-10-06 : 18:58:37
|
Thanks again. I will update the post with the results. |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2010-10-06 : 19:29:34
|
I'd recommend Robocopy, I believe it's a standard tool with Win2K8. It is much faster than regular copy/xcopy and can do restartable copy operations. I've used it to copy 2 TB of data in about 2-1/2 days over a LAN (not a WAN).I'd also suggest exporting the file names to a text file and processing them with a for loop from the command line, rather than use xp_cmdshell. That way you don't impact SQL Server with a pure file copy operation. With a little work you could concatenate the Robocopy statement, source and destination paths, and filename and export to a batch file that you can run later. |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
|
|
|
|
|
|