Author |
Topic |
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2006-07-21 : 09:42:21
|
Ahmad writes "Dear Sir, Greetings, I would like to have application or if there is functionality in the SQL server, so as I can transfer specified table records daily from around 10 servers in different countries to main server in other country, keeping this operation secured and saving the integrity, reminding that the volume of records that I am talking about is about 50000 record daily minimum and 500000 records max, also the in that table which I will send the records from it to the main server is the same for al countries, also it has just 11 fields, and most of them is char with length of 10 I need your help to post me the best solution model for this situation, and what is the best technology to use. Thanks" |
|
coolerbob
Aged Yak Warrior
841 Posts |
Posted - 2006-07-21 : 12:57:29
|
Here's the best I have:Use BCP to dump the tables to disk. This can be triggered by a scheduled sql job in the sql agent if you wish.The next two steps can be in one EXE and can be called as the second step of the sql job.1) You could then use the latest incryption facility in .NET to make those files secure.2) Each of your servers can then copy the incrypted file to an FTP Site on your main server.Then simply decrypt the files and bulk insert them.job done (bar a few clidches...)(2005 Integration Services may help with some of this - but you can easily to it without) |
|
|
Tech2007
Starting Member
2 Posts |
Posted - 2006-07-24 : 03:53:08
|
Thanks man, but i will do this upon daily bases, what is your suggestions Ahmad |
|
|
Tech2007
Starting Member
2 Posts |
Posted - 2006-07-24 : 03:58:40
|
Dear Sir, I need to ask you if this works upon daily bases, can you tell me what do you means by BCP to dump tables to disk, and you mean here the whole table each time or just the needed Query. thanks Ahmad |
|
|
jen
Master Smack Fu Yak Hacker
4110 Posts |
Posted - 2006-07-24 : 04:35:19
|
+1 coolerbobwhen you do the bcp, be sure that the 'upload' time to the database is reasonable,had to handle 500MB to 1 gb file before and it takes more than 5 hours to upload the data of about 16 million records of mixed update, delete, insert, the database gets 'locked' due to the enormous client requests and data uploading done at the same time--what a task, remember to do this off peak hours if main database table is being used--------------------keeping it simple... |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-07-24 : 06:11:42
|
Couldn't you use replication?If you are going to use BCP (which I think is a good idea) some thoughts:You need to protect against the task failing to happen one day.We tend to use "Staging" tables for this type of job. We insert "new/modified" rows into the staging table, with an additional "date/time" column, and then use the date/time as the cutoff for the transfer to the Main Server.So we check the Main Server for the latest date it has, and then export everything with a newer date. If the file doesn't get processed tomorrow it will just export a bigger file until the Main Server catches up again.We have also needed to do this in "small-ish" batches in the past, and for that we also included the PK columns. (Example is transferring orders to Main Server; it runs four times an hour and takes about 10 minutes to do 500 orders, so we restrict the max number of orders per run to 500, and then on busy days the system "catches its tail" during the night.SELECT TOP 1000 *FROM MyTableWHERE MyBatchDate >= @MostRecentRemoteDate AND ( MyPK1 > @MostRecentRemotePK1 OR ( MyPK1 = @MostRecentRemotePK1 AND MyPK2 > @MostRecentRemotePK2 ... ) ) alternatively you could have a BatchNumber column on the table, update that from NULL to next available number at the start of the export - which you can restrict to a batch size if necessary:UPDATE USET MyBatchNumber = @NextBatchNumberFROM MyTable AS UWHERE MyPK IN (SELECT TOP 1000 MyPK FROM MyTable WHERE MyBatchNumber IS NULL ORDER BY MyUpdateDateOrID) then export any rows WHERE MyBatchNumber = @NextBatchNumberYou then just need a way to retransmit a failed batch - which could be as simple as resetting that batch number to NULL:UPDATE USET MyBatchNumber = NULLFROM MyTable AS UWHERE MyBatchNumber = 1234 so that it gets reallocated to the next batch number next time around. (You then have missing batch numbers at the far end, which might be a bad idea when trying to work out what happened).Kristen |
|
|
jen
Master Smack Fu Yak Hacker
4110 Posts |
Posted - 2006-07-25 : 05:09:56
|
replication may not be the better option 10 servers in other countries --> main serverso he is trying to consolidate the data into one server only, based on my limited experience, this is really tough to maintain especially if publication fails on one publisher, most often, I had to delete the entire table and resynchronize everything again, meaning, there is a need to generate a new snapshotunless..source tables (tables that have daily rows only)-->temp table in main server --> transfer to permanent table offpeak hourstruncates the source tables-->cascaded down to the destinationunless he needs to have a realtime reflection of today and previous days rows, which can be tricky but that's another question altogether--i like your approach Kristen, very proactive--------------------keeping it simple... |
|
|
Kristen
Test
22859 Posts |
Posted - 2006-07-25 : 06:04:25
|
"i like your approach Kristen, very proactive"I shouldn't have posted it until a week after my "Try replication" posting, eh?!!Kristen |
|
|
jen
Master Smack Fu Yak Hacker
4110 Posts |
Posted - 2006-07-25 : 08:54:37
|
--------------------keeping it simple... |
|
|
|