Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Copying a FTP'd 100 MB file, and running stored procedures

Author  Topic 

AskSQLTeam
Ask SQLTeam Question

0 Posts

Posted - 2004-05-25 : 08:43:49
Peeush writes "Hi,
I plan to use a batch file that will invoke several stored procedures.

The job in hand looks like this:

1) There is a 100 MB (350,000 records) flat file which will be FTP'd from host to the SQL server.

2) I plan to use BCP to copy this large size of file to a temporary table (already created) running in batches

3) Write more stored procedures to pick up the data from temporary table, manipulate and do Insertion/Updation operations

4) Use Print Command in between stored procedures to write regular logs while running the jobs.

5) If I encounter any error while in Step 3, write that particular record to a flat file again.

Questions that I am seeking answer for:
a) I am looking by a performance point of view. How much time would it take for the entire operation. Assume that step 3 would involve around 12 insertions/updation operations. Absolutely any rough idea would be really helpful.

b) Would it be possible to start step 3 in middle of step 2 so that time could be saved.

c) What would be more efficient?
- Writing a batch file that would invoke all Stored Procedures?

- Writing a master procedure that would invoke all stored procedures.



Also, the above is my idea of how to do the work. If anybody has any idea, like using Visual Basic and Recordset instead of a Temporary Table, that would help in performance, then please suggest so.

Again, I am really very very thankful of getting such an educated mass that has read my question and suggests answers."

timmy
Master Smack Fu Yak Hacker

1242 Posts

Posted - 2004-05-25 : 19:48:25
Peeush,

a) The performance would depend on the specs of your server and (to a lesser extent), thea> speed of your network connection. It would also depend on the number of indexes (indices?) that are being used on the destination tables. But given the size of the file, I think 1/2 an hour is not an unreasonable timeframe.
b) You could - depends on how you instigate the SP's (which leads to..)
c) I don't think the efficiency would be much different between the two. But using a master SP would offer more flexibility as far as question b is concerned. Have you also considered doing this using a DTS package?

Tim
Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2004-05-25 : 20:11:00
b) Would it be possible to start step 3 in middle of step 2 so that time could be saved.
--bcp for the size of recordsets you're talking about is so fast, I wouldn't consider this. Why take a chance?
--Also, you would be running into contention issues.

c) What would be more efficient?
--I prefer to write everything in a master stored procedure. You can then just schedule this with a SQL Agent Job.

Instead of capturing the output of print statement, create a log table and write to that throughout the process.

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page
   

- Advertisement -