Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Performance degrades when stored proc executed repeatedly

Author  Topic 

AskSQLTeam
Ask SQLTeam Question

0 Posts

Posted - 2003-06-09 : 08:40:05
Mike writes "Using SQL Server 2000 and Win2000 SP2, I am using a stored proc attached to a set of triggers to insert a record into a table based on data from the tables containing the triggers. The procedure takes a single argument as input. Let's call this the 'refresh' proc.

This works well for a single record - now I need to populate the target table based on the current contents of the source tables.

So I created another stored proc that generates a cursor containing one record for each argument I want to pass to the 'refresh' proc. I then looped through this cursor, calling the 'refresh' proc for each of its rows.

This was fine on the test table of about 1000 rows. However the live one could contain up to a million. The first few thousand rows run quite quickly (300-500 per minute) but performance then degrades rapidly to the point where "rows per minute" threatens to become "minutes per row".

The rows are not large (10-20 columns, typically). I am not using indexing (intending to apply indexes after building the target table). Logging is on for the database at present; I can get it turned off if necessary.

Any ideas on what is causing the degradation and how to get round it would be very welcome. I've used SQL for a long time, but only recently got into stored procedures - so I suspect there may be a simpler, better way of doing things than my cursor/stored proc approach. It was just useful to be able to use the same logic for the populate and "refresh" processes.

Thanks in anticipation"

X002548
Not Just a Number

15586 Posts

Posted - 2003-06-09 : 11:12:13
quote:

I am using a stored proc attached to a set of triggers



Nope it doesn't work that way...a trigger is based on a table, and the action (INSERT, UPDATE or DELETE) that occurs on/ to it.

quote:

to insert a record into a table based on data from the tables containing the triggers.



Ok no big deal

quote:

The procedure takes a single argument as input. Let's call this the 'refresh' proc.

This works well for a single record - now I need to populate the target table based on the current contents of the source tables.



Triggers work on affected rows in the inserted and deleted virtual tables...you can do anything you want however, but how do you know what rows you want to refer to?

quote:

So I created another stored proc that generates a cursor containing one record for each argument I want to pass to the 'refresh' proc. I then looped through this cursor, calling the 'refresh' proc for each of its rows.



Oh, the smell of cursors in the morning...smells like...like...victory...

NOT

quote:

This was fine on the test table of about 1000 rows. However the live one could contain up to a million. The first few thousand rows run quite quickly (300-500 per minute) but performance then degrades rapidly to the point where "rows per minute" threatens to become "minutes per row".



Ahh the plot unfolds....

quote:

The rows are not large (10-20 columns, typically). I am not using indexing (intending to apply indexes after building the target table).



Everything is relative

quote:

Logging is on for the database at present; I can get it turned off if necessary.



ROLLBACK...we don't need no stickin ROLLBACK

quote:

Any ideas on what is causing the degradation and how to get round it would be very welcome.



You already answered it yourself

quote:

I've used SQL for a long time, but only recently got into stored procedures - so I suspect there may be a simpler, better way of doing things than my cursor/stored proc approach. It was just useful to be able to use the same logic for the populate and "refresh" processes.



Define "LONG". Post your Sproc and triggers so we can have a look. That would be the bset way.

quote:

Thanks in anticipation



A Carly Simon fan?




Brett

8-)

Edited by - x002548 on 06/09/2003 11:13:29

Edited by - x002548 on 06/09/2003 11:14:26
Go to Top of Page

nr
SQLTeam MVY

12543 Posts

Posted - 2003-06-09 : 11:45:57
>> This was fine on the test table of about 1000 rows.
Most bad code does - doesn't mean anything as it would all be done in memory.

>> I am not using indexing (intending to apply indexes after building the target table).
Read in bol about what indexes do and you will see why this works well with 1000 rows and gets progressively slower. This is probably the solution t the problem.

>> Logging is on for the database at present; I can get it turned off if necessary.
No you can't - read about the transaction log in bol.

>> there may be a simpler, better way of doing things than my cursor/stored proc approach.
See below.
Never using a cursor is a reasonable guideline.

Try it with appropriate indexes and it will probably be ok. You should update staatistics and maybe rebuild the indexes after it has been running for a time.

>> I've used SQL for a long time, but only recently got into stored procedures.
Hope that wasn't for querying sql server.

==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page
   

- Advertisement -