Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 General SQL Server Forums
 Database Design and Application Architecture
 Running a job continuously - performance concerns

Author  Topic 

bschweitzer
Starting Member

7 Posts

Posted - 2008-09-26 : 15:47:32
I have a job that I want to run when the server starts up, which contains an endless loop. It checks for new records in a view and updates the records if it finds one. My question: are there any performance issues that I need to be aware of if I just let this run?

Here's the heart of the code where it loops:
quote:

Set rsTransactions = CreateObject("ADODB.Recordset")
rsTransactions.ActiveConnection = MM_WREA_STRING
rsTransactions.Source = mySQL
rsTransactions.CursorType = 0
rsTransactions.CursorLocation = 2
rsTransactions.LockType = 1
j = 1
while j < 2
rsTransactions.Open()
rsTransactions_numRows = 0

Repeat2__numRows = -1
Repeat2__index = 0
rsTransactions_numRows = rsTransactions_numRows + Repeat2__numRows
while not rsTransactions.eof
tid = rsTransactions.Fields.Item("TransID").Value
updatetable.CommandText = "Update t_Transactions Set Processed = 1 where TransID = " & tid
updatetable.Execute()
updatecount = updatecount + 1
rsTransactions.MoveNext()
Wend
rsTransactions.Close()
for x = 1 to 50000
next
wend



Does anyone see a problem with this? I'd hate to hang our server.

Thanks,
Beth

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2008-09-26 : 15:51:00
Why don't you just use a trigger?

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2008-09-26 : 15:55:36
Just noticed that mySQL in your code. Is your issue for MySql or SQL Server?

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

bschweitzer
Starting Member

7 Posts

Posted - 2008-09-26 : 16:03:16
The mySql in my code is just a variable name for the sql statement string - I'm using SQL server.

I'm kind of new and don't know about triggers. Can you point me to some help on that? That's the kind of thing I was looking for. Do you set them up as part of the job parameters?

Thanks,
Beth

quote:
Originally posted by tkizer

Just noticed that mySQL in your code. Is your issue for MySql or SQL Server?

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog


Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2008-09-26 : 16:06:42
Triggers go on a table. You can read up on triggers in SQL Server Books Online. There are insert, update, and delete triggers. Each time one of those happens and you've got a trigger for it, it'll fire and run the trigger code.

It sounds like you want an insert trigger.

Tara Kizer
Microsoft MVP for Windows Server System - SQL Server
http://weblogs.sqlteam.com/tarad/

Subscribe to my blog
Go to Top of Page

bschweitzer
Starting Member

7 Posts

Posted - 2008-09-26 : 16:22:41
Ah - I found the section on triggers. They seem like very useful things, but unfortunately won't work for me here. The problem is that the table with the new records is in a database on our external server, but the processing has to be done on our internal, behind the firewall server.

So the code below runs on our internal SQL server, but checks a table on our external server. It then sets off some processes on our internal server (like launching InDesign with a data merge document), and finally sets the database record to "processed" when it's done (again on the external server).

While I'd love to use the trigger method, it doesn't seem possible, because the SQL server on the external server would not be able to address the internal server in order to run the InDesign process.

Back to the original question - does anyone see any reason why my continuous loop method would cause a problem?

Thanks,
Beth
Go to Top of Page

LoztInSpace
Aged Yak Warrior

940 Posts

Posted - 2008-09-28 : 11:21:41
If you can see the database from your internal server to read then you can execute a trigger on it.
I never thought I'd say this, but use a trigger. Code like this is bad on so many levels.
- you're pulling data in and processing it on yet another box then spitting it out
- and then trying to slow it down with loops!
- It's crap because it uses dynamic SQL.
- It's crap because it doesn't use sets but an update in a loop (try a single update statement Update t_Transactions Set Processed = 1 where TransID in (...mySQL...)
- This update does not appear to be in any transaction that set the status to 'processed' in the first place.

All up, this is a very, very bad solution. Back to the original question - I suggest you take advice from the experts on this one.
Go to Top of Page

saurabhsrivastava
Posting Yak Master

216 Posts

Posted - 2008-09-28 : 16:51:33
Performance completely depends on resource utilization on internal,external and netwrok resources. If you know how many rows are updated/inserted/deleted on source server than you can set up a baseline and start from there. There are several methods to perform this operation and their usage completely depends on your enviornment.
Have you tried running that job once?
How long it took to process records?
How many records it processed in one execution?
Soluition;
SSIS.
Log Shipping.
Replication (snapshot)-
Database mirrioring (SQL Server 2005 or SQL Server 2008)
Database Snapshot ( SQL Server 2005 or SQL Server 2008)

Evaluate them, compare performance using each method and go with best performer
Go to Top of Page

LoztInSpace
Aged Yak Warrior

940 Posts

Posted - 2008-09-28 : 22:03:19
I think you've missed quite how bad this solution is.
Go to Top of Page

bschweitzer
Starting Member

7 Posts

Posted - 2008-09-29 : 17:26:43
LoztInSpace: I posted this question because I thought what I was doing might have performance issues but don't know enough about SQL server to know if that were true or not. I have limited knowledge of SQL server and there is no one else at my company who I can ask about these things.

I appreciate your answer but the problem is that, while I could create a trigger on the table in question, I can't update the database on the other server from that server. Maybe if I spell out the problem more clearly you could think of a better solution:

a) We have a sql table on an external web server that gets updated when a client requests a custom order via the web.

b) When that happens, we want to use our internal server to grab the client's information from the external database, create a custom file for that client (using software that only exists on the internal server), then mark that order as complete and send the customer an email.

So even though I could execute a trigger on the external server, I don't see how it could do anything because it can't get to the internal server. The processing has to be done on the internal server end and there's nothing to trigger from that end.

Does this make sense?

We're not talking about thousands or even hundreds of transactions in a day. More like 20-50 max. I've had it running since Friday and there doesn't seem to be any impact on the server otherwise, but I wanted to find out if this was a bad practice and if there is a better alternative.

I am open to suggestions and would appreciate any other ideas. Please keep in mind that I have very limited experience with SQL server and am really trying to find the right way to do this.

Oh and by the way, I don't know what you mean by using a set, and the code I posted is really dummy code representing the idea of what I'm doing and not all the nitty gritty code. I really am doing more processing than is shown here, then I mark the record as processed so it doesn't get reprocessed. The code opens a record set from a view of all records in the table with the processed flag turned off, so that inner while loop usually does nothing until a new record gets added.

I put the for loop in just to keep it from constantly trying to open the view without taking a breath, thinking that might monopolize the cpu.

Thanks for all your help.
Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2008-10-01 : 04:08:59
if you do continue down this route...note that
"for x = 1 to 50000 next" will be a CPU hog. You should look to incorproate some WAIT processing.

Also the reference to SET processing refers to the general technique of coding SQL server to update loads of records in one go. If you have 100 records to be updated in a table, you have 2 basic choices read one record from the table 100 times and update the each 1 record 100 times, or else read 100 records from the table as a SET just once, and then update the SET for 100 records just once. The latter approach is far more efficient. Think of it as like filling a tea-cup with sugar using a tweezers or with a spoon. Search here for more discussions/examples on SET BASED PROCESSING. It'll open you up to being a far more effective SQL Server programmer/DBA.
Go to Top of Page

bschweitzer
Starting Member

7 Posts

Posted - 2008-10-01 : 11:17:30
Thank you Andrew!

I don't think SET will help me because there will usually only be one record at a time that is awaiting processing. The loop is there just in case another one slips in at the same time, but at the absolute most I would expect to see is 10 at once. In addition, there is a lot more processing going on with each record than just setting the processed flag on - I just omitted it to keep the example simple. But I can see where SET would be a useful method in the right circumstances.

Your comment on WAIT processing sent me searching again however and I was able to find a WAITFOR statement in the SQL help files. That's what I had been looking for earlier but was looking for something similar in Visual Basic and couldn't find anything like it. If I change the for/next loop at the end to execute a command containing a WAITFOR DELAY '000:00:30' (to wait for 30 seconds) before looping back to check for new records, would that help the performance?

Thanks!

Beth
Go to Top of Page

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2008-10-02 : 11:23:37
Search here for other comments on WAITFOR...I haven't used it myself, but from what I read it looks a good function.

Also a SET can contain 1 element....but the good thing is that it be 10million as well and the logic should work just as well.
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2008-10-02 : 14:20:56
Can you initialize the internal work that needs to be done from within sql server? In that case you can create a stored procedure for the entire process in a manner something like this:
CREATE PROCEDURE [dbo].[procedure] 
AS

BEGIN

WHILE (1 > 0)
BEGIN
IF EXISTS (SELECT * FROM linkedserver.dbname.dbo.tablename WHERE SomeFlag = 1)
BEGIN
--> Do whatever needs to be done
END

WAITFOR DELAY '00:00:30'
END

END
I don't see how this can be so bad...the only problem is ofcourse when this job crashes or the link falls down for some reason. I guess you could add a TRY CATCH block inside here also...

- Lumbago
Go to Top of Page

LoztInSpace
Aged Yak Warrior

940 Posts

Posted - 2008-10-03 : 08:18:29
You do not need the IF EXISTS...
Just do your work. If no rows match then you do nothing. There is no point in checking it first.
Go to Top of Page

bschweitzer
Starting Member

7 Posts

Posted - 2008-10-03 : 08:52:52
Thank you everyone! I think I can make it work now, using the WAITFOR command instead of the loop.

I appreciate everyone's help in this - it's a great forum :-)

Beth
Go to Top of Page
   

- Advertisement -