Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
PeterG
Posting Yak Master
156 Posts |
Posted - 2004-04-16 : 17:07:56
|
| I have a web application that uses ASP and SQL Server. In one of my web pages I am sending params to the stored proc as a CSV string, thus passing everything to the db once. The reason I'm doing this is so that I only have to call the stored procedure once instead of calling it several times. Someone said that actually it's the connection to the db that's taking a performance hit, so if I stay connected and call the stored proc multiple times I'm fine. By having the parameters in a CSV string and then parsing them and processing in a loop, it would just be better to pass the params one at a time and then call the stored proc as many times as needed.Will anybody shed light on this pls? Thanks.Peter |
|
|
ValterBorges
Master Smack Fu Yak Hacker
1429 Posts |
Posted - 2004-04-16 : 17:14:08
|
| Need more detail on what you're trying to process.Schema, Stored Proc, Requirements, Data Volume. |
 |
|
|
PeterG
Posting Yak Master
156 Posts |
Posted - 2004-04-16 : 18:13:01
|
| The database design is complicated, so I'll try to explain it rather than sending you the schema. I'm trying to execute against multiple tabless with names like these: Element1, Element2 ... They have the same fields, just broken into multiple tables. I know the design is not the greatest, but I'd have to live with it. So in my stored procedure, I have dynamic sql to execute a select from an element table, store the value in a temp table, then do a final select on the temp table. Thus, my CSV string param.The suggestion was that I create a stored procedure that would be executed depending on which table I want to get data from. If I have 12 element tables, then I call a different stored proc (12 times) accessing a particular table.Data volume on each element table varies. |
 |
|
|
RobinCox
Starting Member
4 Posts |
Posted - 2004-04-16 : 19:24:16
|
| It might be useful to establish whether there is a real performance issue with the query execution. Have you tried using SQL Query Analyser to look at the execution time, compilation times and the execution plan to see whether the qeury is optimized and that there are in fact going to be any differences between one approach versus another? |
 |
|
|
Lumbago
Norsk Yak Master
3271 Posts |
Posted - 2004-04-16 : 20:04:54
|
As far as I can understand you have the following options:<%'Option 1, splits the string and exec's myRealProcedure for each value in the string: SQL = "EXEC myMasterProcedure @CSV = '1,2,3'"'Option 2, exec's myRealProcedure right away SQL = "EXEC myRealProcedure @Field = 1;" & _ "EXEC myRealProcedure @Field = 2;" & _ "EXEC myRealProcedure @Field = 3;"%> If this is the case then I can't really say what works more efficiently (would think option 2) but I at least think option 2 is cleaner and this is something I use all the time.Just my 2 cents worth I guess...--Lumbago"Real programmers don't document, if it was hard to write it should be hard to understand" |
 |
|
|
|
|
|
|
|