Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Saving data to multiple tables using 1 sp

Author  Topic 

t1g312
Posting Yak Master

148 Posts

Posted - 2005-10-16 : 01:00:17
Dear All,

I have a header table (e.g. Order Details) and a transactions table (e.g. Items purchased). When a user save the data in the front-end, ideally all the data needs to be passed to a single stored procedure right? However, I can't find an elegant way of doing it. Currently, I'm using 2 methods -
1) I'm passing the items purchased as a csv string and splitting them in the stored procedure.
2) I call 1 sp to save the order details and if successful, I call another for each of the items.

Is there a better way of doing this?

Thanks in advance,

Adi

sachinsamuel
Constraint Violating Yak Guru

383 Posts

Posted - 2005-10-16 : 01:08:58
Hi Adi,

Why you passing data in a CSV format?

How does it matters if there are 100 or more parameters. If you passing CSV string then there will also be cost of spiliting. Why to take the cost when we can avoid that?

I will recomend to send it in separate parameters and use them in store procedure.

Rest method 2 is perfectly fine.

regards
Sachin
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-16 : 01:45:45
You should have separate procedures for loading data into parent and child tables.
Your application should call the procedure to insert the parent record first, which should return the primary key assigned to the record if you are using surrogate keys.
The application should then loop through the child records, calling the insert procedure once for each record. Mess around with parsing CSV strings and trying to do this in a single call, and you are bound to run into trouble.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 01:46:32
"How does it matters if there are 100 or more parameters"

Probably because there is an unlimited number of Order Items that can be presented to the SProc.

Why not insert into the Order Details table directly, rather than calling an SProc for each item?

INSERT INTO OrderDetails(ColA, ColB, ...)
SELECT ColA, ColB, ...
FROM dbo.MySplitFunction(@MyCSVList)

Create the table-returning function MySplitFunction to return a data set of all the delimited items - probably have a different delimiter for the Columns and Rows (rather than a fixed number of columns).

Make sure the Split Function is set based (rather than looping round getting each value) and it should be pretty efficient. If you are not already doing so use a Tally Table for the Split function - that's the fastest method I've seen.

Kristen
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-16 : 02:09:53
quote:
Why not insert into the Order Details table directly, rather than calling an SProc for each item?

Because the interface should not have direct access to production tables. And if a developer insists upon having write access to tables, I take no responsibility for administering the database and I won't guarantee the integrity of their data.
Go to Top of Page

t1g312
Posting Yak Master

148 Posts

Posted - 2005-10-16 : 02:52:55
Kristen,

This is exactly what I'm doing. But as the number of parent-child tables increase, it gets very tedious and boring :)

I was just wondering if there is a way of passing a "table variable" to the sp. Is it possible in SQL 2005?

Thanks


quote:
Originally posted by Kristen

"How does it matters if there are 100 or more parameters"

Probably because there is an unlimited number of Order Items that can be presented to the SProc.

Why not insert into the Order Details table directly, rather than calling an SProc for each item?

INSERT INTO OrderDetails(ColA, ColB, ...)
SELECT ColA, ColB, ...
FROM dbo.MySplitFunction(@MyCSVList)

Create the table-returning function MySplitFunction to return a data set of all the delimited items - probably have a different delimiter for the Columns and Rows (rather than a fixed number of columns).

Make sure the Split Function is set based (rather than looping round getting each value) and it should be pretty efficient. If you are not already doing so use a Tally Table for the Split function - that's the fastest method I've seen.

Kristen



Adi
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 08:03:00
"Because the interface should not have direct access to production tables"

Sorry, poorly phrased my end.

What I meant was rather than having

CREATE PROCEDURE CreateCompleteOrder
@CSVListOfItems varchar(8000)
AS
... Loop round the CSV file
EXEC AddRowToOrderDetailsTable @MyOrderItemStuff
... End Loop ...

it might be acceptable to have

CREATE PROCEDURE CreateCompleteOrder
@CSVListOfItems varchar(8000)
AS
INSERT INTO OrderDetailsTable(... column list ...)
SELECT ... Column List ...
FROM MyCSVSplitterFunction(@CSVListOfItems)

"But as the number of parent-child tables increase, it gets very tedious and boring"

How many Parent/Table relationships can there be in adding order lines?

If its getting really hairy you might consider passing XML to the SProc - SQL Server has reasonable tools to "parse" the XML data into tables, based on the structure of the tables, so you could either parse it into temporary tables, and from there into the real ones - or straight into the real tables

Kristen
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-16 : 10:17:01
Well, an alternative would be to pass in a collapsed "flat-file" record for each record in the lowest child dataset, which would also include all the relevent data from each parent dataset in the chain. Your procedure would then take responsibility for parsing the record into different table schemas and inserting/updating as appropriate.
Advantage: one insert for each record in the lowest child dataset.
Disadvantages: have to submit large record of probably redundant data for each insert/update, difficult to insert data for just the top level of the hierarchy.
Not a method I would recommend. Really, your application layer should be sophisticated enough to handle looping through recordsets with ease. What are you programming this in?
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 12:00:20
"your application layer should be sophisticated enough to handle looping through recordsets with ease"

But that would involve multiple calls from Client to Server - which might be worse [performance-wise] than the server looping round them, or doing a multi-row insert.

Kristen
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-16 : 12:51:20
Only in the most extreme cases would I trade off clarity and maintability of code for increased performance.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-16 : 15:16:26
blindman I reckon I'm missing something.

As I understand it t1g312 has a dataset from some frontend which gives him data for an Order Header record and a bunch of Order Item records.

He figures he would like to pass all the data to an Sproc in a single call which will then create an order header and all the order detail records.

Do you think it would be better for the client application to loop round the Order Detail records calling the SQL Server for each item? (i.e. a round-trip for each order item?)

Seems slower-than-necessary to me.

Kristen
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-16 : 18:17:10
I don't think either of us is missing anything, except possibly more details in the issue at hand.
If it is a large existing dataset, with multiple OrderDetails and Items records that is being imported, then I'd agree it is more efficient to import it directly into a table, but I would import it into a STAGING table rather than into production tables. After loading into a staging table, have a sproc cleans and verify the data and distribute it to production tables.
If this is an order entry system and data is being generated on the fly, then I'd still use application-side loops and single-insert sprocs rather than direct inserts to the production tables.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-17 : 01:37:24
"then I'd still use application-side loops and single-insert sprocs rather than direct inserts to the production tables"

So just to be clear: you favour the Application calling sort sort of Order Item Add SProc, for each line in the order, rather than calling a single SProc with a parameter containing the whole data set (perhaps as CSV)?

Kristen
Go to Top of Page

DonAtWork
Master Smack Fu Yak Hacker

2167 Posts

Posted - 2005-10-17 : 08:25:20
So, if I order 50 different items, you want to make FIFTY round trips to the DB??

Help us help YOU!
Read this blog entry for more details: http://weblogs.sqlteam.com/brettk/archive/2005/05/25.aspx

*need more coffee*
SELECT * FROM Users WHERE CLUE > 0
(0 row(s) affected)
Go to Top of Page

sachinsamuel
Constraint Violating Yak Guru

383 Posts

Posted - 2005-10-17 : 08:45:42
In that case, the best solution is to use XML.

You can use OpenXML method to push data in DB. In this case there will be just one hit to the database.

Regards
Sachin Samuel
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-17 : 09:51:38
Round trips? No. Who said they had to be round trips. 50 inserts, yes. I think that is MUCH preferable to submitting a single insert with the parent record and all the fields from all 50 detail records as a monstrous concatenated string to be parsed. If I saw that in an application I don't know whether I'd laugh, cry, or yell cha-ching! Money for me to fix the problem!
Now, my beef is with submitting multiple records as a concatenated string to a sproc. I wish there was a way to submit a table variable to a sproc, but there isn't. XML may be a solution, but I haven't used it so I can't comment on it.
Go to Top of Page

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2005-10-17 : 09:53:36
In short:
No direct table access for the application means: all inserts through stored procs.
Stored procs can't accept table variables so: single inserts.
Submitting multiple records as a single concatenated string: uhm...normalization anyone?
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-10-17 : 11:13:11
OK, Gotcha blindman, thanks for the explanation.

"No direct table access for the application means: all inserts through stored procs."

I'm right behind you ...

"Stored procs can't accept table variables so: single inserts"

We often use CSV, and sometimes use XML for this. CSV "often" where we want to do:
WHERE MyColumn IN (@MyParameter)
and XML where we have a large-ish dataset we need to process - this is usually a bulk import from a source that can easily provide it as XML (web services type stuff).

"Submitting multiple records as a single concatenated string: uhm...normalization anyone?"

Now this one I disagree with you. The database is still normalised; no dynamic INSERT statements. If there is a significant performance gain I am happy to "obfuscate" the process in favour of speed.

The Client Application can pass the Order Header details as normal, multiple, parameters to the "SaveMyOrder" Sproc, and then put the ProductCodes/Quantities in some sort of delimited list.

Within the SaveMyOrder Sproc I would use a UDF TableVar "splitter" function to virtualise that delimited data as a table, and INSERT rows directly into the OrderDetails table (or I might loop round and call an InsertOrderDetails Sproc for each item - depending on the other requirements of the application)

Kristen
Go to Top of Page

t1g312
Posting Yak Master

148 Posts

Posted - 2005-10-18 : 04:34:08
quote:
Originally posted by Kristen
How many Parent/Table relationships can there be in adding order lines?



Order was just an example. My program has nothing to do with order and there are nearly a dozen parent child tables.

quote:

If its getting really hairy you might consider passing XML to the SProc - SQL Server has reasonable tools to "parse" the XML data into tables, based on the structure of the tables, so you could either parse it into temporary tables, and from there into the real ones - or straight into the real tables
Kristen



It's a little surprising that a dbms as powerful as SQL Server doesn't have the feature of passing a table/array as parameter. I guess T-SQL is to blame.

Adi
Go to Top of Page

t1g312
Posting Yak Master

148 Posts

Posted - 2005-10-18 : 04:35:45
quote:
Originally posted by Kristen

blindman I reckon I'm missing something.

As I understand it t1g312 has a dataset from some frontend which gives him data for an Order Header record and a bunch of Order Item records.

He figures he would like to pass all the data to an Sproc in a single call which will then create an order header and all the order detail records.

Do you think it would be better for the client application to loop round the Order Detail records calling the SQL Server for each item? (i.e. a round-trip for each order item?)

Seems slower-than-necessary to me.

Kristen



Exactly.

Adi
Go to Top of Page
   

- Advertisement -