Author |
Topic |
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2013-05-01 : 07:37:48
|
Hello,I'd like your opinions on how to best organise this.At the moment I have two queries containing table functions. The first returns the profits generated year by year for each project. The second takes this information generated by the first and some other data and returns some other financial information in one row for each project.These functions are intended to be used by an application that will have several people accessing data at the same time and have serveral different variables.The issue I have is I call the first function from with in the second and in the application want to show the results of both. This means I am effectively running the first function twice and sometimes this can take a long time.I'd like to somehow pass the results of first function to the second. Perhaps I could make a custom table type and save the results into this and pass is a parameter to the second function. Or should I be using sprocs? Or could I just save into a temp table and reference that in the second function? Or perhaps I shouldn't even be doing this in the database?! Any ideas? Thanks! |
|
James K
Master Smack Fu Yak Hacker
3873 Posts |
Posted - 2013-05-01 : 09:06:56
|
There are a few different ways in which you could share the data. Take a look at Sommarskog's article here - it describes approaches including temp tables that you mentioned. You have to use stored procedures rather than functions if you want to populate temp tables: http://www.sommarskog.se/share_data.html |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2013-05-02 : 05:00:56
|
Thanks for the link! There doesn't seem to be a method without its problems which makes me feel SQL Server could be the wrong tool... but it's so well suited in other ways. Global temp tables with have their problems with multiple users. Maybe the Process-Keyed Table solution is the best from that article. Still doesn't seem the "proper" way to do things. Shame Table-valued Parameters aren't more flexible. |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2013-05-02 : 06:24:34
|
As far as I understand, if I had the control I would have created set of update logic which will do these calculations on each rows of a table and save the final results into it. Then you could simply populate the table with required rows firat and then update will take care of logics which functions used to do. The table can be destroyed once calculations have all being done and results back to original table. The table name can be made dynamically by appending user related info also to it like userid or username so that concurrent execution doesnt cause any issue------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/https://www.facebook.com/VmBlogs |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2013-05-02 : 06:35:10
|
Yeah, I had a think about that solution too. Then you're relying on dynamic SQL which can be messy... Again, for such a common requirement it doesn't seem a "proper" solution. I mean it can work, but there really should be something properly designed for this kind of use. I haven't read this in detail, but something like this guy's proposing would be what I would expect: http://www.sommarskog.se/tableparam.html. Well for now I'll have to make to with one of the non-perfect solutions! Thanks! |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2013-05-07 : 05:50:38
|
Out of interest, does anyone know if using table parameters as an output is going to be supported in the future? Also, does PostgreSQL, Oracle or any other DBMS support a better way of sharing data between procedures or functions? Thanks! |
|
|
visakh16
Very Important crosS Applying yaK Herder
52326 Posts |
Posted - 2013-05-07 : 06:47:41
|
quote: Originally posted by michael.appleton Out of interest, does anyone know if using table parameters as an output is going to be supported in the future? Also, does PostgreSQL, Oracle or any other DBMS support a better way of sharing data between procedures or functions? Thanks!
One alternative you've in SQL Server is to pass resultset data as XML between procedures and functions. You can use nodes(),query() etc to shred data from XML and also use FOR XML to build XML out of a resultset------------------------------------------------------------------------------------------------------SQL Server MVPhttp://visakhm.blogspot.com/https://www.facebook.com/VmBlogs |
|
|
michael.appleton
Posting Yak Master
160 Posts |
Posted - 2013-05-07 : 07:09:41
|
Good to know, but it really doesn't seem quite right to pass XML around internally in the database. Gut feeling is that it should only really be used as an input or output to an application as inside a database we expect data to be in tables. Not saying it doesn't work, but seems a bit messy and I expect there are performance issues.Saw that Erland Sommarskog has actually proposed out table parameters to Microsoft:[url]https://connect.microsoft.com/SQLServer/feedback/details/299296/relax-restriction-that-table-parameters-must-be-readonly-when-sps-call-each-other[/url]Their answer from 2007quote: Hi Erland,Thanks for the feedback on this. We have recieved similar feedback from a large number of customers. Allowing table valued parameters to be read/write involves quite a bit of work on the SQL Engine side as well as client protocols. Due to time/resource constraints as well as other priorirites, we will not be able to take up this work as part of SQL Server 2008 release. However, we have investigated this issue and have this firmly in our radar to address as part of the next release of SQL Server. We appreciate and welcome the feedback here.Srini AcharyaSenior Program ManagerSQL Server Relational Engine
Still nothing in SQL Server 2012 from what I can tell, five years later! So much about it being on their radar.... |
|
|
|