Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 Help with efficiency

Author  Topic 

Job
Yak Posting Veteran

69 Posts

Posted - 2006-01-19 : 11:47:28
I have a series of views that I create each time we recieve a new set of data. However, I'm not sure that it is the most efficient way of arriving at the answer. If I could get suggestions to help me learn how to better write these statements I would really appreciate it.

--------------------------Create the Txn level data with the F and NF TRVU numbers--------------------
drop view VW_Txn_UPC

---For UPC we included a marker if there was a name in the--------------------------------------------
---InvResMnem field --------------------------------------------
create view VW_Txn_UPC
As
Select PtLstNme, PtCity, PtZip, PtMRN, PtSSN, PtDOB, InvNum, InvCrtDt, InvServDt,
InvGroupNme,InvDivNme,InvBilArea,InvCrtPdIDX, InvOrigFscNum, InvOrigFscRptCat2Nme, InvFscNum,
InvFscRptCat2Nme, InvLocCat1Nme, DX1, DX2, DX3, DX4, InvLocNme, InvProv,
InvRefProv, TxnCptCde,
TxnModAll, TxnPayCdeNum, TxnPayCdeCat, TxnPostDt, TxnBankDepDt,
TxnPostPdIDX, TxnServDt, TxnDXAll, TxnChrgAmt, TxnAdjAmt,
TxnPayAmt, TxnCreditAmt, TxnDebitAmt, TxnUnits, Fxd_Units, Refunds, Pay_wo_Refunds,
TMD.CPT_26_TC, Pos_Num,
'AlphaCode' = Case
When left(TxnCptCde,1) like '%[a-zA-Z]%'
Then left(TxnCptCde,1)
Else Null
END,
left(TMD.TxnPostPdIDX,4)as YR,TMBR.FAC_TRVU,TMBR.NFAC_TRVU,TMBR.WRVU,
'Adj_FAC_TRVU' =
Case
WHEN TxnModAll like '%80%' or TxnModAll like '%81%' or TxnModAll like '%82%'
THEN TMBR.FAC_TRVU *.25
ELSE TMBR.FAC_TRVU
END,
'Adj_NFAC_TRVU' =
Case
WHEN TxnModAll like '%80%' or TxnModAll like '%81%' or TxnModAll like '%82%'
THEN TMBR.NFAC_TRVU *.25
ELSE TMBR.NFAC_TRVU
END,
'Adj_WRVU' =
Case
WHEN TxnModAll like '%80%' or TxnModAll like '%81%' or TxnModAll like '%82%'
THEN TMBR.WRVU *.25
ELSE TMBR.WRVU
END

From Tbl_Txn_UPC TMD Left JOIN

Tbl_Master_Blnd_RVU_RestNJ_05 TMBR on TMD.CPT_26_TC = TMBR.CPT_26_TC

--------------------------Create the Temp B5 View------------------------------------------------------------------------------

Drop View VW_B5_UPC_tmp

Drop View VW_B5_UPC_tmp

Create View VW_B5_UPC_tmp
As
SELECT TxnCptCde, TxnModAll,CPT_26_TC, AlphaCode,TxnPostDt, InvServDt,
InvLocNme, InvProv,InvGroupNme,InvDivNme,InvBilArea,
InvOrigFscRptCat2Nme, InvLocCat1Nme, YR, TxnPostPdIDX, SUM(TxnChrgAmt) AS Charges,
SUM(TxnAdjAmt + TxnCreditAmt - TxnDebitAmt + TxnChrgAmt + Refunds)
AS Adjustments, SUM(TxnPayAmt - Refunds) AS PaymentsLessRefunds,
SUM(Fxd_Units) AS Fxd_Units,WRVU, FAC_TRVU,
NFAC_TRVU,Pos_Num, Adj_FAC_TRVU, Adj_NFAC_TRVU, Adj_WRVU
FROM VW_Txn_UPC
GROUP BY VW_Txn_UPC.InvOrigFscRptCat2Nme,
YR,VW_Txn_UPC.TxnPostPdIDX, TxnModAll,InvLocNme,InvLocCat1Nme, TxnCptCde, TxnPostDt,
InvServDt, CPT_26_TC,AlphaCode, WRVU, FAC_TRVU,
VW_Txn_UPC.NFAC_TRVU,InvProv,InvGroupNme,InvDivNme,InvBilArea,Pos_Num, Adj_FAC_TRVU, Adj_NFAC_TRVU, Adj_WRVU




------------------------------------------------------------------------------------------------------------------
--------------------------Create the Actual B5 View--------------------------------------------------------------

Drop View VW_B5_UPC


Create View VW_B5_UPC As
Select InvGroupNme,InvDivNme,InvBilArea,
InvOrigFscRptCat2Nme, InvLocNme,InvLocCat1Nme, TxnPostPdIDX,Charges,Adjustments,
PaymentsLessRefunds,Fxd_Units,
WRVU*Fxd_Units as WRVU_Tot,FAC_TRVU*Fxd_Units as F_TRVU_Tot, NFAC_TRVU*Fxd_Units as NF_TRVU_Tot,
Adj_FAC_TRVU*Fxd_Units as Adj_FAC_TRVU, Adj_NFAC_TRVU*Fxd_Units as Adj_NFAC_TRVU, Adj_WRVU*Fxd_Units as Adj_WRVU
From VW_B5_UPC_tmp

---------------------------------------END OF --------------------------------------------------------
--------------------------------------------------------------------------------------------------------

I use this last view as the basis of the analysis, and many times I will only use certain of the columns. Needless to say running this view from a view from a view takes quite a while to run. All comments are appreciated.

Cheers,

Job

blindman
Master Smack Fu Yak Hacker

2365 Posts

Posted - 2006-01-19 : 13:51:24
At first glance, your views do not look they have any glaring ineffeciencies. My question to you would be, why are you creating new views for each new set of data? Perhaps a stored proc with a parameter that allowed you to filter out the data you want would be better than an explosion of nearly identical views.
Go to Top of Page

jsmith8858
Dr. Cross Join

7423 Posts

Posted - 2006-01-19 : 14:03:15
quote:
Originally posted by Job

I have a series of views that I create each time we recieve a new set of data.



Can you please explain this a little, with an example of your workflow? Creating database objects is not something that should be done as data comes and goes; it should only be required when major structural changes are made to your system.
Go to Top of Page

Job
Yak Posting Veteran

69 Posts

Posted - 2006-01-20 : 10:00:00
Thank you for your responses. Regarding stored proceedures, I did look those up, but I'm still not clear on how to use them. I think this would help me with overall efficiency. Regarding workflow. We get a new project (client) we gather pretty much the same set of data from each, generally around 10-20 mm lines of data. I have a saved .sql file that has all of the views, updates, make tables etc. that I use to analyze the data and to produce answers. I duplicated this each time so each project has one of these sql files. All of your suggestions and help is greatly appreciated.

Cheers,
Job
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-20 : 11:03:08
"Regarding stored proceedures, I did look those up, but I'm still not clear on how to use them"

Basically you make a Stored procedure along the lines of:

CREATE PROCEDURE dbo.MySProc
(
@SomeParameter varchar(10)
)
AS
SELECT ... Column List ...
FROM dbo.MyTable
WHERE MyColumn = @SomeParameter


The first time you run it SQL Server prepares a query plan, and [usually!!] caches it.

When you run the SProc again it gets the query plan from cache, so saves all the compile time on each iteration.

If you have a number of "steps" in the job then there is no round-trip to the server for each one ... you application says "EXECUTE dbo.MySProc 'SomeData'" then SQL Server processes all the steps in the SProc, and the Resultsets [and optionally return/output parameters] are sent back to the Client. The reduction in round-trips can save time too.

The savings can be surprisingly significant (compared to dynamic SQL).

(There are also, potentially, security benefits, but they probably don't apply to what you are doing).

Kristen
Go to Top of Page

Job
Yak Posting Veteran

69 Posts

Posted - 2006-01-20 : 12:58:31
As part of the saved proceedure can you specify the output to be saved to a file to a certain location and then continue with the next step in the proceedure? Also, I create tables and often the result of these is to create and populate a table in a different database is this possible as well? Is it best to create one massive stored proceedure that does one thing after another, or multiple stored proceedures that you run one after another? Also, many times I will drop a view then create the view and if I try to run two sets of this (ie. --step 1 Drop view abc => create view abc as Select... --step 2 Drop view efg => create view efg as Select...) So if I run step 1 and two together it gives an error...

Is there a location I can go or can someone drop some pseudo code to do the above examples?

Thanks for all the help and explanation.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-21 : 05:22:13
I can only really make some suggestions for part of that, hopefully others have suggestions / opinions too.

One SProc can call another. So you COULD modularise your code by having a "Master SProc" call a bunch of child ones. Then there is no size limit, and each child-Sproc will have its own cached query plan.

(That may be important if there are conditional execution paths through your code as the cached query plan may not be suitable for all execution paths, but one-cached-query-plan-per-child-Sproc will fare better, of course)

An Sproc just returns a Result Set, so it can't really output to a file. There are ways around that, but that probably isn't what you want.

Perhaps retain a SQL Script that you run in Query Analyser and save the stuff that is appropriate to files as you go?

I have loads of stuff like that - its mostly for one-off, heavily amended, or string-and-chewing-gum usage!!

So my script looks something like this:

-- This script does SuchAndSuch

EXEC MySproc1 @CLIENT='Acme'
EXEC MySproc2 @PRODUCT='Widget'
GO

-- Run to here, then save results to \\MyServer\Acme\STEP1_yyyymmdd.TXT
...

If you need to run it loads of times with no/little modification [after the initial development] then you could make a BATCH file instead which called OSQL (or possibly BCP) to do the work [i.e. instead of Query Analyser]

The output from these can EASILY be stored in a file.

So then your BATCH file would, in pseudo code, be something like:

Run OSQL to execute MySProc1 and MySProc2 saving the output to "MyFile.TXT"

Kristen
Go to Top of Page

Job
Yak Posting Veteran

69 Posts

Posted - 2006-01-22 : 01:41:27
Thanks for the input Kristen. This is really great stuff. Basically, I run a bunch of these statments to ultimately get to two views which I run and save to .txt files to be used by Excel or Access to pivot off of. The reason for the question regarding modularizing the SProc is I was thinking of having a central location to define all of the variables for the whole lot, then it would run along and finally output these two .txt files to be picked up when it was finished running. I suppose I need to find one of you SQL Guru's on retainer or something and have you help set these things up on an hourly basis. I need to analyze the data, but not necessarily become a dba. I spend a lot of my billable time trying to set this up for ease of use for those who know less than I do. I need help getting it set up correctly so we can analyze away at the data, but I know very little about correctly clustering indexes, and efficient code and such.

If any are interested or available on a consistant basis, at least initially please let me know.

Job
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-22 : 03:32:20
Sounds to me like you are doing OK so far ... but I get your point about maximising your time on your primary skill set!

Best to fill in your Profile with location etc. You might be just down the road from me, but seeing as I can't see any houses around here I somehow doubt it!

Kristen
Go to Top of Page
   

- Advertisement -