Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 How SQL Server would respond....

Author  Topic 

MichaelP
Jedi Yak

2489 Posts

Posted - 2002-10-16 : 12:06:56
This is sorta a weird question so bear with me.

Lets say I've got a few rather nasty queries (reports) that need to be run. Possibly all at the same time, back-to-back.
If I run them all back-to-back, will the SQL Server be "unresponsive" to users that are trying to do normal things on the server like a quick lookup, etc?

What I'm trying to do is make a system that can output large reports for large dataranges, and still have the website that the database powers run quickly. Is this an unlrealistic goal? Do I need two DB servers, one that does production stuff, and one that does just reporting?

Michael

<Yoda>Use the Search page you must. Find the answer you will.</Yoda>

robvolk
Most Valuable Yak

15732 Posts

Posted - 2002-10-16 : 12:21:51
Assuming OLAP isn't an option for you, and if you can't utilize 2 servers, can you perform some generalized summarization and put the results into summary tables, like a poor-man's OLAP/Analysis Services?

I had some big summary reports that I CUBE'd and ROLLUP'd into a few summary tables, with just enough detail for some basic drill-down, and it worked great. Took maybe 5-10 minutes to generate and only needed to be done once a day.

Anything that you can fold into a 1,000 row or less table is a pretty good measure. You can then have these tables refreshed at regular intervals so that they don't have to hit the detail table(s) constantly.

Go to Top of Page

M.E.
Aged Yak Warrior

539 Posts

Posted - 2002-10-16 : 12:23:40
Heh, you'll need a powerful database just to do reports :D

I do some queries to my database such as find all wells and all related picks for this daterange (ussually 2 days). The query takes about 45 minutes to run and increases query time by about 125%. what took 10 seconds takes 22.5 when I hit the server with this query.

My solution:

I scripted all these reports tasks and made them into DTS packages. The DTS package runs a query that dumps all the information I need to report into a temp table. That temp table is then exported out of the database (txt file, excel spreadsheet... whatevers needed). I then schedual the DTS package to run at night during a low use period. If theres multiple queries, I just get them to kick off after the one previous finishes.

Question is.. when you need a report do you need it right away or can it wait to be run at night so you'll have the report info in the morning?



-----------------------
SQL isn't just a hobby, It's an addiction

Edited by - m.e. on 10/16/2002 12:26:04
Go to Top of Page

MichaelP
Jedi Yak

2489 Posts

Posted - 2002-10-16 : 12:37:39
I had considered "preparing" the reports. I do sorta need the functionality of "Give me this report, for this insane daterange, and I want it now." I wonder if there's some way I can "weight" a report, and determine if I can run it now, or have to wait until the night to run it. Example: Running a report for 1 week's of data can be run right now, but running it for one month will have to wait until the evening.

From what I can tell, all of my reports should be able to run rather quickly. I've written a ASP application in the past that allowed for the "feature" above. It worked great for awhile, but now that there's a ton of data in the system, the reports / ASP page can timeout. For this application, I'm thinking ahead so that I don't run into this problem.

When you guys mention pre-caching the "denormalized" report data into a static table, the idea is to get the few rows you need into a smaller table, with a smaller index so that you can read it out faster?

Michael



<Yoda>Use the Search page you must. Find the answer you will.</Yoda>
Go to Top of Page
   

- Advertisement -