Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
dbuser123
Starting Member
20 Posts |
Posted - 2006-02-10 : 10:09:06
|
| I know this has been discussed many times.I have the following problemI have 2 tablesTable T1 (ID int PKSubject nvarchar(120)Date1 datetime...)Table T2(ID int FKindex intDate2 datetime...)As you see T1 has a one-many relationship with T2. T2 has a PK of (ID,index)T1 has 1.2M recordsT2 has 4.8M recordsim trying to get 1000 records in each page of datathe sql would beselect * from T1 LEFT JOIN T2 ON T1.ID = T2.ID WHERE T2.Index=0 ORDER BY Date2Date2 is indexed non-clustered.I have tried all the paging methods mentioned herehttp://www.codeproject.com/aspnet/PagingLarge.aspThe rowcount method would not work if i specified 2 different order by columns of different type for example (date2,T1.ID) and if i just specify date2 on the join then rowcount method takes 35 seconds for the 1000th page and the cursor method takes 3 minutes !!But the first page in rowcount method takes 0s and the cursor method takes 3 minutes !!What is going on. Is there any good method at all to do paging and is 35/50 seconds acceptable ?? |
|
|
madhivanan
Premature Yak Congratulator
22864 Posts |
|
|
dbuser123
Starting Member
20 Posts |
Posted - 2006-02-10 : 10:29:59
|
| Looked at all that. Im looking at every single thing on the web. I want to know from your experiences how do you guys handle multiple joins and performance. what is the acceptable time for the access of 50th page of 100 pages ? |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2006-02-10 : 11:07:21
|
| The "best" method depends on some factors and what your prioreties are. If pure speed is what you want and you're willing to sacrafice some data "timely-ness" then denormalized, rolled up, ranked reporting tables could be a solution. If you are willing to have fewer rows per page that could make a significant difference.Are these search queries where you could have widely varying results based on search parameters?Table joins is not so much a factor as your data model and how optimized the queries are. No matter what paging strategy you use, if your query can not return sorted data quickly you'll never get fast results.Maybe you can post your table DDLs (with rowcounts) and your (un-paged) query.couple questions:Do you really need up to 1000 rows per page? Just getting that much data to a client can consume a lot of time.Do you need to return the rownumbers (of the entire results - not just for the page)?Do data changes between page requests need to be reflected in the subsequent page requests?How many total rows could potentially be in a result?Be One with the OptimizerTG |
 |
|
|
dbuser123
Starting Member
20 Posts |
Posted - 2006-02-10 : 11:19:40
|
| It is difficult for me to post the DDLs. Im not allowed to do that. I can give you any other details you want. All i care about is the speed of retrieval of course based on simple search criteria and sorts.>Do you really need up to 1000 rows per page? Just getting that much data to a client can consume a lot of time.this is a win32 client. Right now im more concerned about SQL taking the time. The time to reach the client from the server is very less.>Do you need to return the rownumbers (of the entire results - not just for the page)?all i need to return is the total number of pages.Do data changes between page requests need to be reflected in the subsequent page requests?> nope. im using (nolock) everywhere..How many total rows could potentially be in a result?total rows per result is 1000 because im choosing only the first index in T2. Total as i said there are 1.2M+ in T1 and Total 4M+ in T2. |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2006-02-10 : 11:30:39
|
quote: How many total rows could potentially be in a result?total rows per result is 1000 because im choosing only the first index in T2. Total as i said there are 1.2M+ in T1 and Total 4M+ in T2.
What I mean is could someone make a request where they could page through 4000 pages of 1000 row pages? (4 million rows) Or can you enforce some "please narrow your search criteria" concept?Be One with the OptimizerTG |
 |
|
|
TG
Master Smack Fu Yak Hacker
6065 Posts |
Posted - 2006-02-10 : 11:35:59
|
| From a QA window, how long does it take to do:select top 1000 <colList>from <your tables>order by <one of the possible sorts>Be One with the OptimizerTG |
 |
|
|
|
|
|