Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2002-11-20 : 08:30:55
|
| tina writes "HelloI have paid a consultant to develop an ASP/SQL application [SQL Server 2000]. We recently discovered that when there are as little as 50,000 records in the database, it takes a long time to get back "Search Results". This is partly because each query contains about 10 smaller queries.ISSUE 1: The slowest queries are the ones for which there are many search results. However, we can't set an upper limit on the number of items in the search results, because we have a feature "display oldest first" and "display most popular first". POSSIBLE SOLUTION: My consultant suggests that we identify X queries that take up most server time [i.e (average server time per query)*(# of times per day the query is executed)). Every 24 hours, we can generate the html "search results" pages for each of these queries. These pages will take up about 20Gb of hard disk space on our dedicated server. When a surfer submits a query, we can then check whether it is one of the X queries for which we have generated html search results pages earlier in the day. If it is one of these X queries, our code does not use the database to generate the search results page [this takes about 10-15 seconds], instead, it displays the appropriate Search Results page that we have generated earlier for this query.May I please ask your advice about this? Do you believe this is a reasonably efficient solution? Will this indeed save server time? When each Search result page takes 10 seconds {10 queries are involved in creating this page - i.e. the categories in each of the 6 drop down menus change to reflect the categories available to search within search results} to complete [assuming each surfer contributes about 10-20 queries] our server will not be able to accomodate more than 500-1000 surfers/day, without crashing. I am hoping the method above will free up the server, and allow us to have higher traffic/day without crashing.Thank you very much for your kind help" |
|
|
ValterBorges
Master Smack Fu Yak Hacker
1429 Posts |
Posted - 2002-11-20 : 09:08:41
|
| Can you show us how you're storing those 50,000 records and some sample queries. We might be able to evaluate your consultant's work.How big is the HTML file that one of your search results creates.If its a lot of data then most likely the 10 seconds your experiencing is your web browser loading all the data.What is your network bandwith?Are you accessing the asp on a local network?Are you using a dedicated server?What are your server specs?If your queries are in asp/ado then I would suggest moving them to stored procedures.If it's not the size of the html file as described above then I do believe that caching the results would indeed speed up your process, however in doing so you are limiting the ability to perform real-time analysis and also creating lots more work. |
 |
|
|
jackstow
Posting Yak Master
160 Posts |
Posted - 2002-11-21 : 04:45:59
|
| ASP.NET has fully featured caching capabilites built in. Maybe it would be worth porting your ASP search page to an ASP.NET page. |
 |
|
|
Bambola
Posting Yak Master
103 Posts |
Posted - 2002-11-21 : 06:07:35
|
To me it sounds like something that can be solved with a good design, indexes and query optimization. 50,000 is very little, and I cannot see how it would create such big problems. How about posting some of the queries so we can help more?quote: tina writes "HelloI have paid a consultant to develop an ASP/SQL application [SQL Server 2000]. We recently discovered that when there are as little as 50,000 records in the database, it takes a long time to get back "Search Results". This is partly because each query contains about 10 smaller queries.ISSUE 1: The slowest queries are the ones for which there are many search results. However, we can't set an upper limit on the number of items in the search results, because we have a feature "display oldest first" and "display most popular first". POSSIBLE SOLUTION: My consultant suggests that we identify X queries that take up most server time [i.e (average server time per query)*(# of times per day the query is executed)). Every 24 hours, we can generate the html "search results" pages for each of these queries. These pages will take up about 20Gb of hard disk space on our dedicated server. When a surfer submits a query, we can then check whether it is one of the X queries for which we have generated html search results pages earlier in the day. If it is one of these X queries, our code does not use the database to generate the search results page [this takes about 10-15 seconds], instead, it displays the appropriate Search Results page that we have generated earlier for this query.May I please ask your advice about this? Do you believe this is a reasonably efficient solution? Will this indeed save server time? When each Search result page takes 10 seconds {10 queries are involved in creating this page - i.e. the categories in each of the 6 drop down menus change to reflect the categories available to search within search results} to complete [assuming each surfer contributes about 10-20 queries] our server will not be able to accomodate more than 500-1000 surfers/day, without crashing. I am hoping the method above will free up the server, and allow us to have higher traffic/day without crashing.Thank you very much for your kind help"
Bambola. |
 |
|
|
|
|
|
|
|