Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Persisting Search Data

Author  Topic 

Blastrix
Posting Yak Master

208 Posts

Posted - 2002-12-22 : 17:53:51
I have a lookup page which is pretty heavily used by different people, and generally has some pretty similar search criteria. The people in question generally look at the first 3-4 pages(even though there may be upwards of 100). Each time the search is performed, the query creates a temp table with an identity field, pumps it full of all records fitting the criteria, and then brings back the 20 or so records for that specific page.

This seems like a waste of resources to me. Is there a way that I can effectively search once, create a table(maybe a unique naming scheme), fill it with the records, and then hit off of that table for a certain period of time, and then drop it? If there is a better way to do this than what I'm thinking of, please let me know.

Thanks,
Steve

nr
SQLTeam MVY

12543 Posts

Posted - 2002-12-22 : 18:11:59
How dynamic is your data?
You could keep a table with the search criteria and a table with the results.
populate these whenever a search is made then you only have to recreate the data if the data changes.

You will be sending the data to a middle layer so you could also save it there in a text file (maybe as xml) so that you don'y have to get the data from the database at all - just need some way of telling that the results are still valid.

==========================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
Go to Top of Page

Bambola
Posting Yak Master

103 Posts

Posted - 2002-12-23 : 02:43:52
I usually use this approach as well but instead of using a temp table I use a table datatype that holds only an identity and the pk of the row I need. so the table is created in the memory and holds the minimun data possible. than I join the data table with this table datatype and take the corrisponding page.
If you are talking about 100 pages, that should work fine. Check the execution plan and see if it works better for you.

If you choose the other approach of holding data in a text file, you would have to write some trigger that will delete this file when data changes.



Bambola.
Go to Top of Page

GreatInca
Posting Yak Master

102 Posts

Posted - 2002-12-23 : 10:48:14
You can always exploit your search tracking and/or stash the PKs in a array in a session var and then query the details by the page.

Go to Top of Page

jsmith8858
Dr. Cross Join

7423 Posts

Posted - 2002-12-23 : 12:41:54
how about something like:

You have a table called "Search History":

SearchID, SearchText
1, blah blah
2, computers

When a user performs a search, you see if it is in the search history table. If not you add it.

Then you have another table of "Search results":

SearchID, ForeignKey
1,2
1,5
1,10
2,12
2,25
2,63
..etc..

this keeps track of the ID's of the records returned by this search. If this is the first time you have searched on this search text, you append the results of the search to this table first.

The data you return is then based on linking this table to the table you are searching, filtered by the current searchID.

You empty of your search results and search terms tables whenever your data changes.

This way, searches are "cached" and you can very quickly return the records you need to return. You could add a "Sequence" field or a "Matching%" field to your search results table as needed for paging or sorting.

If the search text is different or there are different fields, adjust the "Search text" field as needed to include all parameters they can search by. For example, you might have:

SearchID, City, State, Zip, ..etc...

Of course, the "Search results" table would probably get VERY big quickly so I have no idea if this will work; just throwing it out there! This would only work if a very high percentage of searches are truly redundant.

- Jeff
Go to Top of Page

Blastrix
Posting Yak Master

208 Posts

Posted - 2002-12-23 : 23:19:32
Thanks guys, I'll try playing around with these different scenarios over the next couple of weeks and see how things go.

Steve

Go to Top of Page

p2bl
Yak Posting Veteran

54 Posts

Posted - 2002-12-26 : 03:58:11
I don't think storing all result in a session var is ideal method.because if the result contains 2000 rows and each rows has 1k,each user will use up at least 2M memory,so if u have much more seach pages,or u have many simulatenius users,web server will used up much memory(because session has a timeout)

========================
look!
Go to Top of Page
   

- Advertisement -