Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 Transact-SQL (2000)
 90 million rows and poorly indexed

Author  Topic 

numfar
Starting Member

2 Posts

Posted - 2006-01-15 : 15:02:23
I'm tasked with retrieving specific account data from a 90 million row table. I don't own the db and have on rights other than select.

I do know there are a few indexed fields, 2 of which I believe I can use to my advantage, POST_DATE, and DETAIL_TYPE. Account is not an indexed field.

The table contains over 5 years worth of data, and the data I require could be in any or all years.

I've tried using date criteria to restrict the number of rows, but still have horrible performance. I'm also restricting records based on DETAIL_TYPE, as there are 4 of those we don't need.

I currently have my selects in a DTS, transporting the rows to a delimited text file. The 6 selects span an entire year and have been running for almost 2 hours, returning so far, 40 rows for this particular account.

Is there anything else I could do to the WHERE to speed this up?

Adding an index to the table is a no go, and I'm not even sure I could at this point due to its large size.

Any advice would be appreciated.

Thanks,
d

eyechart
Master Smack Fu Yak Hacker

3575 Posts

Posted - 2006-01-15 : 15:19:10
seeing that you have not provided us any information regarding your query or the table or the indexes that are currently in place we have no advice for you. You have given us cryptic little glimpses, but that is no where near enough data for us to help you.

Anyway, chances are you will probably need to have some indexes added to your tables to service these queries. Insist that indexes be added, because your queries will be adversely affecting every other query hitting that database if they aren't added.

Does your organization have some kind of policy against adding indexes?



-ec
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2006-01-15 : 15:31:42
You could try to pump the data into an Analysis Server cube. With appropriate dimensions for each dimensional attribute of the table.
With any luck there are lookup tables joined to the 90m row table for Account, DETATIL_TYPE etc, so building dimensions from those should be very quick.
Once you load the data into the cube (select all rows basically) with low aggregation factor (5% e.g.) you can analyze the data very quickly.

Apart from that, proper indexes is like night and day, tell them that until there are indexes it is going to be horribly slow.

rockmoose
Go to Top of Page

numfar
Starting Member

2 Posts

Posted - 2006-01-15 : 16:05:46
The database is not mine, and I've been told our license(for billing system) does not allow us make changes to the tables, including the addition of indexes.

In regards to what I'm doing..

SELECT Field1, Field2, Field3
FROM TABLE
WHERE POST_DATE BETWEEN '07/31/2001' AND '08/14/2001'
AND DETAIL_TYPE NOT IN (50,51,60,61)
AND ACCT_ID = 123

This is a sample of what I have to do, however, I have to do it from 07/31/2001 to present day for 4 different accounts.

If I take "AND ACCT_ID = 123" out of the criteria, it runs pretty well, with the exception that 99% of the records retrieved I cannot use.

Thanks
Go to Top of Page
   

- Advertisement -