Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 sub query or inner join

Author  Topic 

bobyliu
Starting Member

14 Posts

Posted - 2005-02-16 : 14:40:39
how should i choose which one to use for best performance?

jsmith8858
Dr. Cross Join

7423 Posts

Posted - 2005-02-16 : 15:14:09
brief question, brief answer: it depends

- Jeff
Go to Top of Page

bobyliu
Starting Member

14 Posts

Posted - 2005-02-16 : 18:52:29
so... what will be some senarios that i should use inner join and what will be some senarios that i should use sub queries for better performance? are there some good online articles or good books you can recommand?
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-02-16 : 19:04:33
Is this a homework question? Have you even looked them up inside SQL Server Books Online? Do you understand the differences between them?

Tara
Go to Top of Page

bobyliu
Starting Member

14 Posts

Posted - 2005-02-16 : 19:58:04
Bascially I am trying to archive a table into a table in our offlinedb. The table I am trying to archive has about 10 million rows. So OBVIOUSLY i can't just move them all at once since it will blow up the transaction log. homework question? i don't think i ever got to deal with a data set like this big when i was going to school.

So anyway, i am trying to move the data in chunks and preselect a small set of data to be archived and then copy the data to the table on the offlinedb and purge the data in the original table. So i am just trying to figure out if i should join the small set with the big set or i should just to it a totally different way. right now it is taking about 40 minutes for each chunk (by using inner join). as for using sub quries in this case, i am going to say it will be so much worse that it will be unbearable.

The reason i asked about sub queries and inner join is because i read some optimization tips somewhere saying something about them but i can't remember where i read it. by the way, the smaller data set is 1000 rows and the big data set (table i am trying to archive) is 10 million rows. the smaller set is stored in a table variable.
Go to Top of Page

spirit1
Cybernetic Yak Master

11752 Posts

Posted - 2005-02-17 : 06:29:52
inner join with 1000 rows is taking 40 minutes?? wow...


Go with the flow & have fun! Else fight the flow
Go to Top of Page

jsmith8858
Dr. Cross Join

7423 Posts

Posted - 2005-02-17 : 09:00:51
again, it all depends on how you are doing things. do you copy every row every time you archive? why not just rows that are new or have changed (i.e., using a datestamp or something similiar). In many transactional systems, rows CAN'T change, so all you need to do is add new ones. IN addition, all of this depends on what you are joining to and by what columns.

Since it sounds like you are dealing with just 2 tables, why don't you give us the DDL for the two tables and the methodology by which you would like to archive the data, and we can help you with efficient SQL statements that will acheive that.

But understand that if you are copying 10 million rows from point A to point B, it won't matter what SQL statement you are executing, it will take a long time. That's a lot of data to move accross a network -- does that make sense? Chances are, no matter what SQL statement you write (even using a cursor as you mention) the bottleneck will be the cost of moving all that data accross a network, and also the destination DB would have a lot of work to do just to save all that data to its hard drive, write to its transaction log (every transaction!) and build indexes, etc.

(there was a poster here about 6 months ago who wanted to optimize his SQL statement that returned 2 million rows; complained he needed them all for a particular report and it wasn't fast enough. That report would have been something like 20,000 pages long!! The SQL statment itself wasn't the issue ...)

- Jeff
Go to Top of Page
   

- Advertisement -