Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Derived table or not

Author  Topic 

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-26 : 09:15:57
I'm comparing the sum of some volumes from one table (table1) with a fixed volume from another table (table2) and I need to select the rows where these two are different. table2 actually holds the aggregated data from table1 but in some cases they are different and I need to find the distinct MarketID where the volumes are different. This is the query I have which does the job:
  SELECT DISTINCT MarketID
FROM (
SELECT a.UserID, a.MarketID, a.CompID, a.Request, a.Price, Volume1 =
CASE
WHEN a.Request = 1 THEN SUM(a.Volume)
ELSE SUM(a.Volume/(a.Price-1))
END,
b.Volume AS Volume2
FROM table1 a INNER JOIN table2 b
ON a.UserID = b.UserID
AND a.MarketID = b.MarketID
AND a.CompID = b.CompID
AND a.Request = b.Request
AND a.Price = b.Price
WHERE a.UserID = @UserID
AND a.Volume > 0
AND b.Volume > 0
GROUP BY a.UserID, a.MarketID, a.CompID, a.Request, a.Price, b.Volume) AS dt1
WHERE dt1.Volume1 <> dt1.Volume2
This works fine but I suspect that the derived table workaround is killing performance and I really don't want that. I did it this way 'cause I couldn't figure out a proper HAVING-clause with the case and all...

Any tips or tricks are appreciated :)

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"

mohdowais
Sheikh of Yak Knowledge

1456 Posts

Posted - 2004-08-26 : 09:49:39
I would argue with that. Derived tables hardly hurt performance, and many queries would actually be so damn unreadable if you did not use a derived table . Have you looked at the execution plan, see if you pinpoint any obvious bottlenecks. I'd imagine that a six column JOIN is not going to be blazing fast on a few hundred thousand rows. Nor is a six column GROUP BY. Could you remove any of these columns from the join or the group by?

OS
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-26 : 10:08:00
The plan actually doesn't look too bad and the WHERE UserID = @UserID will limit the resultset quite substantially. The join will probably never be on more than a few thousand rows maximum as data will also be deleted quite frequently. Here's the plan:
|--Stream Aggregate(GROUP BY:([a].[MarketID]))
|--Filter(WHERE:([Expr1004]<>[b].[Volume]))
|--Compute Scalar(DEFINE:([Expr1004]=If ([a].[Request]=1) then ...
|--Stream Aggregate(GROUP BY:([a].[MarketID], ...
|--Sort(ORDER BY:([a].[MarketID] ASC, ...
|--Nested Loops(Inner Join, OUTER REFERENCES:([a].[MarketID]...
|--Clustered Index Seek(OBJECT:..
|--Clustered Index Seek(OBJECT:..


--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2004-08-26 : 12:52:43
Hi,

Maybe you could avoid to do the join inside the derived table.

e.g.

SELECT DISTINCT a.MarketID
( SELECT UserID, MarketID, CompID, Request, Price,
Volume = CASE WHEN a.Request = 1 THEN SUM(a.Volume)
ELSE SUM(a.Volume/(a.Price-1)) END
FROM table1
WHERE Volume > 0
GROUP BY UserID, MarketID, CompID, Request, Price ) a
INNER JOIN table2 b
ON a.UserID = b.UserID
AND a.MarketID = b.MarketID
AND a.CompID = b.CompID
AND a.Request = b.Request
AND a.Price = b.Price
WHERE
a.Volume <> b.Volume


/rockmoose
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2004-08-26 : 13:28:29
I think it might look more like..


SELECT DISTINCT MarketID
FROM (
SELECT UserID
, MarketID
, CompID
, Request
, Price
, CASE WHEN a.Request = 1
THEN SUM(a.Volume)
ELSE SUM(a.Volume/(a.Price-1))
END AS Volume
FROM Table 1
WHERE a.UserID = @UserID
AND a.Volume > 0
GROUP BY a.UserID, a.MarketID, a.CompID, a.Request, a.Price, b.Volume) AS a
INNER JOIN table2 b
ON a.UserID = b.UserID
AND a.MarketID = b.MarketID
AND a.CompID = b.CompID
AND a.Request = b.Request
AND a.Price = b.Price
WHERE b.Volume > 0
AND a.Volume <> b.Volume



Brett

8-)
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2004-08-26 : 14:30:51
The SQL parser would have caught the syntax error ;-), not as fast as you though.
Thanx002548 for the correction.


SELECT DISTINCT MarketID
FROM (
SELECT UserID
, MarketID
, CompID
, Request
, Price
, CASE WHEN a.Request = 1
THEN SUM(Volume)
ELSE SUM(Volume/(Price-1))
END AS Volume
FROM [Table 1]
WHERE UserID = @UserID
AND Volume > 0
GROUP BY UserID, MarketID, CompID, Request, Price ) AS a
INNER JOIN table2 b
ON a.UserID = b.UserID
AND a.MarketID = b.MarketID
AND a.CompID = b.CompID
AND a.Request = b.Request
AND a.Price = b.Price
WHERE b.Volume > 0
AND a.Volume <> b.Volume


/rockmoose
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-30 : 08:41:37
Sorry for replying so late but I took a few days off. I deserved it The plan looks a little better I think, one of the clustered seeks was moved quite a bit forward, which is better isn't in...? ->
  |--Stream Aggregate(GROUP BY:([a].[MarketID]))
|--Nested Loops(Inner Join, ...
|--Compute Scalar(DEFINE:([Expr1003]=...
| |--Stream Aggregate(GROUP BY:...
| |--Sort(ORDER BY:([a].[MarketID] ASC, ...
| |--Clustered Index Seek(OBJECT: :::
|--Clustered Index Seek(OBJECT:...
I think I'll stick with this one, thanx alot!

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2004-08-30 : 08:50:14
Good, so is there a measurable difference in seconds ?



rockmoose
/* Chaos is the nature of things...Order is a lesser state of chaos */
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-30 : 08:56:28
Nah, hehe...I only have about 20 rows in the table so I seriously doubt there will be any noticeable difference as it is now. But as my awareness of properly formated DML/indexes have increased quite drastically lately I try to make the queries as efficient as possible.

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page

derrickleggett
Pointy Haired Yak DBA

4184 Posts

Posted - 2004-08-30 : 10:12:41
Lumbago, I was looking through some old posts. You've really come a long way in the last several months. Just wanted to let you know. Keep up the good work. It's a fun learning experience, isn't it? :)

MeanOldDBA
derrickleggett@hotmail.com

When life gives you a lemon, fire the DBA.
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-30 : 10:24:24
Hehe, thanx Derrick...and it's true! I have learned sooo much since coming here you wouldn't belive it. And it's all because of quite a few pretty darn amazing guys and one darn amazing gal (there's only one, right??)! I have used forums for quite some time but never ever come across a forum with so many highly skilled and so VERY helpful people in on place. There is still quite a way to go to reach the levels of you ol'timers but I'm getting closer for every passing day...

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page

rockmoose
SQL Natt Alfen

3279 Posts

Posted - 2004-08-30 : 14:21:57
quote:
Originally posted by Lumbago

Nah, hehe...I only have about 20 rows in the table so I seriously doubt there will be any noticeable difference as it is now. But as my awareness of properly formated DML/indexes have increased quite drastically lately I try to make the queries as efficient as possible.



All this for 20 records you !!!

*lol*


rockmoose
/* Chaos is the nature of things...Order is a lesser state of chaos */
Go to Top of Page

Lumbago
Norsk Yak Master

3271 Posts

Posted - 2004-08-30 : 15:05:53
Hehe, don't worry my good neighbour...I'm designing this for a brand new part of my website so it's still only in development. When I put it in to production table1 will probably have around 100k records and table2 maybe around half of that. And for every single additional row in the table our turnover increase so lets hope it grows *really* big :)

--
Lumbago
"Real programmers don't document, if it was hard to write it should be hard to understand"
Go to Top of Page
   

- Advertisement -