Please start any new threads on our new
site at http://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.

Our new SQL Server Forums are live!
Come on over! We've restricted the ability to create new threads on these forums.

Thanks Peso... great code. You can achieve a lot in a few lines in SQL...

I was sort of getting there myself, but would have taken me a few more hours to crack it I think!

Extremely fast for 100,000 rows and a small sample size (a few seconds), and still very performant for 350 sample size (5 minutes as opposed to 35 minutes in my previous version!)

For the larger sample sizes (eg 350) could a procedural approach prove faster?

thought I'd posted an update here... I adapted the procedural code to my needs and reduced the processing time down to 20 seconds....

thanks for all the assistance!

So what does your final code look like?

--Jeff Moden"Your lack of planning DOES constitute an emergency on my part... SO PLAN BETTER! " "RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row"

For MS SQL Server 2008 moving average for last 10 samples:

SELECT x.RowDate as Date, avg(y.RowValue)
FROM
(SELECT ROW_NUMBER() OVER (ORDER BY RowDate asc) AS Row, RowDate FROM YourTable) AS x,
(SELECT ROW_NUMBER() OVER (ORDER BY RowDate asc) AS Row, RowDate, RowValue FROM YourTable) AS y
WHERE x.Row between y.Row and y.Row+9
group by x.RowDate
ORDER BY x.RowDate asc

I stumbled upon this query and really find it very interesting and useful; Especially due to the fact that it does NOT use a lag function. I was wondering if you would care to explain it a bit. 1. first of all, what Moving avg are you calculating, last 12 rows or 3 rows? 2. in the 3 way union, you have actdt as 0,0 and 2, Did you mean it to be two zeroes or 0,1,2 3. can you explain the overall logic how this goes back a certain number of rows to get a moving average.

Thank you.

quote:Originally posted by SwePeso

I think set-based code has the fastest time And no need for staging table either!

-- Prepare sample data
DECLARE @Sample TABLE (dt DATETIME, Rate FLOAT)
INSERT @Sample
SELECT CURRENT_TIMESTAMP - 10, 1 UNION ALL
SELECT CURRENT_TIMESTAMP - 9, 2 UNION ALL
SELECT CURRENT_TIMESTAMP - 8, 4 UNION ALL
SELECT CURRENT_TIMESTAMP - 7, 4 UNION ALL
SELECT CURRENT_TIMESTAMP - 6, 5 UNION ALL
SELECT CURRENT_TIMESTAMP - 5, 6 UNION ALL
SELECT CURRENT_TIMESTAMP - 4, 6 UNION ALL
SELECT CURRENT_TIMESTAMP - 3, 8 UNION ALL
SELECT CURRENT_TIMESTAMP - 2, 9 UNION ALL
SELECT CURRENT_TIMESTAMP - 1, 10 UNION ALL
SELECT CURRENT_TIMESTAMP - 0, 11 UNION ALL
SELECT CURRENT_TIMESTAMP + 1, 9
-- Peso 2
SELECT DATEADD(DAY, dt, '19000101') AS dt,
AVG(Rate) AS SimpleMovingAvg,
SUM(wr) AS WeightedMovingAvg
FROM (
SELECT DATEDIFF(DAY, '18991230', dt) AS dt,
0.1 * Rate AS wr,
Rate,
0 AS Actualdate
FROM @Sample
UNION ALL
SELECT DATEDIFF(DAY, '18991231', dt),
0.2 * Rate,
Rate,
0
FROM @Sample
UNION ALL
SELECT DATEDIFF(DAY, '19000101', dt),
0.7 * Rate,
Rate,
1
FROM @Sample
) AS k
GROUP BY dt
HAVING MAX(Actualdate) = 1
ORDER BY dt