Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Can someone give me some ideas Please!!!

Author  Topic 

Nitu
Yak Posting Veteran

81 Posts

Posted - 2006-05-02 : 15:27:33
Hi,

I need some help with tranfering data from one table to another.

The structures of the table are like this,
Table1:
[UniqueID] [int] NOT NULL ,
[Yr1] [decimal](18, 6) NOT NULL ,
[Yr2] [decimal](18, 6) NOT NULL ,
[Yr3] [decimal](18, 6) NOT NULL ,
[Yr4] [decimal](18, 6) NOT NULL ,
[Yr5] [decimal](18, 6) NOT NULL ,
[Yr6] [decimal](18, 6) NOT NULL ,
[Yr7] [decimal](18, 6) NOT NULL ,
[Yr8] [decimal](18, 6) NOT NULL ,
[Yr9] [decimal](18, 6) NOT NULL ,
[Yr10] [decimal](18, 6) NOT NULL ,
[LastModified] [smalldatetime] NOT NULL

Table 2:
[UniqueID] [int] Not NULL ,
[price] [decimal](18, 6) NULL ,
[price_start_date] [smalldatetime] NULL ,
[price_stop_date] [smalldatetime] NULL ,
[price_replaced_date] [smalldatetime] NULL

Data has to move from Table1 to Table2. In the first table yr1, yr2,.... fields have prices.
The way data should be in Table2 is,
1. the table2.price_start_date is table1.lastmodified
2. if the prices in all the 10 years in table1 are same then the price_stop_date should have table1.lastmodified + 10 years and the table2.price should be the price given in all the year fields.
3. f prices are not same in all 10 years, then at the year where the price is different in table1, the stop_price_date should be that many years added and the table2.price should the price given in these year fields,
the next record should start from the next year and should repeat until year10 is reached.

Really appretiate whoever can help me on this one.

Thanks a lot in advance,
Nitu


tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2006-05-02 : 15:31:21
Duplicate thread:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=65457

You are going to need to provide us some sample data to work with. Provide at least 10 rows of data for the source table. This should be in the form of INSERT INTO statements so that we can run things on our own machines. Then provide a data example for your 3 requirements, making sure to illustrate what the destination table would look like.

Tara Kizer
aka tduggan
Go to Top of Page

Nitu
Yak Posting Veteran

81 Posts

Posted - 2006-05-02 : 16:05:06
Thanks for your reply,

Here is a sample source table with 10 rows,


UniqueID Yr1 Yr2 Yr3 Yr4 Yr5 Yr6 Yr7 Yr8 Yr9 Yr10 LastModified
1 278.68 270.32 262.21 262.21 262.21 262.21 262.21 262.21 262.21 262.21 10/10/2005
2 401.23 389.19 377.52 377.52 377.52 377.52 377.52 377.52 377.52 377.52 10/10/2005
3 472.37 458.2 444.45 444.45 444.45 444.45 444.45 444.45 444.45 444.45 10/10/2005
4 594.92 577.07 559.76 559.76 559.76 559.76 559.76 559.76 559.76 559.76 10/10/2005
5 672.32 652.15 632.59 632.59 632.59 632.59 632.59 632.59 632.59 632.59 10/10/2005
6 749.72 727.23 705.41 705.41 705.41 705.41 705.41 705.41 705.41 705.41 10/10/2005
7 827.12 802.31 778.24 778.24 778.24 778.24 778.24 778.24 778.24 778.24 10/10/2005
8 904.52 877.38 851.06 851.06 851.06 851.06 851.06 851.06 851.06 851.06 10/10/2005
9 957.89 929.15 901.28 901.28 901.28 901.28 901.28 901.28 901.28 901.28 10/10/2005
10 6600 6402 6209.94 6209.94 6209.94 6209.94 6209.94 6209.94 6209.94 6209.94 10/10/2005



After all the process the Destination table should look like this,


UniqueID Price Price_Start_Date Price_Stop_Date Price_Replace_Date
1 278.68 10/10/2005 10/10/2006 <Not worried about this>
1 270.32 10/10/2006 10/10/2007
1 262.21 10/10/2007 10/10/2015
2 401.23 10/10/2005 10/10/2006
2 389.19 10/10/2006 10/10/2007
2 377.52 10/10/2007 10/10/2015
3 472.37 10/10/2005 10/10/2006
3 458.2 10/10/2006 10/10/2007
3 444.45 10/10/2007 10/10/2015
4 594.92 10/10/2005 10/10/2006
4 577.07 10/10/2006 10/10/2007
4 559.76 10/10/2007 10/10/2015
5 672.32 10/10/2005 10/10/2006
5 652.15 10/10/2006 10/10/2007
5 632.59 10/10/2007 10/10/2015
6 749.72 10/10/2005 10/10/2006
6 727.23 10/10/2006 10/10/2007
6 705.41 10/10/2007 10/10/2015
7 827.12 10/10/2005 10/10/2006
7 802.31 10/10/2006 10/10/2007
7 778.24 10/10/2007 10/10/2015
8 904.52 10/10/2005 10/10/2006
8 851.06 10/10/2006 10/10/2015
9 957.89 10/10/2005 10/10/2006
9 929.15 10/10/2006 10/10/2007
9 901.28 10/10/2007 10/10/2010
9 930.76 10/10/2010 10/10/2013
9 921.54 10/10/2013 10/10/2015
10 6600 10/10/2005 10/10/2006
10 6402 10/10/2006 10/10/2007
10 6209.94 10/10/2007 10/10/2015



Lets take Record #1 from Table1:

  • The Rates are not equal for first 2 years,
    so the first record in table2 is having the yr1 price and lastmodified as startdate and one year more in stopdate from table1.

  • The rates are not equal for second year and 3rd year also, so the second record in table2 is having the yr2 price, now the startdate will the where the previous record stopped for this uniqueID, and stopdate will be another year added to this startdate.

  • Since the Remaining years have same prices, the stopdate is year15 and the price is the value that is same in all the remaining years.



I hope i explained everything i could, let me know if you need any more details.

Thanks a lot in advance,
Nitu
Go to Top of Page

RyanRandall
Master Smack Fu Yak Hacker

1074 Posts

Posted - 2006-05-03 : 07:55:03
Hi Nitu,

I'm not sure the data in your example matches the result (e.g. for id 9). I modified the data as I thought, and put this together...

--data
declare @t table (UniqueID int, Yr1 money, Yr2 money, Yr3 money, Yr4 money, Yr5 money, Yr6 money, Yr7 money, Yr8 money, Yr9 money, Yr10 money, LastModified datetime)
insert @t
select 1, 278.68, 270.32, 262.21, 262.21, 262.21, 262.21, 262.21, 262.21, 262.21, 262.21, '10/10/2005'
union all select 2, 401.23, 389.19, 377.52, 377.52, 377.52, 377.52, 377.52, 377.52, 377.52, 377.52, '10/10/2005'
union all select 3, 472.37, 458.2, 444.45, 444.45, 444.45, 444.45, 444.45, 444.45, 444.45, 444.45, '10/10/2005'
union all select 4, 594.92, 577.07, 559.76, 559.76, 559.76, 559.76, 559.76, 559.76, 559.76, 559.76, '10/10/2005'
union all select 5, 672.32, 652.15, 632.59, 632.59, 632.59, 632.59, 632.59, 632.59, 632.59, 632.59, '10/10/2005'
union all select 6, 749.72, 727.23, 705.41, 705.41, 705.41, 705.41, 705.41, 705.41, 705.41, 705.41, '10/10/2005'
union all select 7, 827.12, 802.31, 778.24, 778.24, 778.24, 778.24, 778.24, 778.24, 778.24, 778.24, '10/10/2005'
union all select 8, 904.52, 851.06, 851.06, 851.06, 851.06, 851.06, 851.06, 851.06, 851.06, 851.06, '10/10/2005'
union all select 9, 957.89, 929.15, 901.28, 901.28, 901.28, 930.76, 930.76, 930.76, 921.54, 921.54, '10/10/2005'
union all select 10, 6600, 6402, 6209.94, 6209.94, 6209.94, 6209.94, 6209.94, 6209.94, 6209.94, 6209.94, '10/10/2005'

--transposition
declare @u table (id int, value money, date datetime, isstart bit, isend bit)
insert @u (id, value, date)
select UniqueID, Yr1, LastModified from @t
union all select UniqueID, Yr2, dateadd(yyyy, 1, LastModified) from @t
union all select UniqueID, Yr3, dateadd(yyyy, 2, LastModified) from @t
union all select UniqueID, Yr4, dateadd(yyyy, 3, LastModified) from @t
union all select UniqueID, Yr5, dateadd(yyyy, 4, LastModified) from @t
union all select UniqueID, Yr6, dateadd(yyyy, 5, LastModified) from @t
union all select UniqueID, Yr7, dateadd(yyyy, 6, LastModified) from @t
union all select UniqueID, Yr8, dateadd(yyyy, 7, LastModified) from @t
union all select UniqueID, Yr9, dateadd(yyyy, 8, LastModified) from @t
union all select UniqueID, Yr10, dateadd(yyyy, 9, LastModified) from @t
order by UniqueID, LastModified

--calculation
update a set
isstart = case when a.value = b.value then 0 else 1 end,
isend = case when a.value = c.value then 0 else 1 end
from @u a
left outer join @u b on a.id = b.id and a.date = dateadd(yyyy, 1, b.date)
left outer join @u c on a.id = c.id and a.date = dateadd(yyyy, -1, c.date)

select
id,
a.value,
date as startdate,
(select top 1 dateadd(yyyy, 1, date) from @u
where id = a.id and date >= a.date and isend = 1 order by date) as enddate
from @u a
where isstart = 1


Ryan Randall
www.monsoonmalabar.com London-based IT consultancy

Solutions are easy. Understanding the problem, now, that's the hard part.
Go to Top of Page

Nitu
Yak Posting Veteran

81 Posts

Posted - 2006-05-03 : 11:07:46
Thank you so much Ryan.

I have never used table variables until now. I am a newbie in sql server. So you gave me a new thing to learn today.

I went through some articles on table varaibles after seeing your code. I have a question on them, I have to deal with some roughly about 150000 records, and this number might most probably grow. So for such huge data, is it adviced to use the table variables or temporary tables.

Thanks a lot for your help,
will be waiting for your answer,

--Nitu
Go to Top of Page

RyanRandall
Master Smack Fu Yak Hacker

1074 Posts

Posted - 2006-05-03 : 12:30:39
Do you need to use table variables / temporary tables at all? I used them to demonstrate the technique only. Won't you just have your fixed Table1 and Table2?

In general with speed comparisons you can never know for sure without trying things out, but as a rule of thumb I'd say: avoid 'interim' tables if you can, if you can't - and you have the option - lean towards table variables for smaller amounts of data and temporary tables for larger amounts of data.


Ryan Randall
www.monsoonmalabar.com London-based IT consultancy

Solutions are easy. Understanding the problem, now, that's the hard part.
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2006-05-03 : 12:32:11
You don't need a table variable or a temporary variable. He is using it to represent your real tables. This is so that he can run things on his own machine. So ignore the declaring of both tables and the inserts as that's just to setup your environment on his machine.

BTW, when I asked for INSERT INTO statements for your sample data, this is why I asked for it. Since you didn't provide it in that format, I couldn't work on your problem as I didn't have the time to write them up. Next time, write them up and you'll find lots of people willing to assist you with your problem.

Tara Kizer
aka tduggan
Go to Top of Page

madhivanan
Premature Yak Congratulator

22864 Posts

Posted - 2006-05-04 : 03:09:16
Also read about Normalisation
http://www.datamodel.org/NormalizationRules.html

Madhivanan

Failing to plan is Planning to fail
Go to Top of Page
   

- Advertisement -