Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Jimminy Cricket
Starting Member
3 Posts |
Posted - 2005-06-01 : 10:48:58
|
| Hi,I've had a search through various different posts and articles about separating out comma delimited strings but none seem to do what I want. I have a table which has one column which contains a string of comma-delimited data:0011192313,,,,,,Clno,245,248,48520,536299 0015989704,,,,,,Clno,246,142250,418348,6593530011188294,2,,,,,Mailname,Mrs J Cox,Mr & Mrs J P Cox I need to be able to separate the data out into a new table which will have a column for each separate value, but still keeping them in the same row. All the other articles I've seen seem to split the data out into multiple rows, whereas I need to keep each row together.Cheers,James. |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-01 : 11:29:27
|
| Maybe this is useful: [url]http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=46188[/url]rockmoose |
 |
|
|
AndyB13
Aged Yak Warrior
583 Posts |
Posted - 2005-06-01 : 13:29:11
|
Or this article[url]http://www.sqlteam.com/item.asp?ItemID=2652[/url]Beauty is in the eyes of the beerholder |
 |
|
|
KLang23
Posting Yak Master
115 Posts |
Posted - 2005-06-01 : 19:34:07
|
Hi,This "table-specific" function will return a result set - each row containing the primary key of your table and one of the delimited values. It assumes that the length of your comma-delimited string column is not larget than 8K. You can join this result set back to the original table to get other columns. I haven't read the links in the previous posts, so if this solution is the same, please forgive me.The function assumes that your table exists - below is an example table which the function references.Good luck!Set NoCount ONCreate Table MyTable (MyKey int not null identity, MyString Varchar(256) )Insert Into MyTable (MyString) Values ('A,B,C,D,E,F,G,H,I')Insert Into MyTable (MyString) Values ('ONE,TWO,THREE,FOUR')Insert Into MyTable (MyString) Values ('Abra,Cadabra,Here,I,Am')GOBelow is the function definition. Replace the table and column names of the source table to match your own.Alter FUNCTION DBO.fn_ParseMyTable ()RETURNS @TableOut TABLE (MyKey int, MyCol varchar(50) )ASBEGINDeclare @Start intDeclare @Length intDeclare @MyKey intDeclare @MyString varchar(8000)Declare @MyCol Varchar(50)Set @Start = 1Declare C1 Cursor For Select MyKey, MyString From MyTableOpen C1Fetch C1 Into @MyKey, @MyStringWhile @@Fetch_Status = 0Begin -- Insert a terminating comma If Right(@MyString,1) <> ',' Begin Set @MyString = @MyString + ',' End Set @Length = (Charindex(',',@MyString) - 1) -- Loop though (parse) values. While Charindex(',',@MyString,@Start) > 0 Begin Set @Length = (Charindex(',',@MyString,@Start) - @Start) Set @MyCol = Substring(@MyString,@Start,@Length) Set @Start = @Start + @Length + 1 Insert into @TableOut (MyKey, MyCol) Values (@MyKey, @MyCol) End Fetch C1 Into @MyKey, @MyString Set @Start = 1EndRETURNENDGOThis executes the function:Select * From DBO.fn_ParseMyTable() Returns:MyKey MyCol ----------- ----- 1 A1 B1 C1 D1 E1 F1 G1 H1 I2 ONE2 TWO2 THREE2 FOUR3 Abra3 Cadabra3 Here3 I3 Am |
 |
|
|
AndyB13
Aged Yak Warrior
583 Posts |
Posted - 2005-06-01 : 19:48:36
|
quote: Originally posted by KLang23I haven't read the links in the previous posts, so if this solution is the same, please forgive me.
Maybe you should have because a £/$ to a piece of sh*t says the set based method will be far superior to a cursor! Read the links AndyBeauty is in the eyes of the beerholder |
 |
|
|
KLang23
Posting Yak Master
115 Posts |
Posted - 2005-06-02 : 16:20:28
|
quote: Maybe you should have because a £/$ to a piece of sh*t says the set based method will be far superior to a cursor! Read the links Andy
I don't like your tone Andy. If you're not going to be constructive, you have no business posting here.I took the time to code and present a solution, and I guarantee that in this situation it will perfrom just as well as the solution in the link. Let's see your handiwork instead of a post to a link - the details of which I'm certain that you don't comprehend. I'm certain of that because I didn't solve the problem, and neither did the link. The original post asked for the individual values of the delimited string to be generated horizontally, and my solution, as well as your "link" generates a vertical result set.We both missed the boat, although I demonstrated a little more class than you did.I can't figure out the slang translation of "£/$" in your post. If your goal is to insult me, and you're opionion is that I'm not too clever, you've got to make the insult simple enough for me to understand.Kevin |
 |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2005-06-02 : 16:51:43
|
I think andy's point was that a cursor solution is not a solution, as well as the fact that you admittedly did not read the links already posted. We have all expressed our thoughts (often quite crudely) on cursors, namely their performance for set-based operations. I don't think that posting a link means that the person doesn't understand it, simply its not worth their time to rehash an example when one exists.Ultimately, it could be that the solution the person is looking for is not a rational solution, as in this case, where it is very hard to imagine what purpose this split would serve. That being said... here is an example.Set NoCount ONCreate Table #n (n int)Insert Into #nSelect n = number from admin.dbo.getSequence(1,100,1)Create Table #MyTable (MyKey int not null identity, MyString Varchar(256))Insert Into #MyTable (MyString) Values ('A,B,C,D,E,F,G,H,I')Insert Into #MyTable (MyString) Values ('ONE,TWO,THREE,FOUR')Insert Into #MyTable (MyString) Values ('Abra,Cadabra,Here,I,Am')Select * From #myTable Select myKey, B.n, val = substring(','+A.MyString+',',B.n+1,charindex(',',','+A.MyString+',',B.n+1)-B.n-1) Into #prepWork From #myTable A, #n B Where len(','+A.MyString)>= B.n and substring(','+A.MyString,B.n,1)=','--Select * From #prepWorkSelect col01, col02, col03, col04, col05, col06, col07, col08, col09From (Select myKey, col01=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=1) col01Left Join (Select myKey, col02=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=2) col02 On col01.myKey = col02.myKeyLeft Join (Select myKey, col03=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=3) col03 On col01.myKey = col03.myKeyLeft Join (Select myKey, col04=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=4) col04 On col01.myKey = col04.myKeyLeft Join (Select myKey, col05=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=5) col05 On col01.myKey = col05.myKeyLeft Join (Select myKey, col06=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=6) col06 On col01.myKey = col06.myKeyLeft Join (Select myKey, col07=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=7) col07 On col01.myKey = col07.myKeyLeft Join (Select myKey, col08=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=8) col08 On col01.myKey = col08.myKeyLeft Join (Select myKey, col09=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=9) col09 On col01.myKey = col09.myKeyDrop Table #prepworkDrop Table #nDrop Table #myTableCorey Secret Service Agent: Mr. President, you're urinating on me.President Lyndon Johnson: I know I am. It's my prerogative. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2005-06-02 : 16:54:16
|
quote: Originally posted by KLang23 I guarantee that in this situation it will perfrom just as well as the solution in the link.
I'd bet a lot of money that the cursor solution would not perform as well as a set-based solution for this situation.If I need to loop, which only happens for DBA type routines, I use WHILE instead of a cursor. It gives you more control anyway.Tara |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-03 : 03:37:09
|
Guys,I came up with the function because I had a similar requirement (split csv into columns iof rows).Believe me, I tried to come up with a nice set-based method to do it - NO LUCK.The function is thousands of times faster than the set-based method presented here.(And easier, and more readable)PS.Soon I will have to create database "admin" with the "getSequence" function  SET STATISTICS TIME OFFSET STATISTICS IO OFFSET NOCOUNT ONCREATE TABLE MyTable(MyKey INT IDENTITY PRIMARY KEY, MyString VARCHAR(8000))GOINSERT MyTable(MyString)SELECT t FROM(SELECT '0011192313,,,,,,Clno,245,248,48520,536299' AS tUNION ALL SELECT '0015989704,,,,,,Clno,246,142250,418348,659353'UNION ALL SELECT '0011188294,2,,,,,Mailname,Mrs J Cox,Mr & Mrs J P Cox') AS tCROSS JOIN number WHERE n < 20Select myKey, B.n, val = substring(','+A.MyString+',',B.n+1,charindex(',',','+A.MyString+',',B.n+1)-B.n-1)Into #prepWorkFrom MyTable A, number BWhere len(','+A.MyString)>= B.nand substring(','+A.MyString,B.n,1)=','AND B.n < (SELECT MAX(LEN(MyString)) FROM MyTable)SET STATISTICS TIME ONSET STATISTICS IO ONSELECT DISTINCT dbo.fnGetCsvPart(MyString,0,default) AS Col01 ,dbo.fnGetCsvPart(MyString,1,default) AS Col02 ,dbo.fnGetCsvPart(MyString,2,default) AS Col03 ,dbo.fnGetCsvPart(MyString,3,default) AS Col04 ,dbo.fnGetCsvPart(MyString,4,default) AS Col05 ,dbo.fnGetCsvPart(MyString,5,default) AS Col06 ,dbo.fnGetCsvPart(MyString,6,default) AS Col07 ,dbo.fnGetCsvPart(MyString,7,default) AS Col08 ,dbo.fnGetCsvPart(MyString,8,default) AS Col09 ,dbo.fnGetCsvPart(MyString,9,default) AS Col10 ,dbo.fnGetCsvPart(MyString,10,default) AS Col11FROM MyTableSelect DISTINCT col01, col02, col03, col04, col05, col06, col07, col08, col09, col10, col11From (Select myKey, col01=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=1) col01Left Join (Select myKey, col02=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=2) col02 On col01.myKey = col02.myKeyLeft Join (Select myKey, col03=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=3) col03 On col01.myKey = col03.myKeyLeft Join (Select myKey, col04=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=4) col04 On col01.myKey = col04.myKeyLeft Join (Select myKey, col05=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=5) col05 On col01.myKey = col05.myKeyLeft Join (Select myKey, col06=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=6) col06 On col01.myKey = col06.myKeyLeft Join (Select myKey, col07=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=7) col07 On col01.myKey = col07.myKeyLeft Join (Select myKey, col08=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=8) col08 On col01.myKey = col08.myKeyLeft Join (Select myKey, col09=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=9) col09 On col01.myKey = col09.myKeyLeft Join (Select myKey, col10=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=10) col10 On col01.myKey = col10.myKeyLeft Join (Select myKey, col11=val From #prepWork A Where (Select count(*) from #prepWork Where myKey = A.myKey and n <= A.n)=11) col11 On col01.myKey = col11.myKeySET STATISTICS TIME OFFSET STATISTICS IO OFFDROP TABLE #prepWorkGODROP TABLE MyTableGOrockmoose |
 |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2005-06-03 : 07:22:10
|
is the function really faster than an all out set based method?? In my experience a query that repeated uses a ton of functions is often very sluggish...?Corey Secret Service Agent: Mr. President, you're urinating on me.President Lyndon Johnson: I know I am. It's my prerogative. |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-03 : 07:44:48
|
| In this case, yes, it's faster, it's a very simple function, just manipulating the string passed to it.Performance suffers greatly if You call many (more complex) functions that each look up things in various tables, allocate temporary tables and such.(then it's better to work set-based with intermediate results or what not to get some speed)I have tried to find a good set-based method for this requirement.rockmoose |
 |
|
|
jsmith8858
Dr. Cross Join
7423 Posts |
Posted - 2005-06-03 : 09:28:37
|
| We've been through this before. You cannot generalize about "set based" solutions and assume they are always better if it requires MORE WORK (note: I did not say more code) to implement the set based solution, or if the bottleneck has nothing to do with looping through the rows one by one.For the classic CSV-to-multiple-rows function (1 CSV string -> multiple rows), it is MUCH faster to avoid joining to a numbers table just to say it is "set based" and to avoid a loop; it is more efficient to declare an int and loop through a string and build the table that way. That's what computers are designed to do and do better than anything -- loop through values in memory! Why add overhead to processing joins and retrieving a numbers table when you don't need it?In the case mentioned in this thread, it would be much faster as well to use a UDF to parse this string (or traditional string functions) than to add the overhead of a numberes table and then the multiple left outer joins to derived tables! Think logically about the work required to implement each of these solutions.And, again, for tasks such as looping through rows in a table to send emails or to generate text files or run batch files or things like that, by definition things must be done one at a time, and the bottleneck is NOT the SQL portion. In these cases you are not saving much work (if any -- you may be creating more!) by first moving things to a temp table (or table variable) and continually querying that table to get each row one by one just to avoid the "stigma" of a cursor. (Hopefully, everyone reading this knows me and knows my overall feelings about cursors in general so nothing I've written will be misunderstood)Remember, if you've ever used ADO or ADO.NET to retrieve values from SQL Server --you are using a cursor ! If you've ever used Query Analyzer to display results from a SELECT -- a cursor is used !! That's right folks, we all use them all the time ! Horrors !- Jeff |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-03 : 13:08:46
|
Go to the pub, have a few cold ones Jeff In general set-based processing in sql-server is more efficient than looping/cursors and/or udf's.rockmoose |
 |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2005-06-03 : 13:16:53
|
i know what you mean jeff... i'm not mad Corey Secret Service Agent: Mr. President, you're urinating on me.President Lyndon Johnson: I know I am. It's my prerogative. |
 |
|
|
jsmith8858
Dr. Cross Join
7423 Posts |
Posted - 2005-06-03 : 13:56:59
|
quote: Originally posted by rockmoose Go to the pub, have a few cold ones Jeff In general set-based processing in sql-server is more efficient than looping/cursors and/or udf's.rockmoose
That's kind of like saying "in general when you are doing something , wearing pants is much better than not wearing pants." Doing WHAT?The problem is the vagueness of the word "processing"; that's the key word in your statement and it has a great influence on whether or not it is true.- Jeff |
 |
|
|
KLang23
Posting Yak Master
115 Posts |
Posted - 2005-06-03 : 14:10:03
|
quote: We've been through this before.....The function is .......(And easier, and more readable)
It's good to see the problem viewed with perspective. There are many ways to solve a problem. It's good to know when to use a fly-swatter or an AK-47. |
 |
|
|
Seventhnight
Master Smack Fu Yak Hacker
2878 Posts |
Posted - 2005-06-03 : 14:10:43
|
if you are working on a conveyor system, or an escalator, or car enginge, or in Jon's office, I would definitely suggest wearing pants... Corey Secret Service Agent: Mr. President, you're urinating on me.President Lyndon Johnson: I know I am. It's my prerogative. |
 |
|
|
jsmith8858
Dr. Cross Join
7423 Posts |
Posted - 2005-06-03 : 14:24:15
|
quote: Originally posted by Seventhnight if you are working on a conveyor system, or an escalator, or car enginge, or in Jon's office, I would definitely suggest wearing pants... 
Corey you bastard I fell off my chair at work reading that ! What a way to start the weekend ! - Jeff |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-03 : 15:45:42
|
quote: The problem is the vagueness of the word "processing"; that's the key word in your statement and it has a great influence on whether or not it is true.
Rant:The scope of the "processing" I thought was implied from the context: "sql server"."In general" means the typical or common case.In sql server we (typically) process data that is stored in tables and columns.So I was referring to the (typical) processing that is done in sql-server.(inserting new data, retrieving data, modifying data, deleting data, doing calculations on the data, inferring new data from stored data)EORrockmoose |
 |
|
|
rockmoose
SQL Natt Alfen
3279 Posts |
Posted - 2005-06-03 : 16:08:58
|
quote: Originally posted by KLang23
quote: We've been through this before.....The function is .......(And easier, and more readable)
It's good to see the problem viewed with perspective. There are many ways to solve a problem. It's good to know when to use a fly-swatter or an AK-47.
Now You know ???Perspective and implementation is not the same thing.I was referring to a specific solution to a specific problem, as compared to other solutions.Jeff was talking about concepts.rockmoose |
 |
|
|
|
|
|
|
|