| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2002-04-04 : 08:08:13
|
David writes "Here is what I have1. a point of sale table with no primary keys2. columns = close date, cash, checks, visa, employee id, store_key,mc, 3. know of these columns are unique values. (for instance there are multiple stores so you could have many close dates etc. I want to write an sql query to display any duplicate records. this is what I had. select * from table_namewhere (closedate IN (select closedate from table_name GROUP BY closedate HAVING COUNT(*) > 1 )ORDER BY CLOSEDATEbut this only gives the records with multipe insances of close dates. What should I do....thanxexample dataclosedate store_key cash checks visa emplyeeid 3-21-02 3 10 15 0 joe3-22-01 1 15 25 5 sue3-21-02 4 10 25 0 dave3-20-02 4 15 15 3 dave" |
|
|
Nazim
A custom title
1408 Posts |
Posted - 2002-04-04 : 08:25:12
|
| Is this a season of Duplicate records, found couple of them from yesterday.newayz , try on this linesselect * from table_name tinner join (select closedate ,max(store_key) as MSTKfrom table_name GROUP BY closedate HAVING COUNT(*) > 1 ) kon t.closedate=k.closedate and t.store_key=k.storekey-------------------------------------------------------------- |
 |
|
|
Jay99
468 Posts |
Posted - 2002-04-04 : 09:19:15
|
If you have no primary key, the only way to guarentee uniqueness (or in this case duplicity) is to look at all columns in the row . . .select [close date], case, checks, visa, [employee id, store_key, mcfrom table_namegroup by [close date], case, checks, visa, [employee id, store_key, mchaving count(*) > 1 Jay<O> |
 |
|
|
jessica_lee23
Starting Member
2 Posts |
Posted - 2002-04-04 : 11:45:49
|
| you can simplify this a little more:-)SELECT *FROM Invoice_Daily_SummaryGROUP BY CloseDate, Cash, Checks, Visa, EmployeeID, [Date]HAVING (COUNT(CloseDate) > 1) this should select ALL dupe records |
 |
|
|
Jay99
468 Posts |
Posted - 2002-04-04 : 12:00:43
|
| First, (original poster) I am just now noticing that your psuedo-ddl in the first post contains a different list of columns than the results of your select * . . .Second, (jessica_lee23) I don't see how that is 'simpler' than my solution. There is no [date] field in either representation of the data from the original post (although I made a couple spelling errors so we'll call it even). Your use of select * doesn't simplify the statement. Sure it may be less characters to type, but experience will point to select * being a poor construct for production code. Finally, using the count([close date]) in the having clause will not effect query performance. It could be having count(47) and still work the same. I welcome anyone improving my suggestions, as I am sure they are quite often non-optimal and sometimes flat out wrong, but to come behind me and repost virtually the exact same code and then claim that it is 'simpler' and will return 'ALL' the records (implying my suggestion will not) rubs me the wrong way.Please show me how I am wrong, and I will eat my words . . .Jay<O> |
 |
|
|
David_UF
Starting Member
2 Posts |
Posted - 2002-04-04 : 15:54:15
|
| Jay, I am david the one so many people helped with my query. Thanx for the help. it worked great! are you referring to myself when you said that the first column does not reflect what I have after my select *...if so please explainDavid |
 |
|
|
Jay99
468 Posts |
Posted - 2002-04-04 : 16:08:13
|
quote: 2. columns = close date, cash, checks, visa, employee id, store_key,mc,
quote: select * from table_name...closedate store_key cash checks visa emplyeeid 3-21-02 3 10 15 0 joe...
I am just saying the column list you supplied differed from the resultset provide by your select statement . . .If you got it working, then don't worry about it.Jay<O> |
 |
|
|
jessica_lee23
Starting Member
2 Posts |
Posted - 2002-04-04 : 16:10:22
|
| so what,he forgot mastercard............yours looked worse than that;) |
 |
|
|
Jay99
468 Posts |
Posted - 2002-04-04 : 16:27:53
|
quote: ...yours looked worse than that;)
What the hell?Usually, it’s just chalked up to typing errors but in this case it makes all the difference. There is no PK; there are no unique constraints. Uniqueness, therefore, is only guaranteed by grouping by all columns in the table. In order to reveal the duplicates, you need to look for records that violate uniqueness. This is the point I was trying to get across in my initial post.Are you trying to provoke me with your ‘simplify’ and ‘looked worse than that’? We usually don’t play like that here.Jay<O> |
 |
|
|
joldham
Wiseass Yak Posting Master
300 Posts |
Posted - 2002-04-04 : 17:38:40
|
| I hate to butt in on the fireworks, but I have a suggestion for jessica_lee23. Take it for what it is worth. I have noticed that you are a new member. I myself have only been a member for about a month and know that Jay always provides at least 2 things to this forums if not more: 1) Quick and accurate posts 2) Some of the best suggestions anyone could make. He (as well as many other in this forum) consistently work to help people with running their Databases here until the question is satisfied or they run out of knowledge (which I don't think I have seen yet). My suggestion: Before taking it personal that Jay disagreed with your post, lurk for a while to see who knows what before you start flaming.Jeremy |
 |
|
|
setbasedisthetruepath
Used SQL Salesman
992 Posts |
Posted - 2002-04-05 : 08:45:06
|
Jessica,Jay's comment about 'select * being bad in production' is wise advice that you should take away from this. your statement:quote: SELECT *FROM Invoice_Daily_SummaryGROUP BY CloseDate, Cash, Checks, Visa, EmployeeID, [Date]HAVING (COUNT(CloseDate) > 1) this should select ALL dupe records
breaks as soon as you add a column to Invoice_Daily_Summary. Jay's query, on the other hand, reliably and robustly shows duplicates among those columns no matter what you do to the rest of the table ... setBasedIsTheTruepath<O> |
 |
|
|
David_UF
Starting Member
2 Posts |
Posted - 2002-04-05 : 15:32:51
|
| To Jay and jessica, Thank You very much for the help. You saved me alot of time.............. |
 |
|
|
|