Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 Site Related Forums
 The Yak Corral
 The longest week ...

Author  Topic 

Kristen
Test

22859 Posts

Posted - 2005-11-17 : 01:35:41
http://sleeksoft.co.uk/public/techblog/articles/20051117_1.html

jen
Master Smack Fu Yak Hacker

4110 Posts

Posted - 2005-11-17 : 04:10:33
though not the longest(3 days) but I had to reconstruct thousands of records because the developer used a "on error resume next" statement

all header records were inserted but details were gone or vice versa
that was the time i stumbled across lumigent and sqlteam :)





--------------------
keeping it simple...
Go to Top of Page

spirit1
Cybernetic Yak Master

11752 Posts

Posted - 2005-11-17 : 05:23:38
if there's one thing i hate in VB is on error resume next... that is just not cool
ARGHHHH!!!!!!

Go with the flow & have fun! Else fight the flow
Go to Top of Page

jen
Master Smack Fu Yak Hacker

4110 Posts

Posted - 2005-11-18 : 03:28:38
and they actually believed there were some "freaky" processes in sql server that is mysteriously deleting records





--------------------
keeping it simple...
Go to Top of Page

mcrowley
Aged Yak Warrior

771 Posts

Posted - 2005-11-18 : 09:53:35
Drop table customer
go

...wait....am I on the QA server....? 8-[
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-11-18 : 13:51:23
quote:
Originally posted by mcrowley

Drop table customer
go

...wait....am I on the QA server....? 8-[



I actually did that in production a while back, except it was even worse...

DROP DATABASE ...
GO

I meant do it in the production environment, but rather on the DR server so that I could re-setup log shipping. D'oh! The recovery was rather fast as I realized it as soon as it started executing. Why couldn't there have been at least one customer connected to the database to prevent the drop!? I so had hoped that there would be so that I would get the error saying there were still connected users.

Tara Kizer
aka tduggan
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-11-18 : 13:54:47
Oh and his story seems fake. He would have received errors during the execution of the script if his spid had been selected as the deadlock victim. If it hadn't been selected, he wouldn't have lost any data. If it had been selected, his script would have aborted (although only the last statement would have been rolled back due to the lack of transactions in it) as a deadlock error is a severe error and disconnects.

Tara Kizer
aka tduggan
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2005-11-18 : 14:50:30
BEGIN TRAN
DROP DATABASE
--COMMIT TRAN
GO



Brett

8-)

Hint: Want your questions answered fast? Follow the direction in this link
http://weblogs.sqlteam.com/brettk/archive/2005/05/25/5276.aspx

Add yourself!
http://www.frappr.com/sqlteam
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2005-11-18 : 14:51:40
Maybe he was running a series of scripts using OSQL, or scripts with GO in them so that it just went on to the next block after a failure.



quote:
Originally posted by tkizer

Oh and his story seems fake. He would have received errors during the execution of the script if his spid had been selected as the deadlock victim. If it hadn't been selected, he wouldn't have lost any data. If it had been selected, his script would have aborted (although only the last statement would have been rolled back due to the lack of transactions in it) as a deadlock error is a severe error and disconnects.

Tara Kizer
aka tduggan



CODO ERGO SUM
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-11-18 : 14:57:12
Perhaps he was, but he still would have seen the errors if run from Query Analyzer. If run from osql, then he would have seen them in the output file. If he didn't configure an output file, then it would be impossible to know that it was successful.

Tara Kizer
aka tduggan
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-11-18 : 20:04:23
"BEGIN TRAN
DROP DATABASE [add DB name]
--COMMIT TRAN
GO
"

That aint gonna work - although BoL seems to be silent on the matter. Its great when you go read BoL because of a topic here, for example: I learnt that you can do

DROP DATABASE A,B,C

So

DECLARE @DatabaseList varchar(8000)
SELECT @DatabaseList = COALESCE(@DatabaseList+',', '') + name
FROM master.dbo.sysdatabases
EXEC ('DROP DATABASE ' + @DatabaseList)

has got to be so much better than using a cursor, no? !!

NOTE: This code has not been tested, please report any bugs

Kristen
Go to Top of Page

Michael Valentine Jones
Yak DBA Kernel (pronounced Colonel)

7020 Posts

Posted - 2005-11-18 : 20:20:47
They don't support this syntax?

DROP DATABASE
where
name in ( select name from master.dbo.sysdatabases )


CODO ERGO SUM
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-11-18 : 20:29:15
Shocking isn't it. MS have taken all that time to build SQL2005 and still not included the obvious features, eh?!

Kristen
Go to Top of Page

robvolk
Most Valuable Yak

15732 Posts

Posted - 2005-11-18 : 20:45:19
I still like the title of this one best:

http://weblogs.sqlteam.com/robv/

Go to Top of Page

RoadWarrior
Starting Member

3 Posts

Posted - 2005-11-18 : 21:42:38
Tara,

>> Oh and his story seems fake. <<

From memory (this was 1998, a long time ago), I used Query Analyzer. The problem was that there were more than 60,000 updates in the complete script, and QA cut off the results list after just a few thousand. I haven't tried this with QA 2000, but I wouldn't be surprised if it had the same problem.

You do realise, I hope, that you can have a thought and not say it? This doesn't make you insular, it merely separates you from that mad woman who's always shouting at the pigeons outside the supermarket.
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-11-20 : 02:20:13
Hi RoadWarrior, Welcome to SQL Team!
Go to Top of Page

RoadWarrior
Starting Member

3 Posts

Posted - 2005-11-20 : 05:28:50
Kristen, thanks for the welcome. And thanks for the Googlejuice - my site traffic doubled on the day you linked to my blog entry!

Looking around SQL Team, there seems to be some clever and knowledgeable people here - a good place to learn new stuff and give back some of what you learned.
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-11-21 : 12:50:26
Glad to have you here RoadWarrior! The story just seemed fake to me or I should say it didn't sound plausible. I've never seen Query Analyer cut off results before, but then again I've never had more than a thousand rows in it.

I get myself in trouble a lot by having thoughts and also saying them.

Tara Kizer
aka tduggan
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2005-11-21 : 14:05:15
"I get myself in trouble a lot"

Don't belittle your best feature!!

"I've never had more than a thousand rows in it"

I in-lined a 30MB XML file in an ad-hoc sp_xml_preparedocument statement in Q.A. the other day. I figured that squirting the 30MB up from my client, along the 2Mb line, was going to be slow, so I TerminalServered into the server and did it in Q.A. there.

Maybe I won't do that again ...

Prepare the XML - 28 seconds

INSERT INTO #MyTempTable
SELECT *
FROM OPENXML (@hDoc, 'root/MyTableTag', 3)
WITH #MyTempTable

for each of 9 tables ... wait for it .... 34,585 seconds

Call Sprocs using cursors to insert all the rows from the staging tables - 13,541 "root" level records = 5,300 seconds



Kristen
Go to Top of Page

tkizer
Almighty SQL Goddess

38200 Posts

Posted - 2005-11-21 : 14:15:40
I've processed files that have had thousands and millions of rows in it. When I said thousand rows in it, I meant in the result pane. My result pane typically has very little data in it.

Tara Kizer
aka tduggan
Go to Top of Page
   

- Advertisement -