Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 11:43:28
|
quote: Server: Msg 1101, Level 17, State 10, Procedure ap_get_sales_data, Line 10Could not allocate new page for database 'TEMPDB'. There are no more pages available in filegroup DEFAULT. Space can be created by dropping objects, adding additional files, or allowing file growth.
The stored proc that's being run is:quote: CREATE PROCEDURE dbo.ap_get_sales_data @datefrom datetime, @dateto datetimeASBegin insert into temp_sales (partnumber, [total sales], [Customer Name], first_download, last_download, parentID) select distinct(partnumber), convert(money, sum(totalsalenet)) as [Total Sales], c.customer_name, min(et.requestdatetime) as [First hit], max(et.requestdatetime) as [Last Hit], se.etech_customer_id from sales_history s join customer_data c on s.accountnumber=c.accountnumber join smc_etech_key se on c.accountnumber=se.accountnumber left outer join smc_new_products.dbo.etechmodelrequests et on convert(varchar(50), se.etech_customer_id)=et.username and s.partnumber=et.configname and requestdatetime between @datefrom and @dateto and interfacename like '%down%' where convert(datetime, (substring(convert(varchar(8), purchase_date), 5, 2) + '/' + right(purchase_date, 2) + '/' + left(purchase_date, 4)), 101) between @datefrom and @dateto group by partnumber, customer_name, etech_customer_id order by c.customer_nameselect distinct(parentId), partnumber, max([customer Name]) as Customer, sum([Total Sales]) as [Total Sales], min([first_download]) as [First Download], max([last_download]) as [Last_Download] from temp_sales where [first_download] is not nullgroup by parentid, partnumberorder by parentIDEndGO
I have unrestricted File growth turned on for both the data and transaction log.....I've got plenty of disk space on the drive...I really don't know what could be causing the error. |
|
|
Shastryv
Posting Yak Master
145 Posts |
Posted - 2003-06-02 : 12:15:41
|
| I got exactly the same error message in 6.5. I can’t believe it’s still the same in 7.0 or 2000. See is there any object in the Tempdb that are not getting dropped after the use.You may want to launch trace and capture the statement and see is any. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-02 : 12:24:33
|
| OK, don't know about your error, but why are you inserting in to a temp table? Why not collapse your SQL staements in to 1 SELECT?Brett8-)EDIT: And it P_sses me off that bol doesn't give a possible resolutions columns...it just lists the error that you already see...Edited by - x002548 on 06/02/2003 12:27:37 |
 |
|
|
X002548
Not Just a Number
15586 Posts |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 12:31:08
|
quote: OK, don't know about your error, but why are you inserting in to a temp table? Why not collapse your SQL staements in to 1 SELECT?
Hey brett.Mostly because I'm not sure how to just do a select and get the type of rolled up grouping I'm getting now in one statement. More on this error....it turns out that the temp file is getting to big (more than 6 gigs) and that's why it's erroring out. Doesn't really help me though as I need to run the query but apparently cannot unless I've got more drive space available or something....Guess I'm kind of stuck...... |
 |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 12:35:54
|
| While we're on the subject of disk space, how I do I get rid of some of my really large transaction logs? I do a complete backup and then try and shrink them and I've got a couple that won't let me shrink them below 2 gigs.....Surely I don't need a 2 gig trans log? Isn't there any way to get rid of those large files? |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-02 : 12:42:44
|
quote: ...temp file is getting too big (more than 6 gigs)
Not for nothing...but that's huge....how big is your database?I'll work on trying to collapse your queryBrett8-) |
 |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 12:46:43
|
quote: Not for nothing...but that's huge....how big is your database?
The database itself is only 248 megs. The translog is 2.2 gigs. I have a similiar problem with my "Etech_portal" database where my trans log is 650 megs but the actual DB is only 90 megs. quote: I'll work on trying to collapse your query
Thanks...much appreciated. |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-02 : 13:07:23
|
| What is your database recovery model set to? If it's FULL, then you need to backup your transaction logs periodically. If you don't care about point in time recovery, change it to SIMPLE. If it's already set to simple, then try running DBCC SHRINKDATABASE and DBCC SHRINKFILE. See BOL for syntax on both. If that doesn't work, then you can detach the log and create a new one. But that will take the database down temporarily.Tara |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-02 : 13:11:42
|
| Tara, what does that have to do with tempdb?It's 6 gig!How many other databases are on the box?Also in your SELECT in the second query from the temp table, where do you get ParentId from?Brett8-) |
 |
|
|
tkizer
Almighty SQL Goddess
38200 Posts |
Posted - 2003-06-02 : 13:14:36
|
quote: Tara, what does that have to do with tempdb?It's 6 gig!
Because label asked "While we're on the subject of disk space, how I do I get rid of some of my really large transaction logs? I do a complete backup and then try and shrink them and I've got a couple that won't let me shrink them below 2 gigs.....Surely I don't need a 2 gig trans log? Isn't there any way to get rid of those large files?"Tara |
 |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 13:17:29
|
quote: What is your database recovery model set to? If it's FULL, then you need to backup your transaction logs periodically. If you don't care about point in time recovery, change it to SIMPLE. If it's already set to simple, then try running DBCC SHRINKDATABASE and DBCC SHRINKFILE. See BOL for syntax on both. If that doesn't work, then you can detach the log and create a new one. But that will take the database down temporarily.
Ok....I'll give that a shot, try and shrink my oversized Transact logs. Thanks. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-02 : 13:24:49
|
quote: Because label asked "While we're on the subject of disk space, how I do I get rid of some of my really large transaction logs? I do a complete backup and then try and shrink them and I've got a couple that won't let me shrink them below 2 gigs.....Surely I don't need a 2 gig trans log? Isn't there any way to get rid of those large files?"Tara
Sorry Tara, missed it...was working on the other probs....Brett8-) |
 |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 13:24:56
|
quote: How many other databases are on the box?
Around 7 or 8 other DB's are on the box. Most are much smaller, though the one I'm currently working with (SMC_Sales) is well over 2 million records (550 megs). quote: Also in your SELECT in the second query from the temp table, where do you get ParentId from?
I'm using the se.customer_id as my parentID because a user can be associated with multiple branches of a single company. For instance, a user who is part of the 3M company would be associated to every instance of a 3M branch office based on the company name + '%' Since I don't have a table that defines relationships between branch offices and parent Companies I'm using the user as the parent ID to aggregate all the data associated with each branch of a company (accountnumber) into one result per distinct instance of a company. Edited by - label on 06/02/2003 13:25:27 |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-02 : 13:48:27
|
Ok, this is certainly WITHOUT optimization, but:select distinct(parentId), partnumber, max([customer Name]) as Customer, sum([Total Sales]) as [Total Sales], min([first_download]) as [First Download], max([last_download]) as [Last_Download] FROM (select distinct(partnumber), convert(money, sum(totalsalenet)) as [Total Sales], c.customer_name, min(et.requestdatetime) as [First_Download], max(et.requestdatetime) as [Last_Download], se.etech_customer_id As ParentID from sales_history s join customer_data c on s.accountnumber=c.accountnumber join smc_etech_key se on c.accountnumber=se.accountnumber left outer join smc_new_products.dbo.etechmodelrequests et on convert(varchar(50), se.etech_customer_id)=et.username and s.partnumber=et.configname and requestdatetime between @datefrom and @dateto and interfacename like '%down%' where convert(datetime, (substring(convert(varchar(8), purchase_date), 5, 2) + '/' + right(purchase_date, 2) + '/' + left(purchase_date, 4)), 101) between @datefrom and @dateto group by partnumber, customer_name, etech_customer_id ) As xxxwhere [first_download] is not null group by parentid, partnumber order by parentID couple of comments though. If you can, lose the spaces in column names. Second (and I don't pretend to understand your statement), but I think you can combine them. Maybe not. But if you can (because you'll eliminate a second grouping operation) look at using a HAVING Clause. Third, you don't need a distinct clause on a column your grouping on...I was suprised it even worked...USE NorthwindGOSELECT Distinct(CustomerId), Max(OrderDate), OrderIdFROM OrdersGROUP BY CustomerId, OrderIdGOAnyway, Good Luck...Brett8-) |
 |
|
|
label
Posting Yak Master
197 Posts |
Posted - 2003-06-02 : 17:44:03
|
quote: Ok, this is certainly WITHOUT optimization, but:select distinct(parentId), partnumber, max([customer Name]) as Customer, sum([Total Sales]) as [Total Sales], min([first_download]) as [First Download], max([last_download]) as [Last_Download] FROM (select distinct(partnumber), convert(money, sum(totalsalenet)) as [Total Sales], c.customer_name, min(et.requestdatetime) as [First_Download], max(et.requestdatetime) as [Last_Download], se.etech_customer_id As ParentID from sales_history s join customer_data c on s.accountnumber=c.accountnumber join smc_etech_key se on c.accountnumber=se.accountnumber left outer join smc_new_products.dbo.etechmodelrequests et on convert(varchar(50), se.etech_customer_id)=et.username and s.partnumber=et.configname and requestdatetime between @datefrom and @dateto and interfacename like '%down%' where convert(datetime, (substring(convert(varchar(8), purchase_date), 5, 2) + '/' + right(purchase_date, 2) + '/' + left(purchase_date, 4)), 101) between @datefrom and @dateto group by partnumber, customer_name, etech_customer_id ) As xxxwhere [first_download] is not null group by parentid, partnumber order by parentID couple of comments though. If you can, lose the spaces in column names. Second (and I don't pretend to understand your statement), but I think you can combine them. Maybe not. But if you can (because you'll eliminate a second grouping operation) look at using a HAVING Clause. Third, you don't need a distinct clause on a column your grouping on...I was suprised it even worked...USE NorthwindGOSELECT Distinct(CustomerId), Max(OrderDate), OrderIdFROM OrdersGROUP BY CustomerId, OrderIdGOAnyway, Good Luck...Brett8-)
Alright....I did a lot of fine-tuning on the tables, ended up copying all of the ones I was using, removed superflous fields, established good indexs and some relationships between them, made sure all the data types and naming conventions matched and it ran alot quicker. Here's the final query:quote: Select parentId, partnumber, max([customer_Name]) as Customer, sum(Total_Sales) as Total_Sales, min([first_download]) as [First Download], max([last_download]) as [Last_Download]FROM ( select partnumber, convert(money, sum(totalsalenet)) as Total_Sales, c.customer_name as customer_name , min(et.requestdatetime) as First_Download, max(et.requestdatetime) as Last_Download, se.etech_customer_id As ParentID from sales_history s join customer_data c on s.accountnumber=c.accountnumber join smc_etech_key se on c.accountnumber=se.accountnumber left outer join etechmodelrequests et on se.etech_customer_id=et.username and s.partnumber=et.configname and requestdatetime between @datefrom and @dateto where purchase_date between 20020401 and 20030401 group by partnumber, customer_name, etech_customer_id ) As MyResultsgroup by parentid, partnumber order by parentID
Thanks for your help guys!(PS. Still haven't found a way to shrink my stupid transaction log files though.... ) |
 |
|
|
JohnDeere
Posting Yak Master
191 Posts |
|
|
|
|
|
|
|