Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
AskSQLTeam
Ask SQLTeam Question
0 Posts |
Posted - 2003-06-05 : 08:42:02
|
| Faheem writes "Hello SqlTeam,MSSQl Server 2000.I am trying to retrive data from 3 tables each table having morethan 1000,000 records. After filtering my requiredment is to get around 300,000 records in a Recorset variable with minimum of 10 fileds.When I am trying to execute my stroreprocedure it is taking morethan 5 minutes. Is there any way that can retrieve records quicker than this. I am executing this procedure from VB using ADO.MY Requirement: Based On ID I need to export all data into an execl file. Question :How to dump data of around 300,000 records with over 10 columns into an excel file from a database table.With regardsMohd Abdul Faheem." |
|
|
robvolk
Most Valuable Yak
15732 Posts |
Posted - 2003-06-05 : 08:47:43
|
| Well, you're gonna have a hell of a time since Excel cannot hold more than 65,000 rows on a single sheet. You'd have to split the data into multiple worksheets. For this and other reasons, Excel is not a good tool for such volumes of data. You should consider MS Access if you need to export that kind of volume. Lastly, 300,000 rows is going to take time to transmit over the network no matter how well optimized the query is. It's extremely unusual to pass such a large result set to a client computer. What exactly is going to be done with these 300,000 rows after they are exported? If someone is just going to summarize them into a report, you can do that on SQL Server and just pass the summary results, or you can export into Access and let it run the report. |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-05 : 09:31:08
|
| Dumping 300,000 records doesn't sound like a user "interfacable' type of thing...are passing it on to another system for processing?Why not use bcp?Brett8-) |
 |
|
|
mohdowais
Sheikh of Yak Knowledge
1456 Posts |
Posted - 2003-06-05 : 09:56:02
|
| VB and ADO are no match for 300,000 records. Your stored proc is probably taking a lot less time to retreive the records, the bottleneck is getting all these records across the network into the Recordsets. Like Brett suggested, just dump the contents in a CSV file using BCP...and Excel can read CSV files without any trouble. I only wonder how long it takes to open such a file...Owais |
 |
|
|
MichaelP
Jedi Yak
2489 Posts |
Posted - 2003-06-05 : 12:49:48
|
| Excel will probably be very upset when it tries to open such a large CSV file :)65k rows per CSV.Michael<Yoda>Use the Search page you must. Find the answer you will.</Yoda> |
 |
|
|
X002548
Not Just a Number
15586 Posts |
Posted - 2003-06-06 : 09:34:54
|
| Is it me, or do the AskSQLTeam post rarely get replied to by the originator?Brett8-) |
 |
|
|
dsdeming
479 Posts |
Posted - 2003-06-06 : 10:30:24
|
Even if you could cram that much data into a single Excel sheet you still have to deal with: - Users will forget to set column formats to text. Without it, numeric strings have their leading zeros trimmed.
- There is no real security here. Users will inevitably modify the data in Excel, and then people will make decisions based on that altered data which never had any business rules applied to it.
- If you drive a spreadsheet out into the country and abandon it there, it will somehow magically find its way back home to the printer. At 80 rows/page, this would be 3750 pages. There goes another acre of rain forest.
OK, I admit it. I have an attitude about spreadsheets, particularly when people attempt to use them as databases. Dennis |
 |
|
|
|
|
|
|
|