Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 Inserting 40 million rows from AS400 table to SQL

Author  Topic 

imsganesh
Starting Member

28 Posts

Posted - 2006-01-26 : 17:50:37
Hi,
I need to copy around 40million rows from an AS400 table (legacy systems) to SQL server 2000. Actually the AS400 table has 120mil rows in which i'm using some conditions to pick 2 years data. I'm using the Transform Data task to copy from AS400 table to SQL server. This task is very slow and its taking more than 2 days to complete.

Is there any way to copy big tables from a different source? Bulk Insert works only with files??

Thanks!
SG

Kristen
Test

22859 Posts

Posted - 2006-01-26 : 18:45:17
I reckon your best way would be to export the data from AS400 as a text file (suitable delimited etc.) and use BCP to import it into SQL Server.

Make sure the text file is pre-sorted into primary key order, and use a hint to alert BCP to the fact that the file is ordered.

Pre-extend the SQL database to a size sufficient to contain all the data, to prevent it extending automatically in small(er) amounts.

Drop all indexes [except clustered index matching the import file's sort order] before importing, and recreate them afterwards.

Assuming half decent hardware I reckon importing a million rows should only take a few minutes, so multiplying "a few minutes" by 40 gives me around about "a few hours" as a yardstick

Kristen
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-26 : 21:44:01
Thanks Kristen!
The problem in Exporting the data from AS400 to a text file... I only have a read-only permission to the AS400 tables as it is the production system. Asking the prod support to export the data to a text-file is not feasible, at least for the time being! So Is there any other way, to do this!? Also when i run this DTS job its creating so many locks on the table (more than 22,000 locks, when I checked in Current Activity -> Process and Locks). I donno why? Is that normal for a big source table?

Btw, i used linked server to connect to the AS400 server and then used a select query to access the AS400 tables and inserted records for few days worth of data into the SQL server and it looks like this is faster. I'm trying to insert for 3 years! I'll post the result if this faster!


Thanks!
SG
Go to Top of Page

byrmol
Shed Building SQL Farmer

1591 Posts

Posted - 2006-01-26 : 22:10:52
Ensure that in the DTS task, "Fast Load" is set and an appropriate Batch Size is selected..

DavidM

Intelligent Design is NOT science.

A front-end is something that tries to violate a back-end.
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-27 : 12:46:01
Yes! Fast Load is enabled and the Batch Size was default to 0. Do I need to change the Batch Size?

Thanks!
SG
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-27 : 12:56:06
Anyway, I modified the package to use a simple SQL Task using Linked Server to connect to the source. Now i've a simple sql insert which selects the data from the source through linked server into a local table. I ran it and the job failed after 14 hours with this error:

"DTSStep_DTSExecuteSQLTask_1 DTSRun OnError: DTSStep_DTSExecuteSQLTask_1, Error = -2147217900 (80040E14) Error string: OLE DB provider 'MSDASQL' reported an error. Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 Error Detail Records: Error: -2147217900 (80040E14); Provider Error: 7399 (1CE7) Error string: OLE DB provider 'MSDASQL' reported an error. Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 Error: -2147217900 (80040E14); Provider Error: 7300 (1C84) Error string: OLE DB error trace [OLE/DB Provider 'MSDASQL' ICommandPrepare::Prepare returned 0x80004005: ]. Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 Error: -2147217900 (80040E14); Provider Error: 7312 (1C90) Error string: [OLE/DB provider retu... Process Exit Code 1. The step failed."

Any suggestions please!

btw, the insert qry is simple. I didnt give any col names in the insert -
"insert into <table> select CAST(col1 AS VARCHAR(10)) + '' CAST(col1 AS VARCHAR(10)) as order_id, ordernum,...."

Thanks!
SG
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-28 : 02:03:46
Well .... I expect you could solve that. One way might be to write your own loop that just pulls a days worth of data at a go, and inserts that into your local table.

But you'll spend less time, overall, if you get the security permissions on the AS400 changed, or perhaps just SELECT from it onto you local disk? and then BCP the file in. Or get the AS400 folk to send you a CD with a text file on??

Kristen
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-30 : 16:26:08
Yes! thats what i m doing right now...I'm pulling 6 months data and the same package is running properly, which gives the above error when i pull for the entire year! Do you know why? I could not understand why its failing when running for the whole year!

Thanks!
SG
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-31 : 01:22:21
Well ... if you can do both lots of 6 months, individually, but not the year as a whole then it has to be that the volume of data is too great for something, in which case just have a loop that does the transfer in 6-monthly-chunks.

If the second half of the year fails then its probably a data problem somewhere - column NULL, column too long, data conversion failing, etc.

Kristen
Go to Top of Page

mr_mist
Grunnio

1870 Posts

Posted - 2006-01-31 : 03:51:30
If you're using simple DTS transformations, then make sure that you delete all the individual transformations, select all the columns on both sides, and then create one single transformation containing all the columns. This will increase the speed of the DTS package many times for a multi-column table.

Also make sure constraints and indexes are removed before your large bulk import, you can add them backon again later, for this volume of data it would be far quicker than doing the import with indexes in place.

-------
Moo. :)
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-31 : 10:15:07
Best to have the any clustered index in place though, AND pre-sort the import data to match the clustered index.

Kristen
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-31 : 11:55:29
Thanks to you all! For the time being I'll try to get the data in 6 months. Though I could not get to the cause of the error...like if it is the volume issue, is there any configuration/settings to avoid that..

Thanks!
SG
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-31 : 13:46:29
"if it is the volume issue, is there any configuration/settings to avoid that"

I expect it is timeout (or possibly failure due to disk full, or size of LOG file or somesuch), but it is better NOT to work around that by just increasing the limits because the failure is indicating that the system is under stress!!

If you work with 6 month "batches" then the TEMPDB space, or whatever, will be released after each batch, putting much less strain on the system.

We have a daily "cleanup" routine. It deletes about 8 million rows. I'm expect we COULD do it in one strike, but we do it 100,000 rows at a time. We have found that that has a minimal impact on the system and thus runs smoothly, whereas I'm sure that if we allowed it to just delete the lot we'd have "stress related" problems.

I think the reason you aren't seeing a useful error message is that the stuff that is logged is truncated - note this last bit at the end:

Error string: [OLE/DB provider retu... Process Exit Code 1. The step failed.

so there might be something useful/revealing in the missing section

But either way, If it were me I would definitely do the task in "sensibly sized batches"

Kristen
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2006-01-31 : 14:48:39
Here, better solution. Tell them that you can't do the job because the AS400 admins won't cooperate

Your best bet is bcp

Brett

8-)

Hint: Want your questions answered fast? Follow the direction in this link
http://weblogs.sqlteam.com/brettk/archive/2005/05/25/5276.aspx

Add yourself!
http://www.frappr.com/sqlteam
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-31 : 15:01:24
hmmm...i think that should solve my problem Thanks Brett and Kristen! Both of ur suggestions will work!
Anyway, I did one more thing...i ran the same insert query in Query Analyser for 1 year and it finished without any error.
Is that odd its running in QA properly and not in DTS?

Thanks!
SG
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2006-01-31 : 15:13:12
"Is that odd its running in QA properly and not in DTS?"

No, I don't think so. Q.A. will have no timeout ... DTS probably will ... but my point about stressing the system still stands!

Kristen
Go to Top of Page

X002548
Not Just a Number

15586 Posts

Posted - 2006-01-31 : 16:04:32
quote:
Originally posted by mr_mist

If you're using simple DTS transformations, then make sure that you delete all the individual transformations, select all the columns on both sides, and then create one single transformation containing all the columns. This will increase the speed of the DTS package many times for a multi-column table.

Also make sure constraints and indexes are removed before your large bulk import, you can add them backon again later, for this volume of data it would be far quicker than doing the import with indexes in place.

-------
Moo. :)



MOO indeed....I'd still do bcp with a drop of the indexes...

Where there's a mist, there's a way...

MOO



Brett

8-)

Hint: Want your questions answered fast? Follow the direction in this link
http://weblogs.sqlteam.com/brettk/archive/2005/05/25/5276.aspx

Add yourself!
http://www.frappr.com/sqlteam
Go to Top of Page

imsganesh
Starting Member

28 Posts

Posted - 2006-01-31 : 20:31:50
Thanks!! to all of your responses
SG
Go to Top of Page
   

- Advertisement -