Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 INSERT INTO .... (millions records)

Author  Topic 

Ophidian
Starting Member

3 Posts

Posted - 2004-06-04 : 00:43:47
Dear friends,

I have a query to insert from temp table to real table.
The temp table holds 'processed' couple millions of records.

Somehow, the log file is full and cause problem.

Is there another faster and more efficient way to insert data from one table to another, by not using the INSERT INTO method?


Thanks,
Ophidian.

AndrewMurphy
Master Smack Fu Yak Hacker

2916 Posts

Posted - 2004-06-04 : 05:21:24
you could....use a while loop and process batches of records, wrapped inside transactions so that the batches get committed to the database and so the logsize is controlled within a defined range....
Go to Top of Page

Kristen
Test

22859 Posts

Posted - 2004-06-04 : 05:22:33
Use DTS, it does its work in small-ish batches. If the data is in a file, or if you want to copy the data between remote servers it may then become more efficient to Export to file and then Import, use can use BCP (Bulk Copy Program) or BULK INSERT statement.

Kristen
Go to Top of Page

kselvia
Aged Yak Warrior

526 Posts

Posted - 2004-06-04 : 12:36:45
How many rows are already in your permanent table?

You might be able to do something like this if there are comparatively few rows:

SELECT * into NEW_TABLE FROM #temptable
UNION ALL
SELECT * from PERMANENT_TABLE

DROP TABLE PERMANENT_TABLE
SP_RENAME NEW_TABLE, PERMANENT_TABLE

Add CREATE INDEX statements if necessary to build indexes on PERMENENT_TABLE

SELECT INTO is not logged in the same way as INSERT INTO so your logfile should not fill up.

Alternatively, you might be able to drop most of the indexes from your permanent table before inserting into it then rebuild them afterwards (again depending on how many rows it already has, this might or might not be more efficient)

Also run

sp_recompile PERMANENT_TABLE

after recreating it to make stored procedures and views recognize it's id, statistics, and query plan have changed.

Go to Top of Page

Ophidian
Starting Member

3 Posts

Posted - 2004-06-07 : 02:28:53
Thanks for all the comments, atm breaking into small batches seems to work fine.
As it is a long query, while other users might also access the data, im not sure about dropping the indexes and recreate or such...
Go to Top of Page
   

- Advertisement -