Please start any new threads on our new
site at https://forums.sqlteam.com. We've got lots of great SQL Server
experts to answer whatever question you can come up with.
| Author |
Topic |
|
Ophidian
Starting Member
3 Posts |
Posted - 2004-06-04 : 00:43:47
|
| Dear friends,I have a query to insert from temp table to real table.The temp table holds 'processed' couple millions of records.Somehow, the log file is full and cause problem.Is there another faster and more efficient way to insert data from one table to another, by not using the INSERT INTO method?Thanks,Ophidian. |
|
|
AndrewMurphy
Master Smack Fu Yak Hacker
2916 Posts |
Posted - 2004-06-04 : 05:21:24
|
| you could....use a while loop and process batches of records, wrapped inside transactions so that the batches get committed to the database and so the logsize is controlled within a defined range.... |
 |
|
|
Kristen
Test
22859 Posts |
Posted - 2004-06-04 : 05:22:33
|
| Use DTS, it does its work in small-ish batches. If the data is in a file, or if you want to copy the data between remote servers it may then become more efficient to Export to file and then Import, use can use BCP (Bulk Copy Program) or BULK INSERT statement.Kristen |
 |
|
|
kselvia
Aged Yak Warrior
526 Posts |
Posted - 2004-06-04 : 12:36:45
|
| How many rows are already in your permanent table? You might be able to do something like this if there are comparatively few rows:SELECT * into NEW_TABLE FROM #temptableUNION ALLSELECT * from PERMANENT_TABLEDROP TABLE PERMANENT_TABLESP_RENAME NEW_TABLE, PERMANENT_TABLEAdd CREATE INDEX statements if necessary to build indexes on PERMENENT_TABLESELECT INTO is not logged in the same way as INSERT INTO so your logfile should not fill up.Alternatively, you might be able to drop most of the indexes from your permanent table before inserting into it then rebuild them afterwards (again depending on how many rows it already has, this might or might not be more efficient)Also run sp_recompile PERMANENT_TABLEafter recreating it to make stored procedures and views recognize it's id, statistics, and query plan have changed. |
 |
|
|
Ophidian
Starting Member
3 Posts |
Posted - 2004-06-07 : 02:28:53
|
| Thanks for all the comments, atm breaking into small batches seems to work fine.As it is a long query, while other users might also access the data, im not sure about dropping the indexes and recreate or such... |
 |
|
|
|
|
|
|
|