Please start any new threads on our new 
    site at https://forums.sqlteam.com.  We've got lots of great SQL Server
    experts to answer whatever question you can come up with.
    
        | 
                
                    | 
                            
                                | Author | Topic |  
                                    | NewMedia42Starting Member
 
 
                                        35 Posts | 
                                            
                                            |  Posted - 2009-02-27 : 23:13:55 
 |  
                                            | I have two tasks that run and update the same table - the table gets about 6 million new records per day from the two processes.  Currently I check before each insert to verify that an item matching it doesn't already exist - this is true probably > 80% of the time.My question is - would it be faster to just always insert (regardless if a match already exists, and just periodically run a task on the table to delete duplicates?  This will obviously increase fragmentation, but is it worth exploring this?  Looking through the profiler (code), it's spending 85% of it's time checking for pre-existing entries, so this would make a major speedup. |  |  
                                    | sodeepMaster Smack Fu Yak Hacker
 
 
                                    7174 Posts | 
                                        
                                          |  Posted - 2009-02-27 : 23:31:45 
 |  
                                          | All three(update/delete/insert) deals with external and internal fragmentation.You might wanna try with SSIS with Lookup transformation Tasks.There are plenty of Examples in SSIS topic. |  
                                          |  |  |  
                                    | darkduskyAged Yak Warrior
 
 
                                    591 Posts |  |  
                                |  |  |  |