Please start any new threads on our new site at https://forums.sqlteam.com. We've got lots of great SQL Server experts to answer whatever question you can come up with.

 All Forums
 SQL Server 2000 Forums
 SQL Server Development (2000)
 can BULK INSERT locate FIRSTROW using keywords?

Author  Topic 

steelkilt
Constraint Violating Yak Guru

255 Posts

Posted - 2005-02-01 : 13:43:38
Hi, here’s a snippet of the sproc I’m using to process a text file and then move it to a sql table for cleaning. I use BULK INSERT as below:

SELECT @SQL = 'BULK INSERT new_table'
SELECT @SQL = @SQL + ' FROM '
SELECT @SQL = @SQL + '''\\server\folder\' + @FileChoice + ''' WITH (FIRSTROW=2, LASTROW=400, KEEPNULLS) '

I use FIRSTROW=2 mainly because it has always worked with the “standard” form for this text file, which I show in Example 1 below. All the “junk” is just header stuff that I ignore. Article_ID is the marker for the first valid row of the file, which contains the fieldnames:

Example 1:

Junk
Junk
Junk
Junk
Junk
Junk
Junk
Junk
Article_ID field field etc…


Now the person who gives me the text files has thrown a curve. His new ones are formatted like Example 2 below, with less header junk. Not surprisingly, these files are crashing the sproc:

Junk
Junk
Article_ID field field etc…


My question: is there any way, using BULK INSERT, that I can tell the sproc to find “Article_ID” and have the FIRSTROW begin with Article_ID every time? This would allow me to process a wider range of text file formats.

Note: I use LASTROW=400 because the rowcount in between FIRSTROW and LASTROW is the same everytime.

Thanks for any help you can provide.

dsdeming

479 Posts

Posted - 2005-02-01 : 14:05:04
BULK INSERT isn't that smart. The normal way around your problem is using BULK INSERT to drop all of the data into a large varchar column in some sort of processing table, clean out the junk, and continue from there.

Another option might be to use bcp which will allow you to capture bad rows into an output file. Then you could scan the output file at the end of processing to make sure that it contains nothing but those junk header rows.

Dennis
Go to Top of Page
   

- Advertisement -