Upgrading from SQL Server 7 to 2000 (Part 2)

By Mark Caldwell on 29 May 2002 | Tags: Installation

In Part 1, I introduced the basic options for upsizing a SQL Server from version 7.0 to 2000. One important part of that article was the list of what has to be moved. Here in part 2, I will take that list and explain how I accomplished all of this on 53 separate databases in under 2 hours total, from a remote location.


My client’s configuration was such that we wanted to make a clean install of SQL 2000 Standard Edition on new, more powerful hardware, and then migrate the databases from the old SQL 7 server. We have the luxury that our databases could be offline for the duration of the move, as these are not mission-critical, but of course we wanted to minimize the downtime as much as possible because they are used by our web site. Because of this luxury, I opted for the detach/attach approach so I could move all 53 databases in batch. If you do not have this luxury, you may need to use another method such as Backup and Restore, followed later by another restore of the Log File.

Because I would be working after hours (actually starting at 10:00 PM) I elected to work remotely, from my home office. I have DSL and my client setup a VPN connection so my Enterprise Manager and Query Analyzer worked just as if I were sitting at my desk inside their office. The databases I moved are not large. The biggest databases were about 100 MB, and the median size is under 10 MB, so “your mileage may vary” in regards to how long it takes to move and convert. Regardless, the following approach will work for you.

The Tasks

Using the list of what has to move from Part 1, I expanded this to a specific numbered list of tasks that I would follow. The expanded list is:

  1. Backup the databases and the server and backup the backups again. By using the detach – copy to new machine – attach approach, it is very easy to recover from failure because I just reactivate the old server (with databases reattached). But anything can go wrong, so do backups just in case.

  2. Transfer Logins from OldServer to NewServer (See MS Knowlegebase article Q246133 titled How to Transfer Logins and Passwords Between SQL Servers.

  3. Generate script of Jobs on OldServer. Save for later use. To do this in Enterprise Manager, open the Management folder on your server. Then open the SQL Server Agent folder and right-click on Jobs. Choose the All Tasks menu and finally the Generate SQL Scripts option.

  4. Set all user databases on OldServer to DBO Use Only to prevent users from making any changes while I’m working on them.

  5. Kill any outstanding SPIDs (connections). A Database cannot be detached if it still has active connections.

  6. Detach all databases (see scripts below)

  7. Copy database files at OS level (MDFs, LDFs, and NDFs)

  8. Reattach databases to OldServer to have “just in case”. And re-flag as DBO Use Only so they are not used by mistake.

  9. Attach all databases to NewServer (attach process automatically runs upgrade process to convert to SQL 2000 (dbcompatlevel 80).

  10. Reindex all tables in all databases on NewServer (see scripts below).

  11. Update Statistics in all databases on NewServer.

  12. Fix User-to-Login mappings in each database on NewServer (See chadmat’s script).

  13. Enable any Full-Text Indexing on NewServer and rebuild the catalogs. Use the sp_fulltext_database and sp_fulltext_catalog procedures or Enterprise Manager.

  14. Execute Job Script created in step 3 on NewServer.

  15. Swap IP Addresses between servers. Our web servers all have the SQL Client Network Utility installed and have aliases mapped to the server’s IP address. As long as the new SQL Server ends up on the same IP Address, no change needs to be made to the web server settings.

  16. Test, test, test.

  17. Celebrate Victory!

Scripting the Work

In order to minimize downtime, I pre-wrote SQL scripts to handle most of the activity. In fact I wrote scripts to build scripts for me. I would execute the script-building scripts ahead of time and save their output into new files, with names that refer back to the Task Step Number above that they relate to. So for example, I would name the Set DBO Use Only script 4_SetDBOUseOnly.sql.

Here are the key scripts I used:

DBO Use Only

This script:

SELECT 'sp_dboption ''' + Name + ''', ''dbo use only'', ''TRUE''
G' + 'O

FROM master.dbo.sysdatabases

when executed in Query Analyzer with results in Text (instead of Grid) produces a result that looks like:

sp_dboption 'UserDBName1', 'dbo use only', 'TRUE'

sp_dboption 'UserDBName2', 'dbo use only', 'TRUE'

Copy the results text and paste it into a new Query Analyzer window and you suddenly have the script to set all of your user databases into DBO Use Only mode. Setting them into this mode will prevent users from connecting to your databases while you are working on them.

A couple of notes about the original script: GO is a keyword that separates commands. The Query Analyzer detects it, regardless of quotation marks, and having it in the middle of your script where you don’t want separation can really mess things up. That is why I had to separate the G and the O and concatenate them back together. It looks funny in the original script, but produces the desired results. I also like to have my scripts spaced apart, so there are some carriage returns that are enclosed in the quotation marks, again thus producing the desired output results.

The WHERE dbid > 6 clause simply tells the script to skip the 6 built-in databases (master, tempdb, model, msdb, pubs, and northwind, numbered 1-6 in that order). These databases already exist on the new server in their new format, so there is no need to migrate them. If you attempt to migrate the master database, you will likely cause yourself no end of grief, so don’t do it! The ORDER BY Name clause is simply there for convenience, so my script is alphabetical.

Detach Databases

The script to build the detach statements is very similar to the DBO Use Only script above. It is simply:

SELECT 'sp_detach_db ''' + Name + '''
G' + 'O

FROM master.dbo.sysdatabases 
where dbid > 6

which produces:

sp_detach_db 'UserDBName1'

sp_detach_db 'UserDBName2'

Attach Databases

You can have a script create your Attach Databases script for you, as long as you execute it BEFORE you detach them from OldServer. Then you can use the same script to reattach to OldServer as to NewServer, with the potential exception of a file path change. If the path to your database files is going to be different on the new server, then you can do a global search & replace to change this in the final script.

The following script will query the server for the current file locations of the primary data file and build the attach statements:

SELECT 'sp_attach_db @dbname = N''' + Name + ''',
	@filename1 = N''' + filename + ''',
	@filename2 = N''' + 'D:\MSSQL7\data\' + Name + '.ldf''

FROM master.dbo.sysdatabases
WHERE dbid > 6
Order By Name

which produces the following script:

sp_attach_db @dbname = N'UserDBName1',
	@filename1 = N'd:\MSSQL7\data\ UserDBName1.mdf',
	@filename2 = N'D:\MSSQL7\data\ UserDBName1.ldf'

sp_attach_db @dbname = N'UserDBName',
	@filename1 = N'd:\MSSQL7\data\ UserDBName2_Data.mdf',
	@filename2 = N'D:\MSSQL7\data\ UserDBName2.ldf'

WARNING: This script is not exactly correct for the Log file portion. The actual filenames are stored in sysfiles inside each database. I just did a hand-edit on the output of this script for my purposes because most of our databases only have one data file and one log file. If you want to have the system build the exact attach command that includes every file perfectly, then you need to run the following script in each database:

declare @FileList varchar(1000)

select @FileList = coalesce(@FileList + ', 
	@filename2 = N''', '@filename1 = N''') + rtrim(filename) + ''''
from sysfiles

select 'sp_attachdb @dbName = N''' + db_name() + ''', 
	' + @FileList

which produces:

sp_attachdb @dbName = N'UserDBName3', 
	@filename1 = N'd:\MSSQL7\data\UserDBName3.MDF', 
	@filename2 = N'D:\MSSQL7\data\UserDBName3.ldf'

The trick then, if you have multiple databases, is to either run this separately in each one, or someone here at SQLTeam may find an ingenious way to execute it across all DBs in one shot. I was getting pressed for time, so I just did the hand edit above.

Reindex All Tables in All Databases

The following script was assembled in cooperation with ToddV in the SQLTeam forums. Thanks, Todd, for your help.

DECLARE @SQL  Varchar(8000)
SET @SQL = ''

    ']..sp_MSforeachtable @command1=''DBCC DBREINDEX (''''?'''')''' + Char(13)
FROM MASTER..Sysdatabases
WHERE dbid > 6	-- Skip 6 built-in DBs.  Remove this line to include others in process

PRINT @SQL	-- To see the code you're about to execute.

Unlike the previous scripts which output results that you then had to turn around and execute the results, this one does all the work for you. It uses the officially undocumented stored procedure sp_MSforeachtable. You can find a little bit of information on this stored procedure in The Guru’s Guide to Transact-SQL by Ken Henderson. In short, what it does is loop through every table in a database and execute the command in the @command1 parameter. The question mark tells the procedure to enter the name of the table that is being processed into that position in the command.

This script loops through all the databases, and creates a Dynamic SQL* statement to execute the sp_MSforeachtable script in every database. (Note, there is also a sp_MSforeachDB stored procedure, but because of the way they are written, you cannot nest sp_MSforeachtable inside sp_MSforeachDB ... argh!)

Update Statistics in All Databases

And finally, to update the statistics in all databases, similar to the scripts above, you can use:

SELECT 'Use ' + Name + '
exec sp_updatestats
G' + 'O

FROM master.dbo.sysdatabases
WHERE dbid > 6
Order By Name

to produce the following:

Use UserDBName1
exec sp_updatestats

Use UserDBName2
exec sp_updatestats

Copy and paste the above into a new Query Analyzer window and execute it. This updates all the statistics so that the query optimizer has fresh, current information to use in processing.


That wraps it up. As I said earlier, I ran each of these scripts that build other scripts before I began the actual migration so that I was ready and could just execute each of the final scripts in quick succession. When I began the planning for this migration, I was a little intimidated, but as I dove into the process and started outlining the steps, like I have here, I soon realized that it was all pretty simple, albeit time-consuming. But then again, isn’t that the idea? Spend plenty of time in preparation so that when it comes time for the real performance, it all goes smoothly. It has been over a month since we did the conversion to our production server, and I have not heard of any code changes that were needed.

Good luck in your migrations! I hope this has been of help to you.

Related Links

- Advertisement -