Creating an ‘Attach all databases’ script

Finally, customers are more and more seeing the importance of a decent test environment before things are put in production. So as a DBA I get  more questions about how to set up those test environments and how to keep them in sync with the production environment. One trick I pulled of lately for a +100 databases environment was the following:

The database files on the production server were on a SAN and lucky me, the files for system and user databases were on separated (Windows) disks. So a test server was installed  (MS SQL Server 2008 R2 while production was MS SQL Server 2000) with the new company defaults to do the upgrade tests. The only thing we needed to care about was not to assign drive letters that were used on the current production environment.

With the basic server setup in place it was time to get the user databases on the server. At this point the SAN-team comes into the picture. They cloned the production disks within seconds and were able to present them  to the test server in the same order as on the production server. Switching back to the test server the newly presented disks were assigned the same drive letters as on production. Finally it was time to get the databases online.

If there were only 5 databases, I could have manually attached them or have generated the attach scripts one by one on the production server. But because I had more than 100 databases to attach it was time to do some scripting.

First I created a temporary table on the production server to gather the file information of the user databases:

CREATE TABLE #DBFiles
(
 db varchar(50),
 fileid int,
 name varchar (50),
 filename varchar (250)
)

With the temp table in place I can start gathering the file information from the user databases. Using the sp_MSforeachdb stored procedure I can loop my SELECT statement over every user database:

INSERT INTO #DBFiles
 EXEC sp_MSforeachdb ‘
  USE ?
  IF ”?” <> ”master” AND ”?” <> ”model” AND ”?” <> ”msdb” AND ”?” <> ”tempdb” AND ”?” <> ”distribution”
  BEGIN
    SELECT ”?”, SF.fileid, SF.name, SF.filename FROM sysfiles SF
  END’

Running trough the gathered data made clear that there were databases with let’s say, quite a history. Files were added and removed over the years so the file_id’s where not consistent any more. Some databases had over 16 files, more than you can attach with ‘sp_attach_db’. But while the production server was Ms SQL Server 2000, the new test server was MS SQL Server 2008 R2. This means that if my database files are all on the same location I only need the location of the primary data file in a ‘CREATE DATABASE … FOR ATTACH’ statement. SQL Server will take care of the rest of the files. Having all the file information already in place in my temp table I can simply run this query to generate a script to attach all the user databases:

SET NOCOUNT ON

SELECT ‘CREATE DATABASE ‘ + db + ‘ ON (FILENAME =  ”’ + RTRIM(filename) + ”’) FOR ATTACH;’
FROM #DBFiles
WHERE fileid = 1

Putting the output of above query to Text (CTRL + T) results in a nice script that I can copy and paste to my new test server. Starting the script and waiting for 10 minutes is all I have to do to start testing on the new test environment.

Advertisements

BCP through CLR

While giving a training last week, I had some time to play around with SQL Server. And what’s better than exploring something new and something that might be interesting for some of my customers. So because one of my customers is doing a lot of bcp ins and outs through Stored Procedures in SQL 2000 using XP_cmdshell I decided to write a little demo to do a bulk copy of data using CLR, giving them another reason to upgrade their SQL Servers. So let’s get started.

Before I can start I need, of course, a test environment. So I create a BulkIn and a BulkOut database. To get some data into it I just use the Import/Export Wizard to copy the Person.Person table from database Adventureworks2008R2 into the BulkOut.dbo.Person table. The dbo.Person table in database BulkIn is created with the following script:

USE [BulkIn]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[Person](
 [BusinessEntityID] [int] NOT NULL,
 [Title] [nvarchar](8) NULL,
 [FirstName] [nvarchar](50) NOT NULL,
 [MiddleName] [nvarchar](50) NULL,
 [LastName] [nvarchar](50) NOT NULL,
 [Suffix] [nvarchar](10) NULL
) ON [PRIMARY]
GO

Because I’m going to use the CLR I need to enable it at server level and because my bulk copy normally will not stay on my instance I will create an Assembly with Permission Level External and need to mark the database as Trustworthy:

USE master
GO

SP_CONFIGURE ‘clr enabled’, 1
RECONFIGURE
GO

ALTER DATABASE BulkOut SET TRUSTWORTHY ON
GO

 Now it’s time to open Visual Studio to do some programming. I choose to create a Visual Basic SQL CLR Database Project because I don’t want @chrissie1 to write one of these post again. And to be honest my last programming experience was with VB6 a long time ago so VB.NET is easier for me. When I create the project I must  choose .NET Framework 3.5 otherwise I will get errors when Itry to deploy my assembly to SQL Server:

Next thing I need to  do is to set up a database reference and I choose to put it on the BulkOut database. So when I deploy the package later it will be generated in the BulkOut database:

As I already mentioned the Permission Level needs to be External because in normal circumstances I will not stay on my SQL Server Instance. This is done in the Database tab of the Properties screen of the Project:

Last thing to do is adding a new Stored Procedure to my Project before I finally can start writing some code:

The Stored Procedure is added and I can start writing code after the ‘Add your code here comment. First thing to do is to build up the connectionstrings to the database so I can easily reference them further on:

        Dim srcConnStr As String = “Data Source = (local);Integrated Security = true; Initial Catalog = bulkout”
        Dim dstConnStr As String = “Data Source = (local);Integrated Security = true; Initial Catalog = bulkin”

The logic says I need to open a connection to the source database:

        Using SrcConn As SqlConnection = New SqlConnection(srcConnStr)
            SrcConn.Open()

Next I’m going to read the data from my source table using a SqlDataReader:

            Dim cmdSrcData As SqlCommand = New SqlCommand(“SELECT BusinessEntityID, Title, FirstName, MiddleName, LastName, Suffix FROM Person;”, SrcConn)
            Dim srcReader As SqlDataReader = cmdSrcData.ExecuteReader

To be able to write to my destination database I first need a connection:

            Using dstConn As SqlConnection = New SqlConnection(dstConnStr)
                dstConn.Open()

And finally I can start the BulkCopy of my data:

                Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(dstConn)
                    bulkCopy.DestinationTableName = “Person”
                    bulkCopy.WriteToServer(srcReader)

I finish with closing all the using blocks, but the datareader needs to be closed explicitly:

                    srcReader.Close()
                End Using
            End Using
        End Using
    End Sub

 This is all the coding I need. Now I can deploy the assembly to SQL Server using Visual Studio. When the deploy succeeds I can find the spd_BulkCopy Stored Procedure in my BulkOut database and when I start it like a normally start a Stored Procedure it starts executing and after the message: The command(s) completed successfully I can find all my data in the Person table in database BulkIn.

%d bloggers like this: