Putting Books Online Offline

One of the early announces of MS SQL Server 2012 I remember is that Books Online (BOL) would only be available online. And of course it makes sense. If the SQL Server team wants to keep the library up to date it makes no sense shipping it with the install media. But as a consultant I need BOL to be installed on my personal laptop because it often happens that I’m not allowed on the customers network with an external machine. So no network, means no internet and means no Books Online. But lucky me it’s easy to have BOL offline available.
This is how it works.
Open Management Studio and click Help > Manage Help Settings

In the Help Library Manager click Install content from online

Scroll down in the Install Content from Online screen to the SQL Server 2012 section and click Add in the Actions column. Add additional content if desired and click update.

After a couple of minutes Books Online is offline available.

Interupted Identity Insert behaviour

While teaching the Implementing & maintaining a Microsoft SQL Server 2008 Fasttrack I got a very simple question. But it’s often these simple questions that make you doubt. The question was: “What if you manually insert some values in an Identity column? Will SQL Server autonumber from the new values or…”. I never had to worry on production servers about this and I was pretty sure SQL Server would be able to cope with manual Identity Inserts but I don’t want to give answers to my students that I’m not sure off. So it was demo time:

First of all a table with an Identity column was created:

CREATE TABLEIdentityTest (
Autoseed int IDENTITY (1,1),
Comment varchar (20)
)
GO

Next I insert some rows that automatically get a value in the Autoseed column:

INSERT INTO IdentityTest
VALUES (‘autoinsert’)
GO 75

If I try to insert some values in the autoinsert column I get an error message:

INSERT INTO IdentityTest(Autoseed,Comment)
VALUES (76,’manual’),(77,’manual’)
GO

Msg 544, Level 16, State 1, Line 1
Cannot insert explicit value for identity column in table ‘IdentityTest’ when IDENTITY_INSERT is set to OFF.

 To insert some values manually I need to enable Identity inserts on the table with the following statement:

SET IDENTITY_INSERT IdentityTest ON
GO

Now I can do the insert. Note that when doing an Identity insert I need to specify the columnlist so this will work:

INSERT INTO IdentityTest(Autoseed,Comment)
VALUES (76,’manual’),(77,’manual’)
GO

But this will not work:

INSERT INTO IdentityTest
VALUES (76,’manual’),(77,’manual’)
GO

EXTRA TIP:
Dragging the column folder from object explorer to the Query window will automatically give the columnlist of the table.

Now I have two manual lines in my table. To continue my demo I need to put the auto seed back on:

SET IDENTITY_INSERT IdentityTest OFF
GO

To make it even more complicated I also delete some rows in the table to create some gaps:

DELETE FROM IdentityTest
WHERE Autoseed IN (69,77)

Now it’s time to add some rows. Note that 77 was the highest inserted number but was deleted in the previous script.:

INSERT INTO IdentityTest  VALUES (‘autoinsert’)
GO 10

Now querying the table will give the following results:

SELECT MAX(autoseed)
FROM IdentityTest

Results in 87 so the auto increment started at 78 meaning tot 77 was not reused although it was deleted

SELECT * from IdentityTest
WHERE Autoseed IN (69,77)

Will give no results so the empty gaps are not filled up again.

SELECT * FROM IdentityTest

Shows what happens on heaps (tables with no clustered index). Value 78 will be on position 69 because it will fill the physical gap on the datapage.

Book review: Microsoft SQL Server 2008 Administration with Windows PowerShell

Almost every training I teach I get the question: “What with PowerShell?”. Students want to know what they can do with PowerShell, how it’s used and most important where they can learn it. I learned the PowerShell (very) basics from articles on the internet, demo’s on events and so on. But now I think a found a winner for DBA’s who want to start learning PowerShell. The Microsoft SQL Server 2008 Administration with Windows PowerShell book by Ananthakumar Muthusamy (blog) and Yan Pan (blog) is a perfect guide for DBA’s who start with PowerShell or who already know PowerShell but want to use it to administer their MS SQL Servers. Also beginning DBA’s can start using the book because it’s full of tips, tricks and explanations on how and why certain tasks are done.

The first 4 chapters bring the PowerShell ‘dummy’ up to speed with the very basics of PowerShell. Readers will learn how to use commands, create scripts and work with Functions, Parameters and so on. Extensive code examples help to master the basics fast.

In chapter 5 to 8 the writers dig in deeper on the use of PowerShell and how to get information and to get things done on Windows level, covering the File System, Registry, Event Logs, Services and the WMI Provider

And finally starting from chapter 9 to 13 the reader starts reading what a DBA can do with Windows PowerShell starting with the provider for Configuration Management over the SQL Server PowerShell provider, policies and the SMO.

After these chapters the book could have stopped. Everything about PowerShell and SQL Server is covered to get a DBA started but the real surplus of the book is about to start. First chapter 14 is there to advise on using and creating standards within SQL Server and PowerShell. Nothing new, nothing spectacular but it’s nice to see the writers care about standards and want to help and guide beginning coders to use them.

Starting from chapter 15 to 21 all the knowledge of the previous chapters is put into practise. The chapters are no easy read but show the readers how to create a complete PowerShell solution to inventory, monitor and manage a complete SQL Server environment. And even if the reader doesn’t want to create such a solution. The coding examples are just great to lookup how a certain task can be accomplished with PowerShell.

Conclusion

Microsoft SQL Server 2008 Administration with Windows Powershell is not a book to read on a holiday nor in bed. But I think it’s a must have on every DBA’s bookshelf whether the DBA is a starter or a senior. Everybody will find value in this book.

SQL 2008 R2 upgrade error: Wait on the Database Engine recovery handle failed

While preparing for an upgrade of a complex SQL Server 2000 installation to SQL Server 2008 R2 I had to figure out a way of reusing the MSDB database of the original SQL Server 2000 installation on the new SQL Server 2008 R2 setup. The reason is the third-party job scheduling tool refers to the jobs with their job_id and neither scripting the jobs or using the SSIS ‘Transfer Jobs Task’ gave me the possibility to keep the job_id. The easiest way to accomplish this is:

  • Backup the original SQL Server 2000 MSDB database
  • Restore the SQL Server 2000 MSDB database on a temporary SQL Server 2000 with the same patch level as the original
  • Do an in place upgrade of the tempory SQL Server 2000 instance to SQL Server 2008 R2
  • Backup the upgraded SQL Server 2008 R2 MSDB database
  • Restore the upgrade SQL Server 2008 R2 database on the newly installed SQL Server 2008 R2 server

Unfortunately trying out this approach I stumbled upon the following error during the in place upgrade from SQL Server 2000 to SQL Server 2008:

 The SQL Server error log during setup is stored in ‘%programfiles\Microsoft SQL Server\100\Setup Bootstrap\Log’. In this folder a ‘Summary.txt’ is found with the summary of the last install attempt. Looking in the newest sub folder more detailed log files are found. Opening the ‘SQLServer_ERRORLOG_somedatetime.txt’ and scrolling down gave me this error:

The database owner SID recorded in the master database differs from the database owner SID recorded in database ‘msdb’. You should correct this situation by resetting the owner of database ‘msdb’ using the ALTER AUTHORIZATION statement.

 And off course, since the msdb database is owned by the SQL Login ‘sa’ the SID of the ‘sa’ account on the originating server is not the same as the SID of the ‘sa’ account on the upgrade server. To correct the situation and let the upgrade continue I ran the following commands from a command prompt:

Stop the running SQL Server Service:

net stop mssqlserver

Start the SQL Server services with these trace flags so the upgrade scripts don’t start running again:

net start mssqlserver /f /T3608

Use a dedicated admin connection to connect to the SQL Server Instance and change the owner of the MSDB database back to the ‘sa’ login:

sqlcmd -E -A
USE msdb
GO
sp_changedbowner ‘sa’
GO
EXIT

Use the net stop mssqlserver command again to stop the SQL Server Service.

Now I can restart the setup program of SQL Server 2008 R2 but instead of choosing a new installation I need to choose to REPAIR my latest installation. After the repair finishes I finally have my upgraded MSDB database. And when restoring the database to my target server I must keep in mind to change the owner of the restored MSDB database to the ‘sa’ login of the new server.

I registered for SQL Server Days

For the fourth time in a row I will be visiting the Belgian SQL Server Day(‘s) , an organisation of the Belgian SQL Server User Group and Microsoft Belux. The first 3 editions were 1-day events filled with SQL stuff for DBA’s, Developpers and BI-people brought by a mixture of local SQL Server Guru’s working for the Event partners, Microsoft employees and international SQL speakers. And looking at the calendar nothing has changed… except, the organisation has found so much content they are spreading the event over 2 days and starting from this year the event isn’t free anymore. But hey where can you find 2 days of in depth SQL Server training 79€ (if you register before october 1st)?

And besides all the knowledge you can gain, it’s always nice to go to a local event with only SQL-people to meet colleauges, co-workers, students and tweeps in person.

See you there?

Executing multiple SQL Scripts

A customer asked if it was possible to execute multiple .sql scripts from a single command. I knew the SQLCMD utility could be used to work with input files and that a .bat file could be created executing SQLCMD commands one by one. Looking up the details of the SQLCMD utility I stumbled upon the :r <filename> command wich can be used to parse additional Transact-SQL statements and SQLCMD commands from the file specified by <filename> into the statement cache. So this means I can create a ‘master’ script calling the other scripts and just need to execute the master script with the SQLCMD utility. So I’m going to create a database, add some tables and insert some data from 3 different scripts.

First I create the CREATE_TABLES.sql script to create the tables (duh). Because I’m ‘parsing statements in the statement cache’ it should be possible to declare variables in one script and use them in another script. So I declare and fill a variable to hold the table count:

PRINT ‘Creating tables’
GO

USE Axel8s
GO

IF OBJECT_ID(‘Article’) IS NOT NULL
DROP TABLE dbo.Article
GO
CREATE TABLE dbo.Article
 (
 Article_ID int IDENTITY (1,1) NOT NULL PRIMARY KEY,
 Name varchar (50),
 Remark varchar (50)
 )
GO

IF OBJECT_ID(‘Category’) IS NOT NULL
DROP TABLE dbo.Category
GO
CREATE TABLE dbo.Category
 (
 Category_ID int IDENTITY (1,1) NOT NULL PRIMARY KEY,
 Article_ID int,
 DateAdded DATETIME
 )
GO

DECLARE @iTableCount int
SET @iTableCount = 2

The INSERT_DATE.sql script will insert some data and print the value of the variable from the CREATE_TABLES script:

PRINT ‘TOTAL TABLES CREATED = ‘ + CAST(@iTableCount AS varchar)
GO
PRINT ‘INSERTING DATA INTO Article’
GO

USE Axel8s
GO

INSERT INTO dbo.Article (Name, Remark)
 SELECT ‘Orange’,’The fruit ones’
GO
INSERT INTO dbo.Article (Name, Remark)
 SELECT ‘Coke’,’To drink’
GO
INSERT INTO dbo.Article (Name, Remark)
 SELECT ‘Hamburger’,’No not the germans’
GO

Now it’s time to create a MASTER.sql script that will create the database and will call the other 2 scripts:

SET NOCOUNT ON
GO

PRINT ‘CREATING DATABASE’
IF EXISTS (SELECT 1 FROM sys.databases WHERE name = ‘Axel8s’)
DROP DATABASE Axel8s
GO
CREATE DATABASE Axel8s
GO

:On Error exit

:r c:\Axel8s\CREATE_TABLES.sql
:r c:\Axel8s\INSERT_DATA.sql

PRINT ‘DATABASE CREATION COMPLETED SUCCESFULLY’

Almost everything is in place now to start a double-click creation of my new database. The final script is a .bat file with the following command:

SQLCMD -S -E -dmaster -ic:\Axel8s\master.sql
PAUSE

And with a double-click on the create_Axel8s.bat file my new database is created:

 

Creating an ‘Attach all databases’ script

Finally, customers are more and more seeing the importance of a decent test environment before things are put in production. So as a DBA I get  more questions about how to set up those test environments and how to keep them in sync with the production environment. One trick I pulled of lately for a +100 databases environment was the following:

The database files on the production server were on a SAN and lucky me, the files for system and user databases were on separated (Windows) disks. So a test server was installed  (MS SQL Server 2008 R2 while production was MS SQL Server 2000) with the new company defaults to do the upgrade tests. The only thing we needed to care about was not to assign drive letters that were used on the current production environment.

With the basic server setup in place it was time to get the user databases on the server. At this point the SAN-team comes into the picture. They cloned the production disks within seconds and were able to present them  to the test server in the same order as on the production server. Switching back to the test server the newly presented disks were assigned the same drive letters as on production. Finally it was time to get the databases online.

If there were only 5 databases, I could have manually attached them or have generated the attach scripts one by one on the production server. But because I had more than 100 databases to attach it was time to do some scripting.

First I created a temporary table on the production server to gather the file information of the user databases:

CREATE TABLE #DBFiles
(
 db varchar(50),
 fileid int,
 name varchar (50),
 filename varchar (250)
)

With the temp table in place I can start gathering the file information from the user databases. Using the sp_MSforeachdb stored procedure I can loop my SELECT statement over every user database:

INSERT INTO #DBFiles
 EXEC sp_MSforeachdb ‘
  USE ?
  IF ”?” <> ”master” AND ”?” <> ”model” AND ”?” <> ”msdb” AND ”?” <> ”tempdb” AND ”?” <> ”distribution”
  BEGIN
    SELECT ”?”, SF.fileid, SF.name, SF.filename FROM sysfiles SF
  END’

Running trough the gathered data made clear that there were databases with let’s say, quite a history. Files were added and removed over the years so the file_id’s where not consistent any more. Some databases had over 16 files, more than you can attach with ‘sp_attach_db’. But while the production server was Ms SQL Server 2000, the new test server was MS SQL Server 2008 R2. This means that if my database files are all on the same location I only need the location of the primary data file in a ‘CREATE DATABASE … FOR ATTACH’ statement. SQL Server will take care of the rest of the files. Having all the file information already in place in my temp table I can simply run this query to generate a script to attach all the user databases:

SET NOCOUNT ON

SELECT ‘CREATE DATABASE ‘ + db + ‘ ON (FILENAME =  ”’ + RTRIM(filename) + ”’) FOR ATTACH;’
FROM #DBFiles
WHERE fileid = 1

Putting the output of above query to Text (CTRL + T) results in a nice script that I can copy and paste to my new test server. Starting the script and waiting for 10 minutes is all I have to do to start testing on the new test environment.

%d bloggers like this: