“Get Unused Index Deletion Candidates” Tool for SQL Server 2005

Scenario: In the “Get Unused Indexes” Tool for SQL Server 2005 post, I presented a script to capture unused indexes for a given database. Specifically, I capture “unused index” information in the DBA.dbo.UnusedIndexes table. After we collect several days of data, we can perform a trend analysis. How can we accomplish this task?

Solution: Here’s a companion stored-procedure, that complements the “GetUnusedIndexes” stored procedure, to assist in performing a trend analysis.

Conventions: One convention bears comment.

  1. It’s my standard practice to create a fixed-space “DBA” database on a given SQL Server instance. It’s a workspace where I’ll perform DBA analysis tasks and, optionally, persist related metrics (akin to SQL Server’s usage of its “tempdb” database). I’ll place DBA objects, like this script’s stored procedure and related table, in the DBA database.

Script Highlights: I schedule the “DBA.dbo.GetUnusedIndexes” stored procedure to run once daily for the given databases of interest. Therefore, since we ask the question once per day, an index is either used or unused for a particular day. Let’s call a day where SQL Server does NOT use an index a “hit” (and not a “miss”). So if a particular index has 19 hits out of our 21-day sample, it has a “score” of 95%. SQL Server did not use that index 95% of the time. With those definitions behind us, I can describe the script’s three parameters.

  1. @DBName: The name of the database against which you want to perform the “unused index” trend analysis
  2. @PercentDaysUnused: The “score” (see above) threshold you want to use to consider an index as unused (I err on the conservative side and use a .9 value. In other words, if the index goes unused 90% of the time, I consider it virtually unused)
  3. @Debug: A bit to either enable or disable debug mode (showing intermediate dynamic T-SQL queries and so on)

N.B. This stored procedure calls the “CreateObjectStatementBuilder” stored procedure to produce the correlated CREATE/ALTER statement for each DROP statement. We want to have these statements to have a roll-back path. If you choose to use this “GetUnusedIndexDeletionCandidates” stored procedure, make sure you also download the “CreateObjectStatementBuilder” stored procedure. See the links below.

The “GetUnusedIndexDeletionCandidates” stored procedure produces the following table:

GetUnusedIndexDeletionCandidates_DataDictionary

 

 

 

Once we copy this table into Excel, we can perform additional analysis and save it as a change-management document.

N.B. Note that we have “IsPrimaryKey” and “IsUniqueConstraint” fields. While it makes sense to remove primary keys on occasion (when, perhaps, you can replace it with a superior one), use caution before indiscriminately removing a primary key. Likewise, unless you have a good reason, you will probably not want to delete any indexes that serve to provide a unique constraint.

The Script:

Note: I store these scripts on Box.Net. Click the related links to download the “Get Unused Index Deletion Candidates 2005 (MEI).sql” script and its dependent “Create Object Statement Builder 2005 (MEI).sql” script.

USE DBA
GO

IF OBJECT_ID('dbo.GetUnusedIndexDeletionCandidates', 'P') IS NOT NULL DROP PROCEDURE dbo.GetUnusedIndexDeletionCandidates
GO

CREATE PROCEDURE dbo.GetUnusedIndexDeletionCandidates

	@DBName									SYSNAME			= NULL,
	@PercentDaysUnused						DECIMAL (2, 2)	= 0,
	@Debug									BIT				= 0
AS

/*
* Author:	Mark Holahan
* Date:		Wednesday, 25-May-2011
*
* Changes:
*
* Version:	2
* Purpose:	Summarize unused index candidates for a given database based on
*			the @PercentDaysUnused threshold with related
*			DROP INDEX and CREATE INDEX/ALTER TABLE statements.
*
* Note:
*			This stored procedure assumes both the existence of the DBA.dbo.UnusedIndexes table
*			and that the "DBA.dbo.GetUnusedIndexes" stored procedure is populating said table.
*			Also, this script calls the "DBA.dbo.CreateObjectStatementBuilder" stored procedure.
*
* ToDo:
*/

SET NOCOUNT ON;

-- Some parameter checks

DECLARE @E VARCHAR(500)

IF NOT EXISTS (SELECT 1 FROM master.sys.databases WHERE name = @DBName)
	BEGIN
		SET @E = 'The [' + @DBName + '] database does not exist.'
		+ CHAR(10) + 'Please provide a valid database name.'

		RAISERROR(@E, 16, 1) WITH NOWAIT
		RETURN
	END

IF @PercentDaysUnused = 0
	BEGIN
		SET @E = 'Please provide a non-zero @PercentDaysUnused value.'
		+ CHAR(10) + ' We use this parameter to decide when an index is "unused."'
		+ CHAR(10) + 'An index is considered "unused" when it not used [@PercentDaysUnused] % days of the sample period.'

		RAISERROR(@E, 16, 1) WITH NOWAIT
		RETURN
	END

-- Variables

-- For CreateObjectStatementBuilder stored procedure

DECLARE
	@DBIndexSubsetTableName SYSNAME,
	@AddCariageReturn		BIT,
	@RetVal					INT,

	@YES					BIT

-- Constants
SET @YES					= 1

-- Ephemeral Work tables

DECLARE @f TABLE
(
	ServerName				SYSNAME,
	DatabaseName			SYSNAME,
	SchemaName				SYSNAME,
	TableName				SYSNAME,
	IndexName				SYSNAME,
	IsPrimaryKey			BIT NULL,
	IsUniqueConstraint		BIT NULL,
	Hits					SMALLINT,
	BatchCount				SMALLINT,
	Score					DECIMAL (12, 3)
)

DECLARE @m TABLE
(
	PK						INT IDENTITY,
	DBName					SYSNAME,
	BatchCount				SMALLINT,
	BatchDate				DATETIME,
	ServerName				SYSNAME,
	DatabaseName			SYSNAME,
	SchemaName				SYSNAME,
	TableName				SYSNAME,
	IndexName				SYSNAME,
	Hit						SMALLINT DEFAULT (0)
)

DECLARE @r TABLE
(
	ServerName				SYSNAME NOT NULL,
	DatabaseName			SYSNAME NOT NULL,
	SchemaName				SYSNAME NOT NULL,
	TableName				SYSNAME NOT NULL,
	IndexName				SYSNAME NULL,
	IsPrimaryKey			BIT NOT NULL,
	IsUniqueConstraint		BIT NOT NULL
)

DECLARE @t TABLE
(
	DBName					SYSNAME,
	BatchDate				DATETIME,
	BatchCount				SMALLINT DEFAULT 0
)

DECLARE @u TABLE
(
	ServerName				SYSNAME,
	DatabaseName			SYSNAME,
	SchemaName				SYSNAME,
	TableName				SYSNAME,
	IndexName				SYSNAME
)

DECLARE @z TABLE
(
	DBName					SYSNAME,
	BatchCount				SMALLINT
)

DECLARE @fin TABLE
(
	DatabaseName			SYSNAME NOT NULL,
	SchemaName				SYSNAME NOT NULL,
	TableName				SYSNAME NOT NULL,
	IndexName				SYSNAME NOT NULL,
	IsPrimaryKey			BIT NOT NULL,
	IsUniqueConstraint		BIT NOT NULL,
	TargetInstance_Hits		SMALLINT NOT NULL,
	TargetInstance_Count	SMALLINT NOT NULL,
	TargetInstance_Score	DECIMAL (12, 3) NOT NULL,
	DropAlterStatement		VARCHAR(8000) NULL
)

-- Params for CreateObjectStatementBuilder stored procedure
SET @DBIndexSubsetTableName = NULL
SET @AddCariageReturn		= 0
SET @Debug					= 0

-- Get unique batch dates for DBs
-- We want @ least 1 day after a server restart
INSERT @t (DBName, BatchDate)
	SELECT DataBaseName, BatchDate
	FROM DBA.dbo.UnusedIndexes
	WHERE
		DaysSinceLastRestart > 1
	AND
		IndexName IS NOT NULL
	AND
		DatabaseName = @DBName
	GROUP BY DataBaseName, BatchDate
	ORDER BY DataBaseName, BatchDate;

-- Get distinct list of databases
INSERT @z(DBName, BatchCount)
	SELECT DBName, COUNT(1) BatchCount
	FROM @t
	GROUP BY DBName

UPDATE @t
	SET t.BatchCount = z.BatchCount
FROM @t									t
INNER JOIN @z							z
	ON t.DBName = z.DBName

-- Get unique server, database, schema, table, and index names
INSERT @u( ServerName, DatabaseName, SchemaName, TableName, IndexName )
	SELECT ServerName, DatabaseName, SchemaName, TableName, IndexName
	FROM DBA.dbo.UnusedIndexes
	WHERE
		DaysSinceLastRestart > 1
	AND
		IndexName IS NOT NULL
	AND
		DatabaseName = @DBName
	GROUP BY ServerName, DatabaseName, SchemaName, TableName, IndexName

INSERT @m( DBName, BatchDate, BatchCount, ServerName, DatabaseName, SchemaName, TableName, IndexName )
	SELECT *
	FROM @t
	CROSS JOIN @u

-- Remove cross-join extra stuff
DELETE FROM @m WHERE DBName <> DatabaseName

-- Create way to count the event of an index missing...
UPDATE m
SET
	m.Hit = 1
FROM @m										m
INNER JOIN DBA.dbo.UnusedIndexes			u
	 ON
		m.BatchDate		= u.BatchDate
	 AND
		m.ServerName	= u.ServerName
	AND
		m.DBName		= u.DatabaseName
	AND
		m.SchemaName	= u.SchemaName
	AND
		m.TableName		= u.TableName
	AND
		m.IndexName		= u.IndexName

IF @Debug = @YES SELECT * FROM @m

-- Get a list of indexes and their IsPrimaryKey and IsUniqueConstraint attributes
INSERT @r( ServerName, DatabaseName, SchemaName, TableName, IndexName, IsPrimaryKey, IsUniqueConstraint )
SELECT
	ServerName,
	DatabaseName,
	SchemaName,
	TableName,
	IndexName,
	IsPrimaryKey,
	IsUniqueConstraint
FROM DBA.dbo.UnusedIndexes
WHERE DatabaseName = @DBName
GROUP BY ServerName, DatabaseName, SchemaName, TableName, IndexName, IsPrimaryKey, IsUniqueConstraint
ORDER BY ServerName, DatabaseName, SchemaName, TableName, IndexName, IsPrimaryKey, IsUniqueConstraint;

INSERT @f( ServerName, DatabaseName, SchemaName, TableName, IndexName, Hits, BatchCount, Score )
	SELECT
		ServerName,
		DatabaseName,
		SchemaName,
		TableName,
		IndexName,
		SUM(Hit) Hits,
		MAX(BatchCount) BatchCount,
		CAST( SUM(Hit) AS DECIMAL(12, 3) ) / CAST(MAX(BatchCount) AS DECIMAL(12, 3) ) Score
	FROM @m
	GROUP BY ServerName, DatabaseName, SchemaName, TableName, IndexName
	ORDER BY ServerName, DatabaseName, SchemaName, TableName, IndexName

-- Update the missing IsPrimaryKey and IsUniqueConstraint attributes

UPDATE f
SET
	f.IsPrimaryKey			= r.IsPrimaryKey,
	f.IsUniqueConstraint	= r.IsUniqueConstraint
FROM @f												f
INNER JOIN @r										r
ON
	f.ServerName	= r.ServerName
AND
	f.DatabaseName	= r.Databasename
AND
	f.SchemaName	= r.SchemaName
AND
	f.TableName		= r.TableName
AND
	f.IndexName		= r.IndexName

IF @Debug = @YES SELECT * FROM @f

-- Let's only consider indexes that have not been used on user-defined percentage of the time
INSERT @fin( DatabaseName, SchemaName, TableName, IndexName, IsPrimaryKey, IsUniqueConstraint, TargetInstance_Hits, TargetInstance_Count, TargetInstance_Score )
SELECT
	f.DatabaseName,
	f.SchemaName,
	f.TableName,
	f.IndexName,
	f.IsPrimaryKey,
	f.IsUniqueConstraint,
	f.Hits			TargetInstance_Hits,
	f.BatchCount	TargetInstance_Count,
	f.Score			TargetInstance_Score

FROM @f										f
WHERE
		f.Score >= @PercentDaysUnused

ORDER BY f.DatabaseName, f.SchemaName, f.TableName

-- Build either DROP INDEX or ALTER TABLE statement
UPDATE fin
SET DropAlterStatement =
	CASE WHEN ( IsPrimaryKey ^ IsUniqueConstraint) = 1
		THEN
			'USE' + SPACE(1) + fin.DatabaseName + ';'
			+ 'IF EXISTS (SELECT 1 FROM sys.indexes WHERE object_id = OBJECT_ID(N'
			+ CHAR(39) + '[' + fin.SchemaName + '].[' + fin.TableName + ']'
			+ CHAR(39) + ') AND name = N'
			+ CHAR(39) + fin.IndexName
			+ CHAR(39) + ')'
			+ SPACE(1) + 'ALTER TABLE [' + fin.SchemaName + '].[' + fin.TableName + ']'
			+ SPACE(1) + 'DROP CONSTRAINT [' + fin.IndexName + '];'
		ELSE
			'USE' + SPACE(1) + fin.DatabaseName + ';'
			+ 'IF EXISTS (SELECT 1 FROM sys.indexes WHERE object_id = OBJECT_ID(N'
			+ CHAR(39) + '[' + fin.SchemaName + '].[' + fin.TableName + ']'
			+ CHAR(39) + ') AND name = N'
			+ CHAR(39) + fin.IndexName
			+ CHAR(39) + ')'
			+ SPACE(1) + 'DROP INDEX [' + fin.IndexName + ']' + SPACE(1) + 'ON [' + fin.SchemaName + '].[' + fin.TableName + '];'
	END

FROM @fin fin

-- Execute the "DBA.dbo.GetUnusedIndexDeletionCandidates" stored procedure to create the correlating
-- "CREATE INDEX/ALTER" statements so we can rebuild indexes if need be

EXECUTE @RetVal = [DBA].[dbo].CreateObjectStatementBuilder
   @DBName
  ,@DBIndexSubsetTableName
  ,@AddCariageReturn
  ,@Debug

-- If stored procedure execution succeeded, combine tables
IF @RetVal = 0
	BEGIN
		SELECT
			f.DatabaseName,
			f.SchemaName,
			f.TableName,
			f.IndexName,
			f.IsPrimaryKey,
			f.IsUniqueConstraint,
			f.TargetInstance_Hits,
			f.TargetInstance_Count,
			f.TargetInstance_Score,
			f.DropAlterStatement,
			c.CreateAlterStatement
		FROM @fin									f
		INNER JOIN DBA.dbo.CreateObjectStatement	c
			ON
				f.DatabaseName	= c.DatabaseName
			AND
				f.SchemaName	= c.SchemaName
			AND
				f.TableName		= c.TableName
			AND
				f.IndexName		= c.IndexName
	END
ELSE
	BEGIN
		RAISERROR('The [GetUnusedIndexDeletionCandidates] stored procedure failed. We cannot create CREATE/ALTER object statements.', 16, 1)
		SELECT *
		FROM @fin;
	END

“Get Unused Indexes” Tool for SQL Server 2005“Get Unused Indexes” Tool for SQL Server 2005

Execution Example: Here’s an example of how to execute this script:

DECLARE
	@DBName			SYSNAME,
	@PercentDaysUnused	DECIMAL(2,2),
	@Debug			BIT

SET @DBName				= 'AdventureWorks'
SET @PercentDaysUnused	= .9
SET @Debug				= 0

EXECUTE [DBA].[dbo].[GetUnusedIndexDeletionCandidates]
   @DBName
  ,@PercentDaysUnused
  ,@Debug

Conclusion: Beware of arbitrarily removing indexes. It’s best to

  1. Determine if they’re, in fact, superfluous (based on trend analysis)
  2. Have a script to remove the target indexes
  3. Have a script to replace the target indexes you removed

This second article completes my treatment of identifying and removing unused indexes.

Posted in Alphabetical, DBA Best Practice, DBA Tools, Performance Tuning | Tagged , , , , | Comments Off on “Get Unused Index Deletion Candidates” Tool for SQL Server 2005

“Get DBCC CHECKDB Output” Tool for SQL Server 2005

Scenario: You’d like to automate the invocation of DBCC CHECKDB to validate the structural integrity of your databases.

Solution: The below script

  • Executes DBCC CHECKDB against all SQL Server databases except TempDB
  • Captures the DBCC CHECKDB output into a work table
  • Identifies the summary line for the given DBCC CHECKDB invocation (for a given database)
  • Sends an DB Mail message with an attachment having said summary lines

The DBA receiving the email can react accordingly to the summary information (see “Database Mail Profiles” for more information) (Also, I’ve created a tool to replicate or modify an existing Database Mail Profile).

For a more general approach to sending a SQL Server Error Log summary message, see my “Email SQL Server Errors” Tool for SQL Server 2005 post.

Conventions: One convention bears comment.

  1. It’s my standard practice to create a fixed-space “DBA” database on a given SQL Server instance. It’s a workspace where I’ll perform DBA analysis tasks and, optionally, persist related metrics (akin to SQL Server’s usage of its “tempdb” database). I’ll place DBA objects, like this script’s stored procedure and related table, in the DBA database.

Script Highlights:

The script has four parameters that we will change:

  1. @ProfileName: Is the name of the profile from which to send the message. The profile_name must be the name of an existing Database Mail profile.
  2. @SendEmailOnly: A bit to avoid running DBCC CheckDB and just send the most recent DBCC CheckDB results.
  3. @RecipientsProxy: The semicolon-delimited list of e-mail addresses that will receive this message.
  4. @Debug: A bit to either show or hide verbose output

I divide the script into four sections:

  1. Parameter validation,
  2. Work table maintenance,
  3. DBCC CHECKDB loop, and
  4. Message dispatch

The Script:

Note: I stored these scripts on Box.net. Click on this link to download the below script.

USE DBA
GO

IF OBJECT_ID('dbo.GetDBCCCHECKDBOutput', 'P') IS NOT NULL DROP PROCEDURE dbo.GetDBCCCHECKDBOutput
GO

CREATE PROCEDURE dbo.GetDBCCCHECKDBOutput

@ProfileName SYSNAME = NULL, -- Valid DB Mail Profile
@RecipientsProxy VARCHAR(MAX) = NULL, -- semicolon-delimited list of e-mail addresses
@SendEmailOnly BIT = 0, -- So we can send email of existing records in
-- the [DBA].[dbo].[DBCCCheckDBOutput] table
@Debug BIT = 0 -- Show or hide verbose output

AS

/*
* Author: Mark Holahan
* Date: Monday, 31-May-2011
*
* Version: 3
* Purpose: Run DBCC CHECKDB, with no data repair options, so we don't need to put the given
* database in single-user mode, and send final line in a message via email.
*
* Revised:
* Mon, 31-May-2011: Added @SendEmailOnly bit and concomitant logic
*
* See: http://www.lockergnome.com/sqlsquirrel/2009/03/11/inserting-dbcc-checkdb-results-into-a-table/
*/

SET NOCOUNT ON;

DECLARE

@BatchDate DATETIME,
@DBName SYSNAME,
@MachineName SYSNAME,
@InstanceName SYSNAME,
@PK TINYINT,
@SQL VARCHAR(8000),
@SQLServerInstanceName SYSNAME,
@YES BIT,

-- Error message vars

@ErrorMessage NVARCHAR(4000),
@ErrorNumber INT,
@ErrorSeverity INT,
@ErrorState INT,
@ErrorLine INT,
@ErrorProcedure NVARCHAR(200),

-- DB Mail Variables

@AttachQueryResultAsFileProxy BIT,
@Body NVARCHAR(MAX),
@BodyProxy NVARCHAR(MAX),
@NO BIT,
@Query VARCHAR(MAX),
@QueryProxy VARCHAR(MAX),
@Recipients VARCHAR(MAX),
@Subject NVARCHAR(255),
@SubjectProxy NVARCHAR(255)

-- Ephemeral Work tables

DECLARE @t TABLE
(
PK TINYINT IDENTITY,
DBName SYSNAME
)

-- Defint Constants

SET @YES = 1
SET @NO = 0

-- Set DBMail variables
SET @AttachQueryResultAsFileProxy = @YES

-- Variables
SET @BatchDate = GETDATE()
SET @InstanceName = CAST(SERVERPROPERTY('InstanceName') AS SYSNAME)
SET @MachineName = CAST(SERVERPROPERTY('MachineName') AS SYSNAME)

IF @InstanceName IS NULL
SET @SQLServerInstanceName = @MachineName
ELSE
SET @SQLServerInstanceName = @MachineName + '\' + @InstanceName

SET @SubjectProxy = @SQLServerInstanceName + ': DBCC CheckDB Results'
SET @BodyProxy = 'Here are the summary DBCC CHECKDB output lines.'
SET @QueryProxy = 'SET NOCOUNT ON;SELECT CAST(DBName AS VARCHAR(25)) AS DBName, DTStamp, LEFT(MessageText, 200) MessageText FROM DBA.dbo.DBCCCheckDBOutput WHERE IsSummaryLine = 1 AND DTSTamp = ( SELECT MAX(DTStamp) FROM DBA.dbo.DBCCCheckDBOutput ) ORDER BY DBName;'

-- 1) Parameter validation

-- Parameter existence check:
IF @ProfileName IS NULL
BEGIN
RAISERROR( 'Please provide a DB Mail Profile Name', 16, 1 )
RETURN
END

IF @RecipientsProxy IS NULL
BEGIN
RAISERROR( 'Please provide a receipent email address', 16, 1 )
RETURN
END

-- Profile name validity check

BEGIN TRY
EXEC msdb.dbo.sysmail_help_profile_sp @profile_name = @ProfileName
END TRY

BEGIN CATCH
SELECT
@ErrorNumber = ERROR_NUMBER(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE(),
@ErrorLine = ERROR_LINE(),
@ErrorProcedure = ISNULL(ERROR_PROCEDURE(), '-');

RAISERROR
(
@ErrorMessage,
@ErrorSeverity,
1,
@ErrorNumber, -- parameter: original error number.
@ErrorSeverity, -- parameter: original error severity.
@ErrorState, -- parameter: original error state.
@ErrorProcedure, -- parameter: original error procedure name.
@ErrorLine -- parameter: original error line number.
);
RETURN

END CATCH

IF @SendEmailOnly = @NO

BEGIN

-- 2) Create or maintain work table

IF OBJECT_ID(N'DBA.dbo.DBCCCheckDBOutput', N'U') IS NOT NULL
TRUNCATE TABLE DBA.dbo.DBCCCheckDBOutput
ELSE

BEGIN

CREATE TABLE dbo.DBCCCheckDBOutput
(
PK INT IDENTITY CONSTRAINT PK_DBCCCheckDBOutput PRIMARY KEY,
ServerName SYSNAME COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
DBName SYSNAME COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
Error INT NULL,
[Level] INT NULL,
[State] INT NULL,
MessageText VARCHAR(7000) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
RepairLevel INT NULL,
[Status] INT NULL,
[DbID] INT NULL,
ID INT NULL,
IndID INT NULL,
PartitionID INT NULL,
AllocUnitID INT NULL,
[File] INT NULL,
Page INT NULL,
Slot INT NULL,
RefFile INT NULL,
RefPage INT NULL,
RefSlot INT NULL,
Allocation INT NULL,
DTStamp DATETIME NULL,
IsSummaryLine BIT CONSTRAINT DF_DBCCCheckDBOutput_IsSummaryLine DEFAULT (0)
)
END

-- 3) Create a loop to perform DBCC CHECKDB against all databases other than TempDB

INSERT @t( DBName )
SELECT name
FROM master.sys.databases
WHERE database_id <> 2

WHILE EXISTS ( SELECT TOP 1 1 FROM @t )
BEGIN

SELECT TOP 1 @PK = PK, @DBName = DBName FROM @t

SET @SQL = 'SET NOCOUNT ON;DBCC CHECKDB(' + @DBName + ') WITH TABLERESULTS, ALL_ERRORMSGS;'

IF @Debug = @YES SELECT @SQL;

INSERT INTO [DBA].[dbo].[DBCCCheckDBOutput]
(
[Error]
,[Level]
,[State]
,[MessageText]
,[RepairLevel]
,[Status]
,[DbID]
,[ID]
,[IndID]
,[PartitionID]
,[AllocUnitID]
,[File]
,[Page]
,[Slot]
,[RefFile]
,[RefPage]
,[RefSlot]
,[Allocation]
)

EXEC (@SQL)

-- Update BatchDate, ServerName, and DBName

UPDATE DBA.dbo.DBCCCheckDBOutput
SET
DTStamp = @BatchDate,
ServerName = @SQLServerInstanceName,
DBName = @DBName
WHERE ServerName IS NULL AND DBName IS NULL

-- Mark summary line for given database for email since we can't use local variables for the email session

UPDATE DBA.dbo.DBCCCheckDBOutput
SET IsSummaryLine = @YES
WHERE PK = ( SELECT MAX(PK) FROM DBA.dbo.DBCCCheckDBOutput WHERE DBName = @DBName )

-- Remove record from work table
DELETE @t WHERE PK = @PK;

END

END

-- 4) Send email having summary lines of DBCC CHECKDB

BEGIN TRY

IF @Debug = @YES SELECT @QueryProxy

EXECUTE msdb.dbo.sp_send_dbmail
@profile_name = @ProfileName,
@Recipients = @RecipientsProxy,
@Subject = @SubjectProxy,
@Body = @BodyProxy,
@body_format = 'TEXT',
@query_result_header = @YES,
@Query = @QueryProxy,
@attach_query_result_as_file = @AttachQueryResultAsFileProxy,
@query_result_width = 1000

END TRY

BEGIN CATCH
SELECT
@ErrorNumber = ERROR_NUMBER(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE(),
@ErrorLine = ERROR_LINE(),
@ErrorProcedure = ISNULL(ERROR_PROCEDURE(), '-');

RAISERROR
(
@ErrorMessage,
@ErrorSeverity,
1,
@ErrorNumber, -- parameter: original error number.
@ErrorSeverity, -- parameter: original error severity.
@ErrorState, -- parameter: original error state.
@ErrorProcedure, -- parameter: original error procedure name.
@ErrorLine -- parameter: original error line number.
);

END CATCH

Execution Example: Here’s an example of how to execute this script:

DECLARE
@ProfileName SYSNAME,
@RecipientsProxy VARCHAR(MAX),
@SendEmailOnly BIT,
@Debug BIT

SET @ProfileName = 'Studio'
SET @SendEmailOnly = 0
SET @RecipientsProxy = 'mark@heydba.net'
SET @Debug = 0

EXECUTE [DBA].[dbo].[GetDBCCCHECKDBOutput]
@ProfileName,
@RecipientsProxy,
@SendEmailOnly,
@Debug

I received an email having this attached text file. In this case, we have no errors.

DBCC CheckDB Results

Conclusion: As a best practice, DBAs should periodically use DBCC CHECKDB to check the logical and physical integrity of all their databases. Use the above script to help automate this task.

 

Posted in Alphabetical, DB Mail, DBA Best Practice, DBA Tools, DBCC | Tagged , , , , | Comments Off on “Get DBCC CHECKDB Output” Tool for SQL Server 2005

“Get Worst T-SQL” Tool for SQL Server 2005 Part I

Scenario: You would like to identify your flagrantly poor queries. Once you have your worst offenders, you can begin to tune them for optimal performance.

Before SQL Server 2005, you typically found these offenders by taking SQL Profiler traces. While you can still take this approach, SQL Server 2005 introduced dynamic management views and functions unleashing a wealth of server-state information accessible from within SQL Server. We can reference theses DMVs within T-SQL statements. Therefore, we can easily write a stored procedure to capture snap shots, in a work table, of DMVs that reveal to us query metrics. This DMV-approach is a bit more straight forward than programmatically manipulating SQL Trace (SQL Profiler’s server-side sibling).

Solution: Use the below script to automate the collection of your worst-performing T-SQL. Note: the base script was originally written by Gregory Larsen. I revised it to my fit my “DBA Tool” scripting standard.

Conventions: One convention bears comment.

  1. It’s my standard practice to create a fixed-space “DBA” database on a given SQL Server instance. It’s a workspace where I’ll perform DBA analysis tasks and, optionally, persist related metrics (akin to SQL Server’s usage of its “tempdb” database). I’ll place DBA objects, like this script’s stored procedure and related table, in the DBA database.

Script Highlights:

The procedure has four parameters:

  1. @DBName: Our target database in which we wish to identify the worst TSQL (optional)
  2. @Count: Our TOP x rows
  3. @OrderBy: See Larsen’s excellent embedded notes in the stored-procedure verbiage
  4. @DaysToRetain: The number of days for which you wish to retain worst TSQL metrics in the WorstTSQL work table. I have an arbitrary 14-day default

I divide this script into three sections:

  1. Removal of aged-out “worst TSQL” data
  2. @OrderBy validation check
  3. Load the DBA.dbo.WorstTSQL table

The Script:

Note: I store these scripts on Box.Net. Click on this link to download the below script.

USE DBA
GO

IF OBJECT_ID('[dbo].[GetWorstTSQL]', 'P') IS NOT NULL DROP PROCEDURE [dbo].[GetWorstTSQL]

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE PROC [dbo].[GetWorstTSQL]

/*
* Author:	Gregory A. Larsen (see http://www.sqlmag.com/article/tsql3/quickly-find-your-worst-performing-t-sql-statements.aspx)
*			Copyright © 2008 Gregory A. Larsen.  All rights reserved.
*			Amended by Mark Holahan
*
* Date:		Monday, 02-May-2011
*
* Revised:
*			Mon, 02-May-2011: Embellished error messaging
*
* Version:	3
* Purpose:	This stored procedure displays the top worst performing queries based on CPU, ExecutionCount,
*			I/O and Elapsed_Time as identified using DMV information.  This can be display the worst
*			performing queries from an instance, or database perspective.   The number of records shown,
*			the database, and the sort order are identified by passing pararmeters.
*
* Parameters:
*
*			N.B.: [ObjectName] comes from sys.dm_exec_sql_text.objectid: This field is null for ad hoc and prepared SQL statements.
*
*			There are three different parameters that can be passed to this procedures: @DBName, @Count
*			and @OrderBy.
*			The @DBName is used to constraint the output to a specific database.  If
*			when calling this SP this parameter is set to a specific DatabaseName then only statements
*			that are associated with that database will be displayed.  If the @DBName parameter is not set
*			then this SP will return rows associated with any database.
*			The @Count parameter allows you
*			to control the number of rows returned by this SP.  If this parameter is used then only the
*			TOP x rows, where x is equal to @Count will be returned, based on the @OrderBy parameter.
*			The @OrderBy parameter identifies the sort order of the rows returned in descending order.
*			This @OrderBy parameters supports the following type: CPU, AE, TE, EC or AIO, TIO, ALR, TLR, ALW, TLW, APR, and TPR
*			where:
*				"ACPU" represents Average CPU Usage
*				"TCPU" represents TotalCPU usage
*				"AE"   represents Average Elapsed Time
*				"TE"   represents Total Elapsed Time
*				"EC"   represents ExecutionCount
*				"AIO"  represents AverageIOs
*				"TIO"  represents TotalIOs
*				"ALR"  represents Average Logical Reads
*				"TLR"  represents TotalLogicalReads
*				"ALW"  represents Average Logical Writes
*				"TLW"  represents TotalLogicalWrites
*				"APR"  represents Average Physical Reads
*				"TPR"  represents Total Physical Read
*
*	Typical execution calls:
*
*	Top 6 statements in the AdventureWorks database base on Average CPU Usage:
*      EXEC GetWorstTSQL @DBName='AdventureWorks',@Count=6,@OrderBy='ACPU';
*
*   Top 100 statements order by Average Logical Read
*      EXEC GetWorstTSQL @Count=100,@OrderBy='ALR';
*
*   Show top 100 statements by Average IO
*      EXEC GetWorstTSQL;
*
*/

	(
	@DBName											SYSNAME		= '',
	@Count											INT			= 999999999,
	@OrderBy										VARCHAR(4)	= 'AIO',
	@DaysToRetain									TINYINT		= 14
	)

AS

SET NOCOUNT ON;

DECLARE

	@AgeOutDate										DATETIME,
	@BatchDate										DATETIME,
	@DBNameProxy									SYSNAME,

    @ErrorMessage									NVARCHAR(4000),
    @ErrorNumber									INT,
    @ErrorSeverity									INT,
    @ErrorState										INT,
    @ErrorLine										INT,
    @ErrorProcedure									NVARCHAR(200);

SET @DBNameProxy = '%' + @DBName + '%'
SET @BatchDate = GETDATE()

IF OBJECT_ID('dbo.WorstTSQL', 'U') IS NULL
	BEGIN
		CREATE TABLE dbo.WorstTSQL
		(
			[PK]						INT IDENTITY
			CONSTRAINT PK_WorstTSQL PRIMARY KEY,
			[DatabaseName]				SYSNAME,
			[Statement]					NVARCHAR(MAX),
			[SchemaName]				SYSNAME NULL,
			[ObjectName]				SYSNAME NULL,
			[CachedPlanObjType]			NVARCHAR(16),
			[ExecutionCount]			BIGINT,
			[AverageIOs]				BIGINT,
			[TotalIOs]					BIGINT,
			[AvgLogicalReads]			BIGINT,
			[TotalLogicalReads]			BIGINT,
			[AvgLogicalWrites]			BIGINT,
			[TotalLogicalWrites]		BIGINT,
			[AvgPhysicalReads]			BIGINT,
			[TotalPhysicalReads]		BIGINT,
			[AvgCPU]					BIGINT,
			[TotalCPU]					BIGINT,
			[AvgElapsedTime]			BIGINT,
			[TotalElaspedTime]			BIGINT,
			[LastExecutionTime]			DATETIME,
			BatchDate					DATETIME NULL
		)
	END

--1) Remove records older than user-provided number of days

BEGIN TRY

	-- By default, I'd like to keep 2 weeks of data
	SET @AgeOutDate = DATEADD(dd, (@DaysToRetain * -1), GETDATE())

	DELETE
	FROM DBA.dbo.WorstTSQL
	WHERE
			BatchDate < @AgeOutDate

END TRY

BEGIN CATCH
	SELECT
		@ErrorNumber		= ERROR_NUMBER(),
		@ErrorSeverity		= ERROR_SEVERITY(),
		@ErrorState			= ERROR_STATE(),
		@ErrorLine			= ERROR_LINE(),
		@ErrorProcedure		= ISNULL(ERROR_PROCEDURE(), '-');

	SELECT @ErrorMessage =
		N'Error %d, Level %d, State %d, Procedure %s, Line %d, ' + 'Message: '+ ERROR_MESSAGE();

	RAISERROR
		(
		@ErrorMessage,
		@ErrorSeverity,
		1,
		@ErrorNumber,    -- parameter: original error number.
		@ErrorSeverity,  -- parameter: original error severity.
		@ErrorState,     -- parameter: original error state.
		@ErrorProcedure, -- parameter: original error procedure name.
		@ErrorLine       -- parameter: original error line number.
		);

		RETURN
END CATCH

-- 2) Check for valid @OrderBy parameter:

IF ((SELECT CASE WHEN
          @OrderBy IN ('ACPU','TCPU','AE','TE','EC','AIO','TIO','ALR','TLR','ALW','TLW','APR','TPR')
             THEN 1 ELSE 0 END) = 0)
BEGIN
   -- abort if invalid @OrderBy parameter entered
   RAISERROR('The @OrderBy parameter is not APCU, TCPU, AE, TE, EC, AIO, TIO, ALR, TLR, ALW, TLW, APR or TPR', 11, 1)
   RETURN
END

-- 3) Load table with worst T-SQL info

BEGIN TRY
	INSERT INTO [DBA].[dbo].[WorstTSQL]
	(
		[DatabaseName]
		,[Statement]
		,[SchemaName]
		,[ObjectName]
		,[CachedPlanObjType]
		,[ExecutionCount]
		,[AverageIOs]
		,[TotalIOs]
		,[AvgLogicalReads]
		,[TotalLogicalReads]
		,[AvgLogicalWrites]
		,[TotalLogicalWrites]
		,[AvgPhysicalReads]
		,[TotalPhysicalReads]
		,[AvgCPU]
		,[TotalCPU]
		,[AvgElapsedTime]
		,[TotalElaspedTime]
		,[LastExecutionTime]
		,[BatchDate]
	)

	SELECT TOP (@Count)
		COALESCE(DB_NAME(st.dbid), DB_NAME(CAST(pa.value AS INT))+'*', 'Resource') AS						[DatabaseName],
		-- find the offset of the actual statement being executed
		SUBSTRING( TEXT,
			CASE
				WHEN qs.statement_start_offset = 0 OR qs.statement_start_offset IS NULL
					THEN 1
				ELSE qs.statement_start_offset/2 + 1
			END,
			CASE
				WHEN qs.statement_end_offset = 0 OR qs.statement_end_offset = -1 OR qs.statement_end_offset IS NULL
					THEN LEN(TEXT)
				ELSE qs.statement_end_offset/2
			END -
			CASE
				WHEN qs.statement_start_offset = 0 OR qs.statement_start_offset IS NULL
					THEN 1
				ELSE qs.statement_start_offset/2
			END + 1
			)
		AS																									[Statement]
		,OBJECT_SCHEMA_NAME(st.objectid, dbid)																[SchemaName]
		,OBJECT_NAME(st.objectid, dbid	)																	[ObjectName]
		,cp.objtype																							[CachedPlanObjType]
		,qs.execution_count																					[ExecutionCount]
		,(qs.total_logical_reads + qs.total_logical_writes + qs.total_physical_reads ) / qs.execution_count	[AverageIOs]
		,qs.total_logical_reads + qs.total_logical_writes + qs.total_physical_reads							[TotalIOs]
		,qs.total_logical_reads / qs.execution_count														[AvgLogicalReads]
		,qs.total_logical_reads																				[TotalLogicalReads]
		,qs.total_logical_writes / qs.execution_count														[AvgLogicalWrites]
		,qs.total_logical_writes																			[TotalLogicalWrites]
		,qs.total_physical_reads / qs.execution_count														[AvgPhysicalReads]
		,qs.total_physical_reads																			[TotalPhysicalReads]
		,qs.total_worker_time / qs.execution_count															[AvgCPU]
		,total_worker_time																					[TotalCPU]
		,qs.total_elapsed_time / qs.execution_count															[AvgElapsedTime]
		,qs.total_elapsed_time																				[TotalElaspedTime]
		,qs.last_execution_time																				[LastExecutionTime]
		,@BatchDate

	FROM master.sys.dm_exec_query_stats																		qs
	JOIN master.sys.dm_exec_cached_plans																	cp
		ON qs.plan_handle = cp.plan_handle
	CROSS APPLY master.sys.dm_exec_sql_text(qs.plan_handle)													st
	OUTER APPLY master.sys.dm_exec_plan_attributes(qs.plan_handle)											pa
	WHERE
		pa.attribute = 'dbid'
	AND
		CASE
			WHEN @DBName = ''
				THEN ''
			ELSE COALESCE(DB_NAME(st.dbid), DB_NAME(CAST(pa.value AS INT)) + '*','Resource')
		END
		IN (RTRIM(@DBName), RTRIM(@DBName) + '*')
	ORDER BY
		CASE
			WHEN @OrderBy = 'ACPU' THEN qs.total_worker_time / qs.execution_count
			WHEN @OrderBy = 'TCPU'  THEN qs.total_worker_time
			WHEN @OrderBy = 'AE'   THEN qs.total_elapsed_time / qs.execution_count
			WHEN @OrderBy = 'TE'   THEN qs.total_elapsed_time
			WHEN @OrderBy = 'EC'   THEN qs.execution_count
			WHEN @OrderBy = 'AIO'  THEN (qs.total_logical_reads + qs.total_logical_writes + qs.total_physical_reads) / qs.execution_count
			WHEN @OrderBy = 'TIO'  THEN qs.total_logical_reads + qs.total_logical_writes + qs.total_physical_reads
			WHEN @OrderBy = 'ALR'  THEN qs.total_logical_reads  / qs.execution_count
			WHEN @OrderBy = 'TLR'  THEN qs.total_logical_reads
			WHEN @OrderBy = 'ALW'  THEN qs.total_logical_writes / qs.execution_count
			WHEN @OrderBy = 'TLW'  THEN qs.total_logical_writes
			WHEN @OrderBy = 'APR'  THEN qs.total_physical_reads / qs.execution_count
			WHEN @OrderBy = 'TPR'  THEN qs.total_physical_reads
		END
	DESC

END TRY

BEGIN CATCH
	SELECT
		@ErrorNumber		= ERROR_NUMBER(),
		@ErrorSeverity		= ERROR_SEVERITY(),
		@ErrorState			= ERROR_STATE(),
		@ErrorLine			= ERROR_LINE(),
		@ErrorProcedure		= ISNULL(ERROR_PROCEDURE(), '-');

	SELECT @ErrorMessage =
		N'Error %d, Level %d, State %d, Procedure %s, Line %d, ' + 'Message: '+ ERROR_MESSAGE();

	RAISERROR
		(
		@ErrorMessage,
		@ErrorSeverity,
		1,
		@ErrorNumber,    -- parameter: original error number.
		@ErrorSeverity,  -- parameter: original error severity.
		@ErrorState,     -- parameter: original error state.
		@ErrorProcedure, -- parameter: original error procedure name.
		@ErrorLine       -- parameter: original error line number.
		);

END CATCH

Execution Example: Here’s an example of how to execute this script:

DECLARE
    @DBName         SYSNAME,
    @Count			INT,
    @OrderBy        VARCHAR(4),
    @DaysToRetain   TINYINT

SET @DBName         = 'AdventureWorks'
SET @Count          = 20
SET @OrderBy		= 'TLR'    -- Total Logical Reads (see stored-procedure header)
SET @DaysToRetain   = 14

EXECUTE [DBA].[dbo].[GetWorstTSQL]
   @DBName
  ,@Count
  ,@OrderBy
  ,@DaysToRetain

Conclusion: DBAs need to know their most expensive queries. It’s the foundational to effective query tuning and, on a larger scale, an important piece of broader performance tuning. At this point, I find Larsen’s “Total Logical Reads” the most useful “Order By” metric (although you can see there are several choices you can use to order your results). Try a few out to see what insight you can glean from analyzing your worst TSQL.

In the second part of this two-part topic, I’ll provide a bcp-based tool to export an entire “worst TSQL” statement to a text file. In this way, you can manipulate and further analyze the query.

Posted in Alphabetical, DBA Tools, Performance Tuning | Tagged , , , | Comments Off on “Get Worst T-SQL” Tool for SQL Server 2005 Part I

“Run SQL Trace” Tool for SQL Server 2005 Part I

Scenario: You want to leverage (and automate) the power of SQL Profiler but minimize the overhead on your target instance. And while SQL Profiler has several positive points:

  • it has an intuitive GUI
  • it’s a snap to configure,

it has its costs:

  • the GUI has intrinsic overhead,
  • it requires manual configuration,
  • it can send a lot of data over the network if your target a remote server

Solution: Use “SQL Trace,” the server-side equivalent of Profiler.  SQL Trace is actually the GUI-less engine of Profiler. You need become familiar with only four system stored procedures to create and run a SQL-Trace session:

  1. sp_trace_create: Creates a trace definition. The new trace will be in a stopped state.
  2. sp_trace_setevent: Adds or removes an event or event column to a trace. sp_trace_setevent may be executed only on existing traces that are stopped (status is 0). An error is returned if this stored procedure is executed on a trace that does not exist or whose status is not 0.
  3. sp_trace_setfilter: Applies a filter to a trace. sp_trace_setfilter may be executed only on existing traces that are stopped (status is 0). SQL Server returns an error if this stored procedure is executed on a trace that does not exist or whose status is not 0.
  4. sp_trace_setstatus: Modifies the current state of the specified trace.

Rather than taking the time to explain each of the arguments of these four system stored procedures, it will be instructive, instead, to show you a shortcut that generates a script to execute SQL Trace (demonstrating the use of all four of the above procedures).

A shortcut is the longest distance between two points:

Just kidding.

Here’s a way to use SQL Server Profiler to create the equivalent SQL Trace script:

1. Open SQL Server Profiler

2. Chose to create a new trace

3. Use the “Tuning” template

image

4. Run the trace for, say, 5 seconds

5. Stop the trace

6. Go to File –> Export –> Script Trace Definition –> For SQL Server 2005 – 2008 R2

image

5. Save the file

Here’s the script that SQL Server Profiler produced

/****************************************************/
/* Created by: SQL Server 2008 R2 Profiler          */
/* Date: 04/18/2011  00:53:57 AM         */
/****************************************************/

-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
set @maxfilesize = 5 

-- Please replace the text InsertFileNameHere, with an appropriate
-- filename prefixed by a path, e.g., c:MyFolderMyTrace. The .trc extension
-- will be appended to the filename automatically. If you are writing from
-- remote server to local drive, please use UNC path and make sure server has
-- write access to your network share

exec @rc = sp_trace_create @TraceID output, 0, N'InsertFileNameHere', @maxfilesize, NULL
if (@rc != 0) goto error

-- Client side File and Table cannot be scripted

-- Set the events
declare @on bit
set @on = 1
exec sp_trace_setevent @TraceID, 10, 1, @on
exec sp_trace_setevent @TraceID, 10, 3, @on
exec sp_trace_setevent @TraceID, 10, 11, @on
exec sp_trace_setevent @TraceID, 10, 35, @on
exec sp_trace_setevent @TraceID, 10, 12, @on
exec sp_trace_setevent @TraceID, 10, 13, @on
exec sp_trace_setevent @TraceID, 45, 1, @on
exec sp_trace_setevent @TraceID, 45, 3, @on
exec sp_trace_setevent @TraceID, 45, 11, @on
exec sp_trace_setevent @TraceID, 45, 35, @on
exec sp_trace_setevent @TraceID, 45, 12, @on
exec sp_trace_setevent @TraceID, 45, 28, @on
exec sp_trace_setevent @TraceID, 45, 13, @on
exec sp_trace_setevent @TraceID, 12, 1, @on
exec sp_trace_setevent @TraceID, 12, 3, @on
exec sp_trace_setevent @TraceID, 12, 11, @on
exec sp_trace_setevent @TraceID, 12, 35, @on
exec sp_trace_setevent @TraceID, 12, 12, @on
exec sp_trace_setevent @TraceID, 12, 13, @on

-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint

-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1

-- display trace id for future references
select TraceID=@TraceID
goto finish

error:
select ErrorCode=@rc

finish:
go

If you spend a few minutes studying the above script, you’ll note a few things:

  • that SQL Server Profiler just gave you the SQL Trace (the server-side “Profiler”) equivalent
  • the usage, and relationships between the four SQL Trace system stored procedures
  • that now there’s nothing stopping you from creating your own SQL Trace script

Using the foundation of the above SQL-Server-Profiler generated script, I created the below “Tuning” script. I can now:

  • schedule a job to execute this server side trace (having lower overhead)
  • use the trace output as input into the Database Engine Tuning Advisor

Conventions: One convention bears comment.

  1. It’s my standard practice to create a fixed-space “DBA” database on a given SQL Server instance. It’s a workspace where I’ll perform DBA analysis tasks and, optionally, persist related metrics (akin to SQL Server’s usage of its “tempdb” database). I’ll place DBA objects, like this script’s stored procedure and related table, in the DBA database.

Script Highlights:

The script has four parameters that we will change:

  1. @TraceFile: Specifies the location and file name to which the trace will be written
  2. @RunTimeMinutes: Duration of the trace in minutes (could be changed to smaller time units)
  3. @DatabaseFilter: I want to examine events on a per database basis
  4. @ObjectName: Allows flexibility to drill down to a specific object (if need be)

I divide the script into four sections (that correlate to the four SQL Trace system stored procedures):

  1. Trace definition,
  2. Trace event definition,
  3. Trace filter definition, and
  4. Trace invocation

The Script:

Note: I store these scripts on Box.net. Click on this link to download the below script.

USE DBA
GO

IF OBJECT_ID('dbo.RunSQLTrace', 'P') IS NOT NULL DROP PROCEDURE dbo.RunSQLTrace
GO

/*
* Author:    Mark Holahan
* Date:        Sunday, 18-April-2011
*
* Purpose:    Use SQL Trace and not Profiler (but use Profiler to create a script
*            that has all the events you want)
* See:        http://technet.microsoft.com/en-us/library/ms188662.aspx
*
* Note:        When we use the RPC:Completed Event Class & SQL:BatchCompleted Event Class
*            events, we can extract T-SQL events using SQL Profiler
*
*            This proc's currently configured like the "Tuning" SQL-Profiler template
*/

CREATE PROCEDURE dbo.RunSQLTrace

    @TraceFile            NVARCHAR(245),    -- Specifies the location and file name to which the trace will be written
    @RunTimeMinutes        INT,            -- Duration of the trace in minutes (could be changed to smaller time units)
    @DatabaseFilter        NVARCHAR(4000),    -- I want to examine events on a per database basis
    @ObjectName            SYSNAME = NULL    -- Allows flexibility to drill down to a specific object, if need be

AS

DECLARE

    @MaxFileSizeParamMB    BIGINT,
    @On                    BIT,
    @OptionsParam        INT,
    @RC                    INT,
    @StopTimeParam        DATETIME,
    @TraceID            INT

-- Constants:
SET @On                    = 1
SET @OptionsParam        = 2
SET @MaxFileSizeParamMB    = 50

-- Variables:
SET @StopTimeParam        = DATEADD(mi, @RunTimeMinutes, GETDATE())

-- 1) Define trace

EXEC @RC = master.dbo.sp_trace_create
    @TraceID OUTPUT,
    @options        = @OptionsParam,
    @tracefile        = @TraceFile,            -- N'D:TraceSQLTraceFirstTrace',
    @maxfilesize    = @MaxFileSizeParamMB,    -- 50 MB
    @stoptime        = @StopTimeParam

-- Select the return code to see if the trace creation was successful.
IF @RC = 0 AND @TraceID > 0

    BEGIN

--2) Define events

        -- sp_trace_setevent
        --    [ @traceid = ] trace_id
        --    , [ @eventid = ] event_id
        --    , [ @columnid = ] column_id
        --    , [ @On = ] on

        -- 10: RPC:Completed
        -- Occurs when a remote procedure call (RPC) has completed.

        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 1, @on    -- TextData
        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 3, @on    -- DatabaseID
        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 11, @on    -- LoginName
        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 35, @on    -- Database name
        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 12, @on    -- SPID
        EXEC master.dbo.sp_trace_setevent @TraceID, 10, 13, @on    -- Duration

        -- 12: SQL:BatchCompleted
        -- Occurs when a Transact-SQL batch has completed.

        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 1, @on    -- TextData
        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 3, @on    -- DatabaseID
        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 11, @on    -- LoginName
        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 35, @on    -- Database name
        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 12, @on    -- SPID
        EXEC master.dbo.sp_trace_setevent @TraceID, 12, 13, @on    -- Duration

        -- 45: SP:StmtCompleted
        -- Indicates that a Transact-SQL statement within a stored procedure has finished executing.

        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 1, @on    -- TextData
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 3, @on    -- DatabaseID
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 11, @on    -- LoginName
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 35, @on    -- Database name
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 12, @on    -- SPID
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 28, @on    -- ObjectType
        EXEC master.dbo.sp_trace_setevent @TraceID, 45, 13, @on    -- Duration

-- 3) Define Filter

    --    sp_trace_setfilter
    --          [ @traceid = ] trace_id
    --        , [ @columnid = ] column_id
    --        , [ @logical_operator = ] logical_operator
    --        , [ @comparison_operator = ] comparison_operator
    --        , [ @value = ] value

        -- 35 = DatabaseName column
        EXEC master.dbo.sp_trace_setfilter
            @TraceID,
            35,
            0,
            0,
            @DatabaseFilter -- N'AdventureWorks'

        IF @ObjectName IS NOT NULL
            BEGIN
                -- 34 = ObjectName
                 EXEC master.dbo.sp_trace_setfilter @TraceID, 34, 0, 0, @ObjectName
            END

-- 4) Start Trace (status 1 = start):

        EXEC @RC = sp_trace_setstatus @TraceID, 1

    END

Execution Example: Here’s an example of how to execute this script:

DECLARE
    @TraceFile        NVARCHAR(245),
    @RunTimeMinutes INT,
    @DatabaseFilter NVARCHAR(4000),
    @ObjectName        SYSNAME

SET @TraceFile        = 'C:\TempAdventureWorksTuningTrace1'
SET @RunTimeMinutes = 1
SET @DatabaseFilter = 'AdventureWorks'

EXECUTE [DBA].[dbo].[RunSQLTrace]
   @TraceFile
  ,@RunTimeMinutes
  ,@DatabaseFilter
  ,@ObjectName

Conclusion: SQL Server Profiler is staple tool that has its pros and cons. If you have the time to plan, create a set of preconfigured SQL Trace stored procedures to have at the ready. In the second post on this topic, I’ll provide a script to create 6 staggered SQL Agent jobs that execute the above “Tuning” SQL Trace stored procedure. We’ll then examine a few options on how to analyze or consume the trace data.

Posted in Alphabetical, DBA Best Practice, DBA Tools, Performance Tuning | Tagged , , , , | Comments Off on “Run SQL Trace” Tool for SQL Server 2005 Part I

“Get File Stats” Tool for SQL Server 2005 Part I

Scenario: You’d like to keep your finger on pulse of your SQL Server 2005 instances. There are so many metrics to consider, where do you start? Which are the “bellwether” metrics?

Solution: Microsoft gave us a pile of metrics with SQL Server 2005’s “Dynamic Management Views and Functions” (if you ever felt slighted in the past due to the dearth of server state information, just spend a few minutes perusing this topic in SQL Server Books Online).

Under the heading of “SQL Server Operating System Related Dynamic Management Views” we find a view named “sys.dm_io_virtual_file_stats.” In the opinion of Andrew J. Kelly, SQL Server MVP, this is one with which we need to become familiar and pay attention to.

Laconic Books Online imparts to us this odic nub,

“Returns I/O statistics for data and log files. This dynamic management view replaces the fn_virtualfilestats function.”

Ok…a bit light in the “why” department.

When in doubt, turn to our MVPs (they get the inside skinny). Randy Dyess tells us we use this “perfect little dynamic management view” to understand the disk I/O requests, at the file level, that come through our databases. In other words, we can use this view to identify file-performance issues.

We accomplish this task by analyzing six pieces of information for each our our database files:

  1. The number of reads on a file
  2. The amount of data read on a file
  3. The number of writes to a file
  4. The amount of data written to a file
  5. The number of disk I/O read stalls on a file
  6. The number of disk I/O write stalls on a file

Mr. Dyess tells us that, with these metrics, we can optimize our physical sub-system configuration (RAID level, spindle numbers, SAN configuration) or file placement.

But, again, at a higher level, we can use this view to determine if we’re having disk issues.

Here’s a script, that creates a stored procedure, that you could schedule as a daily job, to create and populate a “File Stats” work table.

Conventions: One convention bears comment.

  1. It’s my standard practice to create a fixed-space “DBA” database on a given SQL Server instance. It’s a workspace where I’ll perform DBA analysis tasks and, optionally, persist related metrics (akin to SQL Server’s usage of its “tempdb” database). I’ll place DBA objects, like this script’s stored procedure and related table, in the DBA database.

Script Highlights:

The script has one parameter that we will change:

  1. @DaysToRetain:  The number of day of “file-stats” data we wish to retain. I have an arbitrary default of 712 days.

I divide the script into two sections:

  1. Removal of aged-out data
  2. Work table load

The Script:

Note: I store these scripts on Box.net. Click on this link to download the below file.

 

USE [DBA]
GO

IF OBJECT_ID('dbo.GetFileStats', 'P') IS NOT NULL DROP PROCEDURE dbo.GetFileStats
GO

CREATE PROCEDURE dbo.GetFileStats

	@DaysToRetain			SMALLINT	= 712

AS

/*
* Author:	Mark Holahan (inspired from a Brian Kelly workshop)
* Date:		Friday, 16-April-2011
*
* Version:  2
* Purpose:	To measure the file stats
*
* ToDo:
*/

SET NOCOUNT ON;

-- Persisted work table

IF OBJECT_ID('DBA.dbo.FileStats', 'U') IS NULL

	BEGIN
		CREATE TABLE dbo.FileStats
		(
			PK										INT IDENTITY CONSTRAINT PK_FileStats PRIMARY KEY,
			ServerName								SYSNAME NULL,
			DatabaseID								SMALLINT,
			DatabaseName							SYSNAME NOT NULL,
			FileID									SMALLINT,
			SampleMS								INT,
			NumOfReads								BIGINT,
			NumOfBytesRead							BIGINT,
			IOStallReadMS							BIGINT,
			NumOfWrites								BIGINT,
			NumOfBytesWritten						BIGINT,
			IOStallWriteMS							BIGINT,
			IOStall									BIGINT,
			SizeonDiskBytes							BIGINT,
			FileHandle								VARBINARY(25),
			DTStamp									DATETIME NOT NULL,
			LastServerRestart						DATETIME NULL,
			DaysSinceLastRestart					AS (DATEDIFF(dd, LastServerRestart, GETDATE()) )
		) ON [PRIMARY]

		CREATE NONCLUSTERED INDEX NCndx_DTStamp
		ON dbo.FileStats (DTStamp)

	END

-- Vars
DECLARE

	@AgeOutDate										DATETIME,
	@BATCHDATE										DATETIME,
	@InstanceName									SYSNAME,
	@MachineName									SYSNAME,
	@NO												BIT,
	@SQLServerInstanceName							SYSNAME,
	@tempdbCreateDate								DATETIME,
	@YES											BIT

-- Constants

SET @BATCHDATE				= GETDATE()
SET @NO						= 0
SET @YES					= 1

-- Variable
SET @InstanceName			= CAST(SERVERPROPERTY('InstanceName') AS SYSNAME)
SET @MachineName			= CAST(SERVERPROPERTY('MachineName') AS SYSNAME)

-- 1) Remove records older than user-provided number of days

BEGIN TRY

	-- By default, I'd like to keep 2 years of data
	SET @AgeOutDate = DATEADD(dd, (@DaysToRetain * -1), GETDATE())

	DELETE dbo.FileStats
	WHERE DTStamp < @AgeOutDate

END TRY

BEGIN CATCH
	SELECT
		ERROR_NUMBER() AS ErrorNumber
		,ERROR_MESSAGE() AS ErrorMessage
		,ERROR_SEVERITY() AS ErrorSeverity
		,ERROR_STATE() AS ErrorState
		,ERROR_LINE() AS ErrorLine
		,ERROR_PROCEDURE() AS ErrorProcedure

		RETURN
END CATCH

-- 2) Load table with file-stats info

BEGIN TRY

	INSERT DBA.dbo.FileStats
	   (
	   ServerName
	   ,DatabaseName
	   ,DatabaseID
	   ,FileID
	   ,SampleMS
	   ,NumOfReads
	   ,NumOfBytesRead
	   ,IOStallReadMS
	   ,NumOfWrites
	   ,NumOfBytesWritten
	   ,IOStallWriteMS
	   ,IOStall
	   ,SizeonDiskBytes
	   ,FileHandle
	   ,DTStamp
	   )

	SELECT
		@MachineName						ServerName,
		DB_NAME(database_id)				DatabaseName,
		database_id							DatabaseID,
		file_id								FileID,
		sample_ms							SampleMS,
		num_of_reads						NumOfReads,
		num_of_bytes_read					NumOfBytesRead,
		io_stall_read_ms					IOStallReadMS,
		num_of_writes						NumOfWrites,
		num_of_bytes_written				NumOfBytesWritten,
		io_stall_write_ms					IOStallWriteMS,
		io_stall							IOStall,
		size_on_disk_bytes					SizeonDiskBytes,
		file_handle							FileHandle,
		@BATCHDATE							DTStamp
	FROM sys.dm_io_virtual_file_stats (NULL, NULL)

END TRY

BEGIN CATCH
	SELECT
		ERROR_NUMBER() AS ErrorNumber
		,ERROR_MESSAGE() AS ErrorMessage
		,ERROR_SEVERITY() AS ErrorSeverity
		,ERROR_STATE() AS ErrorState
		,ERROR_LINE() AS ErrorLine
		,ERROR_PROCEDURE() AS ErrorProcedure

END CATCH

-- 3) Ascertain the last server restart date since this dynamic mgmt view or function or whatever it is
--	sys.dm_io_virtual_file_stats resets/server restart

BEGIN
	BEGIN TRY

		SELECT @tempdbCreateDate = create_date
		FROM master.sys.databases
		WHERE name = 'tempdb'

		UPDATE DBA.dbo.FileStats
		SET LastServerRestart = @tempdbCreateDate
		WHERE LastServerRestart IS NULL;

	END TRY

	BEGIN CATCH

		SELECT
			ERROR_NUMBER() AS ErrorNumber
			,ERROR_MESSAGE() AS ErrorMessage
			,ERROR_SEVERITY() AS ErrorSeverity
			,ERROR_STATE() AS ErrorState
			,ERROR_LINE() AS ErrorLine
			,ERROR_PROCEDURE() AS ErrorProcedure

	END CATCH	

END

Execution Example: Here’s an example of how to execute this script:

DECLARE @DaysToRetain SMALLINT
SET @DaysToRetain = 712

EXECUTE [DBA].[dbo].[GetFileStats]
   @DaysToRetain

Here’s a output sample from an invocation of the [DBA].[dbo].[GetFileStats] stored procedure:

Get File Stats Sample Output

Conclusion: With the introduction of SQL Server 2005’s “Dynamic Management Views and Functions,” DBAs have much data to sift through. SQL Server MVPs Andrew J. Kelley and Randy Dyess assert that we need to use the “sys.dm_io_virtual_file_stats” view to discover disk-performance issues. The script I’ve provided will  help you launch on that task.

In a follow-up post, I’ll discuss how to analyze your historical “file stats” data.

Posted in Alphabetical, DBA Best Practice, DBA Tools | Tagged , , , , , | Comments Off on “Get File Stats” Tool for SQL Server 2005 Part I