THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

ADO.NET in PowerShell

I've had to rework some logic for gathering database information in my serverstatus.ps1 script, because SMO doesn't exactly do things the way one would think. The result is that the performance is unacceptable when dealing with multi-terabyte databases.

In gathering the actual data, then, I resorted to good old fashioned T-SQL via ADO.NET. (I have a followup script to load the collected data using the Invoke-SQLCMD cmdlet, but I don't want to force my clients to load the SQL 2008 tools to run my data collection script.) To gather the information I'd gotten previously using the SMO Databases collection I have to do things in multiple steps, and I use temporary tables to get there. (Yes, a stored procedure would be a better choice here, but I don't want to install anything on client systems.)

The first step is to gather everything about the databases using the following T-SQL command through ADO.NET via PowerShell:

$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$inst;Integrated Security=SSPI;Initial Catalog=master");
$cn.Open()
$q = "SELECT  dtb.database_id as [db_id],"
$q = $q + "  dtb.name as [Name],"
$q = $q + "        dtb.collation_name AS [Collation],"
$q = $q + "        CASE dtb.compatibility_level"
$q = $q + "             when 80 then 'Version80'"
$q = $q + "             when 90 then 'Version90'"
$q = $q + "             when 100 then 'Version100'"
$q = $q + "             END AS [CompatibilityLevel],"
$q = $q + "        CASE when dtb.is_auto_shrink_on = 0 then 'False'"
$q = $q + "             else 'True'"
$q = $q + "             END AS [AutoShrink],"
$q = $q + "        CASE dtb.recovery_model"
$q = $q + "             when 1 then 'Full'"
$q = $q + "             when 2 then 'BulkLogged'"
$q = $q + "             when 3 then 'Simple'"
$q = $q + "             END AS [RecoveryModel],"
$q = $q + "        CAST(0 AS float) AS [Size],"
$q = $q + "        CAST(0 AS float) AS [SpaceAvailable]"
$q = $q + "   into tempdb.dbo.tmpdbs"
$q = $q + "   FROM    master.sys.databases AS dtb"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$dr = $cmd.ExecuteNonQuery()
$cn.Close()

This creates a persistent table called tmpdbs in tempdb. It uses the ExecuteNonQuery method of the SqlCommand object, allowing me to do work without expecting results back. I create a persistent tempdb table because I'm closing my connection after the query completes. (Database Best Practice: Open connections late, close them early.)

Next, while iterating through the databases gathering the file size details I use the following commands to gather and update the rows in the persistent tempdb table:

$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$inst;Integrated Security=SSPI;Initial Catalog=$dbname");
$cn.Open()
$q = "create table #tmpspc"
$q = $q + "    ("
$q = $q + "      Fileid int,"
$q = $q + "      FileGroup int,"
$q = $q + "      TotalExtents int,"
$q = $q + "      UsedExtents int,"
$q = $q + "      Name sysname,"
$q = $q + "      FileName nchar(520)"
$q = $q + "    ) "
$q = $q + "insert #tmpspc EXEC ('dbcc showfilestats')"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$dr = $cmd.ExecuteNonQuery()

$q = " select (SUM(CAST(TotalExtents as float)) * 64) / 1024.0 AS [Size],"
$q = $q + "  ((SUM(CAST(TotalExtents as float)) - "
$q = $q + "  SUM(CAST(UsedExtents as float))) * 64) / 1024.0 AS [SpaceAvailable]"
$q = $q + " into #tmpdbz"
$q = $q + " from #tmpspc"
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$dr = $cmd.ExecuteNonQuery()

$q = " UPDATE  d"
$q = $q + " SET d.[Size] = z.[Size]"
$q = $q + "   ,d.[SpaceAvailable] = z.[SpaceAvailable]"
$q = $q + " from #tmpdbz z, tempdb.dbo.tmpdbs d"
$q = $q + " WHERE d.db_id = " + [string]$db.ID + ""
$cmd = new-object "System.Data.SqlClient.SqlCommand" ($q, $cn)
$dr = $cmd.ExecuteNonQuery()
$cn.Close()

Lastly, I use a SqlDataAdapter and a DataTable object to gather the results and send them to the output CSV file.

$cn = new-object system.data.SqlClient.SqlConnection("Data Source=$inst;Integrated Security=SSPI;Initial Catalog=master");
$cn.Open()
$q = "select Name, Collation, CompatibilityLevel, AutoShrink, RecoveryModel, Size, SpaceAvailable"
$q = $q + " FROM tempdb.dbo.tmpdbs
"
$q = $q + " drop table tempdb.dbo.tmpdbs"
$dbsf = new-object "System.Data.SqlClient.SqlDataAdapter" ($q, $cn)
$dbsf.SelectCommand.CommandTimeout = 0
$dbsf.Fill($dbf)
$dtDatabase = new-object "System.Data.DataTable" "dtDatabaseData"
$dtDatabase = $dbf.Tables[0]
$outnm = ".\" + $svr + "\" + $instnm + "_GEN_Databases.csv"
$dtDatabase | select Name, Collation, CompatibilityLevel, AutoShrink,
	RecoveryModel, Size, SpaceAvailable | export-csv -path $outnm -noType
$cn.Close()

As with anything, there's more than one way to solve a problem, but I found this solution performs well even with very large databases, and avoids the insane number of round-trips to the server that the SMO object makes.

P.S. Thanks to Philip Bondi for asking about this.

Allen

Published Tuesday, September 22, 2009 2:03 PM by AllenMWhite
Filed under: ,

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

Anon said:

Thank you so much for the post

April 10, 2012 6:05 AM

Leave a Comment

(required) 
(required) 
Submit

About AllenMWhite

Allen White is a consultant and mentor for Upsearch Technology Services in Northeast Ohio. He has worked as a Database Administrator, Architect and Developer for over 30 years, supporting both the Sybase and Microsoft SQL Server platforms over that period.

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement