Share via


Export SQL Query Results to Excel and Email to Multiple Recipients

I was recently tasked with creating a custom solution that could query a table (specifically 6 columns) of which one column contained a query that needed to be executed. The results of this execution needed to be placed in Excel, formatted and sent to a list of recipients (one of the other columns). The request was a little more involved than this but the under-pinning’s I thought might be useful to others so I’m posting my solution here.

To begin with, I know there are many different possibilities for creating this type of solution. I opted for a solution that I felt was simplistic (to me) and something I knew I could build within a reasonable amount of time (disclaimer: I am not a programmer so forgive my coding practices). I choose to design the solution in PowerShell (2.0 specifically) and tap into some existing cmdlets I was aware of to help speed things along.

To get started I conceptually laid out the steps I needed to take to get to the end result. 

  • I had a table called [Alerts] that contained 6 columns I needed to query.
  • I needed to execute an initial query to query the [Alerts] table to extract the results from the 6 columns:
    • [Query]
    • [Recipients]
    • [Subject]
    • [Body]
    • [From]
    • [FileLocation]

I think these column names are pretty self-explanatory as to what they are for. The [Query] column is the column that contains the query I needed to execute and then save the results to Excel. The [FileLocation] column is the file location that the Excel file should be saved to.

  • For each row in the [Alerts] table I needed to execute the query from the [Query] column.
  • Save the results to Excel (formatted nicely)
  • Email the file as an attachment to individuals in the [Recipients] column with [Subject], [Body], and [From] from their respective columns.

One of the other requirements was that all of this needed to be called from a stored procedure. But for this posting I’m just going to show you what I did during testing and validation to call a script with hard coded values in it. If I get time I’ll do another post on how I transplanted the rest into a sproc I could call and pass in dynamic values.

Exporting query results directly into Excel isn’t a native capability of SQL Server, so I went through what I knew I could leverage and settled on PowerShell to assist to me. I also was unaware of an easy way to go straight to Excel so I came up with a workaround that works quiet well for my situation. So without further ado, here
is my solution.

First I created this Powershell code, note; I’ve changed some of the hard-coded values for security. I’ve commented a lot of the code chunks for clarity and will expand areas that I feel warrant further explanation.

  #---------------
 # Function: Main
 #---------------
 #Server we are connecting to
 $ServerName = 'MySQLServer'
 
 #DB we are connecting to
 $DatabaseName = 'TestDB'
 
 #Initial query to extract data from [Alerts] table
 $Query = 'select TSQL, FilePath, Recipient, Subject, Body, [From] from [dbo].[Alerts]'

The following variables are the hard-coded column names from the [Alerts] table. While not desirable at the time this is how I constructed this to get the results I wanted.

  #Column names from [Alerts] table
 $RecipientsColName = 'Recipient'
 $TSQLColName = 'TSQL'
 $FilePathColName = 'FilePath'
 $SubjectColName = 'Subject'
 $BodyColName = 'Body'
 $FromColName = 'From'
 $smtpServer = 'mysmtphost.somewhere.mydomain.com'

I spin up an Excel COM object initially to prevent load time later (as you’ll see in a bit).

  #Create Excel COM object
 $Global:excel = New-Object -ComObject excel.application
 $Global:excel.visible = $False
 $Global:excel.displayalerts=$False
 
 #Pass in query from above to query [Alerts] table #SQL Query to execute
 $sqlCMD = $Query
 
 #Create a SQL connection string
 $sqlConStr = "Server=$ServerName;Database=$DatabaseName;Integrated Security=True" 
 
 #Open the connection
 $sqlCon = New-Object System.Data.SQLClient.SqlConnection ($sqlConStr)
 
 #Create a Data adapter, pass in query and connection params
 $sqlDA = New-Object System.Data.SqlClient.SqlDataAdapter ($sqlCMD, $sqlCon) 
 
 #Create a Data Table
 $sqlDT = new-object System.Data.DataTable

In the error handling I have a function called CloseExcel (detailed later on). This function will dereference the Excel COM object should a failure occur during the connection attempt.

  #Error handling should the connection fail 
 Try
 {
 #Fill table, catch connection error
 $sqlDA.fill($sqlDT) | out-null
 }
 Catch [Exception]
 {
 CloseExcel
 Return $Error
 }
 Finally
 {
 #Close connection
 $sqlCon.Close() 
 } 

This variable is used to hold the position of the files I create. Since each row will create files this is the position of each file in an array.

  #xls file slot in array for attachment to send
 [int] $Slot = 0

At this point I have all the results in a data table and I want to loop through each row pulling out the values from each column. This is where the hard coded column values come in handy. There is probably a better or different way of doing this using column position, but this worked for me.

  #Loop through each row in the table
 $sqlDT | ForEach-Object { 
 [string] $Query = $_.$TSQLColName
 [string] $FilePath = $_.$FilePathColName
 [string] $Recipient = $_.$RecipientsColName
 [string] $Subject = $_.$SubjectColName
 [string] $Body = $_.$BodyColName
 [string] $From = $_.$FromColName

For each row I stuff the column values into their respective variables and then execute the respective query. I’m using the built-in Powershell cmdlet Export-Csv to automatically convert the results into a csv file (to the location and name specified in the table columns). There is nothing dynamically being created here as the requirement was to put the file in a location specified in the table. One could easily change this to dynamically create a filename.

  Try
 {
 #Execute $query and Output results to .csv file
 Invoke-Sqlcmd -query $Query -ServerInstance $ServerName -Database 
 $DatabaseName | Export-Csv -Path $FilePath".csv" -NoTypeInformation
 }
 Catch [Exception]
 { 
 CloseExcel
 Return $Error
 }

What I’m doing here is building an array of csv files (for each queries row output). I will explain more on why I’m doing it this way later.

 #Build csv file list for cleanup
$Global:csvFileLists += $Global:FilePath + '.csv'

Now here comes the magic. What I do is take the csv file and open it in Excel silently. Notice the use of the file list array and slot to determine the file based on what iteration of the loop I’m in.

  #Format the file to .xls
 $workBook = $excel.Workbooks.Open($Global:csvFileLists[$Slot])
 $workSheet = $workBook.workSheets.Item(1)

$RowCount holds the number of rows in the file, which I use later to determine what to subject to email. If you expect more rows than what an integer can hold, you should change this type to a different type.

  #Determine how many rows
 [int] $RowCount = $workSheet.UsedRange.Rows.Count

Here I'm using Excel's autofit function to resize the columns for me:

  #Autofit the text to the column
 $resize.EntireColumn.AutoFit() | Out-Null

Now I’m just formatting the entire first row since the csv file will contain headers to make it pretty and stand out.

  #Select the entire first row, bold, underline, center and shade it.
 $range = $workSheet.Cells.Item(1).EntireRow 
 $range.Font.Bold = $true
 $range.Font.Underline = $true
 $range.HorizontalAlignment = 3
 
 #light blue - https://msdn.microsoft.com/en-us/library/cc296089.aspx#xlDiscoveringColorIndex_ColorIndexProperty
 $range.Interior.ColorIndex = 37
 
 # Possible values from: https://msdn.microsoft.com/en-us/library/microsoft.office.interop.excel.xlfileformat.aspx
 # 51 = xlOpenXMLWorkbook (without macro's in 2007-2010, xlsx)
 # 56 = xlExcel8 (97-2003 format in Excel 2007-2010, xls)
 # 52 = xlOpenXMLWorkbookMacroEnabled (with or without macro's in 2007-2010, xlsm)
 # 50 = xlExcel12 (Excel Binary Workbook in 2007-2010 with or without macro's, xlsb)
 $xlExcel8 = 56

To change this from a csv to an Excel file I drop the .csv file extension and then save the file as a native .xls file.

  #Drop .csv on file path for saving
 $Global:FilePath = $Global:FilePath.Replace(".csv", "")
 $workbook.SaveAs($Global:FilePath,$xlExcel8)

I also track in a different global array all of the Excel files I’ve create so I can be sure to delete them at the end.

  #Add file to list for later cleanup
 $Global:xlsFileLists += $workBook.FullName
 
 #Close workbook
 $workBook.Close() 
 
 #deref sheets
 $workSheet = $null
 $workBook = $null

Now, to send an email I’m using PowerShell’s built in Send-MailMessage cmdlet. One of the tricky thing about using this is if you have multiple recipients. If you have multiple recipients entered in the table like this:

Someone1@outlook.com,Someone2@outlook.com

You can’t just stuff the single string $Recipient into the Send-MailMessage parameter –To like this: 

$Recipient = Someone1@outlook.com,Someone2@outlook.com

Send-MailMessage -To $Recipient

It doesn’t work because Send-MailMessage expects either a string, or a string array. Thus I built an el-cheapo parse should there be more than 1 recipient in the column. The function is called . . . can you guess ... ParseRecipient and I pass the variable $Recipient. I admit I went a little cheeky and created a global variable, well if I’m being
honest a few globals, simply to make things easier to work with. 

The logic is, if $RowCount from the Excel file is greater than 1, I know there is data in the file since the first row is the headers. If it’s less than 1, I know there wasn’t any data in the file and my logic changes. In both cases I call the function ParseRecipient and pass $Recipient for parsing.

  If ($RowCount -gt 1 )
 {
 Try
 {
 #Parase Recipient
 ParseRecipient $Recipient
 
 #Send Email 
 Send-MailMessage -To $Global:Recipients -Subject $Subject -Body $Body -Attachment $Global:xlsFileLists[$Slot] -From $From -SmtpServer $smtpServer 
 }
 Catch [Exception]
 { 
 CloseExcel
 Return $Error
 } 
 }
 ELSE
 { 
 $Subject = "No Reports Today - $Subject"
 $Body = "Reports were empty today."
 
 Try
 {
 #Parase Recipient
 ParseRecipient $Recipient
 
 #Send Email
 Send-MailMessage -To $Recipients -Subject $Subject -Body $Body -From $From -SmtpServer $smtpServer
 }
 Catch [Exception]
 { 
 CloseExcel
 Return $Error
 } 
 }
 
 #increment for next attachment file 
 $Slot += 1
 }
 #end of $sqlDT | ForEach-Object {

This point marks the end of the looping from $sqlDT | ForEach-Object. At this point for every row in the table, we’ve executed a query, pumped the results into an excel file (saving the file name and location in an array), converted the file from .csv to .xls (again saving the file name and location in a different array) and sent out an email for each file to the respective recipients. Now we need to do a little housekeeping and delete all the files. 

Here I just loop through the arrays I built and delete all the .csv and .xls files.

  #Delete csv files
 ForEach ($csvFile in $Global:CSVFileLists)
 {
 Remove-Item $csvFile
 }
 
 #Delete xls files
 ForEach ($xlsFile in $Global:xlsFileLists) 
 {
 Remove-Item $xlsFile
 } 

Finally I make sure to call the CloseExcel function to clean up the Excel COM object references.

 CloseExcel 

And there you have it. This is how I exported SQL query results with column headers to Excel and emailed them to intended recipients. Here is the CloseExcel Function (nothing special and perhaps a little overkill).

 #----------------------------------
 # Function: CloseExcel
 # Dereferences the Excel COM object
 #----------------------------------
 Function CloseExcel
 {
 #Close Excel and cleanup
 $Global:excel.quit() 
 $Global:excel = $null
 [GC]::Collect() 
 } 

And here is the recipient parsing logic:

  
 #---------------------------------------
 # Function: ParseRecipient
 # Parses Recipient into Recipients Array
 #---------------------------------------
 Function ParseRecipient
 {
 Param([string]$Recipient)
 
 #Clear out array
 $Global:Recipients = $null
 
 $commaIndex = 0
 DO
 {
 #First comma loc (0 based)
 $commaIndex = $Recipient.Trim().IndexOf(",") 
 
 IF ($commaIndex -eq -1)
 {
 #Build Recipients Array
 $Global:Recipients += $Recipient.Trim()
 }
 ELSE
 {
 #Build Recipients Array from 0 to commaIndex
 $Global:Recipients += $Recipient.Trim().Substring(0,$commaIndex)
 
 #Rewrite Recipient list, start from commaIndex+1
 $Recipient = $Recipient.Substring(($commaIndex+1)).Trim()
 }
 }
 UNTIL ($commaIndex -eq -1)
 } 

I forgot to mention here are my global variables I declared:

  #-------
 #Globals 
 #-------
 #Collection of files to cleanup
 [string[]] $Global:xlsFileLists = @()
 [string[]] $Global:csvFileLists = @()
 [object] $Global:excel
 [string[]] $Global:Recipients = @() 
 

To complete this I wired it up so I could call the script from SQL Server by usig xp_cmdshell by doing this:

  DECLARE @pscmd varchar (200)
 DECLARE @result int
 SET @pscmd = 'C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe –file "C:\ExportToExcel.ps1"' 
 EXEC @result = xp_cmdshell @pscmd, NO_OUTPUT
 If (@result=0)
 Print 'Sucess'
 ELSE
 Print 'Failure' 

I've attached the complete code below (ExportToExcel.zip) for you to enjoy. Remember none of this code is production ready. Use it at your own risk as it is merely meant to be sample code for you to recycle and reform into your own needs.

Thanks for stopping by and I hope this little morsel has been helpful to you!!

 

ExportToExcel.zip