Azure Pipelines - logging commands from SQL script - azure-devops

I am trying to log some messages from a TSQL script running via Azure Pipelines, for instance, before creating a table we check if table already exists and if so we simply print a message and skip table creation...
there are good articles explaining how to access Azure Pipelines Logging Commands from BASH or PowerShell, for instance this article: https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash
but how to output messages to the pipeline logs from within TSQL statement itself?
I will try with RAISERROR ( e.g. RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT; ) hopefully works better than PRINT command, has anyone had similar issue and how did he resolve it?

You can run your scripts through PowerShell Invoke-Sqlcmd with -Verbose key. Here is small example for PowerShell task:
$server = "$(servername)"
$dbname = "$(dbname)"
$u = "$(username)"
$p = "$(password)"
$filename = "testfile.sql"
$filecontent = "RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT;`r`nGO`r`n"
Set-Content -Path $filename -Value $filecontent
Write-Host '##[command] Executing file... ', $filename
#Execution of SQL packet
try
{
Invoke-Sqlcmd -InputFile "$filename" -ServerInstance $server -Database $dbname -Username "$u" -Password "$p" -QueryTimeout 36000 -Verbose
}
catch
{
Write-Host "##[error]" $Error[0]
Write-Host "##[error]----------\n"
}
Result:

You can use Azure Pipelines Logging Commands together with PRINT and RAISERROR commands.
The logging commands syntax ##vso[task..] is the reserved keywords in Azure devops piplines. When ##vso[task..] is found in the tasks' output stream, Azure devops pipeline will execute the logging commands.
So that you can output messages to the pipeline logs from within TSQL statement using logging commands with PRINT or RAISERROR. See below example:
PRINT N'##vso[task.logissue type=warning]Table already exists.';
RAISERROR('##vso[task.logissue type=warning]Table [dbo].[ReportHistory] already exists!',0,1);
See below output messages in pipeline logļ¼š

Related

Azure synapse deployment failing

I am trying to deploy SQL files to an Azure Synapse Analytics dedicated SQL pools using PowerShell script in an Azure Devops pipeline.
I have a folder with SQL files and after defining array of files I am trying to run foreach loop for array and trying to Invoke-Sqlcmd to deploy files but first SQL file get deployed (object is created) and then I get error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
Below is my piece of code:
$files='$(Build.SourcesDirectory)\folder1\'
foreach ($s in $files)
{
Invoke-sqlcmd -ServerInstance $(server) -Database $(db) -InputFile $s -Credential $(cred)}
Azure Synapse Analytics dedicated SQL pools scripts are sensitive to empty batches, eg this script would generate the same error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
-- xx
GO
However the truth is you get the same error in Synapse Studio and SQL Server Management Studio (SSMS), so I suggest you run through your scripts and use the Parse button (Ctrl+F5) in SSMS, which parses the script but does not execute it. This will help you track down your error:
In summary don't have batches that don't do anything.
I was able to get a simple local example added by including dir and using the FullName method to get the full path:
$files = dir "D:\temp\Powershell\*.sql"
foreach ($s in $files) {
#$s.Name
Invoke-Sqlcmd -ServerInstance '.\sql2019x' -Database 'tempdb' -InputFile $s.FullName
}

Capture errors within ForEach-Object -Parallel in Powershell 7

I am trying to execute Powershell script (7.0) file using Powershell 7-Preview which iterates through all the databases, update them using DACPAC file and execute other SQL Server scripts in all the DBs.
It works fine when there is no errors, however, in case of an error while executing Dacpac file
It gives below error and stops executing the script further.
##[error] Could not deploy package.
##[error]PowerShell exited with code '1'.
Any pointer on how we can catch the errors gracefully in PowerShell within the Parallel statement and let script to be continued for other databases? Try-Catch block does not seem to be working here.
I am new to PowerShell. This PowerShell script is a part of DevOps release pipeline.
#.. Variables are defined here ..#
[string[]]$DatabaseNames = Invoke-Sqlcmd #GetDatabasesParams | select -expand name
[int]$ErrorCount = 0
$DatabaseNames | ForEach-Object -Parallel {
try
{
echo "$_ - Updating database using DACPAC."
dir -Path "C:\Program Files (x86)\Microsoft Visual Studio*" -Filter "SqlPackage.exe" -Recurse -ErrorAction SilentlyContinue | %{$_.FullName} {$using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword /tsn:$using:ServerInstance /tdn:"$_" /sf:using:$DacpacLocation /p:BlockOnPossibleDataLoss=False}
echo "$_ - Updating the scripts."
$OngoingChangesScriptParams = #{
"Database" = "$_"
"ServerInstance" = "$ServerInstance"
"Username" = "$DatabaseUsername"
"Password" = "$DatabasePassword"
"InputFile" ="$SystemWorkingDir\$OngoingChangesScriptLocation"
"OutputSqlErrors" = 1
"QueryTimeout" = 9999
"ConnectionTimeout" = 9999
}
Invoke-Sqlcmd #OngoingChangesScriptParams
}
catch {
$ErrorCount++
echo "Internal Error. The remaining databases will still be processed."
echo $_.Exception|Format-List -force
}
}```
Logs-
2020-09-17T19:21:59.3602523Z *** The column [dbo].[FileJob].[GMTOffset] on table [dbo].[FileJob] must be added, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work. To avoid this issue you must either: add a default value to the column, mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
2020-09-17T19:21:59.4253722Z Updating database (Start)
2020-09-17T19:21:59.4274293Z An error occurred while the batch was being executed.
2020-09-17T19:21:59.4280337Z Updating database (Failed)
2020-09-17T19:21:59.4330894Z ##[error]*** Could not deploy package.
2020-09-17T19:22:00.3399607Z ##[error]PowerShell exited with code '1'.
2020-09-17T19:22:00.7303341Z ##[section]Finishing: Update All Companies

azure cli not stopping on error in PS script

Boiled down to the minimum I have a Powershell script that looks like this:
$ErrorActionPreference='Stop'
az group deployment create -g ....
# Error in az group
# More az cli commands
Even though there is an error in the az group deployment create, it continues to execute beyond the error. How do I stop the script from executing on error?
Normally, the first thing to try is to wrap everything in a try...catch block.
try {
$ErrorActionPreference='Stop'
az group deployment create -g ....
# Error in az group
# More az cli commands
}
catch {
Write-Host "ERROR: $Error"
}
Aaaaand it doesn't work.
This is when you scratch your head and realize that we are dealing with Azure CLI commands and not Azure PowerShell. They are not native PowerShell commands which would honor $ErrorActionPreference, instead, (as bad as it sounds), we have to treat each Azure CLI command independently as if we were running individual programs (in the back end, the Azure CLI is basically aliases which run python commands. Ironically most of Azure PowerShell commands are just PowerShell wrappers around Azure CLI commands ;-)).
Knowing that the Azure CLI will not throw a terminating error, instead, we have to treat it like a program, and look at the return code (stored in the variable $LASTEXITCODE) to see if it was successful or not. Once we evaluate that, we can then throw an error:
az group deployment create -g ....
if($LASTEXITCODE){
Write-Host "ERROR: in Az Group"
Throw "ERROR: in Az Group"
}
This then can be implemented into a try...catch block to stop the subsequent commands from running:
try {
az group deployment create -g ....
if($LASTEXITCODE){
Write-Host "ERROR: in Az Group"
Throw "ERROR: in Az Group"
}
# Error in az group
# More az cli commands
}
catch {
Write-Host "ERROR: $Error"
}
Unfortunately this means you have to evaluate $LASTEXITCODE every single time you execute an Azure CLI command.
You may use the automatic variable $?. This contains the result of the last execution, i.e. True if succeded or False if it failed: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_automatic_variables?view=powershell-5.1#section-1
Your code would look something like this:
az group deployment create -g ....
if(!$?){
Write-Error "Your error message"
# Handle your error
}
Unfortunately, and same to #HAL9256 answer, you will need to add this code every single time you execute azure-cli
Update - PS Version 7 error trapping for Az CLI (or kubectl)
This behavior changed significantly in PowerShell v7:
https://github.com/PowerShell/PowerShell/issues/4002
https://github.com/PowerShell/PowerShell/pull/13361
I ended up using the following solution instead which is consistent with PowerShell v5/7. It has the advantage of accepting pipeline input for commands like kubectl where stdio can be used for applying configuration etc.
I used scriptblocks since this integrates nicely into existing scripts without breaking syntax:
# no error
# Invoke-Cli {cmd /c echo hi}
# throws terminating error
# (with psv7 you can now call Get-Error to see details)
# Invoke-Cli {az bad}
# outputs a json object
# Invoke-Cli {az account show} -AsJson
# applies input from pipeline to kubernetes cluster
# Read-Host "Enter some json" | Invoke-Cli {kubectl apply -f -}
function Invoke-Cli([scriptblock]$script, [switch]$AsJson)
{
$ErrorActionPreference = "Continue"
$jsonOutputArg = if ($AsJson)
{
"--output json"
}
$scriptBlock = [scriptblock]::Create("$script $jsonOutputArg 2>&1")
if ($MyInvocation.ExpectingInput)
{
Write-Verbose "Invoking with input: $script"
$output = $input | Invoke-Command $scriptBlock 2>&1
}
else
{
Write-Verbose "Invoking: $script"
$output = Invoke-Command $scriptBlock
}
if ($LASTEXITCODE)
{
Write-Error "$Output" -ErrorAction Stop
}
else
{
if ($AsJson)
{
return $output | ConvertFrom-Json
}
else
{
return $output
}
}
}
Handling command shell errors in PowerShell <= v5
Use $ErrorActionPreference = 'Stop' and append 2>&1 to the end of the statement.
# this displays regular output:
az account show
# this also works as normal:
az account show 2>&1
# this demonstrates that regular output is unaffected / still works:
az account show -o json 2>&1 | ConvertFrom-Json
# this displays an error as normal console output (but unfortunately ignores $ErrorActionPreference):
az gibberish
# this throws a terminating error like the OP is asking:
$ErrorActionPreference = 'Stop'
az gibberish 2>&1
Background
PowerShell native and non-native streams, while similar, do not function identically. PowerShell offers extended functionality with streams and concepts that are not present in the Windows command shell (such as Write-Warning or Write-Progress).
Due to the way PowerShell handles Windows command shell output, the error stream from a non-native PowerShell process is unable (by itself) to throw a terminating error in PowerShell. It will appear in the PowerShell runspace as regular output, even though in the context of Windows command shell, it is indeed writing to the error stream.
This can be demonstrated:
# error is displayed but appears as normal text
cmd /c "asdf"
# nothing is displayed since the stdio error stream is redirected to nul
cmd /c "asdf 2>nul"
# error is displayed on the PowerShell error stream as red text
(cmd /c asdf) 2>&1
# error is displayed as red text, and the script will terminate at this line
$ErrorActionPreference = 'Stop'
(cmd /c asdf) 2>&1
Explanation of 2>&1 workaround
Unless otherwise specified, PowerShell will redirect Windows command shell stdio errors to the console output stream by default. This happens outside the scope of PowerShell. The redirection applies before any errors reach the PowerShell error stream, making $ErrorActionPreference irrelevant.
The behavior changes when explicitly specified to redirect the Windows command shell error stream to any other location in PowerShell context. As a result, PowerShell is forced to remove the stdio error redirection, and the output becomes visible to the PowerShell error stream.
Once the output is on the PowerShell error stream, the $ErrorActionPreference setting will determine the outcome of how error messages are handled.
Further info on redirection and PowerShell streams
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_redirection?view=powershell-7.2
The sample from HAL9256 didn't work for me either, but I did find a workaround:
az group deployment create -g ....
if($error){
Write-Host "ERROR: in Az Group"
#throw error or take corrective action
$error.clear() # optional - clear the error
}
# More az cli commands
I used this to try to deploy a keyvault, but in my environment, soft delete is enabled, so the old key vault is kept for a number of days. If the deployment failed, I'd run a key vault purge and then try the deployment again.
Building from HAL9256's and sam's answers, add this one-liner after each az cli or Powershell az command in a Powershell script to ensure that the catch block is hit when an error occurs:
if($error){ throw $error }
Then for the catch block, clear the error, e.g.
catch {
$ErrorMessage = $_.Exception.Message
$error.Clear()
Write-Warning "-- Operation Failed: $ErrorMessage"
}

Retrieve something from different Azure subscriptions at the same time

I want to run a PowerShell Cmdlet against multiple Azure subscriptions, that's why I've thought about running it within a foreach loop but it did not work:
$Subscriptions = (Get-AzureRmSubscription).SubscriptionId
foreach ($sub in $Subscriptions)
{
Select-AzureRmSubscription -Subscription $sub
Do the task Cmdlet
}
Actually what it does is to run the task against the last subscription it was able to select.
Any better ways to workaround this?
Unfortunately the result cannot be exported to a csv file or a variable because it is displayed under the subscription info, as shown in the following figure.
Try using the Set-AzureRmContext cmdlet to set the subscription:
Get-AzureRmSubscription | ForEach-Object {
$_ | Set-AzureRmContext
# do your task
}

Run SQL script file from powershell

I am trying to run queries stored in a text file from PowerShell. I use following to do that;
Invoke-Expression 'sqlcmd -d TestDB -U $user -P $pw -i "E:\SQLQuery1.sql"'
If an error or exception occurs when executing the queries from the .sql file, how can I capture that in my Powershell script? How can I get the script output?
NOTE: I cannot use invoke-sqlcmd
To answer the question
If some error or exception occurred when executing .sql file how can I get that into my PowerShell script? How can I get the script output?"
Invoke-Expression returns the output of the expression executed. However, it may only capture STDOUT, not STDERR (I haven't tested, as I don't use this method), so you might not get the actual error message.
From the Help:
The Invoke-Expression cmdlet evaluates or runs a specified string as a command and returns the results of the expression or command
The better route is to use the PowerShell method you already have available - Invoke-SQLCmd is installed if you have installed any of the SQL Server 2008 (or newer) components/tools (like SSMS). If you've got SQL Server 2012, it's very easy: import-module sqlps. For 2008, you need to add a Snap-In, add-pssnapin SqlServerCmdletSnapin. And since you have sqlcmd.exe, the PowerShell components should be there already.
If all else fails, go the System.Data.SQLClient route:
$Conn=New-Object System.Data.SQLClient.SQLConnection "Server=YOURSERVER;Database=TestDB;User Id=$user;password=$pw";
$Conn.Open();
$DataCmd = New-Object System.Data.SqlClient.SqlCommand;
$MyQuery = get-content "e:\SQLQuery1.sql";
$DataCmd.CommandText = $MyQuery;
$DataCmd.Connection = $Conn;
$DAadapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$DAadapter.SelectCommand = $DataCmd;
$DTable = New-Object System.Data.DataTable;
$DAadapter.Fill($DTable)|Out-Null;
$Conn.Close();
$Conn.Dispose();
$DTable;
With both this and Invoke-SQLCmd, you'll be able to use try/catch to pick up and handle any error that occurs.
As seen in this question's answers, there is a method built into Powershell to invoke SQLCMD called, unsurprisingly, Invoke-Sqlcmd.
It's very easy to use for individual files:
Invoke-sqlcmd -ServerInstance $server -Database $db -InputFile $filename
Or groups:
$listOfFiles | % { Invoke-sqlcmd -ServerInstance $server -Database $db -InputFile $_ }
Use invoke-sqlquery module, available at this website.