Azure synapse deployment failing - powershell

I am trying to deploy SQL files to an Azure Synapse Analytics dedicated SQL pools using PowerShell script in an Azure Devops pipeline.
I have a folder with SQL files and after defining array of files I am trying to run foreach loop for array and trying to Invoke-Sqlcmd to deploy files but first SQL file get deployed (object is created) and then I get error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
Below is my piece of code:
$files='$(Build.SourcesDirectory)\folder1\'
foreach ($s in $files)
{
Invoke-sqlcmd -ServerInstance $(server) -Database $(db) -InputFile $s -Credential $(cred)}

Azure Synapse Analytics dedicated SQL pools scripts are sensitive to empty batches, eg this script would generate the same error:
Msg 104309, Level 16, State 1, Line 1 There are no batches in the
input script.
-- xx
GO
However the truth is you get the same error in Synapse Studio and SQL Server Management Studio (SSMS), so I suggest you run through your scripts and use the Parse button (Ctrl+F5) in SSMS, which parses the script but does not execute it. This will help you track down your error:
In summary don't have batches that don't do anything.
I was able to get a simple local example added by including dir and using the FullName method to get the full path:
$files = dir "D:\temp\Powershell\*.sql"
foreach ($s in $files) {
#$s.Name
Invoke-Sqlcmd -ServerInstance '.\sql2019x' -Database 'tempdb' -InputFile $s.FullName
}

Related

How to catch unexpected error running Liquibase in PowerShell

I have a small CI/CD script which is written in PowerShell, but I don't know how to stop it, if in the script I get unexpected error running Liquibase. All scripts in SQL and work (preconditions in place where I need to add), but I want to have more opportunity to control CI/CD script. Now, if the script gets the exception, it continues execution. The script updates some schemes and some of them has an influence on each other, so order is important.
#update first scheme - ETL (tables, global temp tables, packages)
.\liquibase update --defaults-file=import.properties
#how to stop this script, if I get unexpected error running Liquibase?
#update second scheme - data (only tables and roles for data)
.\liquibase update --defaults-file=data.properties
#update third scheme - views, tables and other for export data
.\liquibase update --defaults-file=export.properties
Have you tried this?
$result = Start-Process -filepath 'path to liquibase' -ArgumentList "your liquibase arguments go here" -wait
if($result.ExitCode -ne 0){
write-host 'something went wrong'
}

Set-AzDataFactoryV2Trigger fails in Azure Powershell Task in Release pipeline but works fine on Powershell in frontend machine

I want to create all the triggers in ADF after the Release pipeline has been run successfully . This is because there is a hard 256 parameters limit for ARM template max no. of parameters.
The idea is we will delete all the triggers in DEV, TEST, QA and in PROD. In our published artifact, we would have all the JSONs trigger files using which we can create triggers. The Release pipeline would run a PowerShell script and create the Triggers using Set-AzDataFactoryV2Trigger.
I am able to run the below script correctly on my frontend -
$AllTriggers = Get-ChildItem -Path .
Write-Host $AllTriggers
$AllTriggers | ForEach-Object {
Set-AzDataFactoryV2Trigger -ResourceGroupName "<MyResourceGroupName>" -DataFactoryName "<MyTargetDataFactoryName>" -Name "$_" -DefinitionFile ".\$_.json"
}
In the Azure Powershell script, the first line has to be changed a little to read all the JSON's from the Published Artifact -
$AllTriggers = Get-ChildItem -Name -Path "./_TriggerCreations/drop/" -Include *.json
I receive the below error when trying to run this script via Az Powershell task in the release pipeline (you may note that the error is gibberish) -
The yellow blurred line is the name of the Trigger.
Stuck on this for some time now. Any help would be highly appreciated.
Regards,
Sree

Invoke-Sqlcmd cannot find file when BULK INSERT is used

I'm having difficulty invoking the following PowerShell command, from the command line as well as in a script:
Invoke-Sqlcmd -HostName **HOST** -Database **DATABASE**
-Username **USER** -Password **PWD** -Query "TRUNCATE **THE_TABLE**
BULK INSERT **THE_TABLE** FROM 'D:\Folder\csvfiles\import.csv'
WITH ( FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n',TABLOCK)"
-ServerInstance "tcp:**HOST**"
I've copied the CSV file to the parent folder, and then to the root. Each time, the command fails with the following error:
Invoke-Sqlcmd : Cannot bulk load. The file "D:\Folder\csvfiles\import.csv" does not exist.
At line:1 char:1
This is all part of a script that Task Scheduler runs on an hourly basis. Often it will run just fine. Today it has been erroring with more frequency.
I'm stumped. Why can't it find a file that is obviously there? Am I overlooking something?
I found out that there was a related, competing task scheduled that was deleting all of the csv files in the folder at very close to the same time that my script ran. This turned out to be a case of bad timing.

Capture errors within ForEach-Object -Parallel in Powershell 7

I am trying to execute Powershell script (7.0) file using Powershell 7-Preview which iterates through all the databases, update them using DACPAC file and execute other SQL Server scripts in all the DBs.
It works fine when there is no errors, however, in case of an error while executing Dacpac file
It gives below error and stops executing the script further.
##[error] Could not deploy package.
##[error]PowerShell exited with code '1'.
Any pointer on how we can catch the errors gracefully in PowerShell within the Parallel statement and let script to be continued for other databases? Try-Catch block does not seem to be working here.
I am new to PowerShell. This PowerShell script is a part of DevOps release pipeline.
#.. Variables are defined here ..#
[string[]]$DatabaseNames = Invoke-Sqlcmd #GetDatabasesParams | select -expand name
[int]$ErrorCount = 0
$DatabaseNames | ForEach-Object -Parallel {
try
{
echo "$_ - Updating database using DACPAC."
dir -Path "C:\Program Files (x86)\Microsoft Visual Studio*" -Filter "SqlPackage.exe" -Recurse -ErrorAction SilentlyContinue | %{$_.FullName} {$using:SqlPackagePath /Action:Publish /tu:$using:DatabaseUsername /tp:$using:DatabasePassword /tsn:$using:ServerInstance /tdn:"$_" /sf:using:$DacpacLocation /p:BlockOnPossibleDataLoss=False}
echo "$_ - Updating the scripts."
$OngoingChangesScriptParams = #{
"Database" = "$_"
"ServerInstance" = "$ServerInstance"
"Username" = "$DatabaseUsername"
"Password" = "$DatabasePassword"
"InputFile" ="$SystemWorkingDir\$OngoingChangesScriptLocation"
"OutputSqlErrors" = 1
"QueryTimeout" = 9999
"ConnectionTimeout" = 9999
}
Invoke-Sqlcmd #OngoingChangesScriptParams
}
catch {
$ErrorCount++
echo "Internal Error. The remaining databases will still be processed."
echo $_.Exception|Format-List -force
}
}```
Logs-
2020-09-17T19:21:59.3602523Z *** The column [dbo].[FileJob].[GMTOffset] on table [dbo].[FileJob] must be added, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work. To avoid this issue you must either: add a default value to the column, mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
2020-09-17T19:21:59.4253722Z Updating database (Start)
2020-09-17T19:21:59.4274293Z An error occurred while the batch was being executed.
2020-09-17T19:21:59.4280337Z Updating database (Failed)
2020-09-17T19:21:59.4330894Z ##[error]*** Could not deploy package.
2020-09-17T19:22:00.3399607Z ##[error]PowerShell exited with code '1'.
2020-09-17T19:22:00.7303341Z ##[section]Finishing: Update All Companies

Azure Pipelines - logging commands from SQL script

I am trying to log some messages from a TSQL script running via Azure Pipelines, for instance, before creating a table we check if table already exists and if so we simply print a message and skip table creation...
there are good articles explaining how to access Azure Pipelines Logging Commands from BASH or PowerShell, for instance this article: https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash
but how to output messages to the pipeline logs from within TSQL statement itself?
I will try with RAISERROR ( e.g. RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT; ) hopefully works better than PRINT command, has anyone had similar issue and how did he resolve it?
You can run your scripts through PowerShell Invoke-Sqlcmd with -Verbose key. Here is small example for PowerShell task:
$server = "$(servername)"
$dbname = "$(dbname)"
$u = "$(username)"
$p = "$(password)"
$filename = "testfile.sql"
$filecontent = "RAISERROR('Table [dbo].[ReportHistory] already exists!', 0, 1) WITH NOWAIT;`r`nGO`r`n"
Set-Content -Path $filename -Value $filecontent
Write-Host '##[command] Executing file... ', $filename
#Execution of SQL packet
try
{
Invoke-Sqlcmd -InputFile "$filename" -ServerInstance $server -Database $dbname -Username "$u" -Password "$p" -QueryTimeout 36000 -Verbose
}
catch
{
Write-Host "##[error]" $Error[0]
Write-Host "##[error]----------\n"
}
Result:
You can use Azure Pipelines Logging Commands together with PRINT and RAISERROR commands.
The logging commands syntax ##vso[task..] is the reserved keywords in Azure devops piplines. When ##vso[task..] is found in the tasks' output stream, Azure devops pipeline will execute the logging commands.
So that you can output messages to the pipeline logs from within TSQL statement using logging commands with PRINT or RAISERROR. See below example:
PRINT N'##vso[task.logissue type=warning]Table already exists.';
RAISERROR('##vso[task.logissue type=warning]Table [dbo].[ReportHistory] already exists!',0,1);
See below output messages in pipeline logļ¼š