Azure Data factory continous deployment using powershell - powershell

I need to do Continous Integration and Deployment for my Azure Data Factory (ADF).
For this in a Visual Studio solution I have two projects:
one for ADF json files(linked services, datasets etc.).
one for PowerShell script for deploying this ADF into a Azure subscription.
Steps followed
Took MSBUILD of ADF codes and used copy files task to copy into $(Build.ArtifactStagingDirectory).
Used Publish artifacts task to publish in VSTS.
Publish artifacts for PowerShell script as a separate build.
Release
In my release I have a Azure PowerShell script which will invoke these ADF files and deploy it in Azure subscription. I'm using "Build.ArtifactStagingDirectory" for referring my ADF files. But I'm getting the below error -
The term 'Build.ArtifactStagingDirectory' is not recognized as the name of a cmdlet, function, script file, or operable program
foreach($file in Get-ChildItem "$(Build.ArtifactStagingDirectory)" -filter "*LinkedService*")
{
New-AzureRmDataFactoryLinkedService -ResourceGroupName "ADFAutomationResource" -DataFactoryName "ADFCICD190218" -Name $file.BaseName -File $file.FullName -Force | Format-List
}
Let me know how to proceed in this case..as there are no sufficient links explaining this.

Try:
foreach($file in Get-ChildItem $Build.ArtifactStagingDirectory -filter "*LinkedService*")
{
New-AzureRmDataFactoryLinkedService -ResourceGroupName "ADFAutomationResource" -DataFactoryName "ADFCICD190218" -Name $file.BaseName -File $file.FullName -Force | Format-List
}

You're referencing a Build variable in a Release!
I assume you've added your artifacts to the release?
If so; you should be able to refer to them like so:
$(System.DefaultWorkingDirectory)/<Artifact Name>

Related

Set-AzDataFactoryV2Trigger fails in Azure Powershell Task in Release pipeline but works fine on Powershell in frontend machine

I want to create all the triggers in ADF after the Release pipeline has been run successfully . This is because there is a hard 256 parameters limit for ARM template max no. of parameters.
The idea is we will delete all the triggers in DEV, TEST, QA and in PROD. In our published artifact, we would have all the JSONs trigger files using which we can create triggers. The Release pipeline would run a PowerShell script and create the Triggers using Set-AzDataFactoryV2Trigger.
I am able to run the below script correctly on my frontend -
$AllTriggers = Get-ChildItem -Path .
Write-Host $AllTriggers
$AllTriggers | ForEach-Object {
Set-AzDataFactoryV2Trigger -ResourceGroupName "<MyResourceGroupName>" -DataFactoryName "<MyTargetDataFactoryName>" -Name "$_" -DefinitionFile ".\$_.json"
}
In the Azure Powershell script, the first line has to be changed a little to read all the JSON's from the Published Artifact -
$AllTriggers = Get-ChildItem -Name -Path "./_TriggerCreations/drop/" -Include *.json
I receive the below error when trying to run this script via Az Powershell task in the release pipeline (you may note that the error is gibberish) -
The yellow blurred line is the name of the Trigger.
Stuck on this for some time now. Any help would be highly appreciated.
Regards,
Sree

Set-AzDataFactoryV2Pipeline cmdlet deploys new pipeline with modified JSON tag values

I am using Powershell to deploy a pipeline to an Azure Data Factory V2. I am passing a file with the JSON of the pipeline to be deployed.
But, the new pipeline created in the target ADF has a slightly modified JSON compared to the one passed as input. The source type tag under Lookup Activity has the value 'CopySink' instead of the value 'AzureSqlSink' specified in the input JSON file. Thus, Data Factory finds the deployed pipeline invalid.
The pipeline runs alright when I manually correct the tag values using the GUI though.
I have tried the below cmdlets. Both of them seem to have the same outcome.
Set-AzDataFactoryV2Pipeline -ResourceGroupName $DataFactoryResourceGroup -Name $svc.name -DataFactoryName $DataFactoryName -File "$currentPipelinePath" -Force
New-AzDataFactoryV2Pipeline -ResourceGroupName $DataFactoryResourceGroup -Name $svc.name -DataFactoryName $DataFactoryName -File "$currentPipelinePath" -Force
Appreciate any help on this issue. My intention is to automate deployment of ADF pipelines using Powershell.
This worked when I executed the command from the Admin Powershell console after upgrading all Az modules to their latest versions.

How to build and deploy SSAS tabular from VSTS through CI/CD locally and to Azure Analysis services

I am working on a SSAS Tabular project in Visual Studio 2017 where I want to automate build and deploy (and testing) locally and in azure analysis services. The project is connected to an Azure Devops project and it works.
But I am struggling a lot when it comes to build and deploy the project in azure devops. I have followed a blog on this topic (https://notesfromthelifeboat.com/post/analysis-services-1-deployment/) and author has created some powershell scripts that works on my computer when I am running them locally on Windows Powershell ISE. But when I tried to create a build pipeline on devops using the same Powershell file in a Powershell task it fails. I created some variables and set a reference to the powershell file. So far so fine.When I tried to run the build I got an error saying:
Import-Module : The specified module 'SqlServer' was not loaded
because no valid module file was found in any module
It seems that powershell in devops are not able to load the Import-Module -Name SqlServer. I have searched on the net for a solution but nothing has worked so far, and other combination of the ImpI have discovered that there is a small difference in the $env:PSModulePath environment variable between in Powershell ISe and in the Powershell ci-build task, but I am not sure if that is the problem.
If any of you have experience knowing how to solve this issue or have a better solution on how to deploy SSAS Tabular model locally and especially deploying to azure (may be some of you have experience with automation) from build/release.
build setup on devops
Error from running the build
Powershell script
Command: .\deploy_model.ps1 -workspace c:\develop\tabular-automation -environment validation -analysisServicesUsername test_ssas -analysisServicesPassword test_ssas
param(
[Parameter(Mandatory)]
[string]$workspace,
[Parameter(Mandatory)]
[string]$environment,
[Parameter(Mandatory)]
[string]$analysisServicesUsername,
[Parameter(Mandatory)]
[string]$analysisServicesPassword,
[string]$databaseServer = "localhost",
[string]$analysisServicesServer = "localhost"
)
Import-Module -Name SqlServer
$ErrorActionPreference = "Stop"
# Build the model
$msbuild = 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\MSBuild\15.0\Bin\MSBuild.exe'
& "$msbuild" TabularExample.smproj "/p:Configuration=$environment" /t:Clean,Build /p:VisualStudioVersion=14.0
# Copy build outputs and deployment options to deployment directory
$deploymentDir = ".\deployment"
mkdir -Force $deploymentDir
cp "bin\$environment\*.*" $deploymentDir
cp .\deploymentoptions\*.* $deploymentDir
# Update deployment targets with parameters
$template = Get-Content .\deploymentoptions\Model.deploymenttargets
$expandedTemplate = $ExecutionContext.InvokeCommand.ExpandString($template)
$expandedTemplate | Set-Content "$deploymentDir\Model.deploymenttargets"
# Create the deployment script
Microsoft.AnalysisServices.Deployment.exe "$deploymentDir\Model.asdatabase" /s:"$deploymentDir\deploy.log" /o:"$deploymentDir\deploy.xmla" | Out-Default
# Deploy the model
$SECURE_PASSWORD = ConvertTo-SecureString $analysisServicesPassword -AsPlainText -Force
$CREDENTIAL = New-Object System.Management.Automation.PSCredential ($analysisServicesUsername, $SECURE_PASSWORD)
Invoke-ASCmd –InputFile "$workspace\$deploymentDir\deploy.xmla" -Server $analysisServicesServer -Credential $CREDENTIAL
For third party modules, or solution has been to modify the $env:PSModulePath to point to a network location that has the version of the module we want our build agents to run. I use code like below (We also set the PSModulePath to the relative path to our custom modules that are stored in the same repo, but I removed that part of code since you do not state that you have any custom modules)
I like this better than constantly running Install-Module because I have better control over what version of the modules are running, and I do not have to worry about having our build boxes constantly communicating with PowershellGallery
try {
Import-Module SQLServer -Force -ErrorAction Stop
}
catch {
$networkPath = "\\Network path to Modules\"
if (!(Test-Path $networkPath)) {
Write-Error "Can not set env:PSModulePath to the published location on the network" -ErrorAction Stop
}
else {
if (!($env:PSModulePath -like "*;$networkPath*")) {
$env:PSModulePath = $env:PSModulePath + ";$networkPath"
}
}
}
else {
Write-Host "Setting the modulePath to $modulePath"
if (!($env:PSModulePath -like "*;$modulePath*")) {
$env:PSModulePath = $env:PSModulePath + ";$modulePath\"
}
}
Import-Module SQLServer -Force -DisableNameChecking -ErrorAction Stop
}

PowerShell script to deploy ASP.NET Core artifacts to Azure

I need a PowerShell script which will deploy an ASP.NET Core app's artifacts to Azure Web Service. Searching the Internet I managed to find this script:
param($websiteName, $packOutput)
$website = Get-AzureWebsite -Name $websiteName
Stop-AzureWebsite -Name $websiteName
# get the scm url to use with MSDeploy. By default this will be the second in the array
$msdeployurl = $website.EnabledHostNames[1]
$publishProperties = #{'WebPublishMethod'='MSDeploy';
'MSDeployServiceUrl'=$msdeployurl;
'DeployIisAppPath'=$website.Name;
'Username'=$website.PublishingUsername;
'Password'=$website.PublishingPassword
}
$publishScript = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Web Tools\Publish\Scripts\1.2.0\default-publish.ps1"
. $publishScript -publishProperties $publishProperties -packOutput $packOutput
Start-AzureWebsite -Name $websiteName
I am using it the way it is shown on the screenshot:
But...nothing happens as the result of msdeploy command execution: no errors, no data deployed...
So, what is the correct way of deploying ASP.NET Core artifacts with PowerShell?
Visual Studio could generate Windows PowerShell publish script for deploying to a website. The publish script may look like this.
publish script
[cmdletbinding(SupportsShouldProcess=$true)]
param($publishProperties=#{}, $packOutput, $pubProfilePath)
# to learn more about this file visit https://go.microsoft.com/fwlink/?LinkId=524327
try{
if ($publishProperties['ProjectGuid'] -eq $null){
$publishProperties['ProjectGuid'] = 'xxxxxxxx-0260-4800-b864-e9afa92d7fc2'
}
$publishModulePath = Join-Path (Split-Path $MyInvocation.MyCommand.Path) 'publish-module.psm1'
Import-Module $publishModulePath -DisableNameChecking -Force
# call Publish-AspNet to perform the publish operation
Publish-AspNet -publishProperties $publishProperties -packOutput $packOutput -pubProfilePath $pubProfilePath
}
catch{
"An error occurred during publish.`n{0}" -f $_.Exception.Message | Write-Error
}
And a publish module that contains functions that will be used in the scripts. For more information about publishscripts for deploying to a website, please refer to this documentation.

Azure Powershell script fails when run through task scheduler

I have a powershell script that I wrote to backup a local sqlserver to an azure blob. Its based on one I took from MSDN, but I added an extra feature to delete any old backups that are over 30 days old. When I run this as a user, it works fine. When I added this to task scheduler, set to run as me, and I manually ask for it to run, it works fine. (All output is captured in a log file, so I can see that its all working). When run from the task scheduler at night when I'm not logged in (the task scheduler is set to run the script as me) it fails. Specifically, it claims my azure subscription name is not know when I call Set-AzureSubscription. Then, fails when trying to delete the blob with:
Get-AzureStorageBlob : Can not find your azure storage credential. Please set current storage account using "Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
The script in question:
import-module sqlps
import-module azure
$storageAccount = "storageaccount"
$subscriptionName = "SubName"
$blobContainer = "backup"
$backupUrlContainer = "https://$storageAccount.blob.core.windows.net/$blobContainer/"
$credentialName = "creds"
Set-AzureSubscription -CurrentStorageAccountName $storageAccount -SubscriptionName $subscriptionName
$path = "sqlserver:\sql\servername\SQLEXPRESS\databases"
$alldatabases = get-childitem -Force -path $path | Where-object {$_.name -eq "DB0" -or $_.name -eq "DB1"}
foreach ($db in $alldatabases)
{
Backup-SqlDatabase -BackupContainer $backupUrlContainer -SqlCredential $credentialName $db
}
$oldblobs = Get-AzureStorageBlob -container backup | Where-object { $_.name.Contains("DB") -and (-((($_.LastModified) - $([DateTime]::Now)).TotalDays)) -gt $(New-TimeSpan -Days 30).TotalDays }
foreach($blob in $oldblobs)
{
Write-Output $blob.Name
Remove-AzureStorageBlob -Container "backup" -Blob $blob.Name
}
The backup part of the script works, just not the blob deletion parts. It would appear that something is being done to the environment when I log in that allows the azure powershell scripts to work but that isn't being done when I run the command at night when I'm not logged in.
Any one have any idea what that might be?
Task scheduler is set to run the command with a
Powershell -Command "C:\Scripts\BackupDatabases.ps1" 2>&1 >> "C:\Logs\backup.log"
The Azure PowerShell environment just needs to understand what Azure subscription to work with by default. You probably did this for your own environment, but the task scheduler is running in a different environment.
You just need to add an additional command to the beginning of your script to set the Azure subscription. Something like this:
Set-AzureSubscription -SubscriptionName
The documentation for this command is here. You can also set by SubscriptionID etc. instead of SubscriptionName.
In addition, this article walks through how to connect your Azure subscription to the PowerShell environment.
UPDATE: I messed around and got it working. Try adding a "Select-AzureSubscription" before your Set-AzureSubscription command.
Select-AzureSubscription $subscriptionName
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccountName $storageAccount
The documentation for Select-AzureSubscription is here. If you aren't relying on that storage account being set, you may be able to remove the Set-AzureSubscription command.
I was never able to make the powershell script work. I assume I could have made it work if I had set the credentials in the environment variable, as it said, but I instead wrote a little program to do the work for me.
Visit https://github.com/sillyotter/BackupDBToAzure if you need a tool to backup things to azure blobs and delete old leftover backups.
Thanks for the help!