Get datetime during deploy in YAML file - deployment

I need to get DateTime during the deployment in a YAML file. The DateTime should be shown as
"startTime": "2017-12-08T00:00:00"
I found this help. but I need to follow the exact Datetime format. I wonder if anyone can help in this case?
-- Added --
I work on deploying Data Factory by YAML file. This StartTime will be DateTime for the trigger part of Data Factory pipeline
I have update my build pipeline with a variable, build.yml
variables:
deployDate: $(Get-Date -Format "YYYYMMDDThhmmssZ")
and inside my deploy.yml file
- task: AzureResourceGroupDeployment#2
displayName: "Deploy Azure Data Factory Content"
inputs:
azureSubscription: ...
action: ...
resourceGroupName: ..
location: ...
templateLocation: ...
csmFile: ...
csmParametersFile: ...
overrideParameters: >-
- ...
-triggerStartTime "$(deployDate)"
deploymentMode: 'Incremental'
and in adf.content.json, I added
"parameters": {
"triggerStartTime": {
"type": "string"
}
}
"name": "[concat(parameters('factoryName'), '/Trigger')]",
"type": "Microsoft.DataFactory/factories/triggers",
"apiVersion": "...",
"properties": {
"annotations": [],
"runtimeState": "Started",
"pipeline": {
"pipelineReference": {
"referenceName": "...",
"type": "PipelineReference"
},
"parameters": {}
},
"type": "TumblingWindowTrigger",
"typeProperties": {
"frequency": "Hour",
"interval": 1,
"startTime": "[parameters('triggerStartTime')]",
"delay": "00:00:00",
"maxConcurrency": 50,
"retryPolicy": {
"intervalInSeconds": 30
},
"dependsOn": []
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/pipelines/...')]"
]

There is an environment variable in the release stage named RELEASE_DEPLOYMENT_STARTTIME and we can use it in powershell via $(Release.Deployment.StartTime)
In addition, we can custom the variable.
Note: I use the date format as yyyy-MM-dd HH:mm:ss here, You can use other date formats
Define variables
$date=$(Get-Date -Format "yyyy-MM-dd HH:mm:ss");
Write-Host ("##vso[task.setvariable variable=StartTime]$date")
Output the variable
Write-Host "The value of StartTime is : $($env:StartTime)"
Result:
Update1
Please also check this ticket

I managed to solve my problem. First of all, I removed all the changes I did in YAML files.
and then I updated the only adf.content.json
"parameters": {
"baseTime": {
"type": "string",
"defaultValue": "[utcNow('u')]",
"metadata": {
"description": "Schedule will start one hour from this time."
}
}
}
and update the variable, I want to run 15min after the deploy
"variables": {
"startTime": "[dateTimeAdd(parameters('baseTime'), 'PT15M')]"
}

Related

Pass array to ARM Template

I am trying to pass an array from an AzurePowerShell task to an ARM template but I am not sure how to do this.
I get the error:
Not valid json when I attempt to pass it as a string and then use json function in ARM although when I print the json out and lint it, it is valid json.
Can't convert string to object when the template expects an object.
Any help is appreciated.
I have a powershell script that grabs the access policies from my key vault:
$keyVault = Get-AzKeyVault -Name $keyVaultName -ResourceGroupName $resourceGroupName
$keyVaultAccessPolicies = $keyVault.AccessPolicies
$json = $keyVaultAccessPolicies | ConvertTo-Json -Compress
Write-Host "##vso[task.setvariable variable=SAUCE;isOutput=true;]$json"
EDIT
I can see the value when debugging to be an array as its surrounded by square brackets. However, when outputted in pipelines the square bracket is omitted out. I find this strange and don't understand why its doing that.
However, how do I now pass this to a template?
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'Provision Key Vault'
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: ${{ parameters.azureSubscriptionName }}
subscriptionId: ${{ parameters.azureSubscriptionId }}
action: 'Create Or Update Resource Group'
resourceGroupName: '$(Resource.Group.Name)'
location: '$(Region)'
templateLocation: 'Linked artifact'
csmFile: '$(Provisioning.Package.Name)/Templates/key-vault-deploy.json'
deploymentMode: 'Incremental'
overrideParameters: >-
-name "$(KeyVault.Name)"
-foo "$env:accesspolicyreference_SAUCE"
The is the key vault template
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"name": {
"type": "string"
},
"fooBar": {
"type": "string"
}
},
"variables": {
},
"resources": [
{
"apiVersion": "2018-02-14",
"name": "[parameters('name')]",
"location": "[parameters('location')]",
"type": "Microsoft.KeyVault/vaults",
"properties": {
"accessPolicies": "[json(parameters('fooBar'))]",
"enabledForDeployment": false,
"enabledForTemplateDeployment": true,
"enabledForDiskEncryption": false,
"enableRbacAuthorization": false,
"tenantId": "[parameters('tenant')]",
"sku": {
"name": "Standard",
"family": "A"
},
"enableSoftDelete": false,
"networkAcls": {
"defaultAction": "allow",
"bypass": "AzureServices",
"ipRules": [],
"virtualNetworkRules": []
}
},
"tags": "[variables('tags')]",
"dependsOn": []
}
],
"outputs": {}
}
There are several issues here:
Overriding an arm template parameter with a json string can get you into trouble, if it is not escaped correctly. I suggest, you encode the json as a base64 string:
$keyVault = Get-AzKeyVault -Name 'vaultbyenar2htogee'
$keyVaultAccessPolicies = $keyVault.AccessPolicies
$json = $keyVaultAccessPolicies | ConvertTo-Json -Compress
$sauce64 = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($json))
Write-Host "##vso[task.setvariable variable=SAUCE]$sauce64"
There is an arm template function base64ToJson() that directly decodes base64 encoded json strings into objects:
"variables": {
"fooBar": "[base64ToJson(parameters('fooBar'))]"
}
The object that you get from $keyVault.AccessPolicies is not a valid value for the accessPolicies property in the arm template. In order to make it work, you need to map the values into the correct structure with a copy-loop:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"name": {
"type": "string"
},
"location": {
"type": "string"
},
"fooBar": {
"type": "string"
}
},
"variables": {
"fooBar": "[base64ToJson(parameters('fooBar'))]"
},
"resources": [
{
"apiVersion": "2018-02-14",
"name": "[parameters('name')]",
"location": "[parameters('location')]",
"type": "Microsoft.KeyVault/vaults",
"properties": {
"copy": [ {
"name": "accessPolicies",
"count": "[length(variables('fooBar'))]",
"input": {
"tenantId": "[subscription().tenantId]",
"objectId": "[variables('fooBar')[copyIndex('accessPolicies')].ObjectId]",
"permissions": {
"keys": "[variables('fooBar')[copyIndex('accessPolicies')].PermissionsToKeys]",
"secrets": "[variables('fooBar')[copyIndex('accessPolicies')].PermissionsToSecrets]",
"certificates": "[variables('fooBar')[copyIndex('accessPolicies')].PermissionsToCertificates]"
}
}
}],
"enabledForDeployment": false,
"enabledForTemplateDeployment": true,
"enabledForDiskEncryption": false,
"enableRbacAuthorization": false,
"tenantId": "[subscription().tenantid]",
"sku": {
"name": "Standard",
"family": "A"
},
"enableSoftDelete": false,
"networkAcls": {
"defaultAction": "allow",
"bypass": "AzureServices",
"ipRules": [],
"virtualNetworkRules": []
}
},
"dependsOn": []
}
],
"outputs": {}
}
No need to use ;isOutput=true if you will use the pipeline variable in a downstream task within the same job. (Otherwise you need to prefix the variable with the name of the task in which it was set, like: $(myTask.SAUCE)).

ADF Pipeline trigger deployment on DevOps

I'm doing some initial ADF deployment from my adf-dev to adf-staging environment. In the MS docs it says:
Deployment can fail if you try to update active triggers. To update active triggers, you need to manually stop them and then restart them after the deployment.
Does this mean I need to turn off my dev or staging triggers pre/post deployment?
2nd issue. I need to schedule the same set of triggers to run on different days in dev (sat) vs staging (sun). Do I need to make a separate set of triggers for each environment then or can I rewrite the trigger schedules for the existing triggers during deployment?
You will need your staging triggers stopped before you start the deployment, and restarted after deployment is complete.
this page have a PowerShell script for stopping triggers: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment#updating-active-triggers
Also you could use the custom petameters configuration file to update your trigger settings: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment#triggers
To parametrise the trigger deployment in Arm template, first here is a sample weekly trigger that runs on a specific day:
{
"name": "OnceAWeekTrigger",
"properties": {
"annotations": [],
"runtimeState": "Stopped",
"pipelines": [],
"type": "ScheduleTrigger",
"typeProperties": {
"recurrence": {
"frequency": "Week",
"interval": 1,
"startTime": "2021-05-25T22:59:00Z",
"timeZone": "UTC",
"schedule": {
"weekDays": [
"Sunday"
]
}
}
}
}
}
Create an arm-template-parameters-definition.json file as follow:
{
"Microsoft.DataFactory/factories/triggers": {
"properties": {
"typeProperties": {
"recurrence": {
"schedule": {
"weekDays": "=:-weekDays:array"
}
}
}
}
}
}
this file specifies that you want to prarametrise schedule_weekDays property.
after running ADFUtilities export function:
npm run build export c:\git\adf /subscriptions/<subscriptionid>/resourceGroups/datafactorydev/providers/Microsoft.DataFactory/factories/<datafactory_name> "ArmTemplate"
You now get arm template for trigger properties parametrised as follows:
... {
"name": "[concat(parameters('factoryName'), '/OnceAWeekTrigger')]",
"type": "Microsoft.DataFactory/factories/triggers",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"runtimeState": "Stopped",
"pipelines": [],
"type": "ScheduleTrigger",
"typeProperties": {
"recurrence": {
"frequency": "Week",
"interval": 1,
"startTime": "2021-05-25T22:59:00Z",
"timeZone": "UTC",
"schedule": {
"weekDays": "[parameters('OnceAWeekTrigger_weekDays')]"
}
}
}
}, ...
and the parameters file ArmTemplate\ARMTemplateParametersForFactory.json looks as follows:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"factoryName": {
"value": "factory_name"
},
"OnceAWeekTrigger_weekDays": {
"value": [
"Sunday"
]
}
}
}
you could then create different parameter files for dev and staging with different days of the week by modifying the array value for OnceAWeekTrigger_weekDays

Packer - Powershell pass variables

Currently we are deploying images with packer (In a build pipeline which is located in Azure DevOps) within our AWS domain with success. Now we want to take this a step further and we're trying to configure a couple of user for future Ansible maintenance. So we're written a script and tried it as an inline Powershell script but both of the options do not seem to pick up the variable which is set in the variable group in Azure DevOps, all the other variables are being used with success. My code is as follows:
{
"variables": {
"build_version": "{{isotime \"2006.01.02.150405\"}}",
"aws_access_key": "$(aws_access_key)",
"aws_secret_key": "$(aws_secret_key)",
"region": "$(region)",
"vpc_id": "$(vpc_id)",
"subnet_id": "$(subnet_id)",
"security_group_id": "$(security_group_id)",
"VagrantUserpassword": "$(VagrantUserPassword)"
},
"builders": [
{
"type": "amazon-ebs",
"access_key": "{{user `aws_access_key`}}",
"secret_key": "{{user `aws_secret_key`}}",
"region": "{{user `region`}}",
"vpc_id": "{{user `vpc_id`}}",
"subnet_id": "{{user `subnet_id`}}",
"security_group_id": "{{user `security_group_id`}}",
"source_ami_filter": {
"filters": {
"name": "Windows_Server-2016-English-Full-Base-*",
"root-device-type": "ebs",
"virtualization-type": "hvm"
},
"most_recent": true,
"owners": [
"801119661308"
]
},
"ami_name": "WIN2016-CUSTOM-{{user `build_version`}}",
"instance_type": "t3.xlarge",
"user_data_file": "userdata.ps1",
"associate_public_ip_address": true,
"communicator": "winrm",
"winrm_username": "Administrator",
"winrm_timeout": "15m",
"winrm_use_ssl": true,
"winrm_insecure": true,
"ssh_interface": "private_ip"
}
],
"provisioners": [
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"inline": [
"Install-WindowsFeature web-server,web-webserver,web-http-logging,web-stat-compression,web-dyn-compression,web-asp-net,web-mgmt-console,web-asp-net45",
"New-LocalUser -UserName 'Vagrant' -Description 'User is responsible for Ansible connection.' -Password '$(VagrantUserPassword)'"
]
},
{
"type": "powershell",
"environment_vars": ["VagrantUserPassword={{user `VagrantUserPassword`}}"],
"scripts": [
"scripts/DisableUAC.ps1",
"scripts/iiscompression.ps1",
"scripts/ChocoPackages.ps1",
"scripts/PrepareAnsibleUser.ps1"
]
},
{
"type": "windows-restart",
"restart_check_command": "powershell -command \"& {Write-Output 'Machine restarted.'}\""
},
{
"type": "powershell",
"inline": [
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\InitializeInstance.ps1 -Schedule",
"C:\\ProgramData\\Amazon\\EC2-Windows\\Launch\\Scripts\\SysprepInstance.ps1 -NoShutdown"
]
}
]
}
The "VagrantUserpassword": "$(VagrantUserPassword)" is what is not working, we've tried multiple options but none of them seem to be working.
Any idea's?
Kind regards,
Rick.
Based on my test, the pipeline variables indeed couldn't pass to the powershell environment variable.
Workaround:
You could try to use the Replace Token task to pass the pipeline value to Json file.
Here are the steps:
1.Set the value in Json file.
{
"variables": {
....
"VagrantUserpassword": "#{VagrantUserPassword}#"
},
Use Replace Token task before the script task.
Set the value in Pipeline variables.
Then the value could be set successfully.
On the other hand, I also find some issues in your sample file.
"environment_vars": ["VagrantUserPassword={{user VagrantUserPassword}}"], The VagrantUserPassword need to be replaced with VagrantUserpassword(["VagrantUserPassword={{user VagrantUserpassword}}"]).
Note: This is case sensitive.
You need to use $Env:VagrantUserPassword to replace the $(VagrantUserPassword)
For example:
"inline": [
"Write-Host \"Automatically generated aws password is: $Env:VagrantUserPassword\"",
"Write-Host \"Automatically generated aws password is: $Env:VAR5\""
]

Azure Data Factory V2: Custom Activity inside a If Condition activity

I'm working on an Azure Data Factory V2 Pipeline but I having a problem when I try to execute a "Custom activity" inside an "If Condition Activity".
If I try to test my pipeline with "Test Run" button on the ADF's Web interface, this error appeare:
{"code":"BadRequest","message":"Activity PPL_ANYFBRF01 failed: Invalid linked service reference. Name: LNK_BATCH_AZURE","target"...}
I'm sure that there is no error in the linked service reference's name. If I create a "Custom Activity" directly in my pipeline, it's working.
I think it can be a syntax error on my activity but I can't find it.
Here is my "If Condition Activity"'s Json template (the expression "#equal(0,0)" is just for testing purpose):
{
"name": "IfPointComptageNotExist",
"type": "IfCondition",
"dependsOn": [
{
"activity": "PointComptage",
"dependencyConditions": [
"Succeeded"
]
},
{
"activity": "SousPointComptage",
"dependencyConditions": [
"Succeeded"
]
}
],
"typeProperties": {
"expression": {
"value": "#equal(0,0)",
"type": "Expression"
},
"ifTrueActivities": [
{
"type": "Custom",
"name": "CustomActivityTest",
"linkedServiceName": {
"referenceName": "LNK_BATCH_AZURE",
"type": "LinkedServiceReference"
},
"typeProperties": {
"command": "Batch.exe",
"resourceLinkedService": {
"referenceName": "LNK_BLOB_STORAGE",
"type": "LinkedServiceReference"
},
"folderPath": "/test/app/"
}
}
]
}
},
Thank you in advance for your help.
The problem is now solved. I have recreate the pipeline and it's working now.
Regards,
Julien.

How to fix Data Lake Analytics script

I would like to use Azure Data Factory with Azure Data Lake Analytics as action, but without success.
This is my PIPELINE script
{
"name": "UsageStatistivsPipeline",
"properties": {
"description": "Standardize JSON data into CSV, with friendly column names & consistent output for all event types. Creates one output (standardized) file per day.",
"activities": [{
"name": "UsageStatisticsActivity",
"type": "DataLakeAnalyticsU-SQL",
"linkedServiceName": {
"referenceName": "DataLakeAnalytics",
"type": "LinkedServiceReference"
},
"typeProperties": {
"scriptLinkedService": {
"referenceName": "BlobStorage",
"type": "LinkedServiceReference"
},
"scriptPath": "adla-scripts/usage-statistics-adla-script.json",
"degreeOfParallelism": 30,
"priority": 100,
"parameters": {
"sourcefile": "wasb://nameofblob.blob.core.windows.net/$$Text.Format('{0:yyyy}/{0:MM}/{0:dd}/0_647de4764587459ea9e0ce6a73e9ace7_2.json', SliceStart)",
"destinationfile": "$$Text.Format('wasb://nameofblob.blob.core.windows.net/{0:yyyy}/{0:MM}/{0:dd}/DailyResult.csv', SliceStart)"
}
},
"inputs": [{
"type": "DatasetReference",
"referenceName": "DirectionsData"
}
],
"outputs": [{
"type": "DatasetReference",
"referenceName": "OutputData"
}
],
"policy": {
"timeout": "06:00:00",
"concurrency": 10,
"executionPriorityOrder": "NewestFirst"
}
}
],
"start": "2018-01-08T00:00:00Z",
"end": "2017-01-09T00:00:00Z",
"isPaused": false,
"pipelineMode": "Scheduled"
}}
I have two parameters variables sourcefile and destinationfile, which are dynamic (path is from Date).
Then I have this ADLA script for execution.
REFERENCE ASSEMBLY master.[Newtonsoft.Json];
REFERENCE ASSEMBLY master.[Microsoft.Analytics.Samples.Formats];
USING Microsoft.Analytics.Samples.Formats.Json;
#Data =
EXTRACT
jsonstring string
FROM #sourcefile
USING Extractors.Tsv(quoting:false);
#CreateJSONTuple =
SELECT
JsonFunctions.JsonTuple(jsonstring) AS EventData
FROM
#Data;
#records =
SELECT
JsonFunctions.JsonTuple(EventData["records"], "[*].*") AS record
FROM
#CreateJSONTuple;
#properties =
SELECT
JsonFunctions.JsonTuple(record["[0].properties"]) AS prop,
record["[0].time"] AS time
FROM
#records;
#result =
SELECT
...
FROM #properties;
OUTPUT #result
TO #destinationfile
USING Outputters.Csv(outputHeader:false,quoting:true);
Job execution fails and the error is :
EDIT:
It seems, that Text.Format is not executed and passed into script like string ... Then in Data Lake Analytics Job detail is this :
DECLARE #sourcefile string = "$$Text.Format('wasb://nameofblob.blob.core.windows.net/{0:yyyy}/{0:MM}/{0:dd}/0_647de4764587459ea9e0ce6a73e9ace7_2.json', SliceStart)";
In your code sample, the sourcefile parameter is not defined the same way as destinationfile. The latter appears to be correct while the former does not.
The whole string should be wrapped inside $$Text.Format() for both:
"paramName" : "$$Text.Format('...{0:pattern}...', param)"
Also consider passing only the formatted date like so:
"sliceStart": "$$Text.Format('{0:yyyy-MM-dd}', SliceStart)"
and then doing the rest in U-SQL:
DECLARE #sliceStartDate DateTime = DateTime.Parse(#sliceStart);
DECLARE #path string = String.Format("wasb://path/to/file/{0:yyyy}/{0:MM}/{0:dd}/file.csv", #sliceStartDate);
Hope this helps