Azure pipeline parameter split doesn't work - azure-devops

I have a mutli-step azure pipeline used to trigger the execution of a certain job based on keywords I have in azure devops work items.
First step executed is a powershell script that stores into a 'validTags' variable a comma-separated list of strings:
Write-Host "##vso[task.setvariable variable=validTags]$csTags"
After this step, I correctly see the list formatted as I expect:
string1,string2,string3
The 'validTags' variable is then passed as a parameter to another pipeline in which I should split this list and trigger separate jobs:
- template: run.yml
parameters:
tags: $(validTags)
directory: 'path\to\tests'
platforms: 'platform1,platform2'
In the 'run' pipeline I defined this 'tags' parameter:
parameters:
- name: tags
type: string
default: 'someDefaultValue'
and I try to split the parameter:
- ${{each t in split(parameters.tags, ',')}}:
- script: |
echo 'starting job for ${{t}}'
but when I execute the pipeline, I have in 't' still the full string (string1,string2,string3) not splitted.
I have noticed that if I try to perform the split on the "platforms" parameter which is passed along with "tags" to the run.yml pipeline, it works, so it seems that the problem is related to the fact that I am trying to split a string stored in an external variable?
Anyone with a similar issue? Any help on this is much appreciated.
Thanks

For those interested in the outcome of this issue:
I tested several possible alternate solutions, including the use of global variables and group variables, but without success.
I submitted a request to MSFT engineering support to get some insight on this and their response is:
The pipeline does not support splitting the runtime variable with
template syntax ${{ }} currently, and we are not able to find other
workarounds to meet your request. Sorry for the inconvenience. Hope
you can understand.
So, to overcome the issue I removed the split at the pipeline level, as initially planned, but rather passed the comma-separated value's string to the template and added there the necessary processing in Powershell.
Another option would have been to perform all the operations from within the first PowerShell script step:
transform the 'run.yml' template in a separate pipeline
in the script, after getting the tags, loop over their values and trigger the 'run.yml' pipeline passing the single tag as a parameter.
I avoided this solution to keep the operations separate and have more control over the execution flow.

Related

How to select object attribute in ADF using variable

I'm trying to parametrize a pipeline in Azure Data Factory in order to enable a certain functionality to mulptiple environments. The idea is that the current environment is always available through a global parameter. I'd like to use this parameter to look up an array of environments to process data to. Example:
targetEnvs = [{ "dev": ["dev"], "test": ["dev", "test"], "acc": [], "prod": ["acc", "prod"] }]
Then one should be able to select the targetEnv array with something like targetEnvs[environment] or targetEnvs.environment. Subsequently a ForEach is used to execute some logic on these target environments.
I tried setting this up with targetEnvs as a pipeline parameter (with default value mapping each env directly to targetEnv, as follows: {"dev": ["dev"], "test": ["test"]}) Then I have a Set variable step to take value from the targetEnvs parameter, as follows:.
I'm now looking for a way to use the current environment (stored in a global parameter) instead of hardcoding "dev" in the Set Variable expression, but I'm not sure how to do this.
.
Using this expression won't even start the pipeline.
.
Question: how do I select this attribute of the object? Any other suggestions on how to do tackle this problem are welcome as well!
(Python analogy would be to have a dictionary target_envs and taking a value from it by using the key "current_env": target_envs[current_env].)
When I tried to access the object same as you, the same error occurred. I have taken the parameter targetEnv (given array) and global parameter environment with value as dev.
You can use the following dynamic content to access the key value.
#pipeline().parameters.targetEnv[0][pipeline().globalParameters.environment]

How to capture who ran the pipeline in Azure DevOps

How to capture who ran the build pipeline in Azure DevOps as a variable?
Is there any predefined variable to capture that?
From the docs:
Build.QueuedBy - See "How are the identity variables set?".
Note: This value can contain whitespace or other invalid label characters. In these cases, the label format will fail.
Build.QueuedById - See "How are the identity variables set?".
Yes, its called: "Build.QueuedBy"
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#build-variables

Dynamically set Azure DevOps variable in scripts

I have n number of variables that I need to assign as Azure DevOps variables in a Release pipeline, and it doesn't seem like I'm getting the syntax right.
The variables may have different values (variable names) such that they could be:
- {guid 1}
- {guid 2}
...
So I won't know them prior to runtime. The problem is that it seems all of the examples of vso[task.setvariable] use static variable names, but I need to set it dynamically.
Here's what should work but doesn't:
Write-Host "##vso[task.setvariable variable=$($myVariable)]$($myValue)"
I've also tried just using [Environment]::SetEnvironmentVariable (with user) and it doesn't seem to persist across two different tasks in the same job.
[Environment]::SetEnvironmentVariable($myVariable, $myValue, 'User')
(Is null in subsequent task)
Is there some way that I can dynamically create release variables that persist between tasks? I've tried to search and found one question on the developer community but no answer to it.
It actually looks like the issue isn't that the variable isn't set, but that after using task.setvariable, the variable will only be available in subsequent tasks (and not the current one).
So I would say this is the best way to set variables in Azure DevOps:
When needing to use variables in the same task/script step, use:
[Environment]::SetEnvironmentVariable(...)
Or just use a variable in PowerShell.
When needing to use variables with multiple steps, use:
$myVariable = "some name"
$myValue = "some value"
# Note that passing in $($variableName) should work with this call
Write-Host "##vso[task.setvariable variable=$($myVariable)]$($myValue)"
# Note that trying to use a Write-Host for $env:myVariable will return empty except in tasks after this one
Write-Host "Setting $($myVariable) to $($myValue)
It works. This is example from my build task:
$myVariableNewValue = '##vso[task.setvariable variable=myVariable]' + $newValue
Write-Host $myVariableNewValue

Passing value from one task output to other task

Unable to find the option to pass value from one task output to other task in Azure deveops pipeline.
Pass value of Id which is an output of task to next task as an input.
task output
You can do this through Output variables part of the task.
1.Use outputs in the same job
In the Output variables section, give the producing task a reference name. Then, in a downstream step, you can use the form $(<ReferenceName>.<VariableName>) to refer to output variables.
2.Use outputs in a different job
You must use YAML to consume output variables in a different job.
For details,please refer to this document.

How to reference a DAG's execution date inside of a `KubernetesPodOperator`?

I am writing an Airflow DAG to pull data from an API and store it in a database I own. Following best practices outlined in We're All Using Airflow Wrong, I'm writing the DAG as a sequence of KubernetesPodOperators that run pretty simple Python functions as the entry point to the Docker image.
The problem I'm trying to solve is that this DAG should only pull data for the execution_date.
If I was using a PythonOperator (doc), I could use the provide_context argument to make the execution date available to the function. But judging from the KubernetesPodOperator's documentation, it seems that the Kubernetes operator has no argument that does what provide_context does.
My best guess is that you could use the arguments command to pass in a date range, and since it's templated, you can reference it like this:
my_pod_operator = KubernetesPodOperator(
# ... other args here
arguments=['python', 'my_script.py', '{{ ds }}'],
# arguments continue
)
And then you'd get the start date like you'd get any other argument provided to a Python file run as a script, by using sys.argv.
Is this the right way of doing it?
Thanks for the help.
Yes, that is the correct way of doing it.
Each Operator would have template_fields. All the parameters listed in template_fields can render Jinja2 templates and Airflow Macros.
For KubernetesPodOperator, if you check docs, you would find:
template_fields = ['cmds', 'arguments', 'env_vars', 'config_file']
which means you can pass '{{ ds }}'to any of the four params listed above.