I'm finding the best way to find a variable is null on the custom condition.
I tried to compare null. However, Azure Pipeline complains cause error, if I configure like this.
and(failed(), ne(variables['Some'], Null))
Also, This configuration doesn't throw an error, however, when the 'Some' is null, the condition becomes false. Since Null and 'Null' is different.
and(failed(), ne(variables['Some'], 'Null'))
I eventually come up with a workaround. However, it is not cool way. I add PowerShell task, and create this script.
if ($env:Some -eq $null) {
Write-Host "##vso[task.setvariable variable=SkipSome]True"
}
then configure custom condition
and(failed(), ne(variables['SkipSome'], 'True'))
I expect there is a way to compare with null without the powershell. However, I can't find it on the official documentation.
How to deal with Null for custom condition in Azure Pipeline?
To deal with Null for custom condition, we should use '' instead of Null or 'Null'.
You can check the String for some details.
So, you can configure it like following:
and(failed(), ne(variables['Some'], ''))
To test it more intuitively, I change the ne to eq:
and(failed(), eq(variables['Some'], ''))
Then I set the variable is empty on the Variable tab, and add a Inline powershell task with above condition:
In the log, we could see that task is executed:
Hope this helps.
Related
I have a mutli-step azure pipeline used to trigger the execution of a certain job based on keywords I have in azure devops work items.
First step executed is a powershell script that stores into a 'validTags' variable a comma-separated list of strings:
Write-Host "##vso[task.setvariable variable=validTags]$csTags"
After this step, I correctly see the list formatted as I expect:
string1,string2,string3
The 'validTags' variable is then passed as a parameter to another pipeline in which I should split this list and trigger separate jobs:
- template: run.yml
parameters:
tags: $(validTags)
directory: 'path\to\tests'
platforms: 'platform1,platform2'
In the 'run' pipeline I defined this 'tags' parameter:
parameters:
- name: tags
type: string
default: 'someDefaultValue'
and I try to split the parameter:
- ${{each t in split(parameters.tags, ',')}}:
- script: |
echo 'starting job for ${{t}}'
but when I execute the pipeline, I have in 't' still the full string (string1,string2,string3) not splitted.
I have noticed that if I try to perform the split on the "platforms" parameter which is passed along with "tags" to the run.yml pipeline, it works, so it seems that the problem is related to the fact that I am trying to split a string stored in an external variable?
Anyone with a similar issue? Any help on this is much appreciated.
Thanks
For those interested in the outcome of this issue:
I tested several possible alternate solutions, including the use of global variables and group variables, but without success.
I submitted a request to MSFT engineering support to get some insight on this and their response is:
The pipeline does not support splitting the runtime variable with
template syntax ${{ }} currently, and we are not able to find other
workarounds to meet your request. Sorry for the inconvenience. Hope
you can understand.
So, to overcome the issue I removed the split at the pipeline level, as initially planned, but rather passed the comma-separated value's string to the template and added there the necessary processing in Powershell.
Another option would have been to perform all the operations from within the first PowerShell script step:
transform the 'run.yml' template in a separate pipeline
in the script, after getting the tags, loop over their values and trigger the 'run.yml' pipeline passing the single tag as a parameter.
I avoided this solution to keep the operations separate and have more control over the execution flow.
In an Argo workflow, I have a loop and I need to continue running the loop in case the output of a previous step has NOT been supplied yet. Until now, I couldn't find any way of performing this simple empty/null check. The following "when" expression:
when: "'{{steps.wait-completion.outputs.parameters.result}}' == ''"
never evaluates as expected because Argo returns the tag name as it is (without being substituted) if the value has not been supplied and then I get:
'{{steps.wait-completion.outputs.parameters.result}}' == ''' evaluated false
Is this a bug or a feature? Any ideas how I can perform such a check from the "when" expression? I also tried using the "default" tag to set a default but it seems to be ignored by the suspend step (another bug or another feature?)
I would really appreciate some ideas here. Thanks in advance!
What I tried:
when: "'{{steps.wait-completion.outputs.parameters.result}}' == ''"
What I expected:
When expression above evaluates to true if the output parameter "result" has not been supplied yet.
What I got:
'{{steps.wait-completion.outputs.parameters.result}}' == ''' evaluated false
I have n number of variables that I need to assign as Azure DevOps variables in a Release pipeline, and it doesn't seem like I'm getting the syntax right.
The variables may have different values (variable names) such that they could be:
- {guid 1}
- {guid 2}
...
So I won't know them prior to runtime. The problem is that it seems all of the examples of vso[task.setvariable] use static variable names, but I need to set it dynamically.
Here's what should work but doesn't:
Write-Host "##vso[task.setvariable variable=$($myVariable)]$($myValue)"
I've also tried just using [Environment]::SetEnvironmentVariable (with user) and it doesn't seem to persist across two different tasks in the same job.
[Environment]::SetEnvironmentVariable($myVariable, $myValue, 'User')
(Is null in subsequent task)
Is there some way that I can dynamically create release variables that persist between tasks? I've tried to search and found one question on the developer community but no answer to it.
It actually looks like the issue isn't that the variable isn't set, but that after using task.setvariable, the variable will only be available in subsequent tasks (and not the current one).
So I would say this is the best way to set variables in Azure DevOps:
When needing to use variables in the same task/script step, use:
[Environment]::SetEnvironmentVariable(...)
Or just use a variable in PowerShell.
When needing to use variables with multiple steps, use:
$myVariable = "some name"
$myValue = "some value"
# Note that passing in $($variableName) should work with this call
Write-Host "##vso[task.setvariable variable=$($myVariable)]$($myValue)"
# Note that trying to use a Write-Host for $env:myVariable will return empty except in tasks after this one
Write-Host "Setting $($myVariable) to $($myValue)
It works. This is example from my build task:
$myVariableNewValue = '##vso[task.setvariable variable=myVariable]' + $newValue
Write-Host $myVariableNewValue
Using Kapacitor 1.3 and I am trying to use the following where node to keep measurements with an empty tag. Nothing is passing through and I get the same result with ==''.
| where(lambda: 'process-cpu__process-name' =~ /^$/)
I can workaround this issue using a default value for missing tags and filter on this default tag, in the following node but I am wondering if there is a better way structure the initial where statement and avoid an extra node.
| default()
.tag('process-cpu__process-name','system')
| where(lambda: \"process-cpu__process-name\" == 'system' )
Sure it doesn't pass, 'cause this
'process-cpu__process-name'
is a string literal it TICKScript, not a reference to a field, which is
"process-cpu__process-name"
You obviously got the condition always false in this case.
Quite common mistake though, especially for someone with previous experience with the languages that tolerates both single & double quote for mere string. :-)
Also, there's a function in TICKScript lambda called strLength(), find the doc here, please.
I'm pulling data from a database (access specifically using ADO). Then I am preparing the variable for possible insertion into SQL later by replacing any existing single quotes with two single quotes.
Because the value coming from the database is very likely to be null, I test for it and the replace is only supposed to happen if it is NOT null. The value I'm getting it not registering as null when I test it, but when I try the replace, I get the error
Method invocation failed because [System.DBNull] does not contain a method named 'Replace'.
Here's the section of code $OldAbstract = $rs.Fields.Item("MetaAbstract").value
if($OldAbstract -ne $null) {$OldAbstract = $OldAbstract.Replace("'","''")}
After pulling the value, I print it and it looks like nothing. I tested it for null, and it says it's not. I tested it for "" and it's not. I even checked to see if it was a space, and it's not. The length is reported as 1... I'm at a loss as to what's going on.
You're testing for the wrong type of null - PowerShell/.net null instead of a database null. Try this (untested):
$OldAbstract = $rs.Fields.Item("MetaAbstract").value
if($OldAbstract -isnot [System.DBNull]) {$OldAbstract = $OldAbstract.Replace("'","''")}