When storing an array in the Azure pipeline yaml parameter, I normally declare the type object that looks something like this:
parameters:
- name: versions
type: object
default:
- 3.0.1
- 3.0.2
- 3.0.3
But now, the issue is I don't know this array set of versions and I have to get it from a command.
So I was thinking something like this:
parameters:
- name: versions
type: object
default: $npm view #get-versions
But it does not seem to work. Does anyone know how to get values from command for the parameter in yaml pipeline? I appreciate it so much!
You can't. Parameters need to be pre-defined and cannot be dynamically generated.
The ugly workaround would be to write an application that parses and updates the pipeline YAML when a new version of your package is published. An alternative is to just make the parameter value a free-form text field, with a reasonable default.
Related
Is it possible to specify a conditional expression in the withParam of a template step? For example, I have a very typical parallelization case that works fine:
#...
templates:
- name: my-template
steps:
- - name: make-list
template: makes-a-list
- - name: consume-list
template: process-list-element
arguments:
parameters:
- name: param-name
value: "{{item}}"
withParam: "{{steps.make-list.outputs.result}}"
What I'd like to do is be able to have consume-list use a different list of "items" to process in parallel, if specified. I've tried several variations of this:
withParam: "{{= workflow.parameters.use-this-list == '' ? steps.make-list.outputs.result : workflow.parameters.use-this-list }}"
where workflow-level parameter use-this-list can be given as a JSON list (e.g., '["item1","item2"]') or an empty string, in which case I'd like the template to use the output of the make-list step.
I've tried that conditional expression many different ways -- using different quotations, using toJson around different parts of it, etc. -- but argo never seems to be able to understand it.
Is it possible to do something like this? FWIW I haven't found any examples anywhere that attempt this sort of thing.
I am actually implementing CI/CD for my application. I want to start the application automatically using pm2. So I am getting the syntax error on line 22.
This is my yml file
This is the error I am getting on github
The problem in the syntax here is related to how you used the - symbol.
With Github actions, you need at least a run or uses field inform for each step inside your job, at the same level of the name field (which is not mandarory), otherwise the github interpreter will return an error.
Here, from line 22, you used something like this:
- name: ...
- run: ...
- run: ...
- run: ...
So there are two problems:
First, the name and the run field aren't at the same yaml level.
Second, your step with the name field doesn't have a run or uses field associated with it (you need at least one of them).
The correct syntax should be:
- name: ...
run: ...
- run: ...
- run: ...
Reference about workflow syntax
I have a downloaded file through helm inspect called sftp.yaml
I have a parameter in that sftp.yaml file:-
sftp:
allowedMACs: "hmac-sha2-512"
allowedCiphers: aes256-ctr
Now if i install the corresponding helm chart after commenting out the entire line of "allowedMACs" from custom values files i.e. "sftp.yaml", then K8s takes the delta of sftp.yaml and the actual values.yaml and then use values.yaml's "allowedMACs".
However What i want is if "allowedMACs" line is commented in "sftp.yaml" custom values file, then it should not set the env variable at all, or sets it as null.
presently my deployment file's env section looks like
- name: MACs
value: {{ default "" .Values.sftp.allowedMACs | quote }}
You need to either override (with new value) or unset the value, if you only comment out the section you are not doing any of the above and the default value is going to be used.
Basically you are looking to unset a default value. As per banzaicloud example this can be done like so:
helm install stable/chart-name --set sftp.allowedMACs=null
You can also use override value file in a similar way:
sftp:
allowedMACs: null
allowedCiphers: aes256-ctr
This is available in Helm since version 2.6. If you like in-depth information you can review the issue and the subsequent PR that introduced the feature.
yeah I think helm would retrieve values from all values files, so if allowedMACs is in one of those it'll get populated. If this parameter is affected only by sftp.yaml file should it really belong only to it and would i make sense to remove it from main values.yaml?
I have an environment.yml shown as follow, I would like to read out the content of the name variable (core-force) and set it as a value of the global variable in my azure-pipeline.yamal file how can I do it?
name: core-force
channels:
- conda-forge
dependencies:
- click
- Sphinx
- sphinx_rtd_theme
- numpy
- pylint
- azure-cosmos
- python=3
- flask
- pytest
- shapely
in my azure-pipeline.yml file I would like to have something like
variables:
tag: get the value of the name from the environment.yml aka 'core-force'
Please check this example:
File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
Please note, that variables are simple string and if you want to use list you may need do some workaraund in powershell in place where you want to use value from that list.
If you don't want to use template functionality as it is shown above you need to do these:
create a separate job/stage
define step there to read environment.yml file and set variables using REST API or Azure CLI
create another job/stage and move you current build defitnion into there
I found this topic on developer community where you can read:
Yaml variables have always been string: string mappings. The doc appears to be currently correct, though we may have had a bug when last you visited.
We are preparing to release a feature in the near future to allow you to pass more complex structures. Stay tuned!
But I don't have more info bout this.
Global variables should be stored in a separate template file. This file ideally would be in a separate repo where other repos can refer to this.
Here is another answer for this
I am having trouble getting a deployment job in a template to expand a variable it is given via a parameter. Ive used some short hand stuff below.
If you want to see the code, there is a prototype that shows the problem at https://github.com/ausfestivus/azureDevOpsPrototypes
The pipeline looks like this:
stage00
buildjob00
task produces output vars (name: taskName.VAR_NAME)
buildjob01
task is able to reference the variable and retrieve/display the variable value via
dependency notation. [dep.buildjob00.taskName.VAR_NAME]
template:
parameters:
bunchOfVarsAsSequenceFormat:
var1: [dep.buildjob00.taskName.VAR_NAME]
var2: [dep.buildjob00.taskName.VAR_NAME]
template contains:
buildjob02
this build job will see the variables values fine
deplomentjob00
this deploy job will see the variable names but contain empty values
Apologies if this is not well explained, hopefully the above prototype helps illustrate it better than the pseudo code above.
What a super help you shared your YAML scripts here! Otherwise, it's too difficult to understand your structure:-)
To display the variable in tmpl: deploy, you need change its corresponding dependsOn as job00, rather than templateJob.
- deployment: templateDeploy
displayName: 'tmpl: deploy'
continueOnError: false
dependsOn: job00
Then you would see the value could display successfully: