In our team there are multiple applications.
In each application's code repository we have a pipeline.yml file that uses/extends the template common-pipeline.yml
pipeline.yml file passes parameter values to common-pipeline.yml using parameters section:
extends:
template: common-pipeline.yml#common-pipeline-repo
parameters:
appName: 'application-name'
enableTests: false
# the parameter value should not be directly configurable from this file but should come from specific relative folder
in common-pipeline.yml the parameter value is accessed:
parameters:
- name: enableTests
type: boolean
default: true
#I need the parameter value to come from specific relative path
Is it possible to add some condition in common-pipeline.yml to make sure that the enableTests parameter value comes from a specific location in the code repository?
Currently any developer can change the enableTests parameter value. We want to add some config file in each application's code repository and give edit rights to this subfolder to only few people in the team.
So in common-pipeline.yml can I verify that the enableTests parameter value is coming from a specific relative path like /pipeline/testconfig file?
Related
Is there a way to get the full s3 url path of a metaflow artifact, which was stored in a step?
I looked at Metaflow's DataArtifact class but didn't see an obvious s3 path property.
Yep, you can do
Flow('MyFlow')[42]['foo'].task.artifacts.bar._object['location']
where MyFlow is the name of your flow, 42 is the run ID, foo is the step under consideration and bar is the artifact from that step.
Based on #Savin's answer, I've written a helper function to get the S3 URL of an artifact given a Run ID and the artifact's name:
from metaflow import Flow, Metaflow, Run
from typing import List, Union
def get_artifact_s3url_from_run(
run: Union[str, Run], name: str, legacy_names: List[str] = [], missing_ok: bool = False
) -> str:
"""
Given a MetaFlow Run and a key, scans the run's tasks and returns the artifact's S3 URL with that key.
NOTE: use get_artifact_from_run() if you want the artifact itself, not the S3 URL to the artifact.
This allows us to find data artifacts even in flows that did not finish. If we change the name of an artifact,
we can support backwards compatibility by also passing in the legacy keys. Note: we can avoid this by resuming a
specific run and adding a node which re-maps the artifact to another key. This will assign the run a new ID.
Args:
missing_ok: whether to allow an artifact to be missing
name: name of the attribute to look for in task.data
run: a metaflow.Run() object, or a run ID
legacy_names: backup names to check
Returns:
the value of the attribute. If attribute is not found
Raises:
DataartifactNotFoundError if artifact is not found and missing_ok=False
ValueError if Flow not found
ValueError if Flow is found but run ID is not.
"""
namespace(None) # allows us to access all runs in all namespaces
names_to_check = [name] + legacy_names
if isinstance(run, str):
try:
run = Run(run)
except Exception as e:
# run ID not found. see if we can find other runs and list them
flow = run.split(sep="/")[0]
try:
flow = Flow(flow)
raise ValueError(f"Could not find run ID {run}. Possible values: {flow.runs()}") from e
except Exception as e2:
raise ValueError(f"Could not find flow {flow}. Available flows: {Metaflow().flows}") from e2
for name_ in names_to_check:
for step_ in run:
for task in step_:
print(f"task {task} artifacts: {task.artifacts} \n \n")
if task.artifacts is not None and name_ in task.artifacts:
# https://stackoverflow.com/a/66361249/4212158
return getattr(task.artifacts, name_)._object["location"]
if not missing_ok:
raise DataArtifactNotFoundError(
f"No data artifact with name {name} found in {run}. Also checked legacy names: {legacy_names}"
)
I have an environment.yml shown as follow, I would like to read out the content of the name variable (core-force) and set it as a value of the global variable in my azure-pipeline.yamal file how can I do it?
name: core-force
channels:
- conda-forge
dependencies:
- click
- Sphinx
- sphinx_rtd_theme
- numpy
- pylint
- azure-cosmos
- python=3
- flask
- pytest
- shapely
in my azure-pipeline.yml file I would like to have something like
variables:
tag: get the value of the name from the environment.yml aka 'core-force'
Please check this example:
File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
Please note, that variables are simple string and if you want to use list you may need do some workaraund in powershell in place where you want to use value from that list.
If you don't want to use template functionality as it is shown above you need to do these:
create a separate job/stage
define step there to read environment.yml file and set variables using REST API or Azure CLI
create another job/stage and move you current build defitnion into there
I found this topic on developer community where you can read:
Yaml variables have always been string: string mappings. The doc appears to be currently correct, though we may have had a bug when last you visited.
We are preparing to release a feature in the near future to allow you to pass more complex structures. Stay tuned!
But I don't have more info bout this.
Global variables should be stored in a separate template file. This file ideally would be in a separate repo where other repos can refer to this.
Here is another answer for this
I am having trouble getting a deployment job in a template to expand a variable it is given via a parameter. Ive used some short hand stuff below.
If you want to see the code, there is a prototype that shows the problem at https://github.com/ausfestivus/azureDevOpsPrototypes
The pipeline looks like this:
stage00
buildjob00
task produces output vars (name: taskName.VAR_NAME)
buildjob01
task is able to reference the variable and retrieve/display the variable value via
dependency notation. [dep.buildjob00.taskName.VAR_NAME]
template:
parameters:
bunchOfVarsAsSequenceFormat:
var1: [dep.buildjob00.taskName.VAR_NAME]
var2: [dep.buildjob00.taskName.VAR_NAME]
template contains:
buildjob02
this build job will see the variables values fine
deplomentjob00
this deploy job will see the variable names but contain empty values
Apologies if this is not well explained, hopefully the above prototype helps illustrate it better than the pseudo code above.
What a super help you shared your YAML scripts here! Otherwise, it's too difficult to understand your structure:-)
To display the variable in tmpl: deploy, you need change its corresponding dependsOn as job00, rather than templateJob.
- deployment: templateDeploy
displayName: 'tmpl: deploy'
continueOnError: false
dependsOn: job00
Then you would see the value could display successfully:
I am trying to replace appsettings.json and e2e-appsettings.json variables stored in two different artifacts in the release pipeline.
As per below code, ONLY appsettings.json is updating, but for the second line it gives an error,
error: NO JSON file matched with specific pattern: e2e/XXX.EndToEnd.Integration.Tests/e2e-appsettings.json
As per the information it should be the relative path to root. So in this case not sure what should be the root as there are two build artifacts.
Further in download artifact logs says,
2019-04-09T02:33:55.9132583Z Downloaded e2e/XXX.EndToEnd.Integration.Tests/e2e-appsettings.json to D:\a\r1\a\EstimationCore\e2e\XXX.EndToEnd.Integration.Tests\e2e-appsettings.json
The artifact with other appsettings.json file which is working fine is a zip file. logs, Downloading app/app.zip to D:\a\r1\a\EstimationCore\app\app.zip
These I have already tried which gave the same error
- NO JSON file matched with specific pattern: e2e/XXX.EndToEnd.Integration.Tests/e2e-appsettings.json.
- NO JSON file matched with specific pattern: **/*e2e-appsettings.json
- NO JSON file matched with specific pattern: d:\a\r1\a\EstimationCore\e2e\XXX.EndToEnd.Integration.Tests\e2e-appsettings.json.
- NO JSON file matched with specific pattern: d:\a\r1\a\EstimationCore\**\**\e2e-appsettings.json.
I'm stuck with a release variable substitution of an angular project. I have a settings.json file which I would like to replace some variables:
{
test : "variable to replace"
}
I tried to find some custom task on the marketplace but all of the tasks seems to work only with xml files for the web.config.
I use the "Replace tokens" from the Marketplace https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens
You define the desired values as variables in the Release Definition and then you add the Replace Tokens task and configure a wildcard path for all target text files in your repository where you want to replace values (for example: **/*.json). The token that gets replaced has configurable prefix and postfix (default are '#{' and '}#'). So if you have a variable named constr you can put in your config.json
{
"connectionstring": "#{constr}#"
}
and it will deploy the file like
{
"connectionstring": "server=localhost,user id=admin,password=secret"
}
The IIS Web App Deploy Task in VSTS Releases has JSON variable substitution under *File Transforms & Variable Substitution Options.
Provide a list of json files and JSONPath expressions for the variables that need replacing
For example, to replace the value of ‘ConnectionString’ in the sample below, you need to define a variable as ‘Data.DefaultConnection.ConnectionString’ in the build/release definition (or release definition’s environment).
{
"Data": {
"DefaultConnection": {
"ConnectionString": "Server=(localdb)\SQLEXPRESS;Database=MyDB;Trusted_Connection=True"
}
}
}
You can add a variable in release variables Tab, and then use PowerShell task to update the content of your settings.json.
Assume the original content is
{
test : "old
}
And you want to change it to
{
test : "new"
}
So you can replace the variable in json file with below steps:
1. Add variable
Define a variable in release variable tab with the value you want to replace with (variable test with value new):
2. Add PowerShell task
Settings for powershell task:
Type: Inline Script.
Inline Script:
# System.DefaultWorkingDirectory is the path like C:\_work\r1\a, so you need specify where your appsettings.json is.
$path="$(System.DefaultWorkingDirectory)\buildName\drop\WebApplication1\src\WebApplication1\appsettings.json"
(Get-Content $path) -replace "old",$(test) | out-file $path