Github Actions Environment Variables - github

I'm sure this must be really simple, but I just can't see it and would appreciate any assistance.
Lets say I have a really simple pipeline like this:
name: Deploy to App Engine
on:
push:
branches: [main]
pull_request:
branches: [main]
env:
PROJECT_ID: stankardian
REGION: europe-west2
APP_NAME: tootler
jobs:
deploy:
name: Test Deployments To App Engine
runs-on: ubuntu-latest
steps:
# Checkout the repo code:
- name: Checkout repository
uses: actions/checkout#v3
with:
ref: main
What I am trying to do is re-use the same pipeline for multiple deployment scenarios where the deployment steps can can the same, but I need to be able to use different values in the deployment steps.
For example the APP_NAME below is 'tootler'. Lets say I need to deploy this to dev, test and preprod. For dev the app name would be 'dev-tootler', in test it would be 'test-tootler, but in preprod it might need to be 'preprod-tootler-v4' or some such.
Ideally I would like to set a single variable somewhere to control the environment I'm deploying into then depending on the value of that variable then load a range of other environment variables with the specific values pertaining to that environment. The example is grossly simplified, but I might need to load 40 variables for each environment and each of those might be a different value (but the same env variable name).
In an ideal world I would like to package the env variables and values in the app directory and load the correct file based on the evaluation of the control variable. E.g.
|
|-- dev.envs
|-- test.envs
|-- preprod.envs
along with :
$env_to_load_for = $env:control_variable
load_env_variables_file($env_to_load_for)
In that pseudocode the value of $env_to_load_for evaluates to the correct filename for the environment I need to work with,then the correct environment varibles get loaded.
I have tried running a bash shell script which exports the variables I need, but I'm finding that those variables only exist for the specific step in the pipeline. By the time I list out the environment variables in the next step, they are gone.
Does that makes sense? This kind of scenario must be very common, but I cant seem to locate any patterns that explain how to accomplish this. I don't want to do down the route of managing different yaml files per environment when the actions are identical.
Would be very grateful for any assistance.

After a lot of experimentation I think I came up with a good way to achieve this. Posting as an answer in case this information helps someone else in the future.
What I did was:
Add a bash step immediately after the checkout
Use that step to run a shell script, I called the script 'target_selector.sh'
In 'target_selector.sh' I evaluate an existing environment variable which I set already at either the job or workflow scope. This is set to either 'dev', 'test' or 'preprod' and wil be used to set the context for everything in one single easy to manage value.
I used a case block to then, depending on the value of that variable, dot source either 'dev.sh', 'test.sh' or 'preprod.sh' depending on what was evaluated. These files I put in a /targets folder.
This is where the magic happens, in those .sh files (in the /targets folder), I added the environment variables I need, for that context, using this syntax:
echo "DEPLOY_TARGET=dev" >> $GITHUB_ENV
echo "APP_NAME=dev_appname" >> $GITHUB_ENV
Turns out that this syntax will write the output of the expression up to the workflow scope. This means that subsequent steps in the workflow can use those variables.
Its a little bit of a faff, but it works and is clearly extremely powerful.
Hope it helps someone, someday.

Related

Azure Pipeline root variables in templates

I am trying to use variables defined at the root level on a YAML pipeline inside Azure DevOps inside templates via the template syntax, but it seems that the variables are not available inside the templates, but when adding the steps directly to the pipeline, the exact same thing works perfectly.
So with a pipeline snippet like that
variables:
- name: test
value: asdf
stages:
- stage:
jobs:
- job: test_job
steps:
- script: echo "${{ variables.test }}"
- template: ./test.yaml
And a test.yaml like that
jobs:
- job: test
steps:
- script: echo "${{ variables.test }}"
The script inside the test_job job writes out asdf while the job inside the template just resolves to echo "".
Since my understanding of pipeline templates is, that those basically get inserted into the main pipeline, this seems like a bug. Any ideas on how to use the root variables in a template syntax inside templates or why this is not working? (Macro synatx is not an option as I need the variable inside a templated condition like ${{ if eq(variables['test'], 'asdf') }})
For security reasons, we only allow you to pass information into
templated code via explicit parameters.
https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops
The means the author of the pipeline using your template needs to
commit changes where they explicitly pass the needed info into your
template code.
There are some exceptions to this, where the variable is statically
defined in the same file or at pipeline compile time, but generally
speaking, it’s probably better to use parameters for everything that
does not involve system-defined read-only dynamic variable and
custom-defined dynamic output variables.
This behavior is by design, check this thread in the developer community
So you can either pass the variable as a parameter to the template or define a centralized variables file to include in the template like here

GitHub actions: env value in branches names (i.e. on push)

I need to run only on branches what ends with specified env variable.
Let's pretend that we are on feat/project-name branch and this is may workflow.yml:
env:
PROJECT: project-name
on:
push:
# only for branches what ends with given project name
branches:
- "**$PROJECT"
Above not work. Below, hard coded is OK:
env:
PROJECT: project-name
on:
push:
# only for branches what ends with given project name
branches:
- "**project-name"
Tried with: "**${{ env.PROJECT }}" and other configuration and nothing works.
You can configure env variables at the workflow level, but you can't use them at that level.
According to the documentation (reference 1 and reference 2):
Environment variables (at the workflow level) are available to the steps of all jobs in the workflow.
In your example, the environment variable is used at the workflow level (in the trigger on configuration), not inside a job steps, and the GitHub interpreter doesn't interpolate the value at that level.
You would need to hardcode the value at that level, or receive it as input (${{ inputs.value }}) from another workflow (or from the GitHub API).

How to share Azure Devops pipeline between multiple repos?

So, I have the following situation:
20 git repositories with a microservice in each
A repo with a template pipeline for the standard build process
Each of the 20 repos defines its own pipeline that uses this template with some parameters
On a PR build for any of the 20 repos, it will run its own pipeline as a build validation.
That's all working fine.
But now, I want to add an additional Optional Check to each of the 20 repos which would run a code analysis tool (eg. sonarqube) as part of the PR.
I don't want to add this to the main pipeline as I want it to appear in the PR as a separate optional check which can be skipped or toggled between optional/required.
The only way that I can find to achieve this is to add a CodeAnalysis.yml to each of the 20 repos and create 20 associated pipelines, which is an overhead I'd rather not deal with.
Is there a way that you can have a single pipeline that can be referenced as an optional check in all of these repos?
According to the docs, it should be possible for the shared pipeline to dynamically fetch the code from the right repo using something like this:
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
But when I try this, the PR is unable to queue the pipeline (unhelpfully, it doesn't give a reason why).
Is there a solution to this?
You need to use Templates to design a shared template to run the code scanning. Essentially templates are reusable yaml files that you can pass parameters to to customise options.
For example, for your use case you could have the template for code scanning and add an existing job onto your pipelines to extend this template, pass any optional parameters you need (such as the repo to check out) and you can add in conditions to decide when to run the code scanning check
I know what you mean, this is not possible (at least not in the PR interface). Because when you press 'Queue' button in the PR build section, there won't even be a popup to select parameters, it will just choose the default value.
- checkout: git://ProjectName/$(Build.Repository.Name)#$(Build.SourceBranch)
This is also not possible, because runtime variables are not accepted here, they will be read directly as string types.
One suggestion is that you can specify parameters manually in pipeline page and then run the pipeline after setting the parameters
The reason for this is:
1, The checkout section is expanded before everything happens in pipeline run.
2, The Queue button on PR page didn't provide pop-out window to select parameters.
Your pipeline definition should be like this:
trigger:
- none
parameters:
- name: ProjectName
default: xxx
- name: RepoName
default: xxx
- name: BranchName
default: xxx
pool:
vmImage: windows-latest
steps:
- checkout: git://${{parameters.ProjectName}}/${{parameters.RepoName}}#${{parameters.BranchName}}
- script: |
dir
displayName: 'Run a multi-line script'
Manually select the parameters every time:

Dev Ops is changing a variable

I have a DevOps build script that looks like this:
variables:
MyConnectionString: ''
stages:
- stage: BuildApp
. . .
# build app
. . .
- job: RunApp
steps:
- script: |
echo ${{variables.MyConnectionString}}
The issue that I have is that the connection string is not coming through to the script - I've set it in the edit variables, but it comes through as blank. I tried setting it directly in the script, but when I did that, it appeared to truncate the value at the semi-colon.
I feel like I'm missing something fundamental around how these variables work. Please can someone point me in the right direction?
The way you are calling the variable is actually for pipeline parameters or varibles template expressions
Calling variables or group variables into tasks should be just using the macro syntax interpolation $(MyConnectionString) syntax.
So change it to:
echo $(MyConnectionString)
Or use the other format of defining variables (usually used when you're also referring to variable groups)
variables:
- group: myVarGroup
- name: MyConnectionString
value: connectioncreds
And also make sure to check this article for more info about devops variables.

How to use complex variables in Azure DevOps pipelines

I'm trying to compose a version in variables and then use it in name:
But it ends up like:
The reason I want it this way is to reuse version variable later to stamp assembly and tag a branch.
Is that possible at all?
You can update the pipeline name during the pipeline with a logging command:
- script: echo "##vso[build.updatebuildnumber]'$(version)'"
I think I found a better way to get a build number in later steps: