I have a Azure DevOps build pipeline that runs a Cypress test. In that Cypress test we have a test user login with a e-mail and password. On my local system I have the password in a cypress.env.json file.
On the Azure build pipeline I get the message that the password is undefined which makes sense since we put the cypress.env.json file in the .gitignore not to expose it to the repo.
I've created a Azure variable to represent the password: $(ACCOUNT_PASSWORD)
So I think I need to create the cypress.env.json file in the build pipeline and use Azure variables for it, but I can't figure out how to create a file during the build step.
I have this task:
- task: CmdLine#2
displayName: 'run Cypress'
inputs:
script: |
npm run ci
So I need to add a task before this that creates the cypress.env.json file with the variable that represents the password:
{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}
You can add a simple PS script that creates the file:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = '{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}'
$json | Out-File cypress.env.json
workingDirectory: '$(Build.SourcesDirectory)'
pwsh: true # For Linux
In the workingDirectory set the path to where you want the file to be created.
If you want to create a json file using Azure Pipeline's Bash#3 for linux environments, you can do
steps:
- task: Bash#3
inputs:
targetType: "inline"
script: |
echo '{"ACCOUNT_PASSWORD": "$(ACCOUNT_PASSWORD)"}' > server/cypress.env.json
cd server
echo $(ls)
cat git-tag.json
- task: Docker#2
inputs:
command: buildAndPush
...
The cd echo and cat commands are not necessary. They are only there to log stuff to the console so you can see where the file is and the contents. This task may seem trivial to many but for someone like me with little bash experience even something simple like this task took quite a while to figure out how to debug and get right.
Azure Pipelines runs this bash script in the Build.SourcesDirectory, which is the root of your project. In this example above I have a /server folder which is where I want to put the .json file.
Later on, I run the Docker#2 task which builds my server. The Docker#2 task looks at my Dockerfile which has the instruction COPY . ./ I could be wrong here being new to docker, but my assumption is the Azure VM running the pipeline executes the docker COPY command and copies the file from the Build.SourcesDirectory to a destination inside the docker container.
Related
I am trying to set an environment variable to use it in another task but it doesn't work.
The first task should set the variable "versionTag" so I can use it in the next task as $(versionTag).
Can anyone help me with this?
- task: Bash#3
displayName: Create version tag
inputs:
targetType: 'inline'
script: |
versionTag=$(echo "$(Build.BuildNumber)" | tr '+' '-')
echo "versionTag: ${versionTag}"
echo "##vso[task.setvariable variable=versionTag]${versionTag}"
- task: Docker#2
displayName: Create runtime docker image
inputs:
containerRegistry: '$(dockerRegistryServiceConnection)'
repository: '$(imageRepository)'
command: 'build'
Dockerfile: '$(dockerfilePath)'
buildContext: '$(Build.SourcesDirectory)'
tags: |
$(tags)
$(versionTag)
There's a magic command string you can write to the log:
echo "##vso[task.prependpath]c:\my\directory\path"
The path will be updated for the scope of the Job. If your pipeline has multiple jobs, you need to issue the same command for future jobs as well.
The updated path will be available in the next step. Not in the step in which you issue the command.
I am trying to use an Azure DevOps deployment job to create a ServiceNow Standard Change Request, register the sys_id to the pipeline, then use it in subsequent phases of the Deployment Job. I have a Python utility that creates a Change Request, and filters the sys_id registering it as a variable. I can access said variable in the same "phase" of the Deployment Job, but the next phase is not working as expected, well,the docs don't really cover any use like this other than contrived uses. I think I was following the Set Variables in Scripts See my pipeline below.
- task: Bash#3
name: snow
displayName: Create Standard RFC from Template
# This task, I'm registering the SYS_ID of the RFC being created. I want to use this throughout the rest
# of the Deployment Job.
inputs:
targetType: inline
script: |
export sys_id=$(servicenow standard create template $(std_tmpl_sys_id) --query="result.sys_id.value")
echo "##vso[task.setvariable variable=rfc_sys_id;isoutput=true]$sys_id"
env:
SNOW_USER: '$(SNOW_USER)'
SNOW_PASS: '$(SNOW_PASS)'
- task: Bash#3
displayName: Progress RFC to Scheduled
# This works, for this one task.
inputs:
targetType: inline
script: |
servicenow standard update $(snow.rfc_sys_id) state=Scheduled
env:
SNOW_USER: '$(SNOW_USER)'
SNOW_PASS: '$(SNOW_PASS)'
deploy:
steps:
- task: Bash#3
displayName: Install ServiceNow
inputs:
targetType: inline
script: |
pip install snow --index-url=https://azure:$(System.AccessToken)#pkgs.dev.azure.com/$(ADO_ORG)/$(ADO_PROJ)/_packaging/python-azure-artifacts/pypi/simple/
- task: Bash#3
displayName: Progress RFC to Implement
# Here, I attempt to get the registered variable from the preDeploy "phase" and use it as a Shell Variable
# because otherwise Azure DevOps would try to just execute it as a shell command.
inputs:
targetType: inline
script: |
servicenow standard update ${RFC_SYS_ID} state=Implement
env:
SNOW_USER: '$(SNOW_USER)'
SNOW_PASS: '$(SNOW_PASS)'
RFC_SYS_ID: $[ dependencies.BuildPythonApp.outputs['preDeploy.rfc_sys_id'] ]
Also found here:
https://gist.github.com/FilBot3/d8184b3c0b1c887e7e99884b051bd73c#file-azure-pipelines-yaml-L89-L131
Is it even possible to do this in Azure DevOps YAML Pipelines using a Deployment Job?
Can you try two of them? I think only the step-name is missing.
RFC_SYS_ID: $[ dependencies.BuildPythonApp.outputs['preDeploy.snow.rfc_sys_id'] ]
or
RFC_SYS_ID: $[ dependencies.BuildPythonApp.outputs['BuildPythonApp.snow.rfc_sys_id'] ]
I put a shell script file in a folder on my repo root and tried to run that in my devops pipeline but it says that cannot find the scriptPath:
[error]Not found scriptPath: /home/vsts/work/1/s/pipelines/databricks-cli-config.sh
I am simply creating a task to run the shell script, like this:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
displayName: "Install and configure the Databricks CLI"
Any idea?
Make sure you checkout your code and you are on correct level. So if you are on regular job please add working directory:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
cwd: '$(System.DefaultWorkingDirectory)'
displayName: "Install and configure the Databricks CLI"
and if you use it on deployment job, by default code is not being checked out there. So you need you need to publish this script as artifact and then download it in deployment job (deployment jobs download artifact by default) or add
- checkout: self
step do download code on deployment job.
I assumed that you use YAML.
Currently I'm working on a pipeline script for Azure Devops. I want to provide a maven settings file as a secure files for the pipeline. The problem is, when I define a job only for providing the file, the file isn't there anymore when the next job starts.
I tried to define a job with a DownloadSecureFile task and a copy command to get the settings file. But when the next job starts the file isn't there anymore and therefore can't be used.
I already checked that by using pwd and ls in the pipeline.
This is part of my current YAML file (that actually works):
some variables
...
trigger:
branches:
include:
- stable
- master
jobs:
- job: Latest_Release
condition: eq(variables['Build.SourceBranchName'], 'master')
steps:
- task: DownloadSecureFile#1
name: settingsxml
displayName: Download maven settings xml
inputs:
secureFile: settings.xml
- script: |
cp $(settingsxml.secureFilePath) ./settings.xml
docker login -u $(AzureRegistryUser) -p $(AzureRegistryPassword) $(AzureRegistryUrl)
docker build -t $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest) .
docker push $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest)
....
other jobs
I wanted to put the DownloadSecureFile task and "cp $(settingsxml.secureFilePath) ./settings.xml" into an own job, because there are more jobs that need this file for other branches/releases and I don't want to copy the exact same code to all jobs.
This is the YAML file as I wanted it:
some variables
...
trigger:
branches:
include:
- stable
- master
jobs:
- job: provide_maven_settings
# no condition because all branches need the file
- task: DownloadSecureFile#1
name: settingsxml
displayName: Download maven settings xml
inputs:
secureFile: settings.xml
- script: |
cp $(settingsxml.secureFilePath) ./settings.xml
- job: Latest_Release
condition: eq(variables['Build.SourceBranchName'], 'master')
steps:
- script: |
docker login -u $(AzureRegistryUser) -p $(AzureRegistryPassword) $(AzureRegistryUrl)
docker build -t $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest) .
docker push $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest)
....
other jobs
In my dockerfile the settings file is used like this:
FROM maven:3.6.1-jdk-8-alpine AS MAVEN_TOOL_CHAIN
COPY pom.xml /tmp/
COPY src /tmp/src/
COPY settings.xml /root/.m2/ # can't find file when executing this
WORKDIR /tmp/
RUN mvn install
...
The error happens, when docker build is started, because it can't find the settings file. It can though, when I use my first YAML example. I have a feeling that it has something to do with each job having a "Checkout" phase, but I'm not sure about that.
Each job in Azure DevOps is running on different agent, so when you use Microsoft Hosted Agents and you separator the pipeline to few jobs, if you copy the secure file in one job, the second job running in new fresh agent that of course don't have the file.
You can solve your issue by using Self Hosted agent (then copy the file to your machine and the second job running in the same machine).
Or you can upload the file to somewhere else (secured) that you can downloaded it in the second job (so why not do it from the start...).
In the pipelines.yml file, the following is used:
steps:
# Print buildId
- script: |
echo "BuildId = $(buildId)"
When looking at the build log in Azure DevOps, I see just "CmdLine".
Is there a way to give a step or a script a readable name which is visible in the build log?
You just need to add the parameter displayName:
steps:
- script: 'echo "BuildId = $(Build.BuildId)"'
displayName: Test1001
- script: 'echo "BuildId = $(Build.BuildId)"'
displayName: Test1002