Interpolating strings in a file path in a DockerFile - azure-devops

I have a Docker file which starts like this:
ARG FILE_PATH
FROM mcr.microsoft.com/dotnet/aspnet:3.1 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/sdk:3.1 AS build
WORKDIR /src
COPY ["${FILE_PATH}/src/NuGet.config", "src/"]
I call it using the azure-cli like this:
$pathToSrc = "$(Build.SourcesDirectory)/My folder"
az acr build --build-arg "FILE_PATH=$pathToSrc" ...
This always fails with the message:
COPY failed: file not found in build context or excluded by
.dockerignore: stat src/NuGet.config: file does not exist
I have tried variations such as:
COPY [$FILE_PATH/src/NuGet.config, "src/"]
COPY ["FILE_PATH/src/NuGet.config", "src/"]
and
az acr build --build-arg "FILE_PATH='$pathToSrc'" ...
but always end up with the same message.
Is there a way to do this. I am running on a hosted agent in Azure-devops pipeline. The task is task: AzureCLI#2 using a PowerShell Core script.

This may be related: https://stackoverflow.com/a/56748289/4424236
...after every FROM statements all the ARGs gets collected and are no longer available. Be careful with multi-stage builds.
Try this:
FROM mcr.microsoft.com/dotnet/aspnet:3.1 AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/sdk:3.1 AS build
WORKDIR /src
ARG FILE_PATH
COPY ["${FILE_PATH}/src/NuGet.config", "src/"]

Related

Azure pipeline docker fails copy with multiple projects

Copying of Data.csproj to Data/ is failing when building my app in azure devops. Though, the first copy command, Api.csproj to Api/ is working fine. Do note that I did not specify the buildContext on my azure-pipeline.yml file. But, when I did add the buildContext, buildContext: '$(Build.Repository.LocalPath)', it failed even on the first copy.
Any inputs or suggestion on how to fix this one? I tried searching and adding the buildcontext or adding the folder on the csproj doesn't seem to work. For example, COPY ["/Data/Data.csproj", "Data/"]
This is my folder structure (my azure-pipeline.yml file is outside the App folder):
App
- Api/
- Api.csproj
- Dockerfile
- Data/
- Data.csproj
- Domain/
- Domain.csproj
- App.sln
My dockerfile:
FROM mcr.microsoft.com/dotnet/core/aspnet:3.1-buster-slim AS base
WORKDIR /app
EXPOSE 80
EXPOSE 443
FROM mcr.microsoft.com/dotnet/core/sdk:3.1-buster AS build
WORKDIR /src
COPY ["Api.csproj", "Api/"]
COPY ["Data.csproj", "Data/"]
COPY ["Domain.csproj", "Domain/"]
RUN dotnet restore "Api/Api.csproj"
COPY . .
WORKDIR "/src/Api"
RUN dotnet build "Api.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "Api.csproj" -c Release -o /app/publish
FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "Api.dll"]
parts of my azure-pipeline.yml
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- task: Docker#2
displayName: Build and push an image to container registry
inputs:
command: buildAndPush
repository: 'App'
dockerfile: '**/Dockerfile'
tags: |
$(tag)
Here's the error:
Step 6/28 : WORKDIR /src
---> Running in 266a78d293ee
Removing intermediate container 266a78d293ee
---> 2d899fafdf05
Step 7/28 : COPY ["Api.csproj", "Api/"]
---> 92c8c1450c3c
Step 8/28 : COPY ["Data.csproj", "Data/"]
COPY failed: stat /var/lib/docker/tmp/docker-builder764823890/Data.csproj: no such file or directory
##[error]COPY failed: stat /var/lib/docker/tmp/docker-builder764823890/Data.csproj: no such file or directory
##[error]The process '/usr/bin/docker' failed with exit code 1
Okay, after trying so many times, I was able to fix this by changing the dockerfile and azure-pipelines.yml.
I think what fixed the issue is to specifically set the buildContext to 'App/' instead of the variable '$(Build.Repository.LocalPath)' that I'm not sure what's the exact value.
I'll just post the part that I made changes to.
Dockerfile
COPY ["Api/Api.csproj", "Api/"]
COPY ["Data/Data.csproj", "Data/"]
COPY ["Domain/Domain.csproj", "Domain/"]
azure-pipelines.yml
inputs:
command: buildAndPush
repository: $(imageRepository)
dockerfile: $(dockerfilePath)
buildContext: 'App/'

how to read env variables of docker-compose file and package.json file from github action?

from my docker-compose file I have to read an env variable. locally, I can read that variable like this: ENV_FILE=.env docker-compose -f docker-compose.dev.prisma.yml up --build
but as .env file is in .gitignore, GitHub action can't get that file. how can I read them?
almost same issue in my package.json file. I have need some env variables to be read from npm scripts:
"start:backend": "wait-port $API_HOST:API_PORT && yarn start"
what I have tried is added those variables in secrets of github, but it didn't get those variables. though expect those 2 files, envs are read perfectly from github action.
Try creating your env file manually as a step in your workflow and pass in your repository secrets. Your docker-compose and package.json should be able to read your environment variables:
- name: create env file
run: |
touch .env
echo VARIABLE=${{ secrets.VARIABLE }} >> .env

How to keep secure files after a job finishes in Azure Devops Pipeline?

Currently I'm working on a pipeline script for Azure Devops. I want to provide a maven settings file as a secure files for the pipeline. The problem is, when I define a job only for providing the file, the file isn't there anymore when the next job starts.
I tried to define a job with a DownloadSecureFile task and a copy command to get the settings file. But when the next job starts the file isn't there anymore and therefore can't be used.
I already checked that by using pwd and ls in the pipeline.
This is part of my current YAML file (that actually works):
some variables
...
trigger:
branches:
include:
- stable
- master
jobs:
- job: Latest_Release
condition: eq(variables['Build.SourceBranchName'], 'master')
steps:
- task: DownloadSecureFile#1
name: settingsxml
displayName: Download maven settings xml
inputs:
secureFile: settings.xml
- script: |
cp $(settingsxml.secureFilePath) ./settings.xml
docker login -u $(AzureRegistryUser) -p $(AzureRegistryPassword) $(AzureRegistryUrl)
docker build -t $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest) .
docker push $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest)
....
other jobs
I wanted to put the DownloadSecureFile task and "cp $(settingsxml.secureFilePath) ./settings.xml" into an own job, because there are more jobs that need this file for other branches/releases and I don't want to copy the exact same code to all jobs.
This is the YAML file as I wanted it:
some variables
...
trigger:
branches:
include:
- stable
- master
jobs:
- job: provide_maven_settings
# no condition because all branches need the file
- task: DownloadSecureFile#1
name: settingsxml
displayName: Download maven settings xml
inputs:
secureFile: settings.xml
- script: |
cp $(settingsxml.secureFilePath) ./settings.xml
- job: Latest_Release
condition: eq(variables['Build.SourceBranchName'], 'master')
steps:
- script: |
docker login -u $(AzureRegistryUser) -p $(AzureRegistryPassword) $(AzureRegistryUrl)
docker build -t $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest) .
docker push $(AzureRegistryUrl)/$(projectName):$(projectVersionNumber-Latest)
....
other jobs
In my dockerfile the settings file is used like this:
FROM maven:3.6.1-jdk-8-alpine AS MAVEN_TOOL_CHAIN
COPY pom.xml /tmp/
COPY src /tmp/src/
COPY settings.xml /root/.m2/ # can't find file when executing this
WORKDIR /tmp/
RUN mvn install
...
The error happens, when docker build is started, because it can't find the settings file. It can though, when I use my first YAML example. I have a feeling that it has something to do with each job having a "Checkout" phase, but I'm not sure about that.
Each job in Azure DevOps is running on different agent, so when you use Microsoft Hosted Agents and you separator the pipeline to few jobs, if you copy the secure file in one job, the second job running in new fresh agent that of course don't have the file.
You can solve your issue by using Self Hosted agent (then copy the file to your machine and the second job running in the same machine).
Or you can upload the file to somewhere else (secured) that you can downloaded it in the second job (so why not do it from the start...).

How can i Remove DockerFile and use only ci file with kubernetes runner

Right now I have a Docker file and a .gitlab-ci.yml , and SHELL runner
FROM node:latest
RUN cd /
RUN mkdir Brain
COPY . /Brain/
WORKDIR /Brain/
RUN npm install
ENV CASSANDRA_HOST_5="10.1.1.58:9042"
ENV IP="0.0.0.0"
ENV PORT=6282
EXPOSE 6282
CMD npm start
and ci file
before_script:
- export newver="0.1.0.117"
build:
image: node:latest
stage: build
script:
- docker build -t Brain .
- docker tag pro 10.1.1.134:5000/Brain:$newver
- docker push 10.1.1.134:5000/Brain:$newver
deploy:
stage: deploy
script:
- kubectl create -f brain-dep.yml
- kubectl create -f brain-service.yml
I dont want create image for every small change, I only want to keep stable images in local registry. now i have multiple version of Brain image, and also how can i have other services beside Brain (elasticsearch and..)
any suggestion
Kubernetes has to be able to pull the image from somewhere. You can use an alternate repo for non-release builds or use some kind of naming scheme, and then clear out non-release builds more frequently.

How do I use PowerShell with Gitlab CI in Gitlab Pages?

How do I use PowerShell commands/scripts with Gitlab CI in a .gitlab-ci.yml file which is used to deploy to gitlab pages?
I am trying to execute the build.ps1 file from .gitlab-ci.yml, but when it reaches the build.ps1 line, it gives an error saying
/bin/bash: line 5: .build.ps1: command not found
I am trying to use the PowerShell script to convert a file in my repo and have the converted file deployed to gitlab pages using .gitlab-ci.yml
Here is my code:
.gitlab.yml
pages:
stage: deploy
script:
- mkdir .public
- .\build.ps1
- cp -r * .public
- mv .public public
artifacts:
paths:
- public
only:
- master
I have been able to figure out a solution to my own question.
Solution
To Run PowerShell Command/Script from a .gitlab-ci.yml file on a gitlab.com using the Gitlab CI, you need to make sure that the contents of your .gitlab-ci.yml file is as shown below.
Note: The .gitlab-ci.yml below works without having to install a Gitlab Runner on your own machine and has been tested on the http://gitlab.com website.
image: philippheuer/docker-gitlab-powershell
pages:
stage: deploy
script:
- mkdir .public
# run PowerShell Script
- powershell -File build.ps1
# run PowerShell Command
- powershell -Command "Get-Date"
- cp -r * .public
- mv .public public
artifacts:
paths:
- public
only:
- master
The docker image philippheuer/docker-gitlab-powershell is outdated. The source on Github was also deleted.
I use in my gitlab-ci.yml the following image mcr.microsoft.com/powershell:latest more Informations available here
scriptjob:
stage: script
image:
name: "mcr.microsoft.com/powershell:latest"
script:
- pwsh ./myscript.ps1
For anyone who is having trouble launching grunt within their gitlab CI/CD via a powershell file, add this line to the top of your file:
$env:path += ";" + (Get-Item "Env:AppData").Value + "\npm"