Gitlab CI failure but continue to run problem - github

In gitLab, I create CI to build the project, for each stage I have 2 job seperately
Build BookProject:
stage: build
<<: *dotnetbuild_job
when: manual
Build ShopProject:
stage: build
<<: *dotnetbuild_job
when: manual
Deploy BookProject:
stage: Deploy
needs: ["Build BookProject"]
<<: *dotnetdeploy_job
when: on_success
Deploy ShopProject:
stage: Deploy
needs: ["Build ShopProject"]
<<: *dotnetdeploy_job
when: on_success
I find that when Build BookProject: job return ERROR: Job failed: exit code 1, which the job icon show !, the Deploy BookProject:
are still continue to run, even I set when: on_success, can I know how to prevent it ?

When jobs specify when: manual, it implies allow_failure: true.
To avoid this behavior, specify allow_failure: false on your manual build jobs.

Related

Azure yaml pipeline group variables not seen by task in a template file

I have a pipeline stage that is using a template as follows:
# Deploy to AKS
- stage: DeployTEST
displayName: Test env for my-app
condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
variables:
- group: 'my-app-var-group-test'
- group: 'package-variables'
- template: templates/shared-template-vars.yml#templates
jobs:
- deployment: TestDeployment
displayName: Deploy to AKS - Test
pool:
vmImage: $(vmImageName)
environment: env-test
strategy:
runOnce:
deploy:
steps:
- template: ./aks/deployment-steps.yml
...and the content of the template deployment-steps.yml is:
steps:
- script: |
echo AzureSubscription: '$(azureSubscription)'
echo KubernetesServiceConnection: '$(kubernetesServiceConnection)' # this is working
- task: KubernetesManifest#0
displayName: Create imagePullSecret
inputs:
action: createSecret
secretName: $(imagePullSecret)
dockerRegistryEndpoint: $(dockerRegistryServiceConnection)
kubernetesServiceConnection: $(kubernetesServiceConnection) # this is causing an error
I get an error like this:
There was a resource authorization issue: "The pipeline is not valid. Job TestDeployment: Step input kubernetesServiceConnection references service connection $(kubernetesServiceConnection) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz."
and like this when I try to select individual stages prior manual pipeline run:
Encountered error(s) while parsing pipeline YAML:
Job TestDeployment: Step input kubernetesServiceConnection references service connection $(kubernetesServiceConnection) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.
The errors above are misleading, because it is not an authorization issue:
the referenced K8s service connection is authorized
when I hardcode the value of the $(kubernetesServiceConnection) variable the pipeline runs just fine - no errors
variable group my-app-var-group-test is authorized - IMPORTANT: this is where the $(kubernetesServiceConnection) variable is defined
NOTE: The variable kubernetesServiceConnection is defined in the my-app-var-group-test variable group & when I comment out the KubernetesManifest task, the value of the $(kubernetesServiceConnection) variable is properly printed to the pipeline console output without any issues and the pipeline runs successfully!?
I know I could use parameters to pass values into the template, but this setup is already used by all other pipelines (variable group vars are used/references in templates) and this issue appeared on a newly created pipeline. I have used file comparison to compare the yaml of a working pipeline and this one and failed to spot anything...
I might be missing something obvious, but I spent hours on this failing to resolve the error...

Azure Pipelines - Handling builds for Dependent downstream pipelines

We have more number of common upstream pipelines - pipleline-a, pipleline-b, pipeline-c, pipeline-d … each in its own repository - repository-a, repository-b, repository-c, repository-d…
My target pipeline, say pipeline-y in repository-y, has a dependency on these upstream pipelines artifacts and the target pipeline needs to build when there is a change to any of the upstream libraries and the corresponding upstream pipeline builds successfully.
In other words, target pipeline-y needs to be triggered if any of the upstream pipelines completed successfully due to changes in them (CI triggers for upstream libraries work fine in their own pipelines).
We currently achieved this, using the resources pipelines trigger in the target pipeline-y, as below:
Upstream Pipeline - pipeline-a.yml
trigger:
- repository-a*
steps
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
publishJUnitResults: false
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: true
effectivePomSkip: false
sonarQubeRunAnalysis: false
goals: 'package deploy'
Target pipeline-y.yml resources section
resources:
pipelines:
- pipeline: pipeline-a
source: pipeline-a
trigger:
branches:
- 'pipeline-a-v1*'
- pipeline: pipeline-b
source: pipeline-b
trigger:
branches:
- 'pipeline-b-v1*'
- pipeline: pipeline-c
source: pipeline-c
trigger:
branches:
- 'pipeline-c-v1*'
- pipeline: pipeline-d
source: pipeline-d
trigger:
branches:
- 'pipeline-d-v1*'
- pipeline: pipeline-e
source: pipeline-e
trigger:
branches:
- 'pipeline-e-v1*'
This works fine.
My question is, as we add more upstream common libraries, we have to update the resources section in the target downstream. When there are new versions of upstream libraries, we have to modify the version in resources-pipelines-pipiline-trigger - branches from “pipeline-a-v1” to “pipeline-a-v2”.
Is there a better way to do this? Can a variable be used in the resources-pipelines-pipeline-trigger - branches - example pipeline-a-$(version) . Can version be derived using Build system variables as below:
I tried
variables:
version: $[replace(variables['Build.SourceBranchName'], variables['Build.Repository.Name'], '')]
It did not seem to work.
It's not possible to dynamically specify resources in YAML.
A suggestion could be to use REST API hooks when new pipelines are added. Then trigger a program that generates new YAML for pipeline-y.yml.

How to use the postgres db on the windows-latest agent used in the azure pipeline?

I have a java maven project that I am building with an azure pipeline with as host "windows-latest" as it contains the correct java 13 version. However, for the integration tests, I need a postgres db and the "windows-latest" agent contains a postgres db, see: link. But how can I use this? I tried to use it by including it's serviceName in the Maven task as service:
services:
postgres: postgresql-x64-13
But then I get the error it can not find a service by that name.
I tried defining the db properties through env settings (see yml below), and then it shows the error:
Caused by: java.net.ConnectException: Connection refused
I also tried running it through a script task through the docker-compose.yml in the root of the project that I use during development, but docker-compose throws an error saying it can't find the compose file, I also doubt this the correct way.
So can I use the postgres db on the windows agent? and how?
My azure pipeline snippet:
variables:
MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository
MAVEN_OPTS: "-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)"
application_name: clearsky
service_name: backend
mygetUsername: myserUsername
mygetPassword: mytoken
SPRING_DATASOURCE_URL: jdbc:postgresql://localhost:5432/postgres
SPRING_DATASOURCE_USER: postgres
SPRING_DATASOURCE_PASSWORD: root
stages:
- stage: create_artifact
displayName: Create artifact
jobs:
- job: build
displayName: Build, test and publish artifact
steps:
- task: Maven#3
name: maven_package
displayName: Maven package
inputs:
goals: "package"
mavenPomFile: "backend/pom.xml"
options: '--settings backend/.mvn/settings.xml -DmygetUsername=$(mygetUsername) -DmygetPassword=$(mygetPassword)'
mavenOptions: "-Xmx3072m $(MAVEN_OPTS)"
javaHomeOption: "JDKVersion"
jdkVersionOption: "1.13"
mavenAuthenticateFeed: true
In Azure Devops Windows agen, the postgresql is disabled/stop by default.
Here is the configuration doc.
Property Value
ServiceName postgresql-x64-13
Version 13.2
ServiceStatus Stopped
ServiceStartType Disabled
You could try the following command to start the postgresql.
"C:\Program Files\PostgreSQL\13\bin\pg_ctl.exe" start -D "C:\Program Files\PostgreSQL\13\data" -w

Different Build steps according to external variable in Drone CI

I use Drone CI for handling CI/CD process.
I am working on a use case where I take input variables and run different pipelines according to the key value pair.
Inputs to the deploy pipeline.
Currently in my pipeline I use Ansible Plugin to push the changes to the destination. Something like this
- name: pipeline1
image: plugins/ansible:3
environment:
<<: *creds
settings:
playbook: .ci/.ansible/playbook.yml
inventory: .ci/.ansible/inventory
user: admin_user
private_key:
from_secret: admin_key
become: true
verbosity: 3
when:
KEY1 = True
- name: pipeline2
image: plugins/ansible:3
environment:
<<: *creds
settings:
playbook: .ci/.ansible/playbook.yml
inventory: .ci/.ansible/inventory
user: admin_user
private_key:
from_secret: admin_key
become: true
verbosity: 3
when:
KEY2 = True
.
.
.
How can I deploy such a pipeline?
when keyword does not have any example in this regard
As per drone conditions documentation (https://docs.drone.io/pipeline/conditions/) you can't use environments in when block. Only repos/promotions could be used there.
In your case you can try to use dependencies for steps, via depends_on parameter in parallelism (https://discourse.drone.io/t/how-to-setup-parallel-pipeline-steps-1-0/3251)

Azure DevOps connecting to remote Artifactory

I am trying to figure out the appropriate way to connect an Azure DevOps pipeline that executes a Maven build to a remote JFrog Artifactory Maven repository.
I first looked at Feeds and Upstream Sources. I didn't see anything in the documentation that shows how to do this. Indeed it looks as though you can't actually do it.
Then I looked at Service Endpoints. Here, I was able to create a service endpoint that points at my Artifactory host. Great! I added a task of Maven Authenticate, which is the only one I saw that allows me to reference the mavenServiceConnection. But the pipeline still fails when trying to resolve artifacts, because it only looks at Maven Central.
# Docker
trigger:
- master
resources:
- repo: self
variables:
tag: '$(Build.BuildId)'
stages:
- stage: Build
displayName: Build image
jobs:
- job: Build
displayName: Build
pool:
vmImage: 'ubuntu-latest'
steps:
- task: MavenAuthenticate#0
inputs:
mavenServiceConnections: 'eti-libs-snapshots-local'
- task: Maven#3
inputs:
options: '-X'
mavenPomFile: 'pom.xml'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: false
effectivePomSkip: false
sonarQubeRunAnalysis: true
sqMavenPluginVersionChoice: 'latest'
- task: Docker#2
displayName: Build an image
inputs:
command: build
dockerfile: '$(Build.SourcesDirectory)/Dockerfile'
tags: |
$(tag)
Edited: Found some more relevant information
So as Kontekst pointed out, I needed to add the Jfrog extension for pipelines. I have done that and got it configured.