Running Cypress test in Azure DevOps Pipelines via Dockerfile - azure-devops

I created a Dockerfile that creates a cypress image, install all dependencies, copies necessary folders, and CMD commands to run the tests. I was able to build the docker image locally, and the test run when running the image locally.
I am trying run the test in Azure Devops pipelines. I created a new pipeline using the Dockerfile I created. In my pipeline I am able to get the cypress image to build, but the tests are not running after the image is built.
I am missing something? After the image in built in the pipeline, I do need to run the image? If so I would I do that in the yaml file?

The CMD instruction is to be executed when running the image. The image cannot be ran automatically after it was built. So you have to use docker run.
You can use a powershell task to run your docker build and run command instead of docker tasks.
In below example, i run docker build command to build my dockerfile and then run docker run command to start my image. Then I can view the execution results from the powershell task summary log.
- powershell: |
cd $(system.defaultworkingdirectory) #cd to the directory where dockerfile resides.
docker build -t myapp .
docker run --rm myapp
If you want to use docker tasks to build your dockerfile, you can also try using RUN to execute your Cypress test instead of putting the test execution command in CMD commands in your dockerfile which can only be executed when run the image.

Related

How to run script on AWS through Jenkins build

I am new to Jenkins and AWS. I have a MongoDB scripts on AWS EC2 instance. The first script need to run before the jenkins build and stores the snippets of DB. Second script need to run post build to restore to that snippets. Scripts are done ready to be used. I just couldn't find a exact way to get to AWS through build and implement this in a Jenkins job. Any help would be appreciated. Thanks
You can use Jenkins stages to to the Prebuild operation, Build and then Post build operations. With the stages you can use a plugin like SSH Pipeline Steps to remotely execute commands on your EC2 instance.

How to validate Ansible playbook from azure yaml build file?

How can we validate the Ansible Playbook from yaml build file ? Yamllint ?
Yamlint is installed in Ubuntu and Linux Azure agents as stated here
So you can run yamlint inside a bash task so as to check your yml files. Choose whatever parameters that suit you.
You can run this pipeline as a build validation, so that no branch is merged into main without validation.

How to install Docker compose in Azure pipelines?

I would like to use Docker Compose task in Azure pipelines, but I am getting following error:
##[error]Unhandled: Docker Compose was not found. You can provide the path to docker-compose via 'dockerComposePath'
How should I install docker compose? Is there a "nice way", something like Docker Installer task?
Unfortunately, there does not seem to be a "clean" step like Docker Installer task for Azure pipelines to install Docker Compose.
However, I have been able to get Docker Compose installed via shell script task which utilizes the local package manager to get/install Docker Compose (e.g. sudo apt-get install -y docker-compose)
- task: ShellScript#2
displayName: Install Docker-Compose
inputs:
scriptPath: 'docker-compose-install.sh'

In Azure DevOps, how do we access build artifacts and the build environment itself in future container job steps?

I want to have a pipleline that does a maven build and then have a later step in the pipeline that uses a docker container to do some operation on the built artifact(s).
This page explains how to run a script in the context of a Docker container - great:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops&tabs=yaml
What I'm not seeing is documentation on how to access, from the Docker container, artifacts from previous build steps, or for that matter, the build environment itself.
GitLab, for example, allows you to share artifacts between steps and exposes a whole slew of environment information to container jobs. How is this accomplished in Azure DevOps?

Is there a way to "pull" and "up" with docker-compose without creating build folders in testing enviroment?

So, i have a docker-compose file that has a build command in each service. In development, docker-compose up works ok. In test enviroment, i want to docker-compose pull and docker-compose build the images, and it works ok, except it needs the folder in build command created in testing server.
Is it really necesary or is there a way to pull and up the containers without create the build folders in the testing server?
There is docker-compose up --no-build