I'm writing a node application in TypeScript, and I need to copy some files from the src directory to the dist directory after it's built (for i18n). I can add a new "copy-i18n-files" task, but I want this task to be run as part of my normal yarn nx run-many --target=build --all execution step. Is there a way to have the build task call another task before completing? Or do I need to create my own custom executor?
Related
I have a project with multiple packages, and I'm trying to write a testing workflow for it.
I'm trying to create an action that does some common setup (like installing dependencies and compiling the app), and then, split into multiple jobs with matrix, to run the tests on each package.
I have a working "setup" job, and I can get the "test" job running in parallel with matrix, but I can't get the "test" job to run on the same environment as the "setup" job. so i need to reinstall and rebuild all the dependencies in each "test" job run.
I have a situation where I need to deploy different stylesheets to different environments (Dev/Test etc).
Is there a way I can edit the publish artefact so I can do a release pipeline for each environment?
So I would have a build pipeline that produces theused.css, dev.css,
test.css
I would have a release pipeline for Dev & Test
The dev pipeline would edit the artefact by deleting theused.css then rename
dev.css to theused.css .. likewise for test
Or is there a better way to do this?
If I would want to rename a file I would probably write a bash/powershell script and execute it as a task in the pipeline.
Bash task
Power Shell Task
Specifically in your case I would copy/rename the dev.css/test.css to theused.css during the deployment step.
I do not know how you deploy but you could either rename the .css before the deployment to an S3 bucket for example or if you deploy on an on premise server copy the file and rename it at the same time.
cp /your/dev.css /your/deployed/path/to/theused.css #copying the file
mv /your/dev.css /your/deployed/path/to/theused.css #move/renaming the file
Meant to add this...
What I ended up doing was the following (with different configs, rather than css, but same idea):
In my app I have configs for Dev, Test & Prod (config.json, config-test.json, config-prod.json)
The first thing the app does is load the config when it runs
I build and deploy to dev
The build folder contains the build files including these config files
I have releases for Test and Prod that do the following:
Task 1: delete the config.json
Task 2: copy the appropriate config file, e.g. config-test in the test release pipeline, and rename it config.json
Task 3: deploy build files to the appropriate environment with the new config
I have Azure Devops build pipeline and I want to add a step to it which will run a solution Clean task. I'd like to achieve the same behavior as when I press Build->Clean Solution in Visual Studio. The problem is that I haven't found how to do Clean only without a Build after. I looked through predefined build tasks (Visual Studio Build, MSBuild) without success.
How can I do this? I know that I can use a Command Line task to run MSBuild, but I wonder maybe I miss some straightforward solution.
There is an Clean option on the Get Source tab, which could perform different kinds of cleaning of the working directory of your private agent before the build is run:
We could set the value to true to clean the working directory of your private agent. Even if the build is failed.
You could check the document Clean the local repo on the agent for some more details.
If you are using YAML, you could try below script.
jobs:
- job: string # name of the job (A-Z, a-z, 0-9, and underscore)
...
workspace:
clean: outputs | resources | all # what to clean up before the job runs
Check this document YAML schema reference for some details.
Update1
The default work folder is _work, open it, we could see some folder 1,2,3... such as below:
If you create a new pipeline, It will create a new folder under folder _work, each pipelines have their own work folder, the clean button just clean their work folder, it will not clean other work folder.
I created a Dockerfile that creates a cypress image, install all dependencies, copies necessary folders, and CMD commands to run the tests. I was able to build the docker image locally, and the test run when running the image locally.
I am trying run the test in Azure Devops pipelines. I created a new pipeline using the Dockerfile I created. In my pipeline I am able to get the cypress image to build, but the tests are not running after the image is built.
I am missing something? After the image in built in the pipeline, I do need to run the image? If so I would I do that in the yaml file?
The CMD instruction is to be executed when running the image. The image cannot be ran automatically after it was built. So you have to use docker run.
You can use a powershell task to run your docker build and run command instead of docker tasks.
In below example, i run docker build command to build my dockerfile and then run docker run command to start my image. Then I can view the execution results from the powershell task summary log.
- powershell: |
cd $(system.defaultworkingdirectory) #cd to the directory where dockerfile resides.
docker build -t myapp .
docker run --rm myapp
If you want to use docker tasks to build your dockerfile, you can also try using RUN to execute your Cypress test instead of putting the test execution command in CMD commands in your dockerfile which can only be executed when run the image.
I have a release pipeline that is specifically for running automated tests, that has multiple agent phases.
Most of the phases require use of the same artifacts in the same location on the build machine.
If I had an artifact set for download on a previous agent phase like so....
I used to be able to use that artifact again in subsequent agent phases, without downloading it again.
Now it appears to wipe the artifact folder contents when moving into a new agent phase. So if I have the following that follows the above....
The folder is wiped and the tasks that depend on that artifact existing fail.
Is there any way to prevent the build agent artifact folder from being deleted after an agent phase has finished and a new one starts?
Since phases can run in parallel and can go from one agent to another and since it's unknown what job will be next on that agent, the jobs clean up once they finish.
The trick is to end each phase with a "Publish Pipeline Artifact" task and then download that artifact in the next phase.