Azure DevOps Copy Task to copy particular subfolder and files inside it - azure-devops

I have nested folder structure and I just want to copy specific folder and files under it
I need help to understand copy task content structure
Here is folder structure
Scripts
Bin
obj
Application
Test
App
file1
file2
I want to copy Just the app folder and files under it
- task: CopyFiles#2
inputs:
SourceFolder: '$(build.artifactstagingdirectory)/Scripts'
Contents: '**\app\**'
TargetFolder: '$(build.artifactstagingdirectory)/Dev'

Your YAML definition only works in windows agent.
Please make sure the charactor are same:
trigger:
- none
pool:
# vmImage: windows-latest
vmImage: ubuntu-latest
variables:
- name: system.debug
value: true
steps:
- task: CopyFiles#2
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)/Scripts'
Contents: |
**/App/**
TargetFolder: '$(System.DefaultWorkingDirectory)/targetfolder'
flattenFolders: true
- script: |
cd targetfolder
dir
displayName: 'Run a multi-line script'
'App', not 'app'. And you need 'flattenFolders' section as 'true'.
Refer to this official document:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/copy-files?view=azure-devops&tabs=yaml

Related

Copy a selection of files to a server using Azure Devops

I am trying to copy a selection of files to a destination folder on a target machine.
In my first version, I can already copy all files to the destination. Therefore, I use the following task to build an artifact.
steps:
- task: CopyFiles#2
displayName: 'copy files'
inputs:
SourceFolder: $(workingDirectory)
Contents: '**/files/*'
flattenFolders: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: files
Later I try to use that artifact for a deployment
stage: Deploy
displayName: 'Deploy files to destination'
jobs:
- deployment: VMDeploy
displayName: 'download artifacts'
pool:
vmImage: 'ubuntu-latest'
environment:
name: local_env
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'download files'
inputs:
artifact: dags
downloadPath: /opt/myfolder/files
This works perfectly fine for all files.
But what I need is the following:
The 'local_env' environment contains multiple servers. The first three letters of each server would be the perfect wild card for the files I needed.
Or in other words, if the environment contains names such as 'Capricorn', 'Aries', 'Pisces', I would like to copy 'cap*.* ', ari*.* ' or 'pis*.*' on the corresponding server.
The way I fixed it for now was
- task: Bash#3
inputs:
targetType: 'inline'
script: "HN=$(hostname | head -c 3) \n cd /opt/myfolder/files/ \n rm -r $(ls -I \"$HN*.*\")"
It does its job, but I am open to mark a better solution as resolution.

Git Checkout fails due to long file path names in the Azure DevOps yaml build pipeline

I have the Unity project code in Azure DevOps Repos and configured the below yaml pipeline to build the Unity project.
trigger:
- none
stages:
- stage: Build
displayName: Unity Build
jobs:
- job: 'UnityBuild'
displayName: 'Build the Unity application'
pool:
name: XXXXXXXXX
steps:
- checkout: none
- script: "git config system core.longpaths true"
- checkout: self
- task: UnityBuildTask#3
inputs:
buildTarget: 'standalone'
unityProjectPath: 'XXXXXXXXXX'
outputPath: '$(Build.BinariesDirectory)'
outputFileName: 'Standalone'
- task: UnityGetProjectVersionTask#1
inputs:
unityProjectPath: 'XXXXXXXXXX'
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.BinariesDirectory)'
Contents: '**'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
Whenever I ran the yaml build pipeline, the build failed before it even executed unity build tasks due to file path name length restrictions.
How to fix the issue of file path names being too long in the Azure DevOps YAML pipeline?
You can run a script before "checkout" that tells Git.exe how to handle long paths (i.e. git config --system core.longpaths true).
See here.
If the agent is running on your own Windows server, then you'll need to configure the server to Enable Long Paths support.

Azure Pipeline Copy Secure file into build folder

I've a vite/svelte project which uses .env files for environment settings. I also have an Azure Pipeline which contains a secure file .env.staging this is on the .gitignore list of the associated repo. I'd like to download this secure file, copy it to my build directory and then have it's contents read when I run vite build --mode staging (well, npm run build:staging which includes vite build...)
When run locally from my machine npm run build:staging works as expected and reads the .env.staging file, however it seems to get ignored when used in the pipeline, am I doing anything wrong?
Here's my yml.
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DownloadSecureFile#1
name: "dotenvStaging"
inputs:
secureFile: '.env.staging'
displayName: "Download .env.staging"
- task: NodeTool#0
inputs:
versionSpec: 14.15.4
displayName: "Install Node.JS"
- task: CopyFiles#2
inputs:
contents: "$(Agent.TempDirectory)/.env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"
- script: npm install
displayName: "npm install"
- script: npm run build:staging
displayName: "npm run build:staging"
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'dist'
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
#replaceExistingArchive: true
#verbose: # Optional
#quiet: # Optional
displayName: "Create archive"
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
ArtifactName: 'drop'
publishLocation: 'Container'
displayName: "Publish archive"
I'm not sure if CopyFiles#2 is doing what I expect or not as it just matches the content parameter to copy whatever files match, which could be 0 if I'm writing it wrong...
Another note, I also tried using $(dotenvStaging.secureFilePath) as the content parameter, but it doesn't seem to do anything either.
Naturally I figured it out as soon as I posted, I needed to update the CopyFiles part to specify sourceFolder, clearly it didn't like my absolute file path for content.
- task: CopyFiles#2
inputs:
sourceFolder: "$(Agent.TempDirectory)"
contents: ".env.staging"
targetFolder: "$(Agent.BuildDirectory)"
displayName: "Import .env.staging"

CopyFiles Task not picking up files

Using Azure DevOps YAML in a database project build and release pipeline
This bit of code correctly picks up my four dacpac files, I can see these being copied in the console
- task: CopyFiles#2
displayName: Copy build output to artifacts staging
inputs:
SourceFolder: "$(Build.SourcesDirectory)"
flattenFolders: true
Contents: '**\bin\**\*.dacpac'
TargetFolder: "$(Build.ArtifactStagingDirectory)"
This bit of code correctly picks up my publish files, I can see these being copied in the console
- task: CopyFiles#2
displayName: Copy build output to artifacts staging
inputs:
SourceFolder: "$(Build.SourcesDirectory)"
flattenFolders: true
Contents: '**\PublishProfile\*.publish.xml'
TargetFolder: "$(Build.ArtifactStagingDirectory)"
This bit of code reports "zero files found"
- task: CopyFiles#2
displayName: Copy build output to artifacts staging
inputs:
SourceFolder: "$(Build.SourcesDirectory)"
flattenFolders: true
Contents: |
'**\bin\**\*.dacpac'
'**\PublishProfile\*.publish.xml'
TargetFolder: "$(Build.ArtifactStagingDirectory)"
This pipe multiline syntax is all over the examples
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/copy-files?view=azure-devops&tabs=yaml#examples
I've also used Get-ChildItem to doubly confirm that the files exist.
It seems like | / multiline doesn't work as described.
As usual, as I write this I checked in detail and the one difference between my code and the example was single quotes.
So it works if you remove single quotes.
Does anyone even QA this stuff?
- task: CopyFiles#2
displayName: Copy build output to artifacts staging
inputs:
SourceFolder: "$(Build.SourcesDirectory)"
flattenFolders: true
Contents: |
# NOTE THESE PATHS ARE NOT SURROUNDED BY SINGLE QUOTES
# EVEN THOUGH THIS WORKS IN THE SINGLE LINE VERSION
**\bin\**\*.dacpac
**\PublishProfile\*.publish.xml
TargetFolder: "$(Build.ArtifactStagingDirectory)"
Other hot tips to save you hours:
Use this to list files to help troubleshoot missing files
- task: Bash#3
inputs:
targetType: inline
workingDirectory: $(PIPELINE.WORKSPACE)
script: ls -R
Remember Linux is CASE SENSITIVE - get the case wrong and it won't find your files
As of right now, you can't parameterise service connections. Maybe that will change in future
It's possible to get indentation wrong in YAML and it gives you no clues
This code makes all the variables in the variable group TST available (these are under "Library" not "Environment" - go figure)
variables:
- group: TST
This code (with extra indentation) doesn't throw an error or give any clues, it just doesn't make any variables available. All your variables like $(MyVariable) will be treated as literals
variables:
- group: TST

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.