I have a Release Pipeline setup to extract my build artifact and then I use the RegEx Find & Replace task to update a few configuration files on my static website before uploading via AzureBlob File Copy task.
What I have currently works. However I have to use the full source path to the "wwwroot" folder in order to for the correct files to be uploaded.
My concern is that my builds may change the directory structure that gets dropped as a build artifact which will break my release tasks.
I've tried to use various wildcards for the source path such as:
**/wwwroot/*
*/*wwwroot/*
And many other variations to no avail.
I read in some of the docs that wildcards won't work for directory paths in this task but was hoping there was a way to work around this as I have long term concerns about the path changing over time. I looked at a few other tasks in the marketplace but nothing seems to fit the bill and I couldn't find any definitive guidance on the matter. Any suggestions?
1.You can use $(System.DefaultWorkingDirectory)/**/wwwroot/*.*(or .json,.xml...) in Source input.
(If the wwwroot folder doesn't have subfolders and all files within that folder can match format *.*)
2.Otherwise I suggest you archive the wwwroot folder first before uploading it:
Use $(System.DefaultWorkingDirectory) to get working directory and then use full path(not recommend using * here) to archive the specific wwwroot folder. Then upload the zipped folder which contains all content of your wwwroot folder.
Related
I'm completely new to Azure DevOps Pipelines so if I'm doing something incorrectly I'd appreciate a nod in the right direction... I setup a build pipeline and that seems to be working, now I'm trying to setup a release pipeline in order to run tests, it's mostly based on Microsoft's documentation:
https://learn.microsoft.com/en-us/azure/devops/test/run-automated-tests-from-test-hub?view=azure-devops
Before running tests I need to transform a config file to replace some variables like access keys, usernames, etc. What I setup is what I have below but for the life of me I can't figure out what text box Package or folder refers to. The documentation is super helpful as you can imagine:
File path to the package or a folder
but what package or what folder is this referring to??? I've tried several different things but everything errors with
##[error]Error: Nopackagefoundwithspecifiedpattern D:\a\r1\a\**\*.zip
or pretty much whatever I specify for a value.
The File Transform task supports the .zip files.
Test with the default File Transform task settings, I could reproduce this issue.
In Release pipeline, the file path could has one more node for the build artifacts .zip file.
The format example:
$(System.DefaultWorkingDirectory)\{Source alias name}\{Artifacts name}\*.zip
So you could try to set the $(System.DefaultWorkingDirectory)/**/**/*.zip in Package Or folder field
For example:
On the other hand, you can check the specific path in the Release log -> Download Artifacts Step.
$(System.DefaultWorkingDirectory): D:\a\r1\a
You could aslo use this specific path in the task.
Update:
If your file is Project Folder, you refer to the following sample:
File structure:
Task Settings:
Note:You only need to assign to the folder node.
You could also select the folder path via ... option.
I have a large (lots of dependencies, thousands of files) nodejs app that I am deploying with an Azure Devops YAML build and Azure Devops "classic editor" release pipeline.
Is there some way to copy JUST new and changed files during a file copy, not every single file? My goal is to reduce the time it takes to complete the copy files step of the deploy, as I deploy frequently, but usually with just changes to one or a few files.
About copying only the changed files into artifacts for releasing, if the changed files are in a specific folder , you can copy files in the specified folder by specifying SourceFolder and Contents arguments in the Copy Files task.
If the changed files are distributed in different folders, I am afraid that there is no out-of-the-box method to only pick the changed files when using copy file or PublishArtifacts task.
As workaround, we could add powershell task to deletes all files (recursive) which have timestamp that is < (Now - x min), with this way Artifact directory contains of ONLY CHANGED files. More detailed info please refer this similar case.
Alternatively, you can call Commits-Get Changes rest api through a script in the powershell task, and then retrieve the changed files in the response and then copy them to specific target folder.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.0
Azure DevOps Build artifact drop contains following files:
Project1.zip
Project1.deploy.cmd
Project2.zip
Project2.deploy.cmd
These files are a result of build output of a single solution.
I want to deploy Project1 web app to Azure.
I set up a release definition and add a Deploy Azure Web Service task to it. The task's path to package file or folder contains value which matches exactly one file Project1.zip . Nowhere do I specify any *.deploy.cmd files.
When the release executes it fails on that task with error:
More than one package matched with specified pattern: *.deploy.cmd. Please restrain the search pattern.
The only work-around I've found is to delete 2.deploy.cmd so that the deploy dask won't find multiple files using specified search pattern. But I would like to not delete it, especially because I would like to eventually deploy Project2 too. How to solve this problem?
I would suggest that you locate your artifacts in different folders.
You can use a Powershell task before you invoke your web deploy task to
create two folders
move the .cmd and .zip folders for project 1 into one folder
move the .cmd and .zip folders for project 2 into the second folder
You can then invoke your web deploy task against the folder path for project 1 and this should side-step your issue.
You can also extract the zip file and then point the web deploy task at the folder instead which may also help you if the above is either too complex or fails for your specific use case.
Don't use a wildcard. Specify an exact path to the file you want to deploy. If you want to deploy multiple things, use multiple tasks.
I am new to ADO, so am probably misunderstanding this, or just doing it wrong, so please be patient.
I want to deploy my web site to my server via FTP. As far as I can see, the build creates a zip file, which I need to extract, and then upload the extracted file. Please correct me if I got this wrong, because if I did, then what follows is probably irrelevant.
I have set up a release pipeline (started with an empty one) and added an Extract Files agent job...
I then added an FTP Upload task to upload the extracted files to my server...
The problem is that my server ends up with all of the files in the root folder. The hierarchy has been lost completely...
There should be a wwwroot folder there with the static content. The folder doesn't exist, and the static files (such as the *.js and *.css files you see in the screenshot) are all in the root folder.
The zip file did contain this folder...
Anyone able to explain what I did wrong?
Azure DevOps Extract Files task doesn't preserve hierarchy
When you want to keep folder structure with FTP Upload task, you can select the checkbox preserve file paths in the FTP Upload task settings.
If you didn't, please check it, then the files folder structure will be preserved:
Hope this helps.
I'm trying to set up automatic build + deploy for a rather large solution. The single solution produces 2 zip folders in the "$(Build.ArtifactStagingDirectory)" location, and these contain all the right files. For the purposes of testing/troubleshooting I am only looking at one of these zip files, but eventually both sites will have to be deployed this way.
However, to get to the actual files, you have to pass through 14 unnecessary subfolders. To further complicate matters, about 8 of these are variable, based on certain elements of the build configuration (some are due to folder structure in the git repo).
I don't want any of these subfolders. The other problem is that I don't actually want a 100% flat file; I need 2 folders with subfolders to be contained within the finally-extracted directory. (The final directory is a physical path for an IIS site.) Is there any way to do this?
I have tried:
Taking the generated zip file, extracting it to a temp directory, and repackaging it, all on the build machine.
To get this to work, I had to manually specify the 14 subdirectories. Also, I was unable to use "Publish Artifact" to upload the resulting zip to VSTS, so I'm not sure how to get it onto the server box.
Downloading the published zip file from VSTS, extracting it locally on the release machine, and then copying the contents to the correct directory.
This works only when I manually specify the 14 folders contained in the directory. I made an attempt to hide the 14 folders with wildcards but only succeeded in copying the excessive nesting structure - I'm guessing the "Source Folder" parameter doesn't support wildcards (I had to actually do it in the "Contents" section or the task failed).
Using the "Flatten Folders" advanced option in the copy dialog.
This removes ALL folder structure, which is not what I want. I need to end up with 2 folders containing subfolders in the final directory.
If it's not possible to only partially flatten the zip generated by the build step, I'd appreciate some help figuring out how much of this terribly convoluted path I can pull out using variables.
There is a very simple way to move the contents of only the contents and subdirectories the PackageTmp folder to the build artifacts folder while shedding the unnecessary folder structure above it, and without using the "Flatten Folders" option (since you likely want to keep the folder structure under PackageTmp intact):
First, in your Build Solution task, set the MS Build Arguments similar to the following:
/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=false /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.binariesdirectory)"
Notice that /p:PackageAsSingleFile=false is set to false; you don't want to zip the package up just yet. Also note the setting /p:PackageLocation="$(build.binariesdirectory). You don't want the output to go directly to the artifact staging directory, as is configured by default.
Next, add a Powershell task, and add this inline script:
$folder = Get-ChildItem -Directory -Path '.\*' -Include 'PackageTmp' -Recurse
Write-Host "##vso[task.setvariable variable=PathToPackageTmpFolder]$($folder.FullName)"
This will store the fully qualified path to the PackageTmp folder in a variable named PathToPackageTmpFolder. Under Advanced options, set the Working Directory to $(build.binariesdirectory)
Now add a Copy Files task to move only the contents of PackageTmp and its subfolders to the artifact staging directory. Set Source Folder to $(PathToPackageTmpFolder), Contents to **, and Target Folder to $(build.artifactstagingdirectory). You're all set!
That is the MSBuild deploy/publish to package action and the folder structure won’t be remain after deploying to the server.
You can specify /p:PackageTempRootDir="" msbuild argument to ignore the folder structure of project path.
Another way is that, you can publish project through FileSystem method, then archive files through Archive files task.