Azure DevOps uzip as a deploy task - deployment

does anyone know how to uznip a file as a part of the deployment pipeline?
At the moment deploy finishes with a zip archive inside of fronted/download/myfiles.zip.
I want to add a task which will take this zip file and extract it into e.g.
frontend/download/archive/...
Thank you

There is an Extract Archive task provided by Microsoft that you can use to extract archived files.
After you added it as a task that runs on your deployment group to you can configure it so that it does what you need. Make sure to set Archive file patterns and Destination folder correctly.

Microsoft has provided a documentation for this, but i would like to share some info related to yaml snippet
steps:
- task: ExtractFiles#1
inputs:
archiveFilePatterns: 'QtBinaries.rar'
destinationFolder: '$(Build.SourcesDirectory)\bin'
cleanDestinationFolder: false
- task: ExtractFiles#1
inputs:
archiveFilePatterns: 'AzureNeededDlls.rar'
destinationFolder: '$(Build.SourcesDirectory)\bin'
cleanDestinationFolder: false
Here i am trying to extract the two rar files with names('QtBinaries.rar, AzureNeededDlls.rar'), we can do it as one task also it depends on need.

Related

Image version by BuildID when do task 'Build module images' for iotedge - Azure pipeline

I am building a docker image for Azure IoTEdge using pipeline, then push to Azure Container Registry. Everything worked but:
Current the image version is fixed (or need to set manual) in module.json:
However I want image version will update by BuidID (Or any unique ID), I tried below code but it did not work:
This is error from pipeline log:
I tried to read document build module image, but it dont have option for tagging like build image using docker v2 task.
Hope you can help me on it!
The Pipeline variable $(Build.BuildId) will not directly pass to module.json file.
To meet your requirement, you can add the task File transform to pass the variable value to module.json file.
Here is an example:
variables:
image.tag.version: $(Build.BuildId)
steps:
- task: FileTransform#1
inputs:
folderPath: '$(System.DefaultWorkingDirectory)'
fileType: 'json'
targetFiles: 'module.json'
- task: Docker#2
Or you can also use the Replace token task. Refer to this ticket: How to apply a variable in Azure Pipelines to a node app during build

Azure pipeline: trying to upload files at root of FTP instead of ./pipeline

I'm trying to upload project files to FTP using Azure Pipeline. It work so far, but there is still one problem that remains. When Azure uploads the fiels it put it inside ./pipeline instead of directly in the remote directory.
Here is my YAML configuration:
steps:
- task: FtpUpload#2
displayName: 'FTP Upload: ./dist'
inputs:
serverEndpoint: '<FTP Service>'
rootDirectory: ./dist
filePatterns: './**/*.*'
remoteDirectory: .
clean: true
preservePaths: true
trustSSL: true
I tried to run FTP command to move the files in the task, but that doesn't work at all.
Is there a way to upload directly to the remote directory without any superfluous folder?
Should I run something else afterwards?
If that might be to any use, I use 2 NPM task beforehand to deploy a Vuejs app.
EDIT / SPECIFICATION
The deployment works. The problem is that the application can not be accessed like this: myurl.net it has to be accessed like this myurl.net/pipeline.

Not found scriptPath in azure devops

I put a shell script file in a folder on my repo root and tried to run that in my devops pipeline but it says that cannot find the scriptPath:
[error]Not found scriptPath: /home/vsts/work/1/s/pipelines/databricks-cli-config.sh
I am simply creating a task to run the shell script, like this:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
displayName: "Install and configure the Databricks CLI"
Any idea?
Make sure you checkout your code and you are on correct level. So if you are on regular job please add working directory:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
cwd: '$(System.DefaultWorkingDirectory)'
displayName: "Install and configure the Databricks CLI"
and if you use it on deployment job, by default code is not being checked out there. So you need you need to publish this script as artifact and then download it in deployment job (deployment jobs download artifact by default) or add
- checkout: self
step do download code on deployment job.
I assumed that you use YAML.

Azure build pipelines - using the 'DownloadSecureFile' task within a template

I have an Azure DevOps project with a couple of YAML build pipelines that share a common template for some of their tasks.
I now want to add a DownloadSecureFile task to that template, but I can't find a way to get it to work.
The following snippet results in an error when added to the template, but works fine in the parent pipeline definition (Assuming I also replace the ${{ xx }} syntax for the variable names with the $(xx) version):
- task: DownloadSecureFile#1
name: securezip
displayName: Download latest files
inputs:
secureFile: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
retryCount: 5
- task: ExtractFiles#1
displayName: Extract files
inputs:
archiveFilePatterns: ${{ variables.securezip.secureFilePath }}
destinationFolder: '${{ parameters.sourcesDir }}\secure\'
cleanDestinationFolder: true
The error occurs on the 'Extract File' step and is Input required: archiveFilePatterns, so it looks like it's just not finding the variable.
As a workaround, I could move the download task to the parent pipeline scripts and pass the file path as a parameter. However, that means duplicating the task, which seems like a bit of a hack.
Variables in dollar-double-curly-brackets are resolved at template expansion time. They are not the output of tasks.
Output variables from tasks are referenced by dollar-single-parentheses and they don't need to start with the word "variables."
So I believe the line you're looking for is like this, and it isn't affected by the template mechanism.
archiveFilePatterns: $(securezip.secureFilePath)

Azure Devops Pipeline Test step fails - incorrect path to data files

I have a Repo containing three solutions. Each solution has multiple projects, many of which are shared (including test projects).
I have a build pipeline along the following lines
Retrieve NuGet packages
Build Solution 1
Build Solution 2
Build Solution 3
Execute all tests
The step to execute all tests is as follows:
- task: VSTest#2
displayName: 'Test'
inputs:
testSelector: 'testAssemblies'
testAssemblyVer2: |
**\*test*.dll
!**\*TestAdapter.dll
!**\obj\**
searchFolder: '$(System.DefaultWorkingDirectory)'
The vast majority of tests run perfectly. However, I get the odd and rather confusing error message for some tests:
[error]SetUp failed for test fixture TestProjectOne.A.B.GetSomethingTests
[error]SetUp : System.IO.DirectoryNotFoundException : Could not find a part of the path
'd:\a\1\s\Projects\TestProjectTwo\A\B\TestData\SomeFile.txt'.
So it's currently processing TestProjectOne but then saying it can't find a file under a path for TestProjectTwo.
The code within the test is as follows:
private const string RelativePath = #"..\..\A\B\TestData\";
...
var x = File.ReadAllText(RelativePath + "SomeFile.txt")
Needless to say, this works perfectly using Visual Studio 2019 using both the Visual Studio and ReSharper test runner.
Why would an Azure DevOps pipeline suffer this issue?
Why would an Azure DevOps pipeline suffer this issue?
That because we use wildcard in the VS test task:
- task: VSTest#2
displayName: 'Test'
inputs:
testSelector: 'testAssemblies'
testAssemblyVer2: |
**\*test*.dll
Which will grab all *test*.dll files in the $(System.DefaultWorkingDirectory) folder, including the sub folder.
Obviously, the great convenience brought by this method is that we do not have to grab the *test*.dll from the folder one by one. But one problem with it is that since we are using wildcards, it will lose the full path of the each *test*.dll file. In thise case, if we specify the relative path ..\..\A\B\TestData\ in the *test*.dll file, it will not get the correct path, because the current *test*.dll file lost its full path.
That is reason why it execute the test dll from TestProjectOne.A.B.GetSomethingTests, but got the path from TestProjectTwo.
To resolve this issue, we could specify the full path in the *test*.dll file instead of the relative path.
Hope this helps.