Azure devops deploy to function app with access restrictions - azure-devops

I am trying to deploy a function app via an Azure DevOps pipeline, however I am receiving the following error:
##[error]Failed to deploy web package to App Service.
##[error]To debug further please check Kudu stack trace URL : $URL_REMOVED
##[error]Error: Error: Failed to deploy web package to App Service. Ip Forbidden (CODE: 403)
From some googling a suggested solution seems to be to whitelist agent IP before the deployment, and then remove it after. I have added this to my pipeline, and I can see the agent IP get added to access restrictions, however the deployment still fails.
Here is my pipeline file:
# Node.js Function App to Linux on Azure
# Build a Node.js function app and deploy it to Azure as a Linux function app.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- main
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: 'xxx'
# Function app name
functionAppName: 'xxx'
# Environment name
environmentName: 'xxx'
# Agent VM image name
vmImageName: 'ubuntu-latest'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
fi
displayName: 'Build extensions'
- script: |
npm install
npm run build --if-present
npm run test --if-present
displayName: 'Prepare binaries'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(environmentName)
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureCLI#2
inputs:
azureSubscription: '$(azureSubscription)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
agentIP=$(curl -s https://api.ipify.org/)
az functionapp config access-restriction add -g xxx -n xxx --action Allow --ip-address $agentIP --priority 200
- task: AzureFunctionApp#1
displayName: 'Azure Functions App Deploy: xxx'
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionAppLinux
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
Is anyone able to advise where I am going wrong?

I've had a simmilar issue while adding the agent IP to the network restrictions of an storage account (using Powershell but you'll understand the idea), we added a 60s sleep to be sure that the setting are taken into account by Azure.
$sa_name = "sapricer$env_prefix"
if ($null -ne (Get-AzStorageAccount -ResourceGroupName $sa_rg -AccountName $sa_name -ErrorAction Ignore)) {
Write-Output "Storage account '$sa_name' exists"
if ($enable) {
Write-Output "Add ip rule for $current_ip on $sa_name..."
Add-AzStorageAccountNetworkRule -ResourceGroupName $sa_rg -AccountName $sa_name -IPAddressOrRange $current_ip
}
else {
Write-Output "Remove ip rule for $current_ip on $sa_name..."
Remove-AzStorageAccountNetworkRule -ResourceGroupName $sa_rg -AccountName $sa_name -IPAddressOrRange $current_ip
}
}
Start-Sleep -Seconds 60

I found the solution to this.
Function Apps have two IP Restriction sections, one for the App and one for the SCM site. The SCM site is the one that requires the IP to be whitelisted in order for the deployment to work:
az functionapp config access-restriction add --scm-site true -g xxx -n xxx --action Allow --ip-address $agentIP --priority 200

You can deploy Azure function app to azure devops pipeline using azure function app task from Azure devops pipeline tasks
Here is the sample snippet for deploying azure function app
variables:
azureSubscription: Contoso
# To ignore SSL error, uncomment the below variable
# VSTS_ARM_REST_IGNORE_SSL_ERRORS: true
steps:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: $(azureSubscription)
appName: samplefunctionapp
package: $(System.DefaultWorkingDirectory)/**/*.zip
Here is the Microsoft Document for deploying azure function app.

Related

AzureCLI task in Azure DevOps pipeline cannot find the location of scripts

I am using Azure DevOps pipeline and in one of my tasks I need to run a bash script which contains some Azure CLI scripts. I have put this script in a folder called scripts, and my pipeline is running in pipelines folder. Pipelines and script folders are at the same level in root directory. The following shows the part of my pipeline where I run the AzureCLI#2 task, but when the pipeline runs it raises the error that it cannot find the file!
I have already pushed everything in the repository and I can see the files. However, the pipeline cannot find it. I am using AzureCLI#2 documentation link to provide values for this task. The part of pipeline that uses AzureCLI is as follows:
pool:
vmImage: ubuntu-20.04
trigger:
branches:
include:
- "feature/ORGTHDATAMA-4810"
exclude:
- "main"
- "release"
paths:
include:
- "dip-comma-poc/**"
variables:
- group: proj-comma-shared-vg
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
## To download from devops artifactory with AZ CLI
## https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/azure-cli-v2?view=azure-pipelines
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: 'dip-comma-poc/deployment-pipelines/scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"
This caused the following error:
To resolve the error I tried using relative path in scriptPath as following but it caused the same error:
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: './scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"
I also tried inlineScript but again it cannot find the file.
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'inlineScript'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
inlineScript: './scripts/sp-login.sh $1 $2 $3'
displayName: "Download python whl from artifactory"
This also raised the same error:
How can I refer to my script in the pipeline yaml file so that it does not raise "No such file or directory error" as shown above? Thank you.
Open the Git repository on the web UI of your Azure DevOps and then check whether its file structure is looking like as below image shows.
If it is same as this file structure. You need to change the file path set on the Azure CLI task to be "deployment-pipelines/scripts/sp-login.sh" instead of "dip-comma-poc/deployment-pipelines/scripts/sp-login.sh".
- task: AzureCLI#2
inputs:
azureSubscription: "sc-arm-pa042-man"
scriptType: 'bash'
scriptLocation: 'scriptPath'
scriptPath: 'deployment-pipelines/scripts/sp-login.sh'
arguments: '$(SVCApplicationID) $(SVCSecretKey) $(SVCDirectoryID)'
displayName: "Download python whl from artifactory"

Azure pipeline - unzip artefact, copy one directory into Azure blob store YAML file

I am getting stuck with Azure pipelines.
I have an existing node SPA project that needs built for each environment (TEST and PRODUCTION). This i can do, but need to have a manual step when pushing to PROD. I am using Azure Dev-op pipeline environments with Approval and Checks to mandate this.
The issue is using a 'deploy job' to take an artefact from a previous step I am unable to find the right directory. This is my YAML file have so far:
variables:
# Agent VM image name
vmImageName: 'ubuntu-latest'
trigger:
- master
# Don't run against PRs
pr: none
stages:
- stage: Development
displayName: Devlopment stage
jobs:
- job: install
displayName: Install and test
pool:
vmImage: $(vmImageName)
steps:
- task: NodeTool#0
inputs:
versionSpec: '12.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: Install node modules
- script: |
npm run build
displayName: 'Build it'
# Build creates a ./dist folder. The contents will need to be copied to blob store
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
verbose: true
- deployment: ToDev
environment: development
dependsOn: install
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '**/*.zip'
cleanDestinationFolder: true
destinationFolder: './cpDist/'
# Somehow within a deploy job retrieve the .zip artefact, unzip, copy the ./dist folder into the blob store
- task: AzureCLI#2
inputs:
azureSubscription: MYTEST-Development
scriptLocation: "inlineScript"
scriptType: "bash"
inlineScript: |
az storage blob upload-batch -d \$web --account-name davey -s dist --connection-string 'DefaultEndpointsProtocol=https;AccountName=davey;AccountKey=xxxxxxx.yyyyyyyyy.zzzzzzzzzz;EndpointSuffix=core.windows.net'
displayName: "Copy build files to Development blob storage davey"
- script: |
pwd
ls
cd cpDist/
pwd
ls -al
displayName: 'list'
- bash: echo "Done"
If you are confused with the folder path, you could add few debug steps to check the location of know system variables to understand what was going on using a powershell script as below:
- task: PowerShell#2
displayName: 'Degug parameters'
inputs:
targetType: Inline
script: |
Write-Host "$(Build.ArtifactStagingDirectory)"
Write-Host "$(System.DefaultWorkingDirectory)"
Write-Host "$(System.ArtifactsDirectory)"
Write-Host "$(Pipeline.Workspace)"
Write-Host "$(System.ArtifactsDirectory)"
You should simply publish the build generated artifacts to drop folder.
Kindly check this official doc -- Artifact selection , in there is explaining that you can define the path which to download the artifacts to with the following task:
steps:
- download: none
- task: DownloadPipelineArtifact#2
displayName: 'Download Build Artifacts'
inputs:
patterns: '**/*.zip'
path: '$(Build.ArtifactStagingDirectory)'
Please be aware that the download happens automatically to $(Pipeline.Workspace), so if you don’t want you deployment to download the files twice, you need to specify the “download: none” in your steps.

How to generate EF Core migrations script when ConnectionString is only known after ARM template deployment?

I want to release an app to Azure and deploy migrations to a database before deploying the Web App. That sounds relatively simple, you can create a migrations.sql script with dotnet-ef in your Build pipeline and apply this script in your Release pipeline.
However, I cannot create a a migrations.sql script in the Build pipeline as I am using four different databases for a DTAP environment. Thus, I would need to generate a migrations.sql script per environment and perform these separately against each of the databases. (as I understand it)
In my Release pipeline I use an incremental ARM template to deploy resources and set the ConnectionString (which comes from an Azure Key Vault) in the Azure Web App application settings configuration.
How/where do I generate the migrations.sql script? Do I do this in a Release pipeline? Am I making a major mistake in my reasoning?
EDIT:
Thanks for Madej's answer that shows the environment doesn't matter. I tried implementing creating the migrations.sql script in my pipelines.
# ASP.NET Core (.NET Framework)
# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
projects: '**/*.csproj'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: DotNetCoreCLI#2
displayName: "Install dotnet-ef"
inputs:
command: 'custom'
custom: 'tool'
arguments: 'install --global dotnet-ef'
- task: DotNetCoreCLI#2
displayName: "Restore tools"
inputs:
command: 'custom'
custom: 'tool'
arguments: 'restore'
- task: DotNetCoreCLI#2
displayName: "Restore"
inputs:
command: 'restore'
projects: '$(projects)'
feedsToUse: 'select'
- task: DotNetCoreCLI#2
displayName: "Build"
inputs:
command: 'build'
projects: '$(projects)'
arguments: '--configuration $(BuildConfiguration)'
- task: DotNetCoreCLI#2
displayName: "Create migrations.sql"
inputs:
command: 'custom'
custom: 'ef'
arguments: 'migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'WebApi.api'
- task: DotNetCoreCLI#2
displayName: "Publish"
inputs:
command: 'publish'
publishWebProjects: true
arguments: '--configuration $(BuildConfiguration) --output $(Build.ArtifactStagingDirectory)'
zipAfterPublish: false
- task: PublishBuildArtifacts#1
displayName: "Publish to Azure Pipelines"
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
My pipeline doesn't work, in the task "Create migrations.sql" I run into the following error:
An error occurred while accessing the Microsoft.Extensions.Hosting services. Continuing without the application service provider. Error: DefaultAzureCredential failed to retrieve a token from the included credentials.
- EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
- ManagedIdentityCredential authentication unavailable. No Managed Identity endpoint found.
- Visual Studio Token provider can't be accessed at C:\Users\VssAdministrator\AppData\Local\.IdentityService\AzureServiceAuth\tokenprovider.json
- Stored credentials not found. Need to authenticate user in VSCode Azure Account.
- Please run 'az login' to set up account
This is because in my Program.cs I add a keyvault and authenticate with the Azure.Identity DefaultAzureCredential as follows:
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.ConfigureAppConfiguration((hostingContext, config) =>
{
var settings = config.Build();
var credentials = new DefaultAzureCredential(
new DefaultAzureCredentialOptions() {
ExcludeSharedTokenCacheCredential = true,
VisualStudioTenantId = settings["VisualStudioTenantId"],
}
);
config.AddAzureKeyVault(new Uri(settings["KeyVault:Endpoint"]), credentials).Build();
})
.UseStartup<Startup>();
});
The Azure Pipelines cannot get a token from DefaultAzureCredential. How do I authenticate the Azure Pipelines?
I have figured out the solution to the problem in my edit. The primary way that the DefaultAzureCredential class gets credentials is via environment variables.
Thus, I had to define the environment variables somewhere. I didn't want to do this in the pipeline variables to avoid having to manage them as they should be available from the project in the form of a service connection to Azure.
I did the following:
In my pipelines added an AzureCLI task to read out the service principal id, key and tenant id and set them to job variables as follows:
- task: AzureCLI#2
inputs:
azureSubscription: '<subscription>'
scriptType: 'ps'
scriptLocation: 'inlineScript'
inlineScript: |
Write-Host '##vso[task.setvariable variable=AZURE_CLIENT_ID]'$env:servicePrincipalId
Write-Host '##vso[task.setvariable variable=AZURE_CLIENT_SECRET]'$env:servicePrincipalKey
Write-Host '##vso[task.setvariable variable=AZURE_TENANT_ID]'$env:tenantId
addSpnToEnvironment: true
In my "Create migrations.sql" task pass these variables as environment variables as follows:
- task: DotNetCoreCLI#2
displayName: "Create migrations.sql"
inputs:
command: 'custom'
custom: 'ef'
arguments: 'migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'WebApi.api'
env:
AZURE_CLIENT_ID: $(AZURE_CLIENT_ID)
AZURE_CLIENT_SECRET: $(AZURE_CLIENT_SECRET)
AZURE_TENANT_ID: $(AZURE_TENANT_ID)
Added the service principal to the Azure Key Vault RBAC as a Key Vault Secrets User. I could only do this with az:
az role assignment create --role 'Key Vault Secrets User (preview)' --scope '/subscriptions/<subscription ID>/resourcegroups/<resource group name>/providers/Microsoft.KeyVault/vaults/<vault name>' --assignee '<service principal object id>'
This absolutely solved my problems without having to manage any more secrets/variables as they are all contained in the pipeline itself and don't pose any security threats.
You can do this in a build pipeline because migration.sql script makes some checks if specific migration was already applied or not.
To create migration script when you use Azure Key Vault in you confiugration the easiest way is to run command from Azure Clit task:
- task: AzureCLI#2
inputs:
azureSubscription: 'rg-tcm-si'
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: 'dotnet ef migrations script --configuration $(BuildConfiguration) --no-build --idempotent --output $(Build.ArtifactStagingDirectory)\migrations.sql'
workingDirectory: 'Itan.Database'
Before that you need to add get and list permissions to your serivde principal which is behind your connection service:
And then even if you need to deploy the same script to different environments/databases it is all fine until they haven't been drifted. So if you do all changes through ef core you are good to go with migration.sql done once and applied many times.
In database you should have:
which contains already applied migrations. ANd then in script you will find:
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20200101111512_InitialCreate')
BEGIN
CREATE TABLE [SomeTable] (
[Id] uniqueidentifier NOT NULL,
[StorageDate] datetime2 NOT NULL,
.....
);
END;
GO
Thus you are safe to run it against multiple databases.
And then to deploy you can use
steps:
- task: SqlAzureDacpacDeployment#1
displayName: 'Azure SQL SqlTask'
inputs:
azureSubscription: 'YourSubscription'
ServerName: 'YourServerName'
DatabaseName: 'YourDatabaseName'
SqlUsername: UserName
SqlPassword: '$(SqlServerPassword)'
deployType: SqlTask
SqlFile: '$(System.DefaultWorkingDirectory)/staging/drop/migrations.sql'

Java Azure Function Deployed from Azure DevOps pipeline says success, but no Functions are deployed

I would like to create pipeline which deploy Java Azure Function. The pipeline Job says successful (1 artifact produced.100% tests passed. 13 files uploaded) but I cannot see Functions deployed in Azure Portal.
Please advice me. I'm following tutorial as base, but I'm using Git Repo of Azure DevOps instead of GitHub. https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/java-function?view=azure-devops
My Visual Code project (unzipped) is located on git /MyProject/FunctionsJava/
Pom.xml is located (unzipped) /MyProject/FunctionsJava/pom.xml
My yml is followings:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
# at the top of your YAML file
# set some variables that you'll need when you deploy
variables:
# the name of the service connection that you created above
serviceConnectionToAzure: connection-to-TestRG-rg
# the name of your web app here is the same one you used above
# when you created the web app using the Azure CLI
appName: JavaFuncApp
# ...
steps:
# ...
# add these as the last steps
# to deploy to your app service
- task: CopyFiles#2
displayName: Copy Files
inputs:
SourceFolder: $(system.defaultworkingdirectory)/MyProject/FunctionsJava/
Contents: '**'
TargetFolder: $(build.artifactstagingdirectory)
- task: Maven#3
inputs:
mavenPomFile: '$(System.DefaultWorkingDirectory)/MyProject/FunctionsJava/pom.xml'
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
goals: 'package'
- task: PublishBuildArtifacts#1
displayName: Publish Artifact
inputs:
PathtoPublish: $(build.artifactstagingdirectory)
- task: AzureWebApp#1
inputs:
azureSubscription: 'TestRG-Conn'
appType: 'webApp'
appName: '$(appName)'
package: $(build.artifactstagingdirectory)
deploymentMethod: 'auto'
The problem should be caused by appType: 'webApp' in the AzureWebApp task, appType should be functionApp. Web app is deployed to app service.
You can try Azure Function App deploy task or Azure App Service deploy task.
- task: AzureFunctionApp#1
displayName: Azure Function App deploy
inputs:
azureSubscription: $(serviceConnectionToAzure)
appType: functionApp
appName: $(appName)
package: $(build.artifactstagingdirectory)

How to capture and retain the artifact package version for Universal artifacts in azure pipelines for cd

I have this azure devops ci/cd pipeline using yaml. My yaml has two stages CI and CD. My CI stage has one job called BuildandDeploy. The CD stage has one deployment job. I am using universal artifacts to publish and downloading the same. In the CD phase I am using UniversalPackages devops task to download the artifact. The task has a input variable called vstsPackageVersion which is the package version that is shown in universal artifacts. I have known of two other variables that could be used $(Build.BuildId) and $(Build.BuildNumber). As a temporary work around I am hard coding the package version for the universal artifact.
I wasn't able to download the artifact with either of the built-in variables. Since the CI and CD are in the same pipeline, is there any way to store and retrieve the package version of the artifact? Is there a identifier like latest that I could use to get the latest artifact from universal package.
# specific branch build with batching
trigger:
batch: true
branches:
include:
- master
stages:
- stage: CI
jobs:
- job: BuildAndPublish
pool:
vmImage: 'Ubuntu-16.04'
steps:
-
script: |
docker build -t $(dockerId).azurecr.io/$(imageName):$(version) .
docker login -u $(dockerId) -p $(pswd) $(dockerId).azurecr.io
docker push $(dockerId).azurecr.io/$(imageName):$(version)
- task: Bash#3
displayName: Initialize Helm Client - create local repo
inputs:
targetType: 'inline'
script: '
helm init --client-only
'
- task: HelmDeploy#0
displayName: Package helm chart
inputs:
connectionType: 'Kubernetes Service Connection'
command: 'package'
chartPath: 'my-helm-dir'
- task: UniversalPackages#0
displayName: Publish helm package to my-company-artifacts
inputs:
command: 'publish'
publishDirectory: '$(Build.ArtifactStagingDirectory)'
feedsToUsePublish: 'internal'
vstsFeedPublish: '$(my-feed-guid)'
vstsFeedPackagePublish: 'my-artifact-name'
versionOption: patch
packagePublishDescription: 'My helm package descrition'
- stage: CD
jobs:
- deployment: DeployJob
displayName: Deploy Job
pool:
vmImage: Ubuntu-16.04
environment: dev
strategy:
runOnce:
deploy:
steps:
- task: UniversalPackages#0
displayName: 'Universal download'
inputs:
command: download
vstsFeed: '$(my-feed-name)'
vstsFeedPackage: 'my-artifact-name'
vstsPackageVersion: 0.0.32
- task: ExtractFiles#1
displayName: 'Extract files '
inputs:
archiveFilePatterns: '*.tgz'
destinationFolder: 'my-folder'
cleanDestinationFolder: true
The Universal Packages task based on az artifacts universal cli that not support "latest version", but only specific version (by the way, this cli is on preview).
As workaround, you can use the Rest API to retrieve the latest version and set a new variable, then, in the download task use this variable.
For example, add a PowerShell task that get the version number and set the variable:
- powershell: |
$head = #{ Authorization = "Bearer $env:TOKEN" }
$url = "https://feeds.dev.azure.com/{organization}/_apis/packaging/Feeds/{feed-name}/packages/{package-guid}?api-version=5.0-preview.1"
$package = Invoke-RestMethod -Uri $url -Method Get -Headers $head -ContentType application/json
$latestVersion = ($package.versions.Where({ $_.isLatest -eq $True })).version
Write-Host "The latest version is $latestVersion"
Write-Host "##vso[task.setvariable variable=latestVersion]$latestVersion"
env:
TOKEN: $(system.accesstoken)
Now, in the download task use it:
vstsPackageVersion: $(latestVersion)