I need the list of available versions for a Universal Package stored in Azure Devops. My thought is to call the REST API Get Package Versions to get a list of the versions for packages on a feed.
GET https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/Feeds/{feedId}/Packages/{packageId}/versions?api-version=5.1-preview.1
The problem is that it requires a packageId, which is the GUID and I only know the name. The only way I've figured out so far to convert a package name to a GUID is using "Get Packages" but that returns every package on the feed (which for me includes thousands of NPM packages) and that makes the download very large for the handful of items I need. Is there some way to extract the packageId for a given package name? Or is there a better way to extract all the versions for a package?
Someone pointed out to me that the Get Packages API has options for IncludeAllVersions and packageNameQuery to achieve what I want rather than using GetAllVersions.
https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/Feeds/{feedId}/Packages?includeAllVersions=true&packageNameQuery={packageName}&protocolType=nuget
I assume you have checked some docs and found there's no direct API can let you get specified packaged ID, right? Also, as the doc said, the package name could not be used in this API:
In fact, you have very close to the answer. Just add some filter while you running Get Packages. But this need you execute some script in Powershell command line or Powershell ISE which is the most convenient approach can for you use. You can also run below script in Azure Devops pipeline, but compare with running in command line, it is a bit cumbersome.
Run below script in your Powershell command line or Powershell ISE:
$token = "{your PAT token}"
$url = 'https://feeds.dev.azure.com/{org name}/_apis/packaging/Feeds/{feed ID}/packages'
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
$response = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Basic $token"} -Method Get
$results = $response.value | Where {$_.name -eq "{Package Name}"} #|
Write-Host "results = $($results | ConvertTo-Json -Depth 100)"
Note: In the above script, you just need to replace the value of PAT token, Organization name and your package name.
Then you will get the info of the specified package, and you can copy the package ID from the command line and apply it in another place :
Note: The above script can also be applied in the Powershell task of Azure Devops without change anything.
Related
I am working on creating qnamaker based Chatbot and for this I have prepared ARM template for creating and deploying resources.
PowerShell script is being used to create knowledge base once all the resources are created.
This process works just fine when I am executing PowerShell script from local PowerShell tool.
Now I want to do CI/CD for this deployment process so that deployment process can be automated
Creating resources and deploying to azure through Pipeline is quite possible through Azure ARM Template deployment task but I am not sure how to execute PowerShell script that is responsible for creating knowledge base once resources are created based on recently created qnamaker service
Any help greatly appreciated
You can refer to Create a CI/CD pipeline for .NET with the Azure DevOps Project for step by step lab to create the pipelines.
If you need help on ARM templates (if you are stuck somewhere), you can check out A quick guide on writing your Azure Resource Manager (ARM) Templates. and get started quickly.
Please clarify that your issue is creating and training the QnA Maker KB after the actual resources have been created by ARM? There are probably multiple ways to do this, but here is what I have done. First of all, I am using the Azure CLI Task version 2.* which simplifies some of the initial steps. You could probably use regular PowerShell for some of the subsequent steps. I have selected all as PowerShell script types and inline scripts. Each of these segments I do as a separate task but you could probably do all as one. I like being able to see which step fails if something happens, though.
First you need to get the key for your cognitive service
$QNAKEY= & az cognitiveservices account keys list -g "YOUR_RESOURCE_GROUP" --name "YOUR_QNA_RESOURCE_NAME" --query key1 -o tsv
Write-Output("##vso[task.setvariable variable=QNA_KEY;]$QNAKEY")
Next you need to create the KB. I have mine seeded with information via a json file in my project repository. You can do this from wherever, or just create a blank KB (I think). Note I'm checking here to see if the KB exists; KB name does not have to be unique, so if you don't do this you will end up creating a lot of duplicate KBs.
$header = #{
"Content-Type"="application/json"
"Ocp-Apim-Subscription-Key"="$(QNA_KEY)"
}
$res = Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases" -Method 'Get' -Headers $header
$kb = $res.knowledgebases | Where-Object name -eq "YOUR_KB_NAME"
Write-Host $res.knowledgebases
Write-Host $kb
if (!$kb) {
Write-Host "KB does not exist, so creating new KB"
$body = Get-Content '$(System.DefaultWorkingDirectory)/PATH_TO_MY_MODEL.json' | Out-String
Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases/create" -Method 'Post' -Body $body -Headers $header
}
Finally, you will likely want to publish your KB. With LUIS I think you need to train it first (via separate CLI task), but QnA Maker you should be able to publish directly. I do this as a dependent stage to ensure the KB is created before I try to publish.
$QNAKEY= & az cognitiveservices account keys list -g "YOUR_RESOURCE_GROUP" --name "YOUR_QNA_RESOURCE_NAME" --query key1 -o tsv
$header = #{
"Content-Type"="application/json"
"Ocp-Apim-Subscription-Key"="$QNAKEY"
}
$res = Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases" -Method 'Get' -Headers $header
$kb = $res.knowledgebases | Where-Object name -eq "YOUR_KB_NAME"
$qnaId = $kb.id
Invoke-RestMethod -Uri "https://westus.api.cognitive.microsoft.com/qnamaker/v4.0/knowledgebases/$qnaId" -Method 'Post' -Headers $header
And that's it! At this point your QnA Maker knowledgebase should be created, published, and ready to go.
I'm trying to automate the deployment of releases in Azure Devops using the Azure Devops API with Powershell. I'm able to retrieve the last release definition, but I'm not able to extract a release for a specific environment.
I'm looking to retrieve the last 'prod' release in order to obtain the 'release ID', and then deploy the release to UAT or DEV using a HTTP 'PATCH' request.
I can hit a URL manually (example below), which gives me the last release for DEV (definitionEnvironmentId=1), and I know I could easily changed the environment ID to '12' for production, but I'd like to get this via an API call.
https://{redacted url}/{redacted}/{redacted}/_apis/release/deployments?api-version=6.0&definitionId=1&definitionEnvironmentId=1&$top=1
Any ideas if this is possible as I can't find any info on it?
You were using the correct Deployments - List Rest api. The reason you didnot get any info from the api call via powershell is because you didnot escape the special character $ for $top parameter in the url.
$ is a special character in powershell. You should escape it by adding a back tick '`' in the url. See below:
#add ` to $top parameter
$url = "https://{redacted url}/_apis/release/deployments?definitionId=3&definitionEnvironmentId=5&`$top=1&api-version=6.0"
$connectionToken="PAT"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$response= Invoke-RestMethod -Uri $url -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
$releaseId = $response.value[0].release.id
Did someone manage to integrate Azure DevOps in Sentry (sentry.io)? I stuck on "Associate commits with a Release" (see: https://docs.sentry.io/workflow/releases/?platform=browser#associate-commits-with-a-release)
I can not figure out a way how I can tell Sentry (by API) which commit ids are associated with a current release/deploy. How can I add a task to the pipeline which will post the commit ids to Sentry API? Or is there some other way to do it?
In azure devops, the Powershell task also support curl. So, you can execute the api in powershell task of VSTS pipeline directly.
In release pipeline, there has a pre-defined release variable, it stores the commit id which is associated with the current release pipeline: $(Release.Artifacts.{alias}.SourceVersion). Here alias is the artifacts name, and you can get it by getting $(Release.PrimaryArtifactSourceAlias).
First, create variables like this:
Then you can apply the variable $(id) into that API, and execute the api in powershell task:
"refs": [{
"commit":"$(id)"
}]
Now, the commit id could be packed into the body of this api, and send to the Sentry server.
If there has multiple commits associate with this release, since the variable $(Release.Artifacts.{alias}.SourceVersion) I mentioned above only store the latest commit message, here you may need add additional scripts to get what you want by Build id.
In release pipeline, with $(Build.BuildId) you can get the corresponding buildid which associate with this release. And then, you could get the commits(changes) by using this API:
GET https://dev.azure.com/{organization}/{project}/_apis/build/changes?fromBuildId={fromBuildId}&toBuildId={toBuildId}&api-version=5.1-preview.2
You could apply these powershell script into your task without change anything because this script is universal among powershell-ise, powershell command line and powershell task in VSTS.
$token = "{PAT token}"
$url="https://dev.azure.com/{org name}/{project name}/_apis/build/changes?fromBuildId={id1}&toBuildId={id2}"
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
$response = Invoke-RestMethod -Uri $url -Headers #{Authorization = "Basic $token"} -Method Get
Write-Host "results = $($response.value.id | ConvertTo-Json -Depth 100)"
Now, you could get the list of commits which associate with the build and corresponding release.
Have a requirement to automatically export specific Azure DEVOPS Build/Release definitions. I know the names of the definitions required. The process would run weekly to capture the information. I know the export can be done manually but want to automate process. Hoping Powershell script can be used.
Thanks
Joe
If you want to export the build/release definition automatically, you'd better use Powershell task with Rest API. But if this, it is not enough for just know the build definition name.
Refer to these docs: get build definition and get release definition. You can see that definitionid is necessary. In fact, this definitionid is very easy to get. Just click the relevant pipeline you want to export, the definitionid will display in URL:
To export the definition, you can use the follow script in powershell:
$headers = #{ Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN" }
$projectsUrl = "https://dev.azure.com/{org}/{project}/_apis/build/definitions/{build definitionid}?api-version=5.1"
$result = Invoke-RestMethod -Uri $projectsUrl -Method Get -Headers $headers
$filename=$result.name+".json"
$filePath="D:\"
$file=$filePath+$filename
$result | ConvertTo-Json | Out-File -FilePath $file
In this script, I specified the build name as the file name($filename=$result.name+".json"), and also, convert the result content as JSON to make the local file more readable:
Similarly, to get the release definition, just change the url as get release difinition:
$headers = #{ Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN" }
$projectsUrl = "https://vsrm.dev.azure.com/{org}/{project}/_apis/release/definitions/{definitionId}?api-version=5.1"
$result = Invoke-RestMethod -Uri $projectsUrl -Method Get -Headers $headers
$filename=$result.name+".json"
$filePath="D:\"
$file=$filePath+$filename
$result | ConvertTo-Json | Out-File -FilePath $file
Note: While use #{ Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN" }, you'd enable Allow scripts to access the OAuth token to make the environment variable available during build pipeline.
In addition, as what you want is capture the information weekly, you can Schedule the pipeline which has these export task:
Now, these export pipeline will run and export the definition weekly.
You'll be looking at the az pipelines release and az pipelines build commands from Azure DevOps CLI
Commands Reference
Extension Reference
We have certain Chocolatey packages stored in VSTS private package management feed. I'm trying to download a specific package from feed in VSTS in PowerShell. I'm using the below commands:
$user = "my-user-id"
$token = "my-pat-token"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$url = "vsts-package-url"
$webClient = New-Object System.Net.WebClient
$webClient.Headers.Add('HttpRequestHeader.Authorization',"Basic $base64AuthInfo")
After this, I should be able to download the package when using $webClient.DownloadFile($url, $fileName).
However file contains dummy text expecting me to sign-in and has 401 related details.
Its basically a PowerShell code. Can anyone please advise how should I go about this? I do not want to install the package from feed, only download it.
I was wondering if there is a better way to do this as this process still needs me to prepare package url first which would be different for each package iteration?
It looks like you're creating the auth header incorrectly. Basically, with a PAT, no user name is needed.
I use this snippet of code all the time:
$pat = 'yourPatHere'
$bytes = [System.Text.Encoding]::ASCII.GetBytes(":$($pat)")
$encodedText =[Convert]::ToBase64String($bytes)
$header = #{Authorization = "Basic $encodedText"}
You can then use Invoke-WebRequest to download the feed information:
Invoke-WebRequest -Uri 'https://uri' -Header $header -Method Get
That looks roughly the same as your code, but Invoke-WebRequest is idiomatic PowerShell vs creating a WebClient. Shouldn't matter in the end.
Actually downloading a file from a feed is a different matter. The packages are exposed via standard mechanisms for that specific type of feed. See the documentation. Basically, if you need a NuGet package, you download it from the NuGet feed, not from package management.