Azure build pipeline sporadic error, [CredentialProvider]Device flow authentication failed - azure-devops

This happened maybe once every 2 weeks, but lately it happens a few times until the build succeeds once.
GET https://api.nuget.org/v3-flatcontainer/xamarin.uitest/3.0.9/xamarin.uitest.3.0.9.nupkg
##[error]C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\CommonExtensions\Microsoft\NuGet\NuGet.targets(128,5): Error : [CredentialProvider]Device flow authentication failed. User was presented with device flow, but didn't react within 90 seconds.
----- Update ----
After further digging in the logs, I found out that it was due to an 401 (Unauthorized) access to a private feed (in the same organization as the project where the pipeline fails).
A temporary solution was to remove the specific *.csproj, that needed this private feed, from the solution build config, and also remove the private feed from the pipeline. This got the pipeline working again.
---- More Pipeline infos ----
I can't post the entire code here.
pool:
vmImage: 'windows-latest'
variables:
configuration: NugetB
steps:
- task: MSBuild
displayName: 'MSBuild'
inputs:
solution: '**\xxSolutionxx.sln'
msbuildArchitecture: 'x64'
configuration: '$(configuration)'
msbuildArguments: '/t:restore;build;pack /p:PackageOutputPath=$(Build.ArtifactStagingDirectory) /p:RestoreAdditionalProjectSources=https://urlToPrivateFeedInTheSameOrganization /p:configuration=$(configuration) /p:NuGetInteractive="true"'

Azure build pipeline sporadic error, [CredentialProvider]Device flow authentication failed
If you have task/script access to azure ariftact feed in your pipeline, then please try to use the System.AccessToken variable to authenticate a pipeline to a private Azure Artifacts feed or try to reset your PAT.
You could check this document for some more details.
Besides, if above not resolve your question, please share how did you access the artifact and the build pipeline configuration.

Related

SonarCloud analysis task fails in Azure DevOps PR pipeline

I have a pipeline in Azure DevOps which is triggered by PR requests. There are three SonarCloud tasks in this pipeline - Prepare analysis on SonarCloud, (my project build step is here), Run Code Analysis and then Publish Quality Gate Result.
When this pipeline is triggered by a PR, it's all fine until gets to the Run Code Analysis task, which then fails with the below error messages:
INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
INFO: Total time: 9.173s
INFO: Final Memory: 7M/48M
INFO: ------------------------------------------------------------------------
##[error]ERROR: Error during SonarScanner execution
ERROR: Error during SonarScanner execution
##[error]ERROR: Not authorized. Please check the properties sonar.login and sonar.password.
ERROR:
ERROR: Not authorized. Please check the properties sonar.login and sonar.password.
ERROR:
##[error]The SonarScanner did not complete successfully
The SonarScanner did not complete successfully
##[error]18:32:43.506 Post-processing failed. Exit code: 1
18:32:43.506 Post-processing failed. Exit code: 1
Here's the thing: whenever this same pipeline is triggered NOT by a PR, either manually or automatically by a daily schedule, it runs and passes with no issues, on any branch. If I trigger the pipeline for the branch being pulled in (not via the PR), it runs and passes fine.
Why is authorization failing only when it's triggered by a PR?! Why isn't it using the same PAT token from SonarCloud that the SAME pipelines use when triggered manually??
Why is authorization failing only when it's triggered by a PR?! Why isn't it using the same PAT token from SonarCloud that the SAME pipelines use when triggered manually??
According to the description, it seems the PAT do not have enough permissions.
You could try to update the PAT with Full access to test:
Update the PAT in your SonarCloud.
You could check this document for some more details.
I've now found the answer, the issue was the PAT issued by Azure DevOps wasn't set in the correct place on SonarCloud. I was putting it in [SonarCloud Project] -> Administration -> General Settings -> Pull Requests -> Personal access token, when in fact it should be stored in [SonarCloud Organisation] -> Administration -> Organisation settings -> Azure DevOps connectivity management -> Personal Access Token, as shown below:

ADF Git Configure Disconnection After Publish with AzureResourceGroupDeployment & ARMTemplates

I’m following the new CICD guide for ADF https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment-improvements
I am then publishing the ARMTemplates generated from the npm export pipeline to my ADF Dev using Azure Resource Group ARM Template deployment described here: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment#script
Looks like this:
- task: AzureResourceGroupDeployment#1
displayName: 'Azure Deployment:Create Or Update Resource Group action on adf-dev-rg'
inputs:
ConnectedServiceName: 'guycarpenter-privatenonprod-Contributor'
resourceGroupName: 'gc-adf-nasa-prinonprod-dev-rg'
location: 'East US 2'
csmFile: '$(Agent.BuildDirectory)/ARMTemplate/ARMTemplateForFactory.json'
csmParametersFile: '$(Agent.BuildDirectory)/ARMTemplate/ARMTemplateParametersForFactory.json'
After I publish the new ARMTemplate to my ADF Dev, ADF git repo Configure gets disconnected.
How should I publish the new ARMTemplate to my ADF Dev without disconnecting the repo?
Edit:
I also found that setting includeFactoryTemplate=false solves the disconenction, but I need it set to true to parametrize ADF for other environments.
Edit #2:
This solved the problem: https://stackoverflow.com/a/56863897/13570809
How should I publish the new ARMTemplate to my ADF Dev without disconnecting the repo?
There is a known user voice about this:
Retain GIT configuration when deploying Data Factory ARM template
You could vote this request and check the feedback.
And the Jason replied:
This has been implemented by the repoConfiguration properties in the
Azure Resource Manager template for the Data Factory resource. See
here for reference -
https://learn.microsoft.com/en-us/azure/templates/microsoft.datafactory/2018-06-01/factories

How to upload a package by UniversalPackages to an external private noarch repo in Azure devops pipline

We save all our build packages (conda build) in our jFrog artifactory server as the compressed tar.bz2 file that can be accessed by the URL, username and password. Currently this is done by cURLUploader task shown below
- task: cURLUploader#2
displayName: Upload to TTI Artifactory
inputs:
files: '$(Build.ArtifactStagingDirectory)/$(packageName)-$(Build.BuildId).tar.bz2'
authType: 'userAndPass'
username: 'admin'
password: '123'
url: 'https://artifactory.xyz.com/abc/local/'
redirectStderr: true
One of the issues is it is hard to control the versioning. Would be nice to have some sort of Major.Minor.Patch instead of the Build.BuildId as part of the compressed file name. I am thinking to use UniversalPakages task to do the job. It request to save the credential in a serviceConnection. And this step faced an issue.
- task: UniversalPackages#0
inputs:
command: publish
feedsToUse: external
externalFeedCredentials: jfrogConnect # Errors appears when creating the service connection
publishDirectory: '$(Build.ArtifactStagingDirectory)'
feedsToUsePublish: external
feedPublishExternal: https://artifactory.xyz.com/abc/local/
packagePublishExternal: $(packageName)
versionOption: patch
packagePublishDescription: Upload Package to xyz-local artifactory
When I trying to create the service connection for the jFrog artifactory repo (https://artifactory.xyz-local/noarch/)it returns the error
Failed to query service connection API: 'https://artifactory.xyz-local/noarch/api/plugins'. Error Message: 'An error occurred while sending the request.'
Update: the repo is in our private vNet, the service connection therefore has issues to access this url, I think that's the issues caused this. The question then would be how to create a service connection to access your private resources. The pipeline task is running in our private agent it is able to access the resource. (proved by the cURLUploader task)
The question then would be how to create a service connection to access your private resources. The pipeline task is running in our private agent it is able to access the resource. (proved by the cURLUploader task)
Since your jFrog artifactory repo in private resources, it could not be verified from the external network, which will cause the verification to fail.
To resolve this issue, you could try to create the connection with the option Save without verification:
Note: You could only use this connection with your private agent in the same private resources.

Azure Artifacts - Universal Packages - Error: An unexpected error occurred while trying to push the package

Devops folks,
I am pushing the build pipeline output to Azure Artefacts - Universal packages for a full stack .net application. The application builds successfully and produces an output in $(Build.ArtifactStagingDirectory)
Would like to publish all these build outputs to UniversalPackages and let release pipeline take it over from there.
I have checked for below things:
1. Permission - Project Collection Build Service - Contributor Role.
2. Task Configuration Confirmed below "UniversalPackages"
- task: UniversalPackages#0
inputs:
command: 'publish'
publishDirectory: '$(Build.ArtifactStagingDirectory)/**/*.nupkg'
feedsToUsePublish: 'internal'
vstsFeedPublish: '123456fg-test-1234-1234-31161a66dc4d/b92b3313-ab41-4044-test-e94146618efb'
vstsFeedPackagePublish: 'Text here-Services-Package'
versionOption: 'minor'
packagePublishDescription: 'Contains some text here)'
verbosity: 'Trace'
sorry for YAML indenting issues,
Below is the pipeline log
2020-04-04T02:36:03.9835393Z Publishing package: test, version: 0.0.1 using feed id: 76a3991f-e6fc-767b-a0dc-90e38c54e558, project: 7813b7e3-bbf1-4355-9263-31161a66dc4d
2020-04-04T02:36:04.0147395Z [command]D:\ABCAgent\_work\_tool\artifacttool\0.2.151\x64\ArtifactTool.exe universal publish --feed 76a3991f-e6fc-767b-a0dc-90e38c54e558 --service https://dev.azure.com/QWERTY/ --package-name test --package-version 0.0.1 --path D:\AzureAgentBuild\_work\1\a --patvar UNIVERSAL_PUBLISH_PAT --verbosity None --description "" --project 7813b7e3-bbf1-4355-9263-31161a66dc4d
2020-04-04T02:36:09.2875733Z {"#t":"2020-04-04T02:36:08.7883701Z","#m":"[GetDedupManifestArtifactClientAsync] Try 1/5, non-retryable exception caught. Throwing. Details:\r\nNo LastRequestResponse on exception VssServiceResponseException: Forbidden","#i":"b2d31574","#l":"Warning","#x":"Microsoft.VisualStudio.Services.WebApi.VssServiceResponseException: Forbidden\r\n ​
Microsoft.VisualStudio.Services.WebApi.VssServiceResponseException:
Forbidden
This permission error can be converted as 403 error code, which means the account does not have enough operation permission to publish package to universal package.
You said you had assign 'Project Collection Build Service' with 'Contributor' role. BUT, this is not a solution for all scenario. It only available while the build pipeline is using 'Project Collection Build Service' account, a collection level service account. There still has another scenario, the pipeline may using project-level service account.
You can with the methods I shared in this answer. Check this to get another similar issue and explanation.
Method 1:
Please go Feed settings => Permissions, add your project-level build service account and assign it Contributor role. Its account name should like {Project Name} Build Service ({Org Name}).
Re-run your pipeline to see whether it can run successfully.
Method 2:
Go Project settings => Settings, and make sure Limit job authorization scope to current project is disabled:
Only it disabled, the service account that pipeline used is collection-level one. At this time, your original permission configuration would be available now.
After playing around with permission for pipelines build service, the root cause was found to be with proxy being blocking the universal package with a forbidden error.
We just removed the proxy from the on-prem, self-hosted build agent and used Azure Express Route to route the traffic. This simple change fixed the issue.
Also, you can doublecheck in the Billing options if the Artifact free space is used up. I fixed it using this option.

Execute YAML templates from Azure DevOps classic pipeline

I would put my questions through following points, hope it's make clear now:
The application source code is in application_code repo.
The pipeline code(YAMLs) are in pipeline_code repo. Because I'd like to version it and don't like to keep in application_code repo. Just to avoid giving control to Dev team to manage it.
Problem statement:
The pipeline YAML won't be triggered unless it's in the source code repository based on the events pr, commit etc.
Can we trigger or execute YAML file which is in pipeline_repo whenever there's event triggered in application_code repo?
I've tried achieving above using Classic pipeline and YAML template but this don't work together. As I can execute a YAML template from a YAML pipeline only not from a classic pipeline like below:
#azure-pipeline.yaml
jobs:
- job: NewJob
- template: job-template-bd1.yaml
Any ideas or better solution than above?
The feature Multi-repository support for YAML pipelines will be available soon for azure devops service. This feature will support for triggering pipelines based on changes made in one of multiple repositories. Please check Azure DevOps Feature Timeline or here. This feature is expected to be rolled out in 2020 Q1 for azure devops service.
Currently you can follow below workaround to achieve above using Build Completion(the pipeline will be triggered on the completion of another build).
1, Setup the triggering pipeline
Create an empty classic pipeline for application_code repo as the triggering pipeline, which will always succeed and do nothing.
And check Enable continuous integration under Triggers tab and setup Bracnh filters
2, setup the triggered pipeline
In the pipeline_code repo using Checkout to Check out multiple repositories in your pipeline. You can specifically checkout the source code of application_code repo to build. Please refer below example:
steps:
- checkout: git://MyProject/application_code_repo#refs/heads/master # Azure Repos Git repository in the same organization
- task: TaskName
...
Then in the yaml pipeline edit page, click the 3dots on the top right corner and click Triggers. Then click +Add beside Build Completion and select above triggering pipeline created in step 1 as the triggering build.
After finishing above two steps, when changes made to application_code repo, the triggering pipeline will be executed and completed with success. Then the triggered pipeline will be triggered to run the real build job.
Update:
Show Azure DevOps Build Pipeline Status in Bitbucket.
you can add a python script task at the end of the yaml pipeline to update the Bitbucket build status. You need to set a condtion: always() to always run this task even if other tasks are failed.
You can get the build status with env variable Agent.JobStatus. For below example:
For more information, please refer to document Integrate your build system with Bitbucket Cloud, and also this thread.
- task: PythonScript#0
condition: always()
inputs:
scriptSource: inline
script: |
import os
import requests
# Use environment variables that your CI server provides to the key, name,
# and url parameters, as well as commit hash. (The values below are used by
# Jenkins.)
data = {
'key': os.getenv('BUILD_ID'),
'state': os.getenv('Agent.JobStatus'),
'name': os.getenv('JOB_NAME'),
'url': os.getenv('BUILD_URL'),
'description': 'The build passed.'
}
# Construct the URL with the API endpoint where the commit status should be
# posted (provide the appropriate owner and slug for your repo).
api_url = ('https://api.bitbucket.org/2.0/repositories/'
'%(owner)s/%(repo_slug)s/commit/%(revision)s/statuses/build'
% {'owner': 'emmap1',
'repo_slug': 'MyRepo',
'revision': os.getenv('GIT_COMMIT')})
# Post the status to Bitbucket. (Include valid credentials here for basic auth.
# You could also use team name and API key.)
requests.post(api_url, auth=('auth_user', 'auth_password'), json=data)