How to show pipeline artifact in the artifacts section in Azure DevOps - azure-devops

Maybe there's a fundamental concept I don't understand but it seems to me that it should be obvious that an artifact published by a pipeline appears in the section called "ARTIFACTS" which is super-accessible directly from the main menu and hightlighted with its own magenta icon. But no... it doesn't. This baffles me. I like DevOps... everything else works so beautifully, why is this SO complicated to do? Anyway, sorry for venting.
Question : Is there any easy way to publish an artifact from a pipeline directly in the "Artifacts" section in Azure DevOps?
image-1 : Published pipeline artifact to artifact section
image-2 : What I was expecting in the publishing task
As of now I retreive the pipeline actifacts manually thru Memu > Pipelines > myPipeline > theDesiredRun > relatedItems > publishedItem > clicArtifactName > downloadWithBrowser. It's a working workaround but this procedure will be a pain to teach and document when onboarding new members in the team.
I googled it and it looks like you can acheive that by creating a feed, publishing to the feed, then subscribing to it. I didn't find any good guide to do it, I tried and failed :(
Is there an easier way to do it? Did I miss something obvious? Maybe I'm just not using the right task in the pipeline? If the only way is thru the feed thing, then does anybody knows a link to a clear explanation on how to do it?
Thank you
--mo

So, there are two forms of artifacts within Azure DevOps:
Build Artifacts: Artifacts that are built and published for consumption during releases.
Azure Artifacts: Maven, npm, NuGet, and Python package feeds from public and private sources. These are designed to be consumed within your applications as shared modules. If you're familiar with .NET / dotnetcore. This would be where you publish custom NuGet packages.
The "Artifacts" tab on the left-hand menu is explicitly for custom packages, not build artifacts. You'll never be able to publish build artifacts there.

After reading #MaxMorrow 's answer I now understand the Artifacts section in Azure DevOps is not designed for what I intended. But for those who are still looking to achieve what I was asking in my original question…
Here’s the workaround I put in place : Commit the build artifact to the repository.
I know it’s not an elegant solution but it fully answers my original requirements to make it easy to find and to download. It’s 1 click away from the main Repos menu in DevOps and it can also be downloaded by synchronising that repository folder on any machine and by any team member, as needed. (git)
HOW TO:
First, a task in the build pipeline writes the artifacts in
$(Build.SourcesDirectory)\BuildArtifacts\myBuildArtifact_run$(Build.BuildNumber).zip
Then the next task is a command line script to commit to repos :
echo This is the current working direcotry : %cd%
echo get worktree list
git worktree list
echo commit all changes
git config user.email "joe#contoso.com"
git config user.name "Automatic Build"
git checkout master
git add --all
git commit -m "pipeline : $(Build.DefinitionName)"
echo push code to repo
git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin master
Finally, you’ll find the committed artifact in your repository, under
root > BuildArtifacts
Voilà !

Related

How to implement Git tag and merge on release?

The final stage of our release pipeline is a manual stage used to confirm the deployed release got its final acceptance. Among the tasks we would like to run in this stage:
Tag the develop branch with the release label. Say "1.2.3".
Merge the develop branch into the master branch.
(We're using Azure Git repositories)
Although it looks like the right moment to make these changes in Git, I'm not quite certain this is the intended usage of Azure release pipelines. I confess being a bit new to Azure pipelines and there seem to be no evident pipeline task for doing such changes.
However, I believe this kind of post-release SCM changes is quite common.
My question is therefore: Where and how is the proper way to apply those SCM changes in Azure Devops ?
EDIT: I could make it work to use a command line task to run git commands. Config was passed by means of a variable group.
You can just do the regular git commands in a Command Line script (first , clone the repo (or add it as an artifact), then tag/merge).
Or install the Tag Git on Release & Git Merge extensions and use them.

Azure Devops - Multiple repositories

I have code in two github repositories that I would like to build and run in the same build pipeline.
Does anybody know if it's possible to clone/pull more than one repo during the 'get sources' step?
In Azure DevOps, a pipeline is only associated with a single repository, by default. However, there are options to include code from other Git repositories in to the build:
Add Command Line task and execute git clone with a PAT in the pipeline
Add the 2nd repository as a submodule to your primary repository. Make sure to check the 'checkout submodules' checkbox under 'Get sources' in the classic editor.
Build each repository separately and use a RELEASE pipeline to bring them together as below:
From the left menu, choose "Releases" under the "Pipeline" group. (as of 14th Oct 2019).
You will be able to add multiple artifacts to the pipeline by clicking on '+ Add'.
The screenshot below shows 3 different sources. A docker image in Azure Container Registry, A build that has output artifacts & a GitHub repository.
All the artifacts get copied to the build agent at run time in their own folders:

How to download files from self hosted VM to VSTS

I have python solution which resides in VSTS repository. Using build pipeline and private agent, the source code gets copied to VM.
After executing the python files, output is stored in 3 different files at the source directory level.
I want to download/copy these output files from private hosted VM to VSTS repository.
How can this be achieved?
Thank you
The only way to get something into the repository is by checking it in via source control.
Maybe it's enough for you to just publish these files as a build artifact. You have the option to publish directly to VSTS or to any Windows file share.
If you really want these files in your repository I'd suggest you publish them as build artifacts and check them in with a release pipeline. You could add a new stage in your existing release pipeline or add a new release pipeline that triggers automatically every time your build completes.
You can call git command to add and push changes to repository, for example:
Check Allow Scripts to access the OAuth token option
Add Command Line task (Tool:git; Arguments: add [file path]; Working folder: $(System.DefaultWorkingDirectory))
Add command line task (Tool:git; Arguments: commit –m “add build result”; Working folder: $(System.DefaultWorkingDirectory))
Add command line task (Tool: git; Arguments: push https://test:$(System.AccessToken)#{account}.visualstudio.com/{project}/_git/{repository} HEAD:master
Related article: Keep Git repository in sync between VSTS / TFS and Git
On the other hand, the better way is publishing the result files as artifact of build through Publish Build Artifact task.

Jenkins - MultiBranch Pipeline : Could not fetch branches from source

I am trying to create a Multibranch Pipeline project in Jenkins with GitHub.
In the status page of the project I have the message that says that there are no branch with the Jenkins file and not build the project, as we can see in this image:
When I scan the repository, the log shows
I configured the project with a GitHub source, as we can see in this image:
The URI of the repository,
Where in the root there is the Jenkinsfile., is:
https://github.com/AleGallagher/Prueba1
Could you help me please? I've spent many hours with this and I don't know what to do.
Thank you!
To use Multibranch pipeline it is mandatory to have Jenkinsfile in repository branch.
How it works?
The Multibranch pipeline job first scans all your repository branches and looks for Jenkinsfile, if it is able to met the criteria it will proceed by executing the Jenkinsfile code and go ahead with build, if it wont be able to find the Jenkinsfile then you will find in console that "criteria not met, jenkinsfile not found in branch".
For jenkinsfile kindly visit https://jenkins.io/doc/book/pipeline/jenkinsfile/
Recommendation:-
Choose git as an option for Branch source.
Set credentials- give preference to ssh. put privatekey as jenkins side
Make sure you have correct access to Repository, if not give access by put key of same user (ssh public-key in repository)
Let me know if issue still persists.

Run CI build on pull request merge in TeamCity

I have a CI build that is setup in TeamCity that will trigger when a pull request is made in BitBucket (git). It currently builds against the source branch of the pull request but it would be more meaningful if it could build the merged pull request.
My research has left me with the following possible solutions:
Script run as part of build - rather not do it this way if possible
Server/agent plugin - not found enough documentation to figure out if this is possible
Has anyone done this before in TeamCity or have suggestions on how I can achieve it?
Update: (based on John Hoerr answer)
Alternate solution - forget about TeamCity doing the merge, use BitBucket web hooks to create a merged branch like github does and follow John Hoerr's answer.
Add a Branch Specification refs/pull-requests/*/merge to the project's VCS Root. This will cause TeamCity to monitor merged output of pull requests for the default branch.
It sounds to me like the functionality you're looking for is provided via the 'Remote Run' feature of TeamCity. This is basically a personal build with the merged sources and the target merge branch.
https://confluence.jetbrains.com/display/TCD8/Branch+Remote+Run+Trigger
"These branches are regular version control branches and TeamCity does not manage them (i.e. if you no longer need the branch you would need to delete the branch using regular version control means).
By default TeamCity triggers a personal build for the user detected in the last commit of the branch. You might also specify TeamCity user in the name of the branch. To do that use a placeholder TEAMCITY_USERNAME in the pattern and your TeamCity username in the name of the branch, for example pattern remote-run/TEAMCITY_USERNAME/* will match a branch remote-run/joe/my_feature and start a personal build for the TeamCity user joe (if such user exists)."
Then setup a custom "Pull Request Created" Webhook in Bitbucket.
https://confluence.atlassian.com/display/BITBUCKET/Tutorial%3A+Create+and+Trigger+a+Webhook
So for your particular use case with BitBucket integration, you could utilize the WebHook you create, and then have a shell / bash script (depending on your TeamCity Server OS) that runs the remote run git commands automatically, which will in turn automatically trigger the TeamCity Remote Run CI build on your server. You'll then be able to go to the TeamCity UI, +HEAD:remote-run/my_feature branch, and view the Remote Run results on a per-feature basis, and be confident in the build results of the code you merge to your main line of code.
Seems that BitBucket/Stash creates branches for pull requests under:
refs/pull-requests//from
You should be able to setup a remote run for that location, either by the Teamcity run-from-branch feature, or by a http post receive hook in BitBucket/Stash.
You can also use this plugin : https://github.com/ArcBees/teamcity-plugins/wiki/Configuring-Bitbucket-Pull-Requests-Plugin
(Full disclosure : I'm the main contributor :P, and I use it every day)