_Hi!
I have a deployable artifact set that gets tested in a pipeline, and this information must be passed to a subsequent deploy pipeline. Right now, I'm writing this information to a file, but how can the next pipeline read it?
Thanks!
Behold the article that answers this question: http://support.thoughtworks.com/entries/23754976-Pass-variables-to-other-pipelines
Related
This is how my colleague set up the stages
here the release of changes
The question is, how can i deploy to PROD (let say only project C - Release 194 and 196), without A, accidentally pushed together to PROD (because it hasnt approve to go to PROD)?
Thank you,
your helps really appreciated
In general, if you wish to take a release of anything to a specific environment, and you have the appropriate approval permissions, you can do the following:
Click the link for the Release-### you wish to deploy
The pipeline will appear, and in the ribbon above, click the Deploy dropdown. Depending on how your pipeline is configured, you'll have the ability to deploy to a single or to multiple environments. Choose where you'd like it to go.
Your question wasn't completely clear, so I've answered as best I can, but it seems from what I read that you might be using a single pipeline to manage deployment of multiple projects? Am I correct? If so, you might wish to look at creating a pipeline for each project - The only way I know what is deployed where in your question is to look at the annotations you added to the right. Separate release definitions would give you better clarity into that. Best of luck.
I am wondering why my build pipeline for Azure DevOps is getting practically all the changes instead of just the most recent one?
I only made one change to 1 file, that didn't have an associated work item. This is what the change looks like. As you can see I have only changed one file. So why is it detecting all the changes?
I thought maybe it was because I didn't have this option enabled.
But even with that option enabled on a pipeline setup the EXACT same way all the changes are still pulled. Am I missing something subtle here?
There's an option on the pipeline to automatically link work items that are completed in a run.
You can edit the Pipeline and in the settings, tick Automatically link work items included in this run. You can pick * for all branches, or a specific branch.
Not sure what you did for your pipeline. Every time you change the files of repos for pipeline and then re-run the pipeline, it lists the changes for each build for the current project. This depends on how many changes you have made to the repo project since the last pipeline.
The below shows:
Then, click into the latest pipeline, and that is true, it will list all changes compared to the previous pipeline.
Click into it and you could get the detailed every single change.
Is it possible to add additional mapping to Get Sources at runtime?
Like in a prejobexecution task?
We are currently using a Powershell script that determines which additional mappings to setup based on iteration, area and different business requirements, maps them to the current workspace and then runs tf get.
This works, however, the changesets and work items from the additional mappings are not linked to the run.
We have also tried a different approach, where a “starter”-pipeline runs the scripts and modifies another pipeline (updates the tfvcMapping) and then invokes it using a build completion trigger.
All changesets and work items are linked, however, the approach does not seem right.
Add additional mappings to Get Sources at runtime (Azure DevOps pipeline - TFVC)
I have encountered a issue very similar to yours before (I use git). Personally, I prefer your second solution, which saves all the linked information (changesets and work items) at the cost of an additional pipeline.
For the first way, just as you test, we will lose some relevant information, which is not what we expected. Although we can use the checkout command to get the latest changesets, we cannot simply complete it for workitems, because it is done by Azure devops. It is difficult for us to obtain the associated workitems through changesets and associate them with our build.
The solution for me, we create a pipeline(as you said starter-pipeline) to invoke the REST API Definitions - Update to update the get source for another pipeline, then hen add build completion trigger:
PUT https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=5.1
Check the similar request body here.
Hope this helps.
I am using Build Pipeline: VSTS and Repo: VSTS
I am trying to create VSTS build pipelines for two branches Dev and UAT. I can achieve it by creating two different pipelines but since both has almost similar steps so I wanted to have only one pipeline and depending on condition I can omit some steps. But I am not able to figure out how to pass the variable value(branch which triggered the build) before queueing the build in VSTS.
Background: I tried to get both branch source, setted triggers on both branch and use Build.SourceBranchName variable but it is giving value the top level(Project) value instead of branch name. I have structure like below
Project
- Dev
- QA
The agenda is to trigger the build pipeline on checkin, sanity checks an publish the artifacts as per the environment(Dev,QA). I am not going for CD right now (will be handled manually because of some constraints).
The solution might sound very awkward but I am a newbie and I want to learn it.
I have checked some people have explained through API but any way to do through UI?
VSTS use API to set build parameters at queue time
Any help would be highly appreciated.

To add a variable that you can pass the value when you queue the build you need to go to variables tab, then add your variable and check the checkbox Settable at queue time.
Now when will queue a build you can change the default value.
I'm trying to execute a bit different build steps in VSTS based on type how build was started: automatically or manually.
I'm especially interested in accessing that information from powershell script. But so far was not able to find suitable solution or workaround.
Did someone faced similar requirement before? How did you solved it? I would appreciate your help!
Seems you want to know whether the project build is happening through TFS triggered build or manually triggered build.
There is no such feature for vnext build for now. About this , you could submit your uservoice to this link, TFS Product Team is listening to your voice there.
As a workaround either to use two build definitions through different version patterns or manually add a specifical tag after a manually build finished. Through using tags to set some labels on the build to distinguish manual and automatic builds. But this is a manual action, it would be better if we can do this automatically.
It seems to be I've managed to find an option that allows to determine wherever build was triggered automatically or manually.
All builds started manually have actual user in $Env:BUILD_QUEUEDBY variable, while automatic builds have system account there. My value was [********]\Project Collection Service Accounts.
I don't know how reliable it is, but for me so far following code did the job:
# Identifying who triggered the build
$OwnerId = $Env:BUILD_QUEUEDBY;
$OwnerId = $OwnerId.ToUpper();
if ($OwnerId.EndsWith("PROJECT COLLECTION SERVICE ACCOUNTS"))
{
Write-Host "Build was triggered automatically. Resulting files considered 'BETA'"
}
else
{
Write-Host "Build was triggered manually. Resulting files considered 'STABLE'"
}