I would like to understand how the auto git checkout is happening azuredevops build pipelines. How the authentication to ADO repo is happening from the ADO build agent. Which user is used by the ADO pipeline to clone or checkout this repo.
My concern is ,when i trigger a pipeline, i could see that the repository is checkedout to build agent. Which user is used by the ADO ? when i try to push, its asking for AD authentication, So how the checkout and push are differ in ADO pipeline perspective?
when you set up your pipeline initially, you specified where you code resides, e.g. on GitHub. As part of that step a service connection will have been created (you have been asked for it). So whatever you specified there, will be used. You can check it out in the project settings under "Service Connections"
It depends on which source code tool you use. For example If you use Github, you will have to setup a connection with it. This can be accomplished on project settings under Github connections. You can then use this service connection and checkout your code.
This Github integration has been made from a particular User and by navigating on github -> Settings -> Integrations -> Applications, you will notice the exact permissions.
I imagine that Github/Azure Devops then will use this integration object that is created for the authentication.
On Azure Repos repositories, you do not need a service connection in place. The repository can be checked out automatically.
When you run the pipeline you can see the exact commands that have been executed for the authentication.
git remote add origin https://ORG.visualstudio.com/test-project/_git/test-project
git config gc.auto 0
git config --get-all http.https://ORG.visualstudio.com/test-project/_git/test-project.extraheader
git config --get-all http.extraheader
git config --get-regexp .*extraheader
git config --get-all http.proxy
git config http.version HTTP/1.1
git -c http.extraheader="AUTHORIZATION: bearer ***" fetch --force --tags --prune --prune-tags --progress --no-recurse-submodules origin
Microsoft-hosted agents run on secure Azure platform. However, you must be aware of the following security considerations.
Although Microsoft-hosted agents run on Azure public network, they are not assigned public IP addresses. So, external entities cannot target Microsoft-hosted agents.
Microsoft-hosted agents are run in individual VMs, which are re-imaged after each run. Each agent is dedicated to a single organization, and each VM hosts only a single agent.
There are several benefits to running your pipeline on Microsoft-hosted agents, from a security perspective. If you run untrusted code in your pipeline, such as contributions from forks, it is safer to run the pipeline on Microsoft-hosted agents than on self-hosted agents that reside in your corporate network.
When a pipeline needs to access your corporate resources behind a firewall, you have to allow the IP address range for the Azure geography. This may increase your exposure as the range of IP addresses is rather large and since machines in this range can belong to other customers as well. The best way to prevent this is to avoid the need to access internal resources.
Hosted images do not conform to CIS hardening benchmarks. To use CIS-hardened images, you must create either self-hosted agents or scale-set agents.
Taken from Microsoft-hosted agents - Security.
The most important part is probably
Microsoft-hosted agents are run in individual VMs, which are re-imaged after each run. Each agent is dedicated to a single organization, and each VM hosts only a single agent.
Next to that, check Create and manage agent pools - Security of agent pools.
Related
I am having trouble deploying files to my servers through the Release Pipelines.
I need to copy files to a Windows and a Linux server. I have tried using the file copy and the ssh file copy tasks, but they seem to be getting blocked because the microsoft servers aren't in my firewall whitelist. What is worse is that I can't seem to get a reliable list of IP's that I need to whitelist, and even if I did it seems they change over time.
So, any advice appreciated.
Also, I am a bit confused about the azure agent. My understanding was that you install them on the servers so that you don't need to worry about firewall issues. I just have the feeling I am missing something. I have no idea what that agent is doing at the moment - it certainly doesn't seem to be helping with the file deploy.
Thanks in advance!
Deploy issue with Azure Release Pipeline
Self-hosted agent: An agent that you set up and manage on your own to run jobs is a self-hosted agent.
To resolve this issue, you could create your private agent, then you can add the IP address of the machine where your private agent deployed to the firewall whitelist of your server machine.
In this case, Azure Release Pipeline runs on your private agent, and the IP of the machine where the private agent is located is added as a whitelist, so that it will not be blocked by the firewall of Windows and Linux servers.
You could refer the document Self-hosted agents to create your private agent.
we have all our pipelines on Azure DevOps:
...but we are using a Repo from another team and it is on Bitbucket Server (company owned). We have our own branch there on the bitbucket repo we use, and I would like to be able to merge PRs to this bitbucket repo and have it "trigger" the Azure (build) pipeline. I am currently just running it manually but it is very tedious. Also, this is on a Bitbucket Server that my company owns (not the regular bitbucket cloud (public) so we have our own custom domain name i.e. www.bitbucket.MY-COMPANY-NAME.com). I tried using the built-in feature on Azure devops (CI) build pipeline to link to the bitbucket repo to trigger "Scheduled" builds at 12AM at night but it always fails for this message:
An exception occurred while polling the repository. Error:
Microsoft.TeamFoundation.Build2.Server.Extensions.ExternalConnectorException: No references received in
the response from https://bitbucket.MY-COMPANY-NAME.com/scm/some-repo/some-repo.git/info/refs?
service=git-upload-pack. Status: 502, Reason: Bad Gateway at
Microsoft.TeamFoundation.Build2.Server.Extensions.GitConnector.ReadRefs(IVssRequestContext
requestContext, HttpResponseMessage response) in
D:\v2.0\P1\_work\1\s\Tfs\Service\Build2\Extensions\SourceProviders\Git\GitConnector.cs:line 318 at
Microsoft.TeamFoundation.Build2.Server.Extensions.GitConnector.GetBranches(IVssRequestContext
requestContext, ExternalConnection connection, Int32 timeoutSeconds, Boolean useAnonymousAccess) in
D:\v2.0\P1\_work\1\s\Tfs\Service\Build2\Extensions\SourceProviders\Git\GitConnector.cs:line 125 at
Microsoft.TeamFoundation.Build2.Server.Extensions.GitSourceProvider.GetMatchingBranchRefs(IVssRequestCont
ext requestContext, BuildDefinition definition, IList`1 branchFilters) in
D:\v2.0\P1\_work\1\s\Tfs\Service\Build2\Extensions\SourceProviders\Git\GitSourceProvider.cs:line 463 at
Microsoft.TeamFoundation.Build2.Server.Extensions.GitSourceProvider.GetSourceVersionsToBuild(IVssRequestC
ontext requestContext, BuildDefinition definition, List`1 branchFilters, Boolean batchChanges, String
previousVersionEvaluated, Dictionary`2& ciData, String& lastVersionEvaluated) in
D:\v2.0\P1\_work\1\s\Tfs\Service\Build2\Extensions\SourceProviders\Git\GitSourceProvider.cs:line 369 at
Microsoft.TeamFoundation.Build2.Server.Extensions.BuildPollingJobExtension.Run(IVssRequestContext
requestContext, TeamFoundationJobDefinition jobDefinition, DateTime queueTime, String& resultMessage) in
D:\v2.0\P1\_work\1\s\Tfs\Service\Build2\Extensions\BuildPollingJobExtension.cs:line 98.
No logs available for this run
According to Microsoft's document Build on-premises Bitbucket repositories:
If your on-premises server is reachable from the servers that run
Azure Pipelines service, then:
you can set up classic build and configure CI triggers
If your on-premises server is not reachable from the servers that run Azure Pipelines service, then:
you can set up classic build pipelines and start manual builds
you cannot configure CI triggers
YAML pipelines do not work with on-premises Bitbucket repositories.
PR triggers are not available with on-premises Bitbucket repositories.
And here is a document that provides some troubleshooting advices about failing triggers:
Is your Bitbucket server accessible from Azure Pipelines? Azure Pipelines periodically polls Bitbucket server for changes. If the
Bitbucket server is behind a firewall, this traffic may not reach
your server. See Azure DevOps IP Addresses and verify that you have
granted exceptions to all the required IP addresses. These IP
addresses may have changed since you have originally set up the
exception rules. You can only start manual runs if you used an
external Git connection and if your server is not accessible from
Azure Pipelines.
Is your pipeline paused or disabled? Open the editor for the pipeline, and then select Settings to check. If your pipeline is
paused or disabled, then triggers do not work.
Have you excluded the branches or paths to which you pushed your changes? Test by pushing a change to an included path in an included
branch. Note that paths in triggers are case-sensitive. Make sure
that you use the same case as those of real folders when specifying
the paths in triggers.
I have code in GitHub and build in Azure DevOps
Build is executed nightly but Service connection Usage history is always empty for GitHub:
Any ideas how to figure out which pipelines use which service connection?
Any ideas how to figure out which pipelines use which service
connection?
Per my experience, the execution history there indicates the history of Github connection usage during pipeline run instead of pipeline checkout. So if we don't use the service connection in any of our task, the history would be empty.
Some details:
I have one pipeline which uses one private github repo as source:
Run the pipeline three times, build number #1434, #1435 and #1436. #1434 only has one simple CMD task while the next two runs (#1435,#1436) have extra github-related tasks which uses that github connection as task input.
#1434(Use the github connection in Get Source step but not in Real run process):
#1435 and #1436(Call the github connection in real run process):
The result after several minutes:
For now the Guthub Connection usage history won't display the history of the pipeline run during which the connection is only used for Get Source authorization. We have to check it ourselves if any pipeline uses the connection for Get Source step.
In addition: I think this would be a good idea if the usage history can also display the history of runs that use the github connection for Get Source Authentication step. So feel free to submit a feature request in our User Voice forum to share your idea with the product team if you do want this feature. They would consider it seriously if it gets enough votes.
Azure DevOps Pipelines - Get API can be used to fetch metadata about repo and if it's GitHub compare service connection Id with wanted, sample solution -
https://dev.azure.com/kagarlickij/_git/azuredevops-check-service-conn-usage
We are trying to start using the Azure Pipelines agents instead of Self hosted ones. While trying to convert over or Acceptance tests I am running into an issue with the agent not allowing our test to connect to an api we spin up with in the Agent that is running on port 44392. Noticed this post. How to open TCP port on localhost on VSTS build agent?, from a couple years ago and is pretty similar to how our test is working. Just wondering if the answer is still accurate or not.
Since your are using the Hosted Agents, which means that the machine is a shared resource between many Azure DevOps Organizations (tenants) and managed (and locked) down by Microsoft.
In other words, we do not provide end user to open port with these agents. The answer in your link is still valid.
You may have to install an agent on your own virtual machine and run the build there. The VM can be in the cloud or on premise. You trade simplicity and cheapness for full control.
In order for Jenkins to be able to have access to multiple repositories on the same server, I set the .ssh/config as follow:
Host a.github.com
HostName github.com
User git
IdentityFile ~/.ssh/project-a-id_rsa
# same for other repos
and set the Jenkins jobs' Source Code Management (SCM), to git and git#a.github.com:user/repo_a.git. It works fine.
Problem
I want those jobs to be triggered on push events so I set a webhook service in github, .i.e, Jenkins (GitHub plugin). The request received from the webhook are "POST for https://github.com/user/repo_a" which is a different host than the one set in the SCM, .i.e, a.github.com.
Because they are different, the job does not build automatically.
Ugly Solution
I got something running by setting the SCM to github.com and override the remote url of the project's git config once cloned with a.github.com. So the SCM would match the webhook, and jenkins when running git push would use the .ssh/config info.
Question
What else can I do ? Is there a better, easily automated way to achieve this?
I stopped using the deploy key and added my own account credentials on jenkins to be able to deal with all repositories without having to change the host with .ssh/config.