How do I give Azure Pipelines access to network resources? - azure-devops

I'm trying to set up a pipeline to include steps such as deploy a dacpac as well as run some e2e integration smoke tests. Unfortunately, some of the resources are inside of an on-premise network. How would Azure Pipelines obtain access to network resources?

You don't. If you need access to on-premise resources, install and configure a private agent within your network.

Related

Can we deploy application to multiple Vms on multiple clouds with Github Actions?

I have an application which I want to deploy to a number of VMS on Azure and AWS, I was working with Azure DevOps before and it provided very nice features to achieve this with deployment groups etc. Now I want to work with Github and I am really having problems to design my CI CD pipeline since Github actions do not have any feature which could be used to do deployment on a set of VMS. If there are please guys share your thoughts any article would be appreciated. Thanks
You can firstly consider to deploy application to one Virtual Machine with Github Actions.
Just in the environment of Azure, all you need is to use GitHub Action to build a virtual machine (VM) within Azure.
you can learn the detailed steps to deploy application to one Virtual Machine with Github Actions in:How to use GitHub Actions to deploy an Azure Virtual Machine.
For multi-environment deployments either in Azure or AWS with GitHub Actions, I recommend you to use Octopus Deploy as a reference. you can still refer to Multi-environment deployments with GitHub Actions and Octopus to deploy Virtual Machine on AWS.
For
deploy application to multiple Vms
We recommend you to use Azure Batch to run parallel workloads. It can allow you to deploy application to multiple Vms at one time in batch in the basic of deploy application to one Virtual Machine.
You can run the batch job using Azure CLI by following the example: Run Batch job with the Azure CLI.

Publishing remote test results to my Azure DevOps pipeline

I have a nodejs web application that I build in Azure Pipelines. I am planning to deploy the generated artifacts on a Azure VM (probably a dev test labs), as part of one of the pipeline steps.
I want to now run browser tests by pointing the browser to the hosted URL in the Azure VM. I want to use the Azure windows and linux VMs in a build pipeline to run the tests on this remote Azure VM and publish the results to the pipeline. These would be karma tests essentially running on the nodejs server.
In my current design, the test results are going to be available on the Azure VM hosting the nodejs application.
What I don't understand is how can I get these test results back to
the Azure Pipeline for publishing the same?
Is there a way I can architect this solution without having to setup my Azure VM as a
pipeline agent in Azure DevOps?
Is there a standard pattern to design such continuous test infrastructure using Azure DevOps?
Thanks
According to your description, you just want to use Microsoft host agent to access an url on your self-host agent (ignore it's Azure VM or your own physical machine, same to host agent).
It depends if that url are accessible through public internet.
The simplest solution here is deploy your build agent on that Azure VM directly. Then run build and test. You can do this through the following script and tasks:
run ng test or any command to raise your tests
publish test results with PublishTestResults task
publish code coverage results with PublishCodeCoverageResults task
Microsoft-hosted agent pool will not work for you with every scenarios. For many teams this is the simplest way to run your jobs. You can try it first and see if it works for your build or deployment. If not, you can use a self-hosted agent. Self-hosted agents give you more control for your builds, tests and deployments.
In your scenario, setup your Azure VM as a pipeline agent and run build/test on it should be the simplest and convenient solution.

How to integrate OnPrem Azure DevOps Server with the cloud one?

My firm has the Azure DevOps online version where we have all our projects and repo's. We were not able to configure CI/CD for the repo's because our internal server network doesn't have access to the internet.
To overcome this issue, we built a new server that has access to the internet and also to the internal network. On the new server, we installed and configured Azure DevOps Server 2019. We don't want to migrate our repo's from the cloud version to the online version.
I am trying to link the OnPrem repo to the cloud repo but it was not working. I issued a PAT on the cloud version and added it as a service connection under Pipelines in the OnPrem version but still, I am not able to see and link the cloud repo's.
I can clone the repo from the cloud to the OnPrem server but that will not get the latest code as the code is being checked in the cloud repo's
Can anyone please guide me on how to link both of them, please.
Thanks!!!
I don't think there's a meaningful way to integrate Azure DevOps Services and Azure DevOps Server, as they are essentially the same product. I assume (but don't know) that you're looking to integrate Azure DevOps Services to on-premise builds and deployments, as you state that you want to keep the repos in Azure DevOps Services. So, in essence, you want to run build and deployment group agents in on-premise environment.
Take a look at the agent-documentation and especially the communication subsection:
https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops
Or this old blog post, from which the communication section originates:
https://devblogs.microsoft.com/devops/deploying-to-on-premises-environments-with-visual-studio-team-services-or-team-foundation-server/
The ideal solution would probably be that you run self-hosted build agents in your server that's open to internet, and configure an agent pool for them in Azure DevOps Services. For deployments, you'll want to use Deployment Groups and install deployment group agents to target servers, where they'll just need outbound 443 access for communicating with Azure DevOps Services.
If that's not possible, you'd have to install deployment agents to the build machine, which then sees your other on-premise servers, but this is rather unsatisfactory solution since you'd either have to rely on WinRm capabilities for deployments, or expose too much network between your build server and other on-premise servers.

Azure pipeline - How do I deploy code to Preregistered application

Our devops team have created an application (ex:athena) and registered with AD. They also have given us Service Principle.
The question I have is , how do I deploy my code in GitHub to the application (athena) that the devops team created for me using the “Deploy to kubernetes service” configuration template ?
Apologies in advance, as I am not proficient in Azure and this shows my gap in understanding.
The Information I have are :
Repository - GitHub (I have done the appropriate authorisation and can see the repository)
Service Principle (create by devops team)
Application (created by devops team)
I have created a Container Registry and Kubernetes service using azure portal
Now, I want to use the “Deploy to kubernetes service” configuration template.
Help much appreciated.
If you want to use this Deploy to kubernetes service, you must get two service connection : Azure Resource Manager and Kubernetes Service Connection.
So, first, you need to configure the connection between Azure Kubernetes, ARM and Azure Devops. Enter project setting->service connection, open New Service Connection and select Kubernetes. Input the relevant configure according to your Azure:
And so do with Azure Resource Manager, you can follow this doc to configure it. Here is my ARM connection you can refer:
Then, you can begin your build and release pipeline.
Deploy to kubernetes service task used in release pipeline. In build pipeline, you must run docker build, push task to finish pushing to Azure registry Container.
And then, run this deploy task in release. You can refer to this blog which written by Azure DevOps Labs: Deploying a multi-container application to Azure Kubernetes Services. It has detailed steps you can refer.
In addition, there has two build source type. One is you import your github repos into Azure Devops repos. So the build can be trigger by Azure Devops Repos. And the other is select Github as your build source, in this type, you can triggered directly by your github instead of use Azure Devops repos:
And also, this need you get the service connection with your Github first. Then, authorize it during the build pipeline.

Using azure devops to deploy to an offline server

I'm using azure devops pipeline to build my IIS application and deploy via release management to several different servers, and it works great. My issue though is that one of the servers I need to deploy to will always be offline, so I need to set up some sort of offline installer for that deployment. Is there a way to do this using the build and release management I already have that I'm not seeing?
Azure Pipelines assumes that the server is always available. Best I can think of is to generate some kind of drop on a fileshare and then add a Manual Intervention Task to pause the pipeline and allow you to do your thing.
There is no air-gapped agent nor a way to run part of your pipeline on another system and import the results.