I'm using azure devops pipeline to build my IIS application and deploy via release management to several different servers, and it works great. My issue though is that one of the servers I need to deploy to will always be offline, so I need to set up some sort of offline installer for that deployment. Is there a way to do this using the build and release management I already have that I'm not seeing?
Azure Pipelines assumes that the server is always available. Best I can think of is to generate some kind of drop on a fileshare and then add a Manual Intervention Task to pause the pipeline and allow you to do your thing.
There is no air-gapped agent nor a way to run part of your pipeline on another system and import the results.
Related
Does anyone know the MSDEPLOY_RENAME_LOCKED_FILES (used in azure) equivalent in devops? I want to set this when pipeline runs in devops to publish to IIS local server.
Currently, this method only supports Azure Web APP service. This option is used to set the Azure App settings(cloud).
It does not apply to IIS local server(on-prem). As far as I know, there is no work around for this.
Forgive me for asking a stupid question. I am from IT Infrastructure background & have been asked to create CI/CD pipelines based on my recent learnings on DevOps.
We have couple of applications whose source code is currently in TFS 2013 & those apps are written in ASP.NET C# language. Now, requirement is to migrate the source code from TFS to Azure Repos (Azure DevOps services) & further create a CI/CD pipeline.
Now for demo purposes, customer is asking us to do the deployment (i.e. Release pipeline) on a test server which is a plain windows 2012 OS without any SQL & IIS for both of these applications. Is that possible & how could we achieve the results to confirm release pipeline is funcioning properly?
In my opinion, it wont work as there is no application infra/configuration done for those applications on that plain test server. I guess we actually need a ready dev/stage environment which is replica of production to do the testing of release pipeline for those applications. Am I correct?
Just need expert advise for confirmation so I communicate the same to customer.
Azure DevOps Pipelines use an agent to perform the deployments. You can run the agent entirely in the cloud when deploying to Azure resources. You can also install an agent locally. Follow this link and scroll down to read about self-hosted agents. This is how you can deploy to your test instance from the pipeline.
Now, what you deploy there may require additional software be installed. You say it's an application in C#. Cool. Now, what's it do? Is it a windows program? Then just having the server there, with an agent installed, is all you need. Is it a web program? Then, yeah, it's going to need an IIS (or whatever) instance available somewhere to deploy to. Is it a database program? Then, yeah, it's going to need a database instance to deploy to. There's nothing magical about having a VM or a machine somewhere. All the same rules have to apply. There has to be an OS, drives, memory, and yes, supporting services depending on the needs of the application.
However, using a local machine instead of a hosted one, that works fine. Just follow the instructions in the link above.
Asking for an opinion or direction on the current problem.
We are using bitbucket pipeline to deploy ci/cd web applications to Azure. Now what is remaining - the database, also being hosted on Azure.
From my research - everything on SQL Database Projects deployments usually utilizes Azure DevOps pipelines (connects to github repo, allows plural environments, has a built-in SqlAgent allows deploy SQL db to the target server via dacpac file. It allows CI with every check-in, every time you push changes. Nice!
But what if can not (for some reason) use Azure DevOps and have to utilize Bitbucket pipelines instead. is that possible? how? via scripting? a tool? to call in the command line? Any help - highly appreciated.
It's true that in Azure DevOps it is easier to deploy (Azure) SQL Database, as Azure DevOps offers many tasks (including 3rd party custom tasks you can find in Microsoft MarketPlace).
However, no matter what tool will you use, you should be able to do the same, knowing the concept of deployment of a specific service.
I don't know BitBucket very well, but I bet the product has the capability to execute some commands, including PowerShell commands as well. If so, you must do 2 steps in your pipeline to publish Azure SQL database:
1) Create server and (empty) database - perhaps BitBucket offers some task for creating services in Azure (from ARM template or other way). If not - you can always use CLI or PowerShell to do so. More info: az cli server
2) Deploy the database or changes to it. This step is always to compare DACPAC file (which is compiled version of SQL Server database project) to target database on the server. The result is T-SQL (differential) script which must be executed against the target database. There is only one way to do so - sqlpackage.exe - tool provided by Microsoft. You can find the whole documentation here and plenty of examples on how to use it on the Internet.
Let me know if that helps.
What options are there to deploy a web application to a heavily locked down machine without WinRM?
The situation is as followed.
Code is in Azure DevOps cloud
Release server is in a semi-secured area with access to download artifacts from DevOps
Target server is in a very locked down zone.
If release server can only copy files to a specific temporary folder target machine, is there a way to do deployment to it without WinRM?
My initial thought is to have a script on the Target machine to watch for the artifact showing up and deploy it. I want to know if there's a better way or if that's my best option?
If release server can only copy files to a specific temporary folder
target machine, is there a way to do deployment to it without WinRM?
If you've read document Deploy your Web Deploy package to IIS servers using WinRM, you would find the notice below the title:
A simpler way to deploy web applications to IIS servers is by using deployment groups instead of WinRM.
So you can consider using Deployment Group as a simpler direction. And here're some discussions(#1, #2) which may help you to do a choice between WinRM and Deployment Group depending on your needs.
Update1:
My initial thought is to have a script on the Target machine to watch
for the artifact showing up and deploy it. I want to know if there's a
better way or if that's my best option?
In your specific scenario, it's one choice when the target server cannot have line of sights to the Azure DevOp/TFS server and you can't(or maybe not want to) use WinRM.
What is the best way to go about adding Status Checks from Azure DevOps in a self-hosted scenario using a private GitHub Enterprise instance (not accessible externally).
Presently our setup is self-hosted with the agent installed on our build servers. We have a custom hook that tells Azure Pipelines to start a build however we aren't sure the best way to deal with the status checks. It would be preferable to not have each build pipeline have to make custom REST calls for the status checks rather have some means to register the status check between systems.
Has anyone configured for this scenario before?
Thanks!