Azuredevops Repo Api to get updated yaml parameters from external apps - azure-devops

We have a java based ticket management application and where users can submit the input values "projectname", "accesskind", "permissiontype". We have a azuredevops pipeline created to automate the permission assignment in jfrog artifactory, where the pipeline is currently configured with manual input of these values to the yaml file parameters from the Tickets.
Our Ticket management tool have capability to push the contents of these values over API calls. So I am looking for Azuredevops server Repo API, that I can give to the tools to update this values directly in the pipeline parameter yaml files, from there I can integrate my existing pipelines

Related

How do I show multiple Azure Pipeline pipelines in GitHub?

We use Azure DevOps for our CI/CD pipelines, but our repositories are in GitHub. We currently trigger the CI pipelines on each push, but there is no link to GitHub so we lose easily seeing the status of a build in pull requests / automatically failing a check if a build fails.
Azure Pipelines has an app on GitHub Marketplace for integrating pipelines with repositories / pull requests. I installed this in our GitHub organization and configured it with the repository access it needs, which then had me authenticate with Azure DevOps, select the project and the pipeline yaml associated with the repository.
This works great and I can see the status directly in a pull request -
The issue is that I have multiple pipelines I would like to run and display the status of in the pull request. We have a monorepo but I only want to build an app if it was modified, so I utilize path filters in the Azure Pipelines yaml so the CI is only run when I need it to. GitHub does not discover/display the status of other CI pipelines I have in the project.
Initially, I tried just setting up another azure pipelines yaml that is triggered by pull requests. When I make a pull request, I see in Azure Pipelines the CI was triggered by 'PR automated for {pr number}', but it does not display it's status in GitHub.
I ended up going to the Azure Pipelines app settings in GitHub, 'revoking' access to the repository, and then immediately re-configuring it with access to the same project as before, but selecting a different pipeline yaml. This worked, it retained the first build I configured and added the second, and now multiple builds are shown in the pull request -
But this does not seem like the intended way to accomplish this. The GitHub app links to the entire documentation for Azure Pipelines, not specifically to docs about the app, and I have not been able to find any info within on how to do this.
Is there a way to add multiple pipelines with the Azure Pipelines app on GitHub, outside of this workaround?

Is it possible to update a file in azure devops repository using power automate?

I can't find any connector in power automate to update a file in azure devops repository.
I am trying to automate the process of checkin our deployment sheet in azure devops repository. My flow is triggered when the custom work item type is closed. An adaptive card is sent to teams where the user can update the logs. Then I want to automatically update the log sheet present in azure devops. Prior to this we used to update the sheet in excel and again commit it from visual studio.
For operations which are not supported by the default connector you could always look into using the Azure DevOps REST API service in combination with an Send an HTTP request to Azure DevOps action.
I would look into using for example the Pushes - Create method:
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/pushes/create?view=azure-devops-rest-6.0&tabs=HTTP
Have a look at this StackOverflow thread, that might be useful as well:
How to use Azure DevOps REST API to Update File?

How to automate Azure data factory pipeline deployments

I want to automate Azure data factory pipeline deployments.
I have Self Hosted Integration runtimes with a different name in each environment (i.e. SHIR-{environment}).
I have different data sources and destinations for each environment. (i.e. different SQL server names or Hostnames)
How can I perform the automatic weekly deployments to promote changes from GitHub dev branch to stage and stage to production? I don't want to modify these database server names in linked services during the GitHub PR merge.
To set up automated deployment, start with an automation tool, such as Azure DevOps. Azure DevOps provides various interfaces and tools in order to automate the entire process.
A development data factory is created and configured with Azure Repos Git. All developers should have permission to author Data Factory resources like pipelines and datasets.
A developer creates a feature branch to make a change. They debug their pipeline runs with their most recent changes. For more information on how to debug a pipeline run, see Iterative development and debugging with Azure Data Factory.
After a developer is satisfied with their changes, they create a pull request from their feature branch to the main or collaboration branch to get their changes reviewed by peers.
After a pull request is approved and changes are merged in the main branch, the changes get published to the development factory.
When the team is ready to deploy the changes to a test or UAT (User Acceptance Testing) factory, the team goes to their Azure Pipelines release and deploys the desired version of the development factory to UAT. This deployment takes place as part of an Azure Pipelines task and uses Resource Manager template parameters to apply the appropriate configuration.
After the changes have been verified in the test factory, deploy to the production factory by using the next task of the pipelines release.
For more information follow this link

How to publish build artifacts to external organisation using Azure DevOps?

I have an ASP web application that I have building in an Azure DevOps Build Pipeline. That is all fine.
I want an external organisation to be able to define their own Azure DevOps Release Pipeline to consume the build artifacts produced by our Build Pipeline. I need the access of that external organisation to be restricted with some sort of credentials (i.e. I don't want the project to be public to everyone). The external organisation should be able to deploy the latest version.
I thought this would be a relatively simple process using only Azure tools (particularly with reference to Feeds), but have tried a number of different approaches based on the documentation but all have failed. I don't want to publish to GitHub - I just want to keep everything inside Azure. I have tried using Universal Packages with Feeds, but the Release Pipeline can only pull a specific version from the feed rather than LATEST.
Does anyone have any recommended approaches I should take?
There is Latest option in the feed of your Release Pipeline:

Publishing artifacts to an external server

We are using azure devops pipeline to build our application including a Azure build agent. At the end of this process, I would like to publish the artifacts to an on-premise server shared directory (which will connect to company mandated deployment process (repliweb)).
Is that possible?
Looking at the documentation it looks like I can use publish artifact or copy file step (or maybe even ftp).
Our IT organization needs to know the IP/port so that the firewall rules can be authored.
Where can I get that information?
I suppose the other possibility would be have our server pull the artifact from Azure devops.
Publishing artifacts to an external server
For this issue, you can try to use FTP Upload task in the pipeline. Using this task in a build or release pipeline to upload files to a remote machine using the File Transfer Protocol (FTP), or securely with FTPS.
For details,please refer to this document.