At the moment if I update my API so that I include a query parameter on one of my endpoints, I also need to update the ARM template that is used during the Azure DevOps release pipeline to update APIM.
Is there a way to automatically update the ARM template json based on the updated action methods?
The end goal is to have APIM reflect the changes I made in my action methods, without me having to update things in multiple places (ie. action method AND ARM template).
Although this doesn't directly answer your question, I'd like to invite you to explore using Azure API Management DevOps Resource Kit instead, an official tool maintained by the APIM team.
Related
I have been researching on how i can create/update a work item in Azure DevOps from logic apps using Azure DevOps rest apis? We need this for an integration with another CRM tool. Please reply if this can be done and how?
Thanks,
Gopal
You can either use this REST API in order to create work item. You will have to maintain PAT:
https://learn.microsoft.com/en-us/rest/api/azure/devops/wit/work-items/create?view=azure-devops-rest-7.0&tabs=HTTP
Update work item:
https://learn.microsoft.com/en-us/rest/api/azure/devops/wit/work-items/update?view=azure-devops-rest-7.0&tabs=HTTP
Or - what I would recommend - you can use Azure DevOps connector (for Logic App it is part of Standard Class):
https://learn.microsoft.com/en-us/connectors/visualstudioteamservices/
Create Work item task anchor:
https://learn.microsoft.com/en-us/connectors/visualstudioteamservices/#create-a-work-item
Update work item task anchor:
https://learn.microsoft.com/en-us/connectors/visualstudioteamservices/#update-a-work-item
Simple guide how to set up such task in Logic App:
https://codegumbo.com/index.php/2021/05/27/using-an-azure-logic-app-to-create-azuredevops-work-items-from-a-sql-server-dataset/
After reproducing from my end, I have received the same error.
You are receiving this because as soon as you try to hit the endpoint it asks for login which requires token and hence you are getting redirected. One of the workarounds to make this work is to provide required scope as mentioned in this official documentation.
One of the ways that worked for me is to use the DevOps Connector directly from logic apps. Below is the flow of my logic app.
While going through documentation for 'source control in ADF', I came across following information in the introduction paragraph.
By default, the Azure Data Factory user interface experience (UX)
authors directly against the data factory service. This
experience has the following limitations:
The Data Factory service doesn't include a repository for storing the
JSON entities for your changes. The only way to save changes is via
the Publish All button and all changes are published directly to the
data factory service.
The Data Factory service isn't optimized for collaboration and
version control.
The Azure Resource Manager template required to deploy Data Factory
itself is not included.
from the highlighted phrase of information, what concerns me is how to understand the difference between ADF service and ADF UI. Couldn't find any relevant information over google search.
Would anyone please help me understand. I have attached the web-link for the source of the document.
Thank you for your time and support
The ADF UI is the authoring tool. By default it isn’t connected to Azure DevOps or GitHub for source control. This is called “Live Mode” and changes are saved directly to ADF service (meaning “saved to the server”).
The preferred method of working in the ADF UI is to connect it Azure DevOps or GitHub repos for source control. (The link you provided describes this.) This allows you to save intermediate progress (even if the pipeline won’t validate because it isn’t valid). And it allows you to utilize source control features like branching, merging, pull request approvals, etc. Only when you merge to the main branch and then Publish do the changes get saved to the ADF service. Until then you can debug pipelines including changes you made in your branch. But the version of pipeline that is run from a trigger (like a schedule trigger) is the published version not what’s in the main branch or in your branch.
Is this possible?
We’ve got a monorepo that holds all that slns for our given project. The product requires these services to function.
Here’s my question
Is it possible to get a pipeline that will build everything but only release changes?
Eg.
Let’s days I’ve got a
Submission api
User management api
MVC app.
If I make changes to the MVC app, then the pipeline needs to build everything but only indicate the the MVC app needs releasing
Same if I made a change to user management
What if I make a change to all 3 in a single PR? I’d want to build and release everything.
I have a tool aztr that summarizes the Azure Build/Release pipeline test results. A recent requirement has come up to save the incoming variable information and the summary information for consumption outside of the tool (say a csv file).
Now on the Release pipelines side the Release API provides all the details about the variables passed into the Release. I want the same functionality to be available on the Build side as well but the Build API, does not provide that functionality. Is there a different API, I need to use to get the variables passed into the build?
Thank you for your responses.
We could list the custom build variables via this API
GET https://dev.azure.com/{organization}/{project}/_apis/build/definitions/{definitionId}?api-version=6.0
In addition, they are some predefined variables in the variable and they are DevOps Services system variables, we cannot list them via this api.
Result:
I have an Azure DevOps instance where I am trying to control and monitor who is doing releases. So, who pressed the button to do the release and what releases have happened.
I tried in Audit options, but it is not satisfying my requirement.
What is the best way to get what I am looking for?
Thanks in advance…
It's a little unclear what you mean by CONTROL
If you are needing to tighten control of which users are allowed to initiate releases outside of a CI/CD pipeline, this is something you would use the built in object permissions for release pipelines.
I've organized our release pipelines into folders
Each of these folders is treated as an object upon which permissions can be set.
For MONITORING the release pipelines
Again, I tend to just use the All Pipelines folder which gives a list of the releases that happened ordered by date. That view lists the pipeline and gives the users Avatar, which is enough to know who created the release.
Also
There are some out-of-the-box widgets that you can put on your dashboard, but I've found them to be unhelpful on the whole. Not to mention that if you have 100's of pipelines you will want to have something reading from your list of pipelines via REST api and pushing those widgets onto the dashboard via the REST api so that you don't need to manage them all "by hand" through the UI. Then if you're going to get into using the REST api, you might as well write your own tool to report the information you need (and possibly turn it into a widget others can consume from the marketplace). I haven't found anything very effective on reporting/summarizing the collection of release pipelines from the marketplace, but there may be something squirreled away in there somewhere.