ADF Track Pipeline Changes - azure-data-factory

Is that possible to see that changes and whom made the changes to the ADF pipeline? In QA some pipeline changes are made and the code has been published. IS that possible to track?

To track the changes in Azure Data Factory, configure git repository with GitHub or Azure-repos.
This will help to track the changes done in Pipeline.
Also, you can revert the changes done in code and bring back the previous versions of code.
You cannot track who made changes directly in ADF. Instead, you can configure the code-review process and allow few members to make changes in code via git and only limited members can publish those changes to data factory.
Reference: Microsoft document on Source control in Azure Data Factory

Related

what is the difference between ADF UX and ADF service?

While going through documentation for 'source control in ADF', I came across following information in the introduction paragraph.
By default, the Azure Data Factory user interface experience (UX)
authors directly against the data factory service. This
experience has the following limitations:
The Data Factory service doesn't include a repository for storing the
JSON entities for your changes. The only way to save changes is via
the Publish All button and all changes are published directly to the
data factory service.
The Data Factory service isn't optimized for collaboration and
version control.
The Azure Resource Manager template required to deploy Data Factory
itself is not included.
from the highlighted phrase of information, what concerns me is how to understand the difference between ADF service and ADF UI. Couldn't find any relevant information over google search.
Would anyone please help me understand. I have attached the web-link for the source of the document.
Thank you for your time and support
The ADF UI is the authoring tool. By default it isn’t connected to Azure DevOps or GitHub for source control. This is called “Live Mode” and changes are saved directly to ADF service (meaning “saved to the server”).
The preferred method of working in the ADF UI is to connect it Azure DevOps or GitHub repos for source control. (The link you provided describes this.) This allows you to save intermediate progress (even if the pipeline won’t validate because it isn’t valid). And it allows you to utilize source control features like branching, merging, pull request approvals, etc. Only when you merge to the main branch and then Publish do the changes get saved to the ADF service. Until then you can debug pipelines including changes you made in your branch. But the version of pipeline that is run from a trigger (like a schedule trigger) is the published version not what’s in the main branch or in your branch.

Where Is The History Of The Azure Classic Pipelines Kept?

I prefer the classic editor in Azure pipelines as I can see in a glance of about a second or two what is happening.
I see the benefit of versioning the pipeline, I just don't want to have to learn another DSL/schema if I don't have to.
The classic view has a history of changes and the content of the JSON it creates is very clear.
Where is this stored and is it possible to direct this to the same repo I would have used for the yaml pipelines?

How to audit azure DevOps services and monitor releases?

I have an Azure DevOps instance where I am trying to control and monitor who is doing releases. So, who pressed the button to do the release and what releases have happened.
I tried in Audit options, but it is not satisfying my requirement.
What is the best way to get what I am looking for?
Thanks in advance…
It's a little unclear what you mean by CONTROL
If you are needing to tighten control of which users are allowed to initiate releases outside of a CI/CD pipeline, this is something you would use the built in object permissions for release pipelines.
I've organized our release pipelines into folders
Each of these folders is treated as an object upon which permissions can be set.
For MONITORING the release pipelines
Again, I tend to just use the All Pipelines folder which gives a list of the releases that happened ordered by date. That view lists the pipeline and gives the users Avatar, which is enough to know who created the release.
Also
There are some out-of-the-box widgets that you can put on your dashboard, but I've found them to be unhelpful on the whole. Not to mention that if you have 100's of pipelines you will want to have something reading from your list of pipelines via REST api and pushing those widgets onto the dashboard via the REST api so that you don't need to manage them all "by hand" through the UI. Then if you're going to get into using the REST api, you might as well write your own tool to report the information you need (and possibly turn it into a widget others can consume from the marketplace). I haven't found anything very effective on reporting/summarizing the collection of release pipelines from the marketplace, but there may be something squirreled away in there somewhere.

Azure Devops Reporting or Queries To Determine Which PBIs have open Github PRs

The problem I am facing is some Product Backlog Items (PBIs) are marked as done, but in some instances the Pull Request (PR) on the PBI was never merged.
We do have Github integrated with our pipeline and the PBIs so we can see the current state of the PR on the PBI.
I would like the ability to create a query or run some type of report that helps me audit the state of all PBIs in Azure Devops and any that are marked with a specific state (Done in this case) that has a PR that still has a status of Open instead of Merged.
Azure Boards doesn't yet have the capability to use GitHub links as query parameters. That said, it is on the backlog to enable scenarios such as this one.
It would be helpful if you created a feature suggestion for this on the Azure DevOps Developer Community. Doing so will allow us to link the suggestion to our backlog item, which will give updates when work is started/completed on that feature.

Azure Data Factory development with multiple users without using Source Control

While working on a single Azure Data Factory solution with no Source Control. Is it possible to work parallelly for a team of 3 or more developers, without corrupting the main JSON?
Scenario:
All developers are accessing the same ADF and working on different pipelines at the same time. One of the developer publishes his/her updates, does it somehow overwrites or ignores the changes other developers are publishing?
I tested and found that:
Multiple users can access the same Data factory and working with
different pipelines in same time.
Publish only affect the current user and the current pipeline which
user is developing and editing. It won't overwrites other pipelines.
For you question:
Is it possible to work parallelly for a team of 3 or more developers, without corrupting the main JSON?
Yes, it's possible.
One of the developer publishes his/her updates, does it somehow overwrites or ignores the changes other developers are publishing?
No, it doesn't. For example, user A only develop with pipeline A, then publish again. The Publish only affect the current pipeline, won't overwrite or affection other pipelines.
You could test and prove it.
Update:
Thanks #V_Singh for share us the Microsoft suggestion:
Microsoft suggested to use CI/CD only, otherwise there will be some disparity in code.
Reply from Microsoft:
"In Live Mode can hit unexpected errors if you try to publish because you may have not the latest version ( For Example user A publish, user B is using old version and depends on an old resource and try to publish) not possible. Suggested to please use Git, since it is intended for collaborative scenarios."
Hope this helps.