Self Hosted IR unavailable after ARM deployment - deployment

We are trying to use self hosted integration runtime to extract data from on-prem fileshare. To implement CI/CD, I have created arm templates from the data factory where IR is successfully working and enabled sharing on for the Data Factory in which I am going to deploy my pipelines using ARM templates. I can successfully deploy pipeline and self hosted IR and linked services but IR is not available in the new data factory connections.
Is it normal? Because to use CI/CD with Data Factory, as soon as ARM gets deployed we should be ready to run pipelines without manual changes? And if I am correct then can anyone help why IR in the new Data Factory isn't available which is making the pipeline failed when I am trying to run it.

Self Hosted Integration Runtime are tied to the ADF it is created in.
To use CI/CD with Self Hosted IR you need to do following steps:
Create a new Data Factory other than the ones you using in CI/CD
process,then create the Self hosted Integration Runtime their.(This
ADF doesn't need to contain any of your pipeline or Dataset).
Go to the newly created Integration Runtime and click on edit or pencil
icon. Go to sharing tab of opened window.
Click on Grant Permission to other Data factory.(Search and Give Permission to all ADF
involved in CI/CD Process).
Copy the resource id Displayed. Go to the DEV Data Factory and create new Self hosted runtime of type linked.
5.Enter the Resource ID when asked and click create.
6.Then proceed to setup CI/CD process through DEV Data Factory.
Since through ARM template in all other Data factory linked Self Hosted IR will be created and if you provided permission then everything will work.

A Self-Hosted Integration Runtime, is 'owned' by exactly one Data Factory instance. The way the 'owner' and the 'sharer' factories define the IR are different. When you deployed one over the other, the type changed and you ended up with either two 'owners' or two 'sharers'. Since there can only be one 'owner' or a 'sharer' points to an 'owner', things break.

Related

Deployment of ARM template to Devops branch

As a part of learning bicep, I've created an arm template for linked service to an existing ADF. When I deploy the template from ADF, It is being deployed directly in live mode.
Would it be possible to deploy that template and have those newly created Linked Service to appear in a branch of my devops repository? So that I can have those in repository?
When deploying ARM to a new data factory, it is automatically deployed in live mode.
If you published a new linked service from data factory, it will appear the next time you deploy.
If you want more flexibility in deploying data factory, take a took at SQLPlayer. A library for flexible data factory deployment.

Cannot add new Integration Runtime for Azure Data Factory

I have owner role in subscription level and I can create new pipeline in Azure Data Factory.
However I don't see "New" button in Integration Runtime.
Is this permission related problem? Which permission is missing then?
https://learn.microsoft.com/en-us/azure/data-factory/create-self-hosted-integration-runtime
Self-hosted integration runtime is used to connect to your on-premise source/linked server.
When create the dataset, linked server, we can see the "New" button:
You also could create the integration runtime by this way:
Manage-->Connections: integration runtimes-->New-->Azure,self-hosted-->Self-hosted:
Then you could choose the self-hosted integration to connect to your on-premise source.

How to complete CI/CD pipelines with Azure DevOps for Azure API Management

I need help to understand better how to create complete CI/CD with Azure Devops for APim. Ok I already has explored the tools and read docs:
https://github.com/Azure/azure-api-management-devops-resource-kit
But I still have questions, my scenario:
APim dev instance with APi and operations created and others settings, as well APim prod instance created but empty.
I ran the extract tool and got the templates (not all), still need the Master (linked templates), and on this point seat my doubt, I already have 2 repos too(dev and prod).
How can I create the Master template, and how my changes from dev environment will be automatically applied to prod?
I didn't used the policyXMLBaseUrl parameters not sure what Path insert there, although seems #miaojiang inserted a folder from azure storage.
After some search and tries I deployed API's and Operations from an environment to another, but we still don't have a full automated scenario, where I make a change in a instance and that is automatically available.Is necessary to edit policies and definitions directly on the repositories or run the extract tool again.

Azure Data Factory designer in Visual Studio 2019 project

I've run into a release pipeline issue with the Azure Data Factory. In most types of Azure entities (such as a web app or sql project), I can develop the project in Visual Studio and use ADO to create a build and releases to deploy the project to Azure. Using variables and libraries, I can create the appropriate release definitions variables on a per-environment basis, and ensure that the build I deploy is the same for each step of the release pipeline (i.e dev -> tst -> stg -> prd). You know, the normal continuous integration/continuous deployment process?
However with Azure Data factories, it seems that I have to create the data factory in the Azure Portal directly for an environment and then create it again for another environment in the pipeline (I know I can export and reimport manually).
What I would like is to be able to create an Azure Data Factory project in Visual Studio (2019), maintain it in Visual Studio with a similar designer like the one in the Azure portal, check it into git and deploy it with a build and release in ADO.
Since creating an Azure Data Factory project doesn't seem possible (or am I missing something?), what is the recommended approach to working with Azure data factories in a continuous integration/continuous deployment ADO environment?
ADFV2 does not have a plugin for Visual Studio, most of the effort has been on the UX side.
We recommend you to use ADF UI as your development tool, where you can define your workflow easily and validate your changes.
For CICD, you can integrate your Dev factory with GIT, and then setup CICD in the below way
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment.
Every publish to Dev factory will trigger your release pipeline which can take the build and deploy to remaining stages.

CI / CD pipeline Template for logic app and azure function

I have a logic app that connects to an sftp server (virtual machine that I created on azure) and does actions when a file is added to that sftp:
When a file is added I create a new blob on the blob storage.
Delete the file from the SFTP server
I have also created a blob trigger-based azure function that, every time a blob is created, processes some actions (like blob content decryption and parsing).
Next steps will be chaining some other azure functions executions in my logic app (like sending e-mail after executing and azure function etc... )…
Now, I have two main questions:
In order to have the best CI/CD pipeline suited for this workflow, do I create the logic app from the portal or from visual studio and why please?
Do I put azure function and logic app in the same solution/Repo? Same project?
Then, how can I create the CI/CD pipeline (type template and steps please)?
Ps: I want to add unit tests to test if my logic app and azure function are working correctly so I want to integrate test step in my build definition.
For more details about the logic app please see this Stack overflow question in which i detailed the process
and here is the logic app
Please find the below points:
I would recommend using Visual Studio. The main advantage is it gives you the same
designer experience, and you can make use of ARM Template and parameters to
deploy your logic app robustly to multiple environments to dev, Staging , proc etc.,
making a robust CI/CD pipeline. It also gives you an advantage of using Azure key vaults using ARM template and the parameter syntax to store any sensitive data.
Also Visual studio provides you to connect to the cloud using cloud
explorer where you can mimic resubmit , run history etc..
If you are using Azure function only for one process then you can
put it under the same solutions, but keeping Azure functions as a
separate Repo gives you more flexibility of re-usability, so that
other applications can also make use of it.
You can utilise Speck flow for automate logic app testing Automated tesing logic app with speckflow this link explained it in detail.