I have Azure Data Factory in which I want to connect to Azure Synapse using User Assigned Managed Identity authentication type.
Three steps need to be done but unfortunately, I haven't noticed a possibility to programmatically set up step 1.
In Data Factory (Settings-> Managed Identities) assign User Managed Ideneitty
Create credentials
Create linked services
If I execute Azure ARM with only implemented second and third steps I got the following exception:
"The referenced user assigned managed identity in the credential is not associated with the factory".
Do you know how I can assign User-Managed Identity to Data Factory?
Had the same error messsage: "The referenced user assigned managed identity in the credential is not associated with the factory"
When doing CI/CD for arm-template I noticed that the auto generated arm-template from the datafactory only had identity type SystemAssigned. Even when when i have manually added a UserAssigned identity in the ADF GUI.
My Solution was to modify arm-template-parameters-definition.json with this.
"Microsoft.DataFactory/factories": {
"identity": "=:-identity"
}
Then in the parameter file you can pass in this :
"dataFactory_identity": {
"value": {
"type": "SystemAssigned,UserAssigned",
"userAssignedIdentities": {
"/subscriptions/<Insert_subscrptionId>/resourceGroups/<Insert_resourceGroupsName>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<Insert_userAssignedIdentitiesName>": {}
}
}
}
Unfortunately, documentation for Azure Data Factory ARM is limited. I found a solution based on the documentation of the Azure Storage ARM and below it's the solution:
{
"name": "[variables('dataFactoryName')]",
"type": "Microsoft.DataFactory/factories",
"apiVersion": "2018-06-01",
"location": "[resourceGroup().location]",
"identity": {
"type": "SystemAssigned,UserAssigned",
"userAssignedIdentities": {
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('uamiName'))]": {}
}
},
Related
I am trying to create linkedservices with restapi in gitmode but the linked service is still created in live mode. My API code was
PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/linkedservices/{linkedServiceName}?api-version=2018-06-01&versionType=branch&version=test_branch
with a body
"properties": {
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://xxxxxxxxx.vault.azure.net/"
}
Please is there a way to reference the branch and create this service in git mode
As per official documentation, Changes made via PowerShell or an SDK are published directly to the Data Factory service, and are not entered into Git.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/source-control
I need Azure DevOps CI/CD Tips for Azure Data Factory.
I have Azure Data Factory which has Web Activity to Post json to Azure LogicApps. (ADF Web->Logic Apps) I have made URL as parameter in Web Activity.
Parameter in ADF Pipeline: MyReport_LogicAppURL
I have edited ARM Template
Manage->ARM Template->Edit Parameter Configuration
"Microsoft.DataFactory/factories/pipelines": {
"properties": {
"parameters": {
"MyPremLoad_OnPremDb": {
"defaultValue": "="
},
"MyReport_LogicAppURL": {
"defaultValue": "="
}
}
}
},
I Save it and then publish.
However I don't see any updates in ARMTemplateParametersForFactory.json in Git Repo
Why ARMTemplateParametersForFactory.json do not get updated in Azure DevOps for Azure Data Factory? I have successfully done this in other environments. Also other person has successfully deployed MyPremLoad_OnPremDb in past.
Maybe if you have a non-standard git hierarchy, you should find or place the
arm-template-parameters-definition.json in Datafactory folder or the root folder.
Once you have made changes to parameter definitions, save and refresh the browser to reload the configurations.
If you still don't see the changes, i would suggest deleting arm-template-parameters-definition.json from your publish branch manually and save parameter definitions > refresh browser > publish from ADF portal.
Make sure you have a right json structure after the edit, Do share if you see any particular errors.
"Microsoft.DataFactory/factories/pipelines": {
"properties": {
"parameters": {
"MyPremLoad_OnPremDb": {
"defaultValue": "="
},
"MyReport_LogicAppURL": {
"defaultValue": "="
}
}
}
},
I added default value to parameter and it got fixed. Thank you KarthikBhyresh-MT for your support.
I was searching the web after information in regards to the question I have to add secrets and access policies to an existing keyvault in azure shade by others applications using ARM.
I read this documentation.
What I'm worried about is in regards to if anything existing will be overwritten on deleted as I'm creating a new template and parameter file in my services "solution" so to speak.
And I know that I have my CICD pipelines in devops set to "incremental" in regards to what it should be updating an creating.
Anyone have a crystal clear understanding regarding this?
Thanks in advance!
UPDATE:
So I think I managed to get it right here after all.
I Created a new key vault resource and added a couple of secrets and some access policies to emulate a situation of an already created resource which I want to add new secrets to.
Then I created this template:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"keyVault": {
"type": "string"
},
"Credentials1": {
"type": "secureString"
},
"SecretName1": {
"type": "string"
},
"Credentials2": {
"type": "secureString"
},
"SecretName2": {
"type": "string"
}
},
"variables": {
},
"resources": [
{
"type": "Microsoft.KeyVault/vaults/secrets",
"name": "[concat(parameters('keyVault'), '/', parameters('SecretName1'))]",
"apiVersion": "2015-06-01",
"properties": {
"contentType": "text/plain",
"value": "[parameters('Credentials1')]"
}
},
{
"type": "Microsoft.KeyVault/vaults/secrets",
"name": "[concat(parameters('keyVault'), '/', parameters('SecretName2'))]",
"apiVersion": "2015-06-01",
"properties": {
"contentType": "text/plain",
"value": "[parameters('Credentials2')]"
}
}
],
"outputs": {}
}
What I've learned is that if an existing shared key vault exists which I want to add some secrets to I only have to define the sub resources, in this case the secrets to be added to the existing key vault.
so this worked an resulted in not modifying anything else in the existing key vault except adding the new secrets.
even though this is not a fully automated way of adding a whole new key vault setup related to a new service, as one doesn't connect the new resources correctly by adding their principal ID's (identity). Its good for now as I don't have to add each secret manually. Though I do have to add the principal ID's manually.
When using incremental mode to deploy the template, it should not overwrite the things in the keyvault.
But to be foolproof, I recommend you to back up your keyvault key, secret, certificate firstly. For the access policies, you can also export the template of the keyvault firstly, save the accessPolicies for restore in case.
If you redeploy the existing KeyVault in incremental mode any child properties, such as access policies, will be configured as they’re defined in the template. That could result in the loss of some access policies if you haven’t been careful to define them all in your template. The documentation linked to above will give you a full list of the properties that would be affected. As per the docs this can affect properties even if they’re not explicitly defined.
KeyVault Secrets aren’t a child property of the KeyVault resource so won’t get overwritten. They can be defined in ARM either as a separate resource in the same template or in a different template file. You can define some, all or none of the existing secrets in ARM. Any that aren’t defined in the ARM template will be left as is.
If you’re using CI/CD to manage your deployments it’s worth considering setting up a test environment to apply the changes to first so you can validate that the result is as expected before applying them to your production environment.
We use Synapse git Integration to deploy artifacts such as linked services generated by a Data Warehouse automation tool (JSON files)
It is different then deploying ARM template in ADF.
We created one Azure Key Vault (AKV) per environment so we do have an Azure Key Vault LinkedService in each environment and the linked services has the same name. But each AKV as his own URL so we need to change the URL in the deployed linked services during the CI/CD process.
I read this https://learn.microsoft.com/en-us/azure/synapse-analytics/cicd/continuous-integration-deployment#use-custom-parameters-of-the-workspace-template
I think I need to create a template to change "Microsoft.Synapse/workspaces/linkedServices"
But I didn't find any example on how to modify the KV url parameters.
Here is the linked services I want to modify,https://myKeyVaultDev.vault.azure.net as to be changed when deploying
{
"name": "myKeyVault",
"properties": {
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://myKeyVaultDev.vault.azure.net"
}
}
}
Not much familiar with the ci/cd and azure devOps yet, but still I need to do it...
I have done this using Azure Devops. When you create the Release pipeline within Azure Devops, one of the options is to "override parameters". at this point you can specify the name of the keyvault and the corresponding value. The corresponding value is configured in a pipeline variable set - which itself can come from the same keyvault.
You don't need to create the template. Synapse already does that and stores it in the publish branch (“workspace_publish”). If you look in that branch you will see the template along with the available parameters that you can override.
More info is available here:
https://www.drware.com/how-to-use-ci-cd-integration-to-automate-the-deploy-of-a-synapse-workspace-to-multiple-environments/
https://techcommunity.microsoft.com/t5/data-architecture-blog/ci-cd-in-azure-synapse-analytics-part-1/ba-p/1964172
From the Azure Key Vault side of things, I believe you're right - you have change the Linked Services section within the template to point to the correct Key Vault base URL.
Azure Key Vault linked service
I don't know if you still are looking for the solution.
In order to parametrize linked service property and specially AKV reference, I think you should modify the template-parameters-definition.json, and add the following section:
"Microsoft.Synapse/workspaces/linkedServices":
{ "*":
{ "properties":
{ "typeProperties":
{ "baseUrl": "|:-connectionString:secureString" }
}
}
}
This will create a parameter for each linked service. The next step is to overrideParameters on SynapseWorkspaceDeployment task on Azure Devops.
I have an requirement to process a table in ADFv2. For this I have to use a web activity. But I don't know how to pass the OAuth credentials to it. Because there is not an OAuth specific selection button. Below is the url to which I am sending the request
url - "https://northeurope.asazure.windows.net/servers/server123/models/testmodel1/refreshes"
Below is request body
"body": {
"CommitMode": "transactional",
"MaxParallelism": 2,
"Objects": [
{
"table": "Customer"
}
],
"RetryCount": 2,
"Type": "Full"
}
Authentication details used for posting request to API
Authentication -
"authentication": {
"audience": "https://*.asazure.windows.net",
"clientId": "***",
"secret": "***",
"tenant": "***",
"type": "ActiveDirectoryOAuth"
}
How can I do this?
You could use MSI authentication for your API in the Azure Data Factory Web Activity. Please see this document.
Specify the resource uri for which the access token will be requested using the managed identity for the data factory. To call the Azure Resource Management API, use https://management.azure.com/. For more information about how managed identities works see the managed identities for Azure resources overview page.
Based on this article,you could know that when creating a data factory, a service identity can be created along with factory creation. The service identity is a managed application registered to Azure Activity Directory, and represents this specific data factory.
So just grant the permission for the destination resource, then your adf activity could access the resource.
You could refer to this case:Azure data factory web activity with MSI authentication