how to create custom image from blob storage hvd in azure dev test labs? - powershell

When I try creating custom image using arm template
https://github.com/Microsoft/azure-docs/blob/master/articles/devtest-lab/devtest-lab-create-custom-image-from-vhd-using-powershell.md
I am getting
{
"error": {
"code": "InvalidStorageAccountForLab",
"message": "Invalid storage account for lab"
}
}
I have added lab storage type as Premium, but when I deploy only standard storage is created.
"apiVersion": "2016-05-15",
"type": "Microsoft.DevTestLab/labs",
"name": "[parameters('newLabName')]",
"location": "[resourceGroup().location]",
"tags": {
"ResourceType": "Rig",
"RigTemplate": "[parameters('customTag')]"
},
"properties": {
"labStorageType": "Premium"
},

I have added lab storage type as Premium, but when I deploy only
standard storage is created.
We can't create a premium VHD to a standard storage account.
If you want to create custom image in premium storage account, we should create a premium storage account first, then replace the storage account and key with premium storage account and key in the PowerShell script.
Note:
The link you followed, the PowerShell script create image will store in the default storage account (standard), so if you want to store image to a premium storage account, we should create a premium first.

Related

Google Cloud Storage permission denied

I set up a Cloud Run which uses a Bucket on Cloud Storage. Locally I run it in a Docker Container, the credentials are passed using a json file, created and downloaded from IAM & Admin, and it works. When deployed, writing to the bucket throws an error:
{
500 unable to sign bytes: googleapi: Error 403: Permission 'iam.serviceAccounts.signBlob' denied on resource (or it may not exist).
Details:
[{
"#type": "type.googleapis.com/google.rpc.ErrorInfo",
"domain": "iam.googleapis.com",
"metadata": {
"permission": "iam.serviceAccounts.signBlob"
},
"reason": "IAM_PERMISSION_DENIED"
}]
[]
}
Any idea?
I had to add the Service Account Token Creator to the service account. I did it, but it did not work anyway because there is the need to deploy a new version of the service, so:
Add the role Service Account Token Creator
Deploy new version of the service

How to programmatically assign user managed identity to Azure Data Factory

I have Azure Data Factory in which I want to connect to Azure Synapse using User Assigned Managed Identity authentication type.
Three steps need to be done but unfortunately, I haven't noticed a possibility to programmatically set up step 1.
In Data Factory (Settings-> Managed Identities) assign User Managed Ideneitty
Create credentials
Create linked services
If I execute Azure ARM with only implemented second and third steps I got the following exception:
"The referenced user assigned managed identity in the credential is not associated with the factory".
Do you know how I can assign User-Managed Identity to Data Factory?
Had the same error messsage: "The referenced user assigned managed identity in the credential is not associated with the factory"
When doing CI/CD for arm-template I noticed that the auto generated arm-template from the datafactory only had identity type SystemAssigned. Even when when i have manually added a UserAssigned identity in the ADF GUI.
My Solution was to modify arm-template-parameters-definition.json with this.
"Microsoft.DataFactory/factories": {
"identity": "=:-identity"
}
Then in the parameter file you can pass in this :
"dataFactory_identity": {
"value": {
"type": "SystemAssigned,UserAssigned",
"userAssignedIdentities": {
"/subscriptions/<Insert_subscrptionId>/resourceGroups/<Insert_resourceGroupsName>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<Insert_userAssignedIdentitiesName>": {}
}
}
}
Unfortunately, documentation for Azure Data Factory ARM is limited. I found a solution based on the documentation of the Azure Storage ARM and below it's the solution:
{
"name": "[variables('dataFactoryName')]",
"type": "Microsoft.DataFactory/factories",
"apiVersion": "2018-06-01",
"location": "[resourceGroup().location]",
"identity": {
"type": "SystemAssigned,UserAssigned",
"userAssignedIdentities": {
"[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', variables('uamiName'))]": {}
}
},

Create Azure Data factory linked service or integration runtime directly in git mode wit rest api

I am trying to create linkedservices with restapi in gitmode but the linked service is still created in live mode. My API code was
PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/linkedservices/{linkedServiceName}?api-version=2018-06-01&versionType=branch&version=test_branch
with a body
"properties": {
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://xxxxxxxxx.vault.azure.net/"
}
Please is there a way to reference the branch and create this service in git mode
As per official documentation, Changes made via PowerShell or an SDK are published directly to the Data Factory service, and are not entered into Git.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/source-control

How do I create additional buckets in Firebase Cloud Storage Emulator

I am trying to create additional storage buckets in the new Firebase Cloud Storage Emulator. I see no option in the EmulatorUI and have found no resources online. Is this even possible?
You can create buckets in the Firebase emulator by following these steps:
1 Setup deploy targets for storage.
firebase target:apply storage default myproject.appspot.com
firebase target:apply storage other other.appspot.com
2 Configure the firebase.json file
Change the storage section to an array that looks like this:
{
"storage": [
{
"target": "default",
"rules": "storage.default.rules"
},
{
"target": "other",
"rules": "storage.other.rules"
}
]
}
Note that the bucket won't be created on the emulator suite until you write something to it. Also you need to change your project to the blaze plan to configure multiple buckets on the firebase console.
This URL has more details on the configuration:
https://firebase.google.com/docs/cli/targets#set-up-deploy-target-storage-database
This documentation page also has information on how to handle multiple buckets with the sdk:
https://firebase.google.com/docs/storage/web/start#use_multiple_storage_buckets

Azure DataLake Analytics U-SQL Pipeline Activities Error

Previously we have used Author deployment for authentication, but it get disabled after 14 days of inactive, So i tried service Principal authentication but is does't work and throw the below error while running activity in datafactory
Cannot resolve DataLakeAnalyticsUri '', Please change or remove DataLakeAnalyticsUri and have a try.
{
"name": "AzureDataLakeAnalyticsLinkedService",
"properties": {
"type": "AzureDataLakeAnalytics",
"typeProperties": {
"accountName": "accountName",
"dataLakeAnalyticsUri":"azuredatalakeanalytics.net",
"subscriptionId": "subscription Id",
"resourceGroupName": "resource Group Name",
"servicePrincipalId":"service Principal Id",
"servicePrincipalKey":"service Principal Key",
"tenant":"tenant id"
}
}
}
Updated answer:
Note: User accounts that are managed by Azure AD expires 14 days after the last slice run.
To avoid or resolve this error, reauthorize by selecting the Authorize permissions when the token expires. Then, redeploy the linked service.
To resolve the issue, make sure to pass the "dataLakeAnalyticsUri":"<azure data lake analytics URI>".
Example: "dataLakeAnalyticsUri": "azuredatalakeanalytics.net"
For more details, refer "Transform data by running U-SQL scripts on Azure Data Lake Analytics".
Hope this helps.