Pulumi: How to restore accidentally deleted state file stored in Azure Storage Account - pulumi

Is there a way to restore accidentally deleted stacks/state files from backups or histories stored in Azure Storage Account?

Related

Delete a file in sharepoint using Azure Data Factory Delete Activity

I am trying to delete a file that is located in a sharepoint directory after successful copy activity. The Delete Activity is having the following properties:
Linked Service : HTTP
DataSet : Excel
Additional Header: #{concat('Authorization: Bearer ',activity('GetToken').output.access_token)}
Here, GetToken is the Web Activity in ADF that generates a token number for accessing SharePoint.
When I am running the pipeline, I am getting the below error:
Invalid delete activity payload with 'folderPath' that is required and cannot be empty.
I have no clue on how to tackle this.
As per my understanding you are trying to delete a file in Sharepoint online using Azure Data Factory.
Currently delete activity in ADF only supports the below data stores and not sharepoint online. which is why you are receiving the above error.
Azure Blob storage
Azure Data Lake Storage Gen1
Azure Data Lake Storage Gen2
Azure Files
File System
FTP
SFTP
Amazon S3
Amazon S3 Compatible Storage
Google Cloud Storage
Oracle Cloud Storage
HDFS
Image: Delete activity Supported Data stores
Ref: Delete activity supported data sources
As a workaround you may try exploring HTTP connector. OR you can use custom activity and write your own code to delete files from SharePoint.
Hope this info helps.

error browsing directory under ADLS Gen2 container for Azure Data Factory

I am creating a dataset in Azure Data Factory. This dataset will be a Parquet file within a directory under a certain container in an ADLS Gen2 account. The container name is 'raw', and the directory that I want to place the file into is source/system1/FullLoad. When I click on Browse next to File path, I am able to access the container, but I cannot access the directory. When I hit folder 'source', I get the error shown below.
How can I drill to the desired directory? As the error message indicates, I suspect that it's something to do with permissions to access the data (the Parquet file doesn't exist yet, as it will be used as a sink in a copy activity that hasn't been run yet), but I don't know how to resolve.
Thanks for confirming putting the resolution for others if anyone face this issue.
The user or managed identity you are using for your data factory should have storage data blob contributor access on the storage account. You can check it from azure portal, go to your storage account, navigate to the container and then directory, click on Access Control on the left panel and check role assignment. If it is missing add the role assignment of storage data blob contributor to your managed identity.

How to grant access to Azure File Copy of Azure Pipeline to Azure Storage?

I would like to copy files with Azure File Copy with Azure Pipeline.
I'm following instruction of https://praveenkumarsreeram.com/2021/04/14/azure-devops-copy-files-from-git-repository-to-azure-storage-account/
I'm using automatically created Service Connection named "My Sandbox (a1111e1-d30e-4e02-b047-ef6a5e901111)"
I'm getting error with AzureBlob File Copy:
INFO: Authentication failed, it is either not correct, or
expired, or does not have the correct permission ->
github.com/Azure/azure-storage-blob-go/azblob.newStorageError,
/home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-
go#v0.10.1-0.20201022074806-
8d8fc11be726/azblob/zc_storage_error.go:42
RESPONSE Status: 403 This request is not authorized to perform
this operation using this permission.
I'm assuming that Azure Pipeline have no access to Azure Storage.
I wonder how do find service principal which should get access to Azure Storage.
I can also reproduce your issue on my side, as different Azure file copy task versions use different versions of AzCopy in behind, then they use different auth ways to call the API to do the operations.
There are two ways to fix the issue.
If you use the automatically created service connection, it should have Contributor role in your storage account, you could use Azure file copy task version 3.* instead of 4.*, then it will work.
If you want to use Azure file copy task version 4.*, navigate to your storage account -> Access Control (IAM) -> add your service principal used in the service connection as a Storage Blob Data Contributor role, see detailed steps here. It will also work.

Creating Organization on Azure Dev Ops with existing name

I have created an organization on Azure DevOps with my email id ( created by me) which is the same as my email id associated with my azure subscription.
I want to create an organization with the name and URL what I created with my personal account in Microsoft associated account.
I deleted one which I created and tried creating by login as a Work Account, however, I get an error organization already exits.
How can I get it resolved?
Azure DevOps can be linked to an Azure Active Directory. In your situation, I strongly suggest you do the following steps to link it and transfer the ownership to your work account:
Make sure you can fully control both your work account and your personal Microsoft account.
Link your existing Azure DevOps organization to your Azure Active Directory.
Add your work account as an administrator in your Azure DevOps organization.
Transfer the ownership of the organization to your work account.
Kick your personal account out.
Here are some tips:
You can link existing Azure Active Directory like this:
You can change the ownership of your Azure DevOps organization like this:
Backup solution
Of course, you can delete the entire Azure DevOps organization and recreate it. To delete it, make sure all your data is safe. And press the Delete button in the Overview settings.
After deleting it and you can re-create a new organization with the same name using your work account.

Script to copy files to azure blob storage without key

Is there a way to copy local files to an azure account without the subscription string or account key/pass, but the powershell script would be run in an environment which has access to the account. I know it can be done if the account key is provided but I'm not allowed access to the password.