Script to copy files to azure blob storage without key - powershell

Is there a way to copy local files to an azure account without the subscription string or account key/pass, but the powershell script would be run in an environment which has access to the account. I know it can be done if the account key is provided but I'm not allowed access to the password.

Related

How can i run pgdump in azure powershell via github actions

I need to automate pg_dump for a postress server in Azure. I preferably want to use github actions and azure powershell to do this and store the file on an azure storage account
I cant seem to find any docs online which documents the use of github actions and azure powershell to do this and save the exported postgress file to an azure storage account

How to grant access to Azure File Copy of Azure Pipeline to Azure Storage?

I would like to copy files with Azure File Copy with Azure Pipeline.
I'm following instruction of https://praveenkumarsreeram.com/2021/04/14/azure-devops-copy-files-from-git-repository-to-azure-storage-account/
I'm using automatically created Service Connection named "My Sandbox (a1111e1-d30e-4e02-b047-ef6a5e901111)"
I'm getting error with AzureBlob File Copy:
INFO: Authentication failed, it is either not correct, or
expired, or does not have the correct permission ->
github.com/Azure/azure-storage-blob-go/azblob.newStorageError,
/home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-
go#v0.10.1-0.20201022074806-
8d8fc11be726/azblob/zc_storage_error.go:42
RESPONSE Status: 403 This request is not authorized to perform
this operation using this permission.
I'm assuming that Azure Pipeline have no access to Azure Storage.
I wonder how do find service principal which should get access to Azure Storage.
I can also reproduce your issue on my side, as different Azure file copy task versions use different versions of AzCopy in behind, then they use different auth ways to call the API to do the operations.
There are two ways to fix the issue.
If you use the automatically created service connection, it should have Contributor role in your storage account, you could use Azure file copy task version 3.* instead of 4.*, then it will work.
If you want to use Azure file copy task version 4.*, navigate to your storage account -> Access Control (IAM) -> add your service principal used in the service connection as a Storage Blob Data Contributor role, see detailed steps here. It will also work.

Access Azure DevOps Secure File properties

I've upload a PKCS#12 certificate secure file to my Azure DevOps project, and added the password as a property:
The problem is that while I can access the certificate file by using a Download Secure File task, I can't see any way to access the properties of the file?
In the mean-time, I have what I want working by adding the password as a secure Variable in a Variable Group instead, but I'm still curious as to how to access secure file properties.
just encrypt the file with a key, then:
save the encrypted file in a shared place, for e.g. git repository or storage account, etc.
save the key in an Azure key vault or any other vault.
during the pipeline, decrypt the file with the key retrieved from AKV.

How to authorize clients in non-default directory to KeyVault

I created a KeyVault in my Azure subscription and a client application in one of my Azure AD directories. However, the client application is not registered in the default directory of the subscription.
When I run the following PowerShell cmdlet, it tries to look up the service principal in the default directory of the subscription and fails to find it.
PS > Set-AzureKeyVaultAccessPolicy -VaultName <vaultname>
-ServicePrincipalName <principal guid> -PermissionsToSecrets Get
I found an article describing how to change the default directory for a subscription in the management portal, but was wondering how to do the same using PowerShell.
The 'Select-AzureSubscription' cmdlet does not seem to support changing the default directory.
Nor does the 'Set-AzureKeyVaultAccessPolicy' support a parameter to indicate in which directory it should look.
Key Vault can only authorize applications (clients) registered in the directory associated with the Azure subscription, and the only way (currently) to change the 'home' directory associated with a subscription is through the Azure management portal.
I would imagine this is as designed behaviour, and can't imagine how / why it would change.
An Azure subscription has an Azure Active Directory attached to it, this is the directory it will use to authenticate against whenever someone tries to access resources.
While you can create trusts to other Active Directories simply creating an new AAD does not automatically enable that domain to be trusted by Azure.
Key Vault is designed to only be accessible to authenticated users, it is designed to provide secure data to those users. Since there is no authentication mechanism between the multiple directories you have created, there is no mechanism for Key Vault to determine who those directories belong to.
Key Vault needs to resolve a service principle through the Active Directory attached to the subscription it is running under (whether that is directly through that directory, or through another domain that it trusts). Anything else would create additional attack vectors and considerably weaken the product.

How do I get the subscriptions of an Azure account via PowerShell when I am logged into Windows using a different account?

When I execute Get-AzureAccount, I see the Azure account of the domain account I am logged into Windows with. So, when I run Get-AzureSubscriptions, I see the associated subscriptions. I want to get the subscriptions associated with a different account (one with which I cannot login into Windows) but I cannot figure out how this is done. Of course, Add-AzureAccount would seem to be the way to go but despite reading the TechNet help page on it, I don't see how another account can be added.
Thanks!
-Rohan.
Azure subscriptions are stored in "C:\Users\%username%\appdata\Roaming\Windows Azure Powershell" (or "%AppData%\Windows Azure Powershell) per user. The contents of that dir is an xml file containing the user's subscriptions. Each subscription is linked to a certificate that needs to reside in the same user's cert store in order to connect.
Anyways, using
Get-AzureSubscription -SubscriptionDataFile <path to the other user's xml file>
you should be able to read those subscriptions, if you have access to his/her profile folder (which would require local admin permissions on a normal system).