I need to automate pg_dump for a postress server in Azure. I preferably want to use github actions and azure powershell to do this and store the file on an azure storage account
I cant seem to find any docs online which documents the use of github actions and azure powershell to do this and save the exported postgress file to an azure storage account
Related
I have a build pipeline which builds *.p12 files and been published a artifacts in the build pipeline
How to copy files from azure Devops pipeline workspace or from published artifacts to Azure storage account (Fileshare) path
You could Azure file copy task in Azure pipeline to copy file from Azure pipeline to azure storage account. This task works only when run on Windows agents.
“Az storage file upload “ az cli command works to upload files from azure Devops pipeline workspace to fileshare storage account
Az storage file upload is what y'all are looking for. Batch upload can be used to do files instead of directories. Despite what the website says a combination of connection string and share name will also work.
Since it took me a while to find it, here is the command to get the connection string and then reuse it later:
connectionString=$(az storage account show-connection-string -g **name of your resource group** -n **name of storage account** --query connectionString -o tsv )
We Have Automated scripts that we would like to build and Test on Azure DevOps but our pipeline cannot run our Test Scripts on Azure
We have a Database Service Account that we want to configure on Azure but we don't know how to go about it. Please assist.
Here is a well explained video (by Hassan Habib from Microsoft) on exactly how to run a console app (you create) in an Azure Pipeline that securely gets credentials to immediately do stuff in Azure (https://youtu.be/ht0xhQyF1x4?t=1688)
He basically, in a handful of minutes shows exactly how to:
Link Pipeline Variables to KeyVault Secrets, so when accessed, the variables do a get() from KeyVault and return that value.
Securely links Pipeline Variables to Azure Environment Variables.
As a step in the release pipeline the console app reads the Azure Environment Variables to get credentials to do stuff in Azure.
In his case he created an Azure Resource Group in Azure.
In your case if I’m understanding correctly. You could possibly make a simple console app that runs in the pipeline, that gets creds\connections strings for your database to do whatever in the DB and could possibly test your scripts.
Example, I want to Offboard Tiger from Azure Develops Organization BigZoo (Tiger has 3 Azure organization, BigZoo, SmallZoo, middleZoo--I only want to remove tiger from BigZoo)
Use this Azure CLI in the Azure Cloudshell Powershell will be working
az devops user remove --user tiger
[--org BigZoo]
[--yes]
But it does not work in Azure function-->Azure function use PowerShell environment--> CloudPowerShell can read AzureCLi but Powershell In AzureFunction cannot read anything related with Az
So, if I only want use powershell without any help from Azure CLI, there are some module but all imported module is remove user from all Organizations--> Not remove user from specific organization
Any suggestion? Remove user from specific AzureDevops Organization. Using API seems too complex . Any good idea?
Thanks
I've not tried this but there is the VSTeam powershell module in the PowerShell gallery which wraps the Azure Devops API.
This has a function called: Remove-VSTeamUserEntitlement
Docs here:
https://methodsandpractices.github.io/vsteam-docs/docs/modules/vsteam/commands/Remove-VSTeamUserEntitlement
Module here:
https://www.powershellgallery.com/packages/VSTeam/
Add the module to the Requirements.psd1 file of your azure functions project and you should be able to utilise it.
I would like to copy files with Azure File Copy with Azure Pipeline.
I'm following instruction of https://praveenkumarsreeram.com/2021/04/14/azure-devops-copy-files-from-git-repository-to-azure-storage-account/
I'm using automatically created Service Connection named "My Sandbox (a1111e1-d30e-4e02-b047-ef6a5e901111)"
I'm getting error with AzureBlob File Copy:
INFO: Authentication failed, it is either not correct, or
expired, or does not have the correct permission ->
github.com/Azure/azure-storage-blob-go/azblob.newStorageError,
/home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-
go#v0.10.1-0.20201022074806-
8d8fc11be726/azblob/zc_storage_error.go:42
RESPONSE Status: 403 This request is not authorized to perform
this operation using this permission.
I'm assuming that Azure Pipeline have no access to Azure Storage.
I wonder how do find service principal which should get access to Azure Storage.
I can also reproduce your issue on my side, as different Azure file copy task versions use different versions of AzCopy in behind, then they use different auth ways to call the API to do the operations.
There are two ways to fix the issue.
If you use the automatically created service connection, it should have Contributor role in your storage account, you could use Azure file copy task version 3.* instead of 4.*, then it will work.
If you want to use Azure file copy task version 4.*, navigate to your storage account -> Access Control (IAM) -> add your service principal used in the service connection as a Storage Blob Data Contributor role, see detailed steps here. It will also work.
I have created SQL Azure under Resource Group using AzureResourceManager mode.
I am trying to Import bacpac file from Azure Blob to Azure SQL using Start-AzureSqlDatabaseImport command but it's look like this command is not available with AzureResourceManager mode. (it's available with AzureServiceManagement mode.)
Does any similar command available with AzureResourceManager mode?
At the moment there is not an Azure Resource Manager command for Azure SQL Database's Import functionality. For a list of currently supported Azure Resource Manager commands for Azure SQL Database please look here.
As of AzureRm PowerShell 5.7.0, there are dedicated cmdlets for importing and exporting bacpacs directly:
https://learn.microsoft.com/en-us/powershell/module/AzureRM.Sql/New-AzureRmSqlDatabaseImport?view=azurermps-5.7.0
Caveat: you have to store the .bacpacs in an Azure Storage Account.