Using az devops pipeline I am deleting all files in container then upload all files instead I want to upload only last changed files using az storage blob cmdlet (like incremental backup) or any other cmdlet except azcopy,can anyone guide me to do so
Since you want to use az storage blob cmdlet to upload files to Azure Storage account blob, I suggest that you can use the az storage blob sync.
For example:
az storage blob sync -c mycontainer -s "path/to/file" -d NewBlob
In Azure DevOps Pipeline, you can use Azure CLI task to run the command.
Related
I am trying to connect my powershell runbook to a storage account (blob) to read .sql files and execute them in my Azure SQL Database.
Connect to a blob container
Read the script on .sql file
Execute the script on the db
When I try Invoke-Sqlcmd, it requires a dedicated storage to store the file. However, runbooks work serverless and there is no storage I can use for the files, as far as I know.
Is there a way to only read the files (without moving them around) via powershell runbook or can I store the files to read with it?
I have a database PostgreSQL in k8s(AKS) and I need to backup it to Azure blob container. Is there any possibility to do this?
I currently have an Azure PostgreSQL server which I would like to permanently delete.
For obvious reasons I would like to make a snapshot/soft delete before turning off and deleting. What is the simplest way to accomplish this? I know there is a backup center option but I feel this is rather sophisticated for what I need.
Thanks
You can take the backup of Azure PostgreSQL server on the Azure Storage Account, then download the backup file locally from the storage account. After that you can delete the storage account and Azure PostgreSQL server.
Below is the article that shows how to take back up on Blob Storage.
Backup Azure Database for PostgreSQL to a Blob Storage
I have tried using the option Export Data-tier option but I get errors in all outputs
Exporting Database
Extracting Schema
Extracting Schema from database
and there are no details about why it failed.\
Is there a way I can back it up on Azure and copy the bacpac file or with powershell ?
You can use SqlPackage to create a backup (export a bacpac) of your Azure SQL Database in your local drive.
sqlpackage.exe /Action:Export /ssn:tcp:sqlftpbackupserver.database.windows.net /sdn:sqlftpbackupdb /su:alberto /tf:c:\sql\sqlftpbackup.bacpac /sp:yourpwd /p:Storage=File
In above example, we are exporting a file (Export) to an Azure SQL Server named sqlftpbackupserver.database.windows.net and the source database name is sqlftpbackup. The source user is alberto and the target file where we will export is in the c:\sql\sqlftpbackup.bacpac sp is to specify the Azure SQL database password of the Azure SQL user. Finally, we will store in a file.
Another example is:
SqlPackage /Action:Export /SourceServerName:SampleSQLServer.sample.net,1433
/SourceDatabaseName:SampleDatabase /TargetFile:"F:\Temp\SampleDatabase.bacpac"
You can try the backup the database to Blob Storage on Portal, then download it to your local disk.
For Powershell, here's the Command example: Backup SQL Azure database to local disk only:
C:\PS>Backup-Database -Name “database1” -DownloadLocation “D:\temp” -Server “mydatabaseserver” -UserName “username” -Password “password” -Verbose
For more details, reference Backup SQL Azure Database.
Here's a tutorial How to backup Azure SQL Database to Local Machine talks about almost all the ways to help you backup the Azure SQL Database to local disk:
Hope this helps.
How to run a shell script from Azure Data Factory. Inside the shell script I am trying to execute an hql file like below:
/usr/bin/hive -f "wasbs://hivetest#batstorpdnepetradev01.blob.core.windows.net/hivetest/extracttemp.hql" >
wasbs://hivetest#batstorpdnepetradev01.blob.core.windows.net/hivetest/extracttemp.csv
My hql file is stored inside a Blob Storage and I want to execute it and collect the result into a csv file and store it back to Blob Storage . This entire script is stored in shell script which also in a Blob Storage. NowIi want to execute in a Azure Data Factory in hive activity. Help will be appreciated.
You could use Hadoop hive activity in ADF. Please take a look at this doc. And you could build your pipeline with ADF V2 UI.