Read .sql files with Azure Automation runbook - powershell

I am trying to connect my powershell runbook to a storage account (blob) to read .sql files and execute them in my Azure SQL Database.
Connect to a blob container
Read the script on .sql file
Execute the script on the db
When I try Invoke-Sqlcmd, it requires a dedicated storage to store the file. However, runbooks work serverless and there is no storage I can use for the files, as far as I know.
Is there a way to only read the files (without moving them around) via powershell runbook or can I store the files to read with it?

Related

How do you run many SQL commands against an Azure SQL database using an Azure Automation powershell runbook

I'm using Azure Automation to move an Azure SQL database from one resource to another(from Prod to Dev for example). After the database is copied, I would then like to run SQL script that adds some users and permissions. This would mean I need to run a handful of commands like "Create user..." and "alter role....". Most examples I've found use powershell to execute a single SQL command, but using that code to run many commands seems like it would result in an excessively long powershell script. In the on-prem world, I probably would have .sql file that gets executed. Any suggestions on how to achieve this easily using powershell in Azure Automation. Thanks!

How to execute SQL scripts using azure databricks

I have one SQL scripts file
In that file there is some sql query
I want to upload it on dbfs and read it from azure databricks and execute querys from the script on azure databricks
Databricks does not directly support the execution of .sql files. However you could just read them into a string and execute them.
with open("/dbfs/path/to/query.sql") as queryFile:
queryText = queryFile.read()
results = spark.sql(queryText)

Google Cloud SQL Restore BAK file

I am new in Google Cloud. I created a Cloud SQL Instance and I need to restore the data from a .bak file. I have the .bak file in a GCS bucket, and I am trying to restore using Microsoft Management Studio -> Task -> Restore. But I'm not able to access the file.
Can anyone help me with the procedure on how to restore from a .bak file?
You need to give the Cloud SQL service Account access to the bucket where the file is saved.
On Cloud Shell run the following:
gcloud sql instances describe [INSTANCE_NAME]
On the output search for the field "serviceAccountEmailAddress" an copy the SA email.
Then again on cloud shell run the following:
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:legacyBucketWriter gs://<<BUCKET_NAME>>
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:objectViewer gs://<<BUCKET_NAME>>
That should give the service account permission to access the bucket and retrieve the file, also here is the guide on doing the import, take in mind that doing the import will override all the data on the DB.
Also remember that:
You cannot import a database that was exported from a higher version of SQL Server. For example, if you exported a SQL Server 2017 Enterprise version, you cannot import it into a SQL Server 2017 Standard version.

How can i backup and Azure DB to local disk?

I have tried using the option Export Data-tier option but I get errors in all outputs
Exporting Database
Extracting Schema
Extracting Schema from database
and there are no details about why it failed.\
Is there a way I can back it up on Azure and copy the bacpac file or with powershell ?
You can use SqlPackage to create a backup (export a bacpac) of your Azure SQL Database in your local drive.
sqlpackage.exe /Action:Export /ssn:tcp:sqlftpbackupserver.database.windows.net /sdn:sqlftpbackupdb /su:alberto /tf:c:\sql\sqlftpbackup.bacpac /sp:yourpwd /p:Storage=File
In above example, we are exporting a file (Export) to an Azure SQL Server named sqlftpbackupserver.database.windows.net and the source database name is sqlftpbackup. The source user is alberto and the target file where we will export is in the c:\sql\sqlftpbackup.bacpac sp is to specify the Azure SQL database password of the Azure SQL user. Finally, we will store in a file.
Another example is:
SqlPackage /Action:Export /SourceServerName:SampleSQLServer.sample.net,1433
/SourceDatabaseName:SampleDatabase /TargetFile:"F:\Temp\SampleDatabase.bacpac"
You can try the backup the database to Blob Storage on Portal, then download it to your local disk.
For Powershell, here's the Command example: Backup SQL Azure database to local disk only:
C:\PS>Backup-Database -Name “database1” -DownloadLocation “D:\temp” -Server “mydatabaseserver” -UserName “username” -Password “password” -Verbose
For more details, reference Backup SQL Azure Database.
Here's a tutorial How to backup Azure SQL Database to Local Machine talks about almost all the ways to help you backup the Azure SQL Database to local disk:
Hope this helps.

how to run a shell script from Azure data factory

How to run a shell script from Azure Data Factory. Inside the shell script I am trying to execute an hql file like below:
/usr/bin/hive -f "wasbs://hivetest#batstorpdnepetradev01.blob.core.windows.net/hivetest/extracttemp.hql" >
wasbs://hivetest#batstorpdnepetradev01.blob.core.windows.net/hivetest/extracttemp.csv
My hql file is stored inside a Blob Storage and I want to execute it and collect the result into a csv file and store it back to Blob Storage . This entire script is stored in shell script which also in a Blob Storage. NowIi want to execute in a Azure Data Factory in hive activity. Help will be appreciated.
You could use Hadoop hive activity in ADF. Please take a look at this doc. And you could build your pipeline with ADF V2 UI.