Is there any way to automate key rotation in Azure PostgreSql? - postgresql

I want to automate the process of rotating encryption key for Azure Postgre Sql. There are no commands available in PowerShell or CLI for key rotation or updating the key.
Is there any other way for key rotation with minimal manual intervention?

The encryption key should be stored in Azure Key Vault, then you can rotate the keys using powershell and also create Azure Functions with powershell(one way to automate you could use time trigger to schedule the execution).
https://learn.microsoft.com/en-us/azure/postgresql/howto-data-encryption-portal
https://learn.microsoft.com/en-us/azure/key-vault/keys/quick-create-powershell

Related

Ways to import data into AzureSQL PaaS from Azure Blob Storage

All,
I have to BULK Insert data into AzureSQL from a Azure Blob Storage Account. I know one way is to use SAS keys but are there more secure ways to load data from T-SQL?
For example, is there a way to use the users AAD account to connect to the Storage? Would Managed Identity work? I have not come across an example in the Internet that uses anything other than SAS Keys.
Gopi
azure data factory generally serves this purpose. You can build a pipeline that grabs data from blob and massages it / loads it into sql, kind of what it's designed for. However if you do not wish to use that,
the recommended way is SAS because it can be temporary and revoked at any time. Why do you think SAS is less secure?
as per the documentation: https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15#credential--credential_name if you were to create an external data source with blob_storage type the identity/credentials MUST be SAS, as it doesn't support any other authentication type. as such, that means you cannot use any other auth method to a blob storage using tsql.

Connect ADF to ServiceNow URL

I am fairly new to Azure, but I have been doing ETL for quite some time now. I want to connect ADF to ServiceNow to bring in lists to our SQL data warehouse. Does anyone have any good articles or what the settings are on how to achieve this?
You could use copy activity in ADF which supports Service Now as input source and Azure Synapse Analytics(formerly Azure SQL Data Warehouse) as output sink.
Since you are new to ADF,based on above tutorials,i'm afraid that there are 3 elements you should get know when you execute copy activity.
1.Linked Service:https://learn.microsoft.com/en-us/azure/data-factory/concepts-linked-services
2.Dataset:https://learn.microsoft.com/en-us/azure/data-factory/concepts-datasets-linked-services
3.Pipeline:https://learn.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers
If you want to execute pipeline in the schedule, you also could add time trigger onto specific pipeline.

Azure data Factory and power shell

enter image description here
Hi Stack team,
actually i plan to migrate my SQL server2012 databases to Azure data warehouse with Azure data factory approach...
But problems are,
1) my database size is 4.5 TB
2)in this approach there are 3 methods. those method details i mentioned in the attached image.. my problem is i planed to 3rd method for migrate(3.Using Azure Data Factory and PowerShell (entire database - ADF))
so please tell me links related above method and its possible for migration or not. if its possiable send me how to do...
Please refers to the link below, there's an UX solution in Azure Data Factory which are just target for migration from SQL Server to Azure SQL Datawarehouse. It's a wizard based UX which are very easy to follow. It will automatically create tables in Azure SQL Datawarehouse and migrate the data using polybase, which is the most efficient way to load data into SQLDW.
https://learn.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-with-data-factory

PostgreSQL: Encrypt Column With pgcrypto

I need to encrypt some columns in a PostgreSQL 9.6 database. The data being encrypted is inherently sensitive; however, the data are not passwords or other authentication credentials. This data will need to be decrypted for statistical analysis and consumption by users.
After reading several questions and answers:
Storing encrypted data in Postgres
https://dba.stackexchange.com/questions/24370/how-to-use-aes-encryption-in-postgresql
https://dba.stackexchange.com/questions/59942/secure-postgresql-database-encryption
... and considering these comments:
... it seems the biggest problem with using the pgcrypto module is the storage of keys in the same database.
This begs the question:
Is it consistent with best practices to store the key in a different database and access it via a foreign data wrapper, such as Postgresql_FDW?
Secret storage is a common issue when using crypto mecanisms.
pgcrypto does not povide key storage, you are free to store the key where you want and protect it as you can.
Storing the key in another database, if managed by the same DBA does not provide much security as DBA may access it the same way.
Ideally, you would store the key in a secure vault and request it from your application in order to construct the queries. It will still be visible from DBA while the request is running through select * from pg_stat_activity.
You may set the key for a SQL session wide use through set session my.vars.cryptokey = 'secret'; then use it into your queries with the following syntax : current_setting('my.vars.cryptokey')::text
To be (almost) transparent from the application point of view, PostgreSQL rules may help for translating secure_column to the call to decrypt function with the session stored key. For inserting, a pre-insert trigger would be required.

Check for the existence of an Azure Blob using TSQL

I currently have a TSQL function called "FileExists" that checks for the existence of a file on disk. However, we are moving the database to Azure Db and the files to Azure Blob storage, so this function needs to be rewritten (if possible). How can I check the Blob Storage container for a particular SubBlob and FileName combination using TSQL?
Of course, you cannot execute direct T-SQL query to Azure Blob. Possible workaround is to use xp_cmdshell to run Powershell script which calls Get-AzureStorageBlob to access the blob and get data... but much more easier to do the whole task in .NET code, not in SQL.