Connecting to Snowflake from Databricks through SSO - single-sign-on

We are currently planning to use Databricks as compute platform and Snowflake as our DWH system. We have planned to use SSO-based login for both, with our corporate ADFS as the IdP and we are still in the planning phase.
Wanted to check if having SSO enabled at Snowflake will restrict our ability to run jobs on Databricks that interacts with Snowflake for reading/writing data. If yes, what are our alternatives for better login security?
If this set-up is actually possible, can someone please point to any documentation talking about connecting to Snowflake from Databricks through SSO. Didn't really find anything on the topic. The document below mentions that MFA, SSO or any browser based login won't work with Snowflake's Spark connector, not sure if that's relevant to this use case.
https://docs.snowflake.com/en/user-guide/spark-connector-use.html#authenticating-through-a-browser-is-not-supported

For Spark connector use OAuth for authentication.
It can be configured with Microsoft Azure AD, see here

Related

Google Cloud SQL Postgres Vs Self Hosted Postgres using GCP Compute instances: HIPAA Compliance

This question is about infosec, data privacy, specifically HIPAA compliance on GCP.
Is there any advantages for self managing Postgres server (built on GCP Compute instances using lets say Terraform) my own Vs using the managed offering, i,e. Cloud SQL
Thanks in advance
Google Cloud SQL Postgres is a fully managed option for deploying PostgreSQL to Google Cloud. The fully managed option is convenient, but is mainly suitable for cloud-native applications, or applications rebuilt for the cloud.
It has Built-in encryption for database tables, temporary files, backups, and any data transferred over Google’s internal networksSecure connections via SSL/TLS or the Cloud SQL Proxy.
Update1
As you are referring to HIPAA You can check this guide for HIPAA Compliance on Google Cloud Cloud sql encrypts the data at rest using the 256-bit Advanced Encryption Standard (AES-256), or better, with symmetric keys: that is, the same key is used to encrypt the data when it is stored, and to decrypt it when it is used. You can use your own encryptions as well with CMEK for cloud sql
And also you mentioned Infosec. I have not completely understood the term. I assume that you are referring to securing information from vulnerabilities. You can use Cloud Armor, which is a network security service that provides defenses against DDoS and application attacks like cross-site scripting (XSS) and SQL injection (SQLi).
Self hosted Postgres gives you full control over your PostgreSQL database on GCP, letting you to fine-tune server parameters, modify database configuration, and tune performance, just like in a local deployment.
Update2
As per this thread, it seems like postgresql is not HIPAA compliant.
For Encryption at rest on postgresql use can PostgreSQL TDE and Pgcrypto as discussed in this similar thread
For self hosted postgres You can also use shielded VM using which you can protect enterprise workloads from threats like remote attacks, privilege escalation, and malicious insiders
I am not sure on your application requirement, But based upon my
understanding about both cloud sql and self hosted postgres I
would recommend considering cloud sql as the best option as it is
fully managed by google and also complies with HIPAA and encryption.
For more information about pros and cons of Google Cloud SQL Postgres and Self hosted Postgres, Check this document

Connecting to MS Forms connector using Service Principal within logic app

I am creating a logic app that will trigger when a form request is submitted.
The MS Form connector requires me to sign in. This is acceptable during development, but we have a lot of logic apps and so use DevOps to automate deployment.
With the current connector, after deployment we still have to:
manually open the logic app in the portal.
connect using authorized credentials.
save the logic app.
This manual process completely defeats the point of using DevOps with Logic Apps.
Its a similar issue when using the Outlook connector.
Is there a way to supply server principal credentials to these connectors, so that they are correct at deployment time and require no manual intervention?
It seems that it's not supported to login on MS Forms connector with service principal. Connectors that can use service principal authentication will have "Connect with Service Principal" option, like Azure Data explorer. You can give your voice on this feedback to promote this feature.
API Connections with OAuth authentication, like Office 365 and Microsoft Team connectors etc, require manual consent. Unfortunately, at this point in time, authentication for those cannot be fully automated.
Here is a ticket you can refer to.

Connecting Looker to Snowflake with SSO enabled

Hi we're enabling federated authentication in snowflake - that means we'll no longer be allowing using user and password.
Everything that connects to snowflake can use a .pem certificate but looker.
There is no such option, you can either use login/pass or oAuth.
Snowflake support suggested ssh tunnel but I don't see how this might help
Brandon with JumpCloud here — there's actually an ongoing discussion about integrating JumpCloud with Snowflake, in the #sso channel of our public Slack workspace. One admin has achieved success and is offering help. If that sounds relevant, feel free to join in — https://join.slack.com/t/jumpcloudlounge/shared_invite/zt-esobabj4-Ytqy4ZSTo6ZONoALoGHAKA.

Does Amazon services support on-premise hosting?

We intend to develop a enterprise Bot using amazon lex that will fetch response from a SQL server, and display result along with visual presentation. Does Lex support on premise deployment?
Will there be any challenges in using Lex vs Google Dialogflow (formerly known as api.ai)?
Please suggest.
The bot agent you will develop that will reside on AWS, you can access it on AWS Lex console and you cannot have it on-premise.
You can, however, use webhooks which you can have on-premise.
You can use amazon-lex to understand user query and match intent, once the intent is matched, you can perform the operations using if-else conditions and get data from your SQL server.
This way none of your data will be on AWS.

Get metrics via API

I'm using Azure SQL, and there is a page in there with metrics
From what I understood the new Azure Managment Portal consumes only public apis. What I'm trying to find out is how to access these metrics via a REST or SOAP api. I've searched through the MSDN documentation but couldn't come up with anything.
Anyone have any ideas?
I presume Microsoft did not provide Azure Database monitoring REST or SOAP API as it would not be used much.
DBAs can connect to Azure Database and gather all the necessary statistics via dynamic management views which are quite powerful.
However, I do not have any article / documentation confirming my presumptions.
try the new sys.event_log and sys.database_connection_stats DMVs instead.
see: Announcing: New System Views for Windows Azure SQL Database