Is there a way to use my google instance in big query? - google-cloud-sql

I have set up a google instance SQL server. I have a company laptop and there seem to be some constraints in me connecting to the SQL server through google proxy as I keep getting x509 certificate errors.
Does Google have its own version of an SSMS e.g. BigQuery where I can use queries to create/alter tables as I would normally in SSMS?
I would like to perform something as simple as
CREATE TABLE [datam].[CashflowAgg_small] (
[JobID] INT,
[ReportingPeriod_ID] INT,
[EntityHierarchy_ID] INT
)
Is something like this possible or do i need to connect to an SSMS?

There isn't a built-in management console for mySQL like there is one for BigQuery.
You can configure one of the following workbenches.
The x509 errors with the SQL proxy are certificate signed by unknown authority errors and usually happens when the SSL configuration was reset and it generates a new CA certificate but your client key doesn't match. In Cloud SQL Console UI in Connections, you reset the SSL configuration and create a new client certificate

Related

Azure Devops SQL DacpacTask failing for Azure Key Vault

I'm trying to deploy a dacpac to an Azure Sql Database with Always encrypted enabled. The Devops agent is running in a self-hosted VM with sqlpackage.exe version 19 with build 16.0.5400.1 installed on it.
I've been able to trace down the issues by adding /diagnostics as an argument to the task and the exception that is raised is:
Unexpected exception executing KeyVault extension 'Object reference not set to an instance of an object.' at Microsoft.SqlServer.Dac.KeyVault.DacKeyVaultAuthenticator.Validate(IList`1 keyVaultUrls, CancellationToken cancelToken)
Anybody have a suggestion on how to solve this?
Please check below points
Microsoft.SqlServer.Dac.KeyVault.DacKeyVaultService Provides a service for discovering and configuring a Microsoft.SqlServer.Dac.KeyVault.KeyVaultAuthenticator to handle key vault access requests.
These requests will occur during deployment if an encrypted table is being altered. It also supports initialization of general key vault support in an application
If you store your column master keys in a key vault and you are using access policies for authorization:
Your application's identity needs the following access policy permissions on the key vault: get, unwrapKey, and verify.
A user managing keys for Always Encrypted needs the following access policy permissions on the key vault: create, get, list, sign, unwrapKey, wrapKey, verify.
SEE Create & store column master keys for Always Encrypted - SQL Server | Microsoft Docs
3.
To publish DAC package if Always Encrypted is set up in the DACPAC
or/and in the target database, you might need some or all of the below
permissions, depending on the differences between the schema in the
DACPAC and the target database schema.
ALTER ANY COLUMN MASTER KEY, ALTER ANY COLUMN ENCRYPTION KEY, VIEW ANY COLUMN > MASTER KEY DEFINITION, VIEW ANY COLUMN ENCRYPTION KEY
DEFINITION
we need to enable that Azure virtual machine check box
References:
Configure column encryption using Always Encrypted with a DAC package - SQL Server | Microsoft Docs
azure-sql-advanced-deployment-part4.
KeyVaultAuthenticator.Validate(IList, CancellationToken) >> Microsoft.SqlServer.Dac.KeyVault Namespace | Microsoft Docs
I managed to find a solution. I downgraded the sqlpackage.exe version. If I understand it correctly apparently version 19 seems to be targeted for SQL Server compatibility level 160 which is shipped with SQL Server 2022. When using version 18 it seems to be working with the current 150 that my Azure DB is set to.

Local Postgres database to Google Cloud PostgreSQL Github

I would like to build a Google Cloud PostgreSQL database using the instructions here
I was able to successfully create the Postgres databases with appropriate tables and views locally.
What do I need to do in order to get the data on Google Cloud PostgreSQL? My goal is to have remote access to this data.
You have 2 options, The first one is use the Cloud SQL proxy as is described here. As the shared links say, the Cloud SQL Proxy provides secure access to your instances without the need for Authorized networks or for configuring SSL.
On the other hand, the second option is only to configure access to your instance under Authorized networks using or not SSL. The complete steps are listed here
You could connect to Cloud SQL from a local test environment using cloud sql proxy. See quickstart-proxy-test.
The workflow is:
Your Application(Running Locally) => cloud sql proxy (Running locally) => GCP remote Cloud SQL service

Is there an option to use a public, trusted CA for Secure Cloud SQL communication?

Currently when you create a Cloud SQL instance, a self signed certificate is created per the Google Cloud SQL documentation.
Is there a way to choose a different Certificate Authority so my communications are signed by a trusted third party?
Below is a sample of the current cert created by Google when the instance is created.
Common Name: Google Cloud SQL Server CA
Organization: Google, Inc
Country: US
Valid From: November 26, 2018
Valid To: November 23, 2028
Issuer: Google Cloud SQL Server CA, Google, Inc
Serial Number: 0 (0x0)
You can connect to your instances using your own CA certificate.
If you have a MySQL instance, follow the steps in this page of the Google Cloud SQL documentation, where it says Connect to your Cloud SQL instance using SSL, but skip the creation of a new client certificate in the beginning.
You need to have the certificate in a text file (such as server-ca.pem) that you'll pass as a flag --ssl-ca=[CERTIFICATE-FILENAME] to the mysql command.
For PostgreSQL, follow these steps instead, again skipping the beginning and using your own CA cert. The cert file would be then indicated when running psql, which takes a string as an argument, including sslrootcert=[CERTIFICATE-FILENAME] in that string.

pgAdmin access control to PostgreSQL

I am interested in barring pgAdmin access to my PostgreSQL server from any station other than the server. Is is possible to do this using pg_hba.conf? The PostgreSQL server should still allow access to the server for my application from other stations.
No, this isn't possible. Nor is it sensible, since the client (mode of access) isn't the issue, but what you do on the connection.
If the user managed to trick your app into running arbitrary SQL via SQL injection or whatever, you'd be back in the same position.
Instead, set your application up to use a restricted user role that:
is not a superuser
does not own the tables it uses
has only the minimum permissions GRANTed to it that it needs
and preferably also add guards such as triggers to preserve data consistency within the DB. This will help mitigate the damage that can be done if someone extracts database credentials from the app and uses them directly via a SQL client.
You can also make it harder for someone with your app's binary etc to extract the credentials and use them to connect to postgres directly by:
using md5 authentication
if you use a single db role shared between all users, either (a) don't do that or (b) store a well-obfuscated copy of the db password somewhere non-obvious in the configuration, preferably encrypted against the user's local credentials
using sslmode=verify-full and a server certificate
embedding a client certificate in your app and requiring that it be presented in order for the connection to be permitted by the server (see client certificates
Really, though, if you can't trust your uses not to be actively malicious and run DELETE FROM customer; etc ... you'll need middleware to guard the SQL connection and apply further limits. Rate-limit access, disallow bulk updates, etc etc.

Microsoft Appfabric configuration to SQL server

We are getting the below error when trying to Configure App fabric.
We create a empty database
Provide userID who is running the configuration sysadmin access on the sql server
The last step in the configuration shows this error. Looks like the machine specific SQL id is not getting created for the AppFabricCache database.
Could not set permissions on configuration Store: ErrorCode:Substatus : No such host is known. Refer to product documentation for manually configuring the store permissions