Is there an option to use a public, trusted CA for Secure Cloud SQL communication? - google-cloud-sql

Currently when you create a Cloud SQL instance, a self signed certificate is created per the Google Cloud SQL documentation.
Is there a way to choose a different Certificate Authority so my communications are signed by a trusted third party?
Below is a sample of the current cert created by Google when the instance is created.
Common Name: Google Cloud SQL Server CA
Organization: Google, Inc
Country: US
Valid From: November 26, 2018
Valid To: November 23, 2028
Issuer: Google Cloud SQL Server CA, Google, Inc
Serial Number: 0 (0x0)

You can connect to your instances using your own CA certificate.
If you have a MySQL instance, follow the steps in this page of the Google Cloud SQL documentation, where it says Connect to your Cloud SQL instance using SSL, but skip the creation of a new client certificate in the beginning.
You need to have the certificate in a text file (such as server-ca.pem) that you'll pass as a flag --ssl-ca=[CERTIFICATE-FILENAME] to the mysql command.
For PostgreSQL, follow these steps instead, again skipping the beginning and using your own CA cert. The cert file would be then indicated when running psql, which takes a string as an argument, including sslrootcert=[CERTIFICATE-FILENAME] in that string.

Related

Azure DevOps on-premise cannot verify Kubernetess service connection

I am creating New Kubernetes service connection in Azure DevOps Server 2020 Update 1 via KubeConfig.
When I click to Verify that the connection it says that Verification Failed with the generic error:
Failed to query service connection API: 'https://ekm.mpu.cz/k8s/clusters/c-qmcrb/api/v1/nodes'. Error Message: 'An error occurred while sending the request.'
Please note that the Kubernetess instance is in the other domain.
I have the notion that the error could be with the certs are not imported somewhere on the machine, where the Azure DevOps is hosted, but I am unsure where. The MS documentation is silent about that as well.
So far I've tried to:
Import CA certs to the MMC under trusted publishers.
Import CA certs under cacerts in JAVA-HOME via keytool.
Import CA certs into azureTrustsStore.jks in JAVA-HOME via keytool.
For all 3 I've checked that the CA certs are imported correctly. But to no avail. Could you please advice or redirect me to the method, how to do it?
Additional Info:
While I cannot Verify and Save the connection, I still can Save it and then use it in the pipeline and it works OK! (sucesfully connect and execute the command).
Connection issues can occur for many reasons, but the root cause is often related to an error with one of these items: Network, Authentication, Authorization. You may refer to Basic troubleshooting of cluster connection issues for detailed troubleshooting steps.

Service Fabric, Azure Devops Deployment fails : The specified network password is not correct

I was recently ordered by our IT team to disable the NAT pools on my service fabric cluster due to security risks. The only way I could do this was to deploy a new cluster with all its components.
Because this is a test environment I opt to use a self signed cert without a password for my cluster, the certificate is in my vault and the cluster is up and running.
The issue I have now is when I try to deploy my application from an Azure Devops Release Pipeline I get the following message:
An error occurred attempting to import the certificate. Ensure that your service endpoint is configured properly with a correct certificate value and, if the certificate is password-protected, a valid password. Error message: Exception calling "Import" with "3" argument(s): "The specified network password is not correct.
I generated the self signed certificate in Key Vault, downloaded the certificate and used Powershell to get the Base64 string for the service connection.
Should I create the certificate myself, with a password?
With the direction of the two comments supplied, I ended up generating a certificate on my local machine using the powershell script included with service fabric's local run time.
A small caveat here is to change the key size in the script to a large key size than the default, because ke vault does not support 1024 keys.
I then exported the pfx from my user certificates added a password(this is required for the service connection) and impoted the new pfx into my key vault.
Redeployed my cluster and it worked.

Is there a way to use my google instance in big query?

I have set up a google instance SQL server. I have a company laptop and there seem to be some constraints in me connecting to the SQL server through google proxy as I keep getting x509 certificate errors.
Does Google have its own version of an SSMS e.g. BigQuery where I can use queries to create/alter tables as I would normally in SSMS?
I would like to perform something as simple as
CREATE TABLE [datam].[CashflowAgg_small] (
[JobID] INT,
[ReportingPeriod_ID] INT,
[EntityHierarchy_ID] INT
)
Is something like this possible or do i need to connect to an SSMS?
There isn't a built-in management console for mySQL like there is one for BigQuery.
You can configure one of the following workbenches.
The x509 errors with the SQL proxy are certificate signed by unknown authority errors and usually happens when the SSL configuration was reset and it generates a new CA certificate but your client key doesn't match. In Cloud SQL Console UI in Connections, you reset the SSL configuration and create a new client certificate

Deploying a Service fabric app from Team Services to Azure

I need some help with deploying a Service fabric app from Team Services to Azure.
I’m getting the following error from the Agent in Team Services (see screenshot below):
2018-06-22T13:17:13.3007613Z ##[error] An error occurred attempting to
import the certificate. Ensure that your service endpoint is
configured properly with a correct certificate value and, if the
certificate is password-protected, a valid password.
Error message: Exception calling "Import" with "3" argument(s):
"Cannot find the requested object.
Please advise.
Here is my Service Fabric Security security page, don't remember where I set up the password needed on the VSTS side but I took note of it and believe it's correct.
Here is the Endpoint page on the VSTS side:
Issue resolved with the help of MS Support by creating a new Certificate in the Key Vault and Adding it to the Service Fabric, steps:
Azure Portal:
Home > Key vaults > YourKeyVault - Certificates: Generate/Import
Generate new key with a CertificateName of your choosing and CN=CertificateName as Subject.
Home > Key vaults > YourKeyVault - Certificates > CertificateName
Select the only version available and Download in PFX/PEM format.
Power Shell: Convert to Base64 string, CertificateBase64
[System.Convert]::ToBase64String([System.IO.File]::ReadAllBytes("c:\YourCertificate.pfx"))
Home > YourServicefabric - Security: Add
Add the Certificate you created as Admin Client by providing 's thumbprint.
VSTS/TFS:
Build and release > Your pipeline: Edit
In the Deployment Process Service Fabric Environment click Manage for Cluster Connection and add a new connection. Besides the other information, in the Client Certificate paste the previous CertificateBase64.
Check the Service Endpoint in VSTS:
Whether it has a properly base64 encoded certificate, with a private key.
Also, check if the provided passphrase is correct.
Also, check if the service endpoint is configured as tcp://mycluster.region.cloudapp.azure.com:19000.
Check if the thumbprint is correct.

ADFS: Error while establishing SSO Connection on windows server 2012

When i access my sing-on url(https://abcd.avcd.ac/adfs/ls/IdpInitiatedSignOn.aspx) from my code to establish connection with adfs, I get error as:
A WS-Trust endpoint that was configured could not be opened.
Additional Data
Address: https://win-3723jtvfe02.abcd.avcd.ac/adfs/services/trust/2005/windowstransport
Mode: WindowsTransport
Error:
MSIS0006: A Service Principal Name is not registered for the AD FS service account.
And I also get warning as:
The SSL certificate does not contain all UPN suffix values that exist in the enterprise.
Users with UPN suffix values not represented in the certificate will not be able to Workplace-Join their devices.
Please help me to figure out this issue.
For the SPN issue, you'll need to get that registered. There is a nice article about that on technet here: http://social.technet.microsoft.com/wiki/contents/articles/1427.ad-fs-2-0-how-to-configure-the-spn-serviceprincipalname-for-the-service-account.aspx
If you're not using the Workplace-Join feature of ADFS 2012 R2, then you don't have to worry about that other error. If you do want to address it, though, check out the docs here: https://technet.microsoft.com/en-us/library/dn614658.aspx