Vault Transit migrate to another server - hashicorp-vault

I have a Vault server with Transit engine enabled, now I want to migrate the server to another location/hosting,
Ho do I export/import existing keys?

You can first migrate seal from transit to shamir. ( Your Recovery Keys Will be your unseal keys ). Then migrate seal to transit ( your second server ). This is only way and at a time of migrating to shamir your first server must be up and ruuning.

Related

Azure Devops SQL DacpacTask failing for Azure Key Vault

I'm trying to deploy a dacpac to an Azure Sql Database with Always encrypted enabled. The Devops agent is running in a self-hosted VM with sqlpackage.exe version 19 with build 16.0.5400.1 installed on it.
I've been able to trace down the issues by adding /diagnostics as an argument to the task and the exception that is raised is:
Unexpected exception executing KeyVault extension 'Object reference not set to an instance of an object.' at Microsoft.SqlServer.Dac.KeyVault.DacKeyVaultAuthenticator.Validate(IList`1 keyVaultUrls, CancellationToken cancelToken)
Anybody have a suggestion on how to solve this?
Please check below points
Microsoft.SqlServer.Dac.KeyVault.DacKeyVaultService Provides a service for discovering and configuring a Microsoft.SqlServer.Dac.KeyVault.KeyVaultAuthenticator to handle key vault access requests.
These requests will occur during deployment if an encrypted table is being altered. It also supports initialization of general key vault support in an application
If you store your column master keys in a key vault and you are using access policies for authorization:
Your application's identity needs the following access policy permissions on the key vault: get, unwrapKey, and verify.
A user managing keys for Always Encrypted needs the following access policy permissions on the key vault: create, get, list, sign, unwrapKey, wrapKey, verify.
SEE Create & store column master keys for Always Encrypted - SQL Server | Microsoft Docs
3.
To publish DAC package if Always Encrypted is set up in the DACPAC
or/and in the target database, you might need some or all of the below
permissions, depending on the differences between the schema in the
DACPAC and the target database schema.
ALTER ANY COLUMN MASTER KEY, ALTER ANY COLUMN ENCRYPTION KEY, VIEW ANY COLUMN > MASTER KEY DEFINITION, VIEW ANY COLUMN ENCRYPTION KEY
DEFINITION
we need to enable that Azure virtual machine check box
References:
Configure column encryption using Always Encrypted with a DAC package - SQL Server | Microsoft Docs
azure-sql-advanced-deployment-part4.
KeyVaultAuthenticator.Validate(IList, CancellationToken) >> Microsoft.SqlServer.Dac.KeyVault Namespace | Microsoft Docs
I managed to find a solution. I downgraded the sqlpackage.exe version. If I understand it correctly apparently version 19 seems to be targeted for SQL Server compatibility level 160 which is shipped with SQL Server 2022. When using version 18 it seems to be working with the current 150 that my Azure DB is set to.

Mongodb: Client side Field Level encryption - integration with Hashicorp vault

We plan to use client-side field-level encryption for some confidential fields in our product. To generate and manage the Customer Master key, we want to use Hashicorp Vault. KMS providers currently supported are only: Amazon Web Services KMS and Locally Managed Keyfile.
To work with Hashicorp Vault, it seems, we need to choose Locally Managed Keyfile as the KMS provider. This means that the Master key will be fetched from Vault in memory and then used in the code to encrypt/decrypt the DEK (Data Encryption Key). Ideally, the decryption of DEK should happen in the vault itself as a best practice, and master key should not be brought out of Vault.
Is there a way to achieve this? There are numerous articles around encryption at rest and integration with Hashicorp vault, but none of them is for CSFLE. Need help if anyone is using CSFLE.
Thanks

Is there a way to use my google instance in big query?

I have set up a google instance SQL server. I have a company laptop and there seem to be some constraints in me connecting to the SQL server through google proxy as I keep getting x509 certificate errors.
Does Google have its own version of an SSMS e.g. BigQuery where I can use queries to create/alter tables as I would normally in SSMS?
I would like to perform something as simple as
CREATE TABLE [datam].[CashflowAgg_small] (
[JobID] INT,
[ReportingPeriod_ID] INT,
[EntityHierarchy_ID] INT
)
Is something like this possible or do i need to connect to an SSMS?
There isn't a built-in management console for mySQL like there is one for BigQuery.
You can configure one of the following workbenches.
The x509 errors with the SQL proxy are certificate signed by unknown authority errors and usually happens when the SSL configuration was reset and it generates a new CA certificate but your client key doesn't match. In Cloud SQL Console UI in Connections, you reset the SSL configuration and create a new client certificate

Verify MongoDB encryption based on Local Key Management

I have configured MongoDB 3.4.16 Enterprise version for native encryption following the Local Key Management method as mentioned in the documentation of MongoDB.
I find that, as mentioned in the tutorial I also get the encryption successful message on the command prompt which comes after the operation was successful:
[initandlisten] Encryption key manager initialized with key file:
My question is, how can I demonstrate the results to other people that with just these configurations the encryption has happened? Like for example, only if I can show the DB data file before and after applying these encryption configurations.
I don't have an answer, rather a comment. Be sure to take note of the notice at the top of the Local Key Management page.
IMPORTANT
Using the keyfile method does not meet most regulatory key
management guidelines and requires users to securely manage their own
keys.
The safe management of the keyfile is critical.
Without a dedicated key manager to store and manage keys, it is like leaving the keys to your house under your welcome mat. Since you are on Enterprise edition, use KMIP and deploy an encryption key manager. More on encryption key management for MongoDB here: https://info.townsendsecurity.com/mongodb-encryption-key-management-definitive-guide

Production Environment for Spring Cloud Config using Git/Vault

Spring Boot - 2.0.0.M3
Spring cloud - Finchley.M1
I want to know if someone is using Spring Cloud config server with both vault and git support in a production setup using Database storage backend.
I have evaluated Spring cloud config using vault and contemplating whether to go for Oracle JCE to encrypt username/pwd or Vault and seek suggestions on the same. we are working on Springboot/microservices.
Following are my findings -
Vault will introduce an additional layer and thus will introduce additional usecases of security, auditing while communicating with Vault.
Spring cloud Config actuator endpoints are broken for the milestone release at this point for generation of encrypted values and /encrypt /decrypt may not work if we go for Oracle JCE support so we generate encrypted values through stable versions.
We do not wish to use consul server and are trying to use Cassandra as Storage backend.
I used Vault Authentication backend using AppRole and generated a Token (different from root token as it's unsafe to use the same) with read permissions. However, Spring Cloud config at the moment support only Token based authentication from client side. That means we first generate token from Vault and then pass it as commandline/env variable.
Some additional points of concern are expiry of token (though we can have non-expiry token not sure about pros/cons), restarts, safety issues, instantiating new microservices. There is no provision of dynamic tokens/authentication at cloud config side.
For milestone release i found that the client side encryption/decryption is not working as of now using recommended inclusion of RSA jar. Here is the ticket i opened.
https://github.com/spring-cloud/spring-cloud-config/issues/805#issuecomment-332491536
These are some of my observations, please share your thoughts if there is any case study/whitepaper that address spring cloud config vault usecases, setup and challenges for production micro-services environment.
Thanks
Thanks for reaching out to me. One think I would state is that the App Role backend utilizes two distinct tokens, and indeed spring-cloud-config-vault does indeed support this functionality, see: http://cloud.spring.io/spring-cloud-vault/single/spring-cloud-vault.html#_approle_authentication. I leverage vault in the same way I leverage config server, as per the documentation. I don't encrypt any values in my config, I just don't put them there. I put the secret values in vault and let it serve config. As long as keys don't collide, you don't have to mess with anything, otherwise you may need adjust the priority so vault wins, again see the documentation that I pointed to above. I wouldn't mess with encryption/decryption in spring-cloud-config personally. Because you have to check the keys into SCM or distribute them to your teams for local development, you lose the value of having these keys IMO.
Thanks Spring Cloud vault does support but not Spring cloud config with Vault. Only way seems to be passing X-Config-token from Microservice to Config Server. We are bit skeptical with this part of generating tokens manually or through script. Especially with containerization and when new MS instances will be spawn. Not sure about this approach especially in production setup.