How to export vault secrets as zip file - hashicorp-vault

Is there a way to export vault secret data from one vault instance, and then import to another vault instance?
Example:
Export secrets from the source instance with the path secret/vault/path and import to the destination empty vault instance.

I am not aware of a 'native' way to do this. You will need to iterate over the requested secrets and export them to file (and import them in the same way).
You can try to use one of the following projects that attempt to do it:
Vault backup
Vault backup migrator

Medusa is a open source cli tool that can export and import your Vault secrets on different Vault instances.
The tool can handle a full tree structure in both import and export. It also supports end to end encryption of your secrets between export and import between Vault instances so that your secrets are always secure.
https://github.com/jonasvinther/medusa
export VAULT_ADDR=https://192.168.86.41:8201
export VAULT_SKIP_VERIFY=true
export VAULT_TOKEN=00000000-0000-0000-0000-000000000000
./medusa export kv/path/to/secret --format="yaml" --output="my-secrets.txt"
./medusa import kv/path/to/new/secret ./my-secrets.txt

Related

Where to store credential in terraform-gcp-github project?

I have a terraform project that accesses a google cloud bucket. All pull requests are done through github. However, I'm not sure where I'm supposed to securely store my bucket credentials? Of course I don't want to upload them to github but I'm not sure where they should be kept. This is my variables file
variable "credentials_filepath" {
default = "../../../creds.json"
}
And this is my main file
provider "google" {
credentials = file(var.credentials_filepath)
project = var.project
region = "europe-west2"
zone = "europe-west2-a"
}
The main idea is to store the secrets securely in a secrets manager and to use a wrapper that makes the secrets available as environment variables only for the duration of the wrapper process.
Among the tools that I can recommend: pass, gopass, summon.
For example, once the secrets are stored in GPG and you have the gpg-agent configured, you can run:
TF_VAR_secret=$(pass gc/myproject) terraform ...
This will tell the shell to set the environment variable TF_VAR_secret to the output of pass gc/myproject.
That command tells pass to use gpg and gpg-agent to read the value of the secret stored at gc/myproject.
secret is a Terraform variable and TF_VAR_secret tells Terraform to fill that variable from that environment variable. (See Terraform documentation).

How do I modify a Spring application.yml file during deployment in Azure Devops?

My Spring app uses the standard application.yml file to define a connection to a database. I don't want to check any production credentials into the source code or resulting WAR file, and so need a way to modify the YAML file during deployment to include these production secrets.
Is this something I can do with Azure Devops, short of scripting the extraction of the WAR file, doing a find and replace on the file and repackaging it?
The most straight forward way i can recall is to:
use Extract Files task for the .war file
use Replace Tokens Taks to lookup your db connection string
use Archive Files task to archive back to war
you have two options for storing your production secrets:
Use variables, or variable groups to store your secrets and mark them as "secrets"
Use Azure Key Vault to store your secrets and inject them in the pipeline
The replace tokens task uses prefix and postfix tokens to lookup your connection db and the name inside this tokens matches the variable name. e.g. for a token in your yaml "#{db_connection}#" you should create a varible named db_connection in your ADO variables.

How to securely export and import data from different vault instances?

What if :
I want to replicate the data contained in dev vault to my local vault?
I want to export my local vault data to dev vault up to prod vault?
What is the ideal way of doing this securely and versioned? I am thinking of some export/import mechanism.. Fairly new to vault
Medusa is a open source cli tool that does exactly what you need.
The tool can handle a full tree structure in both import and export. It also supports end to end encryption of your secrets between export and import between Vault instances so that your secrets are always secure.
https://github.com/jonasvinther/medusa
export VAULT_ADDR=https://192.168.86.41:8201
export VAULT_SKIP_VERIFY=true
export VAULT_TOKEN=00000000-0000-0000-0000-000000000000
./medusa export kv/path/to/secret --format="yaml" --output="my-secrets.txt"
./medusa import kv/path/to/new/secret ./my-secrets.txt
Vault stores everything in the backends and encrypts them with the unseal keys.
If you wanted, you could copy the data else where, and then 'import' them to the next environment (and by copy, i mean db dump if you are using a database to store stuff, copy s3 buckets if you are using s3, etc).
That would require downtime as you would need to seal your cluster to make sure all the writes happen before you copy your data.
If you want something more automatic, you could upgrade to the enterprise version and use replication - there are various different replication options.

How to make Keycloak use its database password from keyvault instead of env file

Currently I am using keycloak on postgres db. and the db creds are provided to environment files. Wanted to know how I can make keycloak obtain the db creds from keyvault something like Azure keyvault ? Is there any documentation / guideline around it?
As per the official documentation ,some part already done but look like still work in progress
To use a vault, a vault provider must be registered within Keycloak.
It is possible to either use a built-in provider described below or
implement your own provider. See the Server Developer Guide for more
information.
To obtain a secret from a vault instead of entering it directly, enter the following specially crafted string into the appropriate field: ${vault.entry-name} where you replace the entry-name with the name of the secret as recognized by the vault.
https://www.keycloak.org/docs/latest/server_admin/#_vault-administration
https://issues.redhat.com/browse/KEYCLOAK-3205

How to upload a credential.json file to Hasura cluster without adding it to a git remote repository

I am supposed to use a credvalue.json in some API being used by the program, but I don't want to upload this credentials to GitHub repository and still use it in microservice
I tried adding it to .gitignore and copying it to src folder in docker but it results in file not found, but if I remove it from . gitignore it works well
I can't use hasura secrets, it's the credvalue.json file required by the library
Also just for a use case, the API requires me to specify the path of this JSON file as an environment variables, so what should be the path of file uploaded JSON file?
You should be able to use hasura secrets with files as well with the -f flag.
Here are the link to the docs: https://docs.hasura.io/0.15/manual/project/secrets/mounting-secret-as-file.html
You can basically create a secret from a file and then mount that secret as a file your microservice container.