docker-compose mongodb access env variables in /docker-entrypoint-initdb.d/ script - mongodb

This question is based off the top answer to a previous question on the same topic
My question is, in my custom /docker-entrypoint-initdb.d/ init script, how can I reference env variables that are declared in docker-compose's .env file? Meaning, env variables besides MONGO_INITDB_ROOT_USERNAME and MONGO_INITDB_ROOT_PASSWORD.
for example:
mongo --eval "db.getSiblingDB('sibling').createUser({user: '$SIBLING_USER', pwd: '$SIBLING_PASSWORD', roles: [{ role: 'readWrite', db: 'sibling' }]})"

I did the following for an Reverse Proxy using NGINX where based on a env variable it loads a different config file.
Docker-compose.yml:
https-proxy:
build:
context: ./https-proxy
dockerfile: ./Dockerfile
args:
- MY_VAR=TRUE
Dockerfile:
FROM nginx
ARG MY_VAR
ENV MY_VAR=${MY_VAR}
RUN bash ./etc/nginx/config.sh
config.sh:
#!/bin/bash
if [ $MY_VAR == true ]; then
echo 'My Var is True'
else
echo 'My Var is False'
You could also define an .env file aside your Docker-compose.yml, so you don't have to change that file and only define the values on a different place where Docker-compose will look for them

Related

Howto Pass a Pipeline Secret to DockerCompose#0

I'm writing a testing pipeline and dynamically creating a series of containers to setup the pieces.
I need to be able to pass a secret variable from the Pipeline into the Docker Compose construct to enable the container to connect to the database server.
I have a number of non-secret variables in the pipeline and they are all being passed successfully.
I have mapped the $env:addressdatabase_password in a powershell test to verify that my variable is available.
#To Verify if mapped secret variables are coming through (reverse the string)
- powershell: |
$a = $env:addressdatabase_password
Write-Host "$a"
$b = $a.ToCharArray()
[Array]::Reverse($b)
$c = -join($b)
Write-Host "$c"
env:
addressdatabase_password: $(database-address-password) #pipeline secret
The task in my azure-pipelines.yml looks like this (not all arguments are shown)
- task: DockerCompose#0
displayName: 'Container Start'
inputs:
containerregistrytype: 'Azure Container Registry'
azureSubscription: '$(containerSubscription)'
azureContainerRegistry: '{"loginServer":"$(containerLoginServer)", "id" : "$(containerRegistryId)"}'
dockerComposeFile: '**/docker-compose.yml'
action: 'Run a Docker Compose command'
dockerComposeCommand: 'up -d'
arguments: mycontainer
containerName: 'cf_$(CreateDb.buildidentifier)'
detached: true
dockerComposeFileArgs: |
addressdatabase_name=$(database-address-name)
addressdatabase_user=$(database-address-user)
addressdatabase_pass=$(addressdatabase_password)
env:
addressdatabase_password: $(database-address-password) #pipeline secret
The relevant parts of the docker-compose.yml file
mycontainer:
image: mycontainer-runtime:latest
ports:
- "80:80"
volumes:
- ${mount_1}:C:/mount1
- ${mount_2}:C:/mount2
environment:
ADDRESS_DATABASE_NAME: ${addressdatabase_name}
ADDRESS_DATABASE_USERNAME: ${addressdatabase_user}
ADDRESS_DATABASE_PASSWORD: ${addressdatabase_pass} #pipeline secret
The container starts up successfully, but when I examine the Environment Variables inside the container
ADDRESS_DATABASE_NAME=pr_address
ADDRESS_DATABASE_USER=test-addressuser
ADDRESS_DATABASE_PASSWORD=$(addressdatabase_password)
I'm looking for a way to get this value securely to my container without exposing it in the Pipeline.

Github action: stored .env file content into github secrets and in pipeline want to put secret content in .env file

I stored production .env file content into github secrets (in single variable), wants to create the .env file in pipeline and put the secret content into .env file.
Tried following methods
...
env:
ENV_CONTENT: ${{ secrets.ENV_DEV }}
...
run: |
touch .env
echo $ENV_CONTENT
echo $ENV_CONTENT >> .env
cat .env
...
run: |
echo ${{ secrets.ENV_DEV }} >> .env
cat .env
...
Output: variable is env file is not getting defined.
> demo#1.0.0 deploy:dev /home/runner/work/lvld-api/lvld-api
> NODE_ENV=dev serverless deploy --stage dev
Serverless: Deprecation warning: Detected ".env" files. Note that Framework now supports loading variables from those files when "useDotenv: true" is set (and that will be the default from next major release)
More Info: https://www.serverless.com/framework/docs/deprecations/#LOAD_VARIABLES_FROM_ENV_FILES
Serverless: DOTENV: Loading environment variables from .env:
Serverless: - STAGE
Serverless Warning --------------------------------------
A valid environment variable to satisfy the declaration 'env:REGION' could not be found.
Serverless Warning --------------------------------------
Main.yml: https://drive.google.com/file/d/1PK4SlyXkC7xRn_eM2SQO1rkWjoJaOYaZ/view?usp=sharing
GithubAction Log: https://drive.google.com/file/d/1YvBfdle1GpomJpyuqneQt0PK-OYShXZC/view?usp=sharing

how to read env variables of docker-compose file and package.json file from github action?

from my docker-compose file I have to read an env variable. locally, I can read that variable like this: ENV_FILE=.env docker-compose -f docker-compose.dev.prisma.yml up --build
but as .env file is in .gitignore, GitHub action can't get that file. how can I read them?
almost same issue in my package.json file. I have need some env variables to be read from npm scripts:
"start:backend": "wait-port $API_HOST:API_PORT && yarn start"
what I have tried is added those variables in secrets of github, but it didn't get those variables. though expect those 2 files, envs are read perfectly from github action.
Try creating your env file manually as a step in your workflow and pass in your repository secrets. Your docker-compose and package.json should be able to read your environment variables:
- name: create env file
run: |
touch .env
echo VARIABLE=${{ secrets.VARIABLE }} >> .env

How do I use an env file with GitHub Actions?

I have multiple environments (dev, qa, prod) and I'm using .env files to store secrets etc... Now I'm switching to GitHub Actions, and I want to use my .env files and declare them into the env section of the github actions yml.
But from what I've seen so far, it seems that I can not set a file path and I have to manually re-declare all variables.
How should I proceed as best practice?
A quick solution here could be having a step to manually create the .env file before you need it.
- name: 'Create env file'
run: |
touch .env
echo API_ENDPOINT="https://xxx.execute-api.us-west-2.amazonaws.com" >> .env
echo API_KEY=${{ secrets.API_KEY }} >> .env
cat .env
Better method for multiple variables
If you have a lot of env variables simply paste the whole file into a github secret named ENV_FILE and just echo the whole file.:
- name: 'Create env file'
run: |
echo "${{ secrets.ENV_FILE }}" > .env
The easiest way to do this is to create the .env file as a github secret and then create the .env file in your action.
So step 1 is to create the .env files as a secret in github as a base64 encoded string:
openssl base64 -A -in qa.env -out qa.txt
or
cat qa.env | base64 -w 0 > qa.txt
Then in you action, you can do something like
- name: Do Something with env files
env:
QA_ENV_FILE: ${{ secrets.QA_ENV_FILE }}
PROD_ENV_FILE: ${{ secrets.PROD_ENV_FILE }}
run: |
[ "$YOUR_ENVIRONMENT" = qa ] && echo $QA_ENV_FILE | base64 --decode > .env
[ "$YOUR_ENVIRONMENT" = prod ] && echo $PROD_ENV_FILE | base64 --decode > .env
There are a number of ways for determining $YOUR_ENVIRONMENT but usually this can be extracted from the GITHUB_REF object. You applications should be able to read from the .env files as needed.
I would suggest 3 pretty simple ways to engage your .env file variables in the GitHub Actions workflow. They differ based on whether you store the file in your repository (the worst practice) or keep it out of it (the best practice).
You keep your .env file in the repository:
There are some ready-made actions that allow to read the .env variables (e.g. Dotenv Action,Simple Dotenv).
(simple, manual, annoying when update .env variables) You keep your file out of your repository:
You manually copy the content of the respective .env files (say .env.stage, .env.production) into the respective GitHub Actions secret variables (say WEBSITE_ENV_STAGE, WEBSITE_ENV_PRODUCTION).
Then at your GitHub Actions workflow script create the .env file from the desired variable like this echo "${{secrets.WEBSITE_ENV_STAGE }}" > .env and use it in the workflow.
(a bit more involved though prepare it once, then change your .env variables at the local machine, then sync these at GitHub with one click) As in item 2 above, the file is out of the repository.
Now you use the GitHub Actions API to create or update the secrets. On your local machine in the dev environment you write the NodeJS script that calls the API endpoint and write the .env files to the desired GitHub Actions secret variable (say as above into WEBSITE_ENV_STAGE or to both stage and production variables at once);
This is pretty wide choice of ways to engage the .env files's variables in the workflow. Use any matching your preference and circumstances.
Just for information, there is the 4th way which engages some 3rd party services like Dotenv Vault or HasiCorp Vault (there are more of the kind) where you keep you secret variables to read these to create .env file at build time with your CI/CD pipeline. Read there for details.
Edit:
You were using Circleci Contexts, so with that you had a set of secrets of each env. I know they are working to bring secrets to org level, and maybe team level... there is no info if they will create sort of contexts like we have in CCI.
I have thought on adding the env as prefix of the secret name like STAGE_GITHUB_KEY or INTEGRATION_GITHUB_KEY using ${env}_GITHUB_KEY on the yml as a workaround for now... What do you think?
--- Original answer:
If I understand you well, you already have the dotenv files stored somewhere and you want to inject all those secrets into the steps, without having to manually add them to github secrets and do the mapping in each workflow you migrate... right?
There is an action made by someone that reads a dotenv file and put its values into ouputs, so you can use them linked in further steps. Here is the link: https://github.com/marketplace/actions/dotenv-action
Whatever is present in the .env file will be converted into an output variable. For example .env file with content:
VERSION=1.0
AUTHOR=Mickey Mouse
You do:
id: dotenv
uses: ./.github/actions/dotenv-action
Then later you can refer to the alpine version like this ${{ steps.dotenv.outputs.version }}
You can also use a dedicated github action from github-marketplace to create .env files.
Example usage:
name: Create envfile
on: [push]
jobs:
create-envfile:
runs-on: ubuntu-18.04
steps:
- name: Make envfile
uses: SpicyPizza/create-envfile#v1
with:
envkey_DEBUG: false
envkey_SOME_API_KEY: "123456abcdef"
envkey_SECRET_KEY: ${{ secrets.SECRET_KEY }}
file_name: .env
Depending on your values defined for secrets in github repo, this will create a .env file like below:
DEBUG: false
SOME_API_KEY: "123456abcdef"
SECRET_KEY: password123
More info: https://github.com/marketplace/actions/create-env-file
Another alternative is to use the Environments feature from github. Although that isn't available on private repos in the free plan.
You could have scoped variables, at repository, profile/organization level and environment. The configuration variables closer to the repository takes precedence over the others.
I tried using the accepted solution but GitHub actions was complaining about the shell commands. I kept getting this error: line 3: unexpected EOF while looking for matching ``'
Instead of referencing the secrets directly in the shell script, I had to pass them in separately.
- name: Create env file
run: |
touch .env
echo POSTGRES_USER=${POSTGRES_USER} > .env
echo POSTGRES_PASSWORD=${POSTGRES_PASSWORD} > .env
cat .env
env:
POSTGRES_USER: ${{ secrets.POSTGRES_USER }}
POSTGRES_PASSWORD: ${{ secrets.POSTGRES_PASSWORD }}
You can export all secrets to environment variables and do everything from a script.
I created an action exactly for that - takes all the secrets and exports them to environment variables.
An example would be:
- run: echo "Value of MY_SECRET1: $MY_SECRET1"
env:
MY_SECRET1: ${{ secrets.MY_SECRET1 }}
MY_SECRET2: ${{ secrets.MY_SECRET2 }}
MY_SECRET3: ${{ secrets.MY_SECRET3 }}
MY_SECRET4: ${{ secrets.MY_SECRET4 }}
MY_SECRET5: ${{ secrets.MY_SECRET5 }}
MY_SECRET6: ${{ secrets.MY_SECRET6 }}
...
You could convert it to:
- uses: oNaiPs/secrets-to-env-action#v1
with:
secrets: ${{ toJSON(secrets) }}
- run: echo "Value of MY_SECRET1: $MY_SECRET1"
Link to the action, which contains more documentation about configuration: https://github.com/oNaiPs/secrets-to-env-action
I was having the same issue. What I wanted was to upload a .env file to my server instead of defining the env variables in my Github repo. Since I was not tracking my .env file so every time my workflow ran the .env file got deleted. So what I did was :
Added the .env file in the project root directory in my server.
Added clean: false under with key in my actions/checkout#v2 in my workflow
eg:
jobs:
build:
runs-on: self-hosted
strategy:
matrix:
node-version: [14.x]
- uses: actions/checkout#v2
with:
clean: 'false'
This prevents git from deleting untracked files like .env. For more info see: actions/checkout
One more approach would be doing something as described in https://docs.github.com/en/actions/security-guides/encrypted-secrets#limits-for-secrets
So basically treating your .env file as a "large secret". In this case, the encrypted .env file is kept commited in your repo, which should be fine. Then in your action have a step to decrypt the .env file.
This removes the overhead of having to create each individual secret inside your .env as a Github secret. The only Github secret to maintain in this case, is one for the encryption password. If you have multiple .env files such as qa.env, prod.env, etc... I would strongly suggest using a different encryption password for each, and then store each encryption passwords as an "environment secret" in Github instead of "repo secret" (if using Github environments is your thing. See https://docs.github.com/en/actions/deployment/targeting-different-environments/using-environments-for-deployment).
If you don't want to commit the (encrypted) .env file in you repo, then I would go with the base64 approach described in https://stackoverflow.com/a/64452700/1806782 (which is simmilar to what's in https://docs.github.com/en/actions/security-guides/encrypted-secrets#storing-base64-binary-blobs-as-secrets) and then create a new Github secret to host the encoded contents.
For those like me with aversion to manual repetitive tasks, Github secret creation can these days easily be scripted with the Github CLI tool. See
https://cli.github.com/manual/gh_secret_set . It also supports 'batch' creation of secrets from env files (see the -f, --env-file flags)
inspired by Valentine Shis answer above, I created a GitHub Action for this use-case and the one I had at the time while reading this thread.
GitHub Action: next-env
GitHub Action to read .env.[development|test|production][.local] files in Next.js (but also non Next.js) projects and add variables as secrets to GITHUB_ENV.
Despite the name, it also works in non-Next.js projects as it uses a decoupled package of the Next ecosystem.
You need to define your environment variables in "Secrets" section of your repository. Then you can simply use your secrets in your workflow.
Example usage:
- uses: some-action#v1
env:
API_KEY: ${{ secrets.API_KEY }}
SECRET_ID: ${{ secrets.SECRET_ID }}
with:
password: ${{ secrets.MY_PASSWORD }}
Here is the documentation:
https://help.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets

setting environment variables in ansible permanently

i am using ansible to add permanent environment variables in ubuntu bashrc .
i have these settings defined in prod_vars file:
enviornment_variables:
PRODUCTION:
MONGO_IP: 0.0.0.0
MONGO_PORT: 27017
ELASTIC_IP: localhost
ELASTIC_PORT: 9200
how can i export it using a task? i kniow about lineinfile module but i do not want to repeat for every env var
- name: set env in the bashrc files
lineinfile: dest=/home/user/.bashrc line='export MONGO_IP=enviornment_variables[PRODUCTION][MONGO_IP]'
also above command gives synatx error?
Instead of using lineinfile module, use the blockinfile module.
So something like this should work:
- name: Adding to environment variables for user
blockinfile:
path: /home/user/.bashrc
insertafter: EOF
block: |
export {{ item.key }}={{ item.val }}
marker: "# {mark} {{ item.key }}"
with_dict:
"{{ enviornment_variables['PRODUCTION'] }}"
ps: The spelling error in "environment" literally took 20+ minutes for me to identify!