User Environment Variable with airflow.webserver.defaultUser.password - kubernetes

I am using the Helm chart for Apache Airflow and trying to set the password of the default user to the value of an environment variable:
airflow:
env:
- name: PASSWORD
value: Hello, World!
webserver:
defaultUser:
password: $PASSWORD
However, this is setting the password to literally $PASSWORD instead of Hello, World!.
I have tried other things like password: ${PASSWORD} to no avail.

Use as follow, this is as described in official examples.
$(PASSWORD)

Related

Using AWS CLI from Azure pipeline

I'm trying to use AWS cli within a script section of an Azure pipeline. The script section is in a template file and it's accessed from the main pipeline.
steps:
- bash: |
step_function_state=`aws stepfunctions list-executions --state-machine-arn $(stateMachineArn) --status-filter RUNNING | jq -r '.executions[]|.status' | head -1`
echo "State machine RUNNING status: ${step_function_state}"
# Rest of the script#
displayName: "Test Script"
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_DEFAULT_REGION: $(AWS_DEFAULT_REGION)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
stateMachineArn, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION are stored in a variable group. When running the pipeline it gives the following error,
An error occurred (UnrecognizedClientException) when calling the ListExecutions operation: The security token included in the request is invalid.
Using the same credentials I am able to run my local CLI and get the results.
I tried printenv command and all the AWS variables are in the environment too. What could I possibly do wrong?
I realized that this issue occurred due to credential mismatch.
After adding the correct credentials (same as local cli) the pipeline CLI also started to work.
Based on the error log it felt like aws_session_token could be an issue but the actual issue was in aws_access_key_id and aws_secret_access_key.

azure pipelines : accessing secret variables

I am trying to access secret variable to pass it to another script.
I expect following code in pipeline to print Value but it prints some text 'xxx' ragardless of the value of a secret variable
echo xxx
Pipeline Snippet
steps:
- bash: echo This script could use $SYSTEM_ACCESSTOKEN
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
If you want to access a secret variable, you could print it to a file. Check the example below:
steps:
- powershell: |
$env:var1 | Out-File C:\Users\xxx\Desktop\Newfolder\debug.txt
displayName: 'PowerShell Script'
env:
var1: $(System.AccessToken)
But System.Accesstoken is a PAT token generated for the service identity “Project Collection Build Service (account)”, it's not needed to verify the value of System.AccessToken. In addition, if you want to print the value of System.AccessToken to a file, you need to check the Allow scripts to access the OAuth token in the agent job:
Azure pipelines will scan the output and mask the secret, you can simply split it up and print it in two parts.
Here is a bash example:
echo "${MY_SECRET:0:10}" # Print the first 10 characters
echo "${MY_SECRET:10:100}" # Print character 11 - 100
You should of course only do it for debugging purposes and not leave it in your pipeline.
Updates:
If I save secret value to a file and publish that file as an artifact secret is visible in cleartext.
After speaking to my colleagues I have realized that all text in logs if it contains a secret value it will be masked.
It will interesting to see if I have 2 variables viz.
OPEN_VAR='something' # No Secret
and
SECRET_VAR='something' # Values same as above but Secret
if I print $OPEN_VAR ; does it mask value because "something" is also a value of "SECRET_VAR"
This is because SYSTEM_ACCESSTOKEN is a secret. If you do the same with variable which is not a secret you will be able to see value.

Ansible uri module

I have web url and i have two options enable/disable , Now I am writing ansible playbook to do this which is in Powershell script
My ansible playbook
---
- hosts: localhost
vars:
applauncher: "xweb"
jobState: "Disable"
serverName: "NETBATCH"
jobName: "Loan1"
tasks:
- name: edit app jobs
uri:
url: 'http://{{ applauncher }}/Edit/{{ jobState }}?Server={{ serverName }}&JobName={{ jobName }}'
method: POST
user: xxxx
password: xxx
I am passing extra vars to disable and enable option , Is it possible to use this option in uri module and which method I should use to select disable/enable option in URL
I think this is what you are after, but it took me a while to figure out what you are asking for. Read about vars_prompt playbook keyword: https://docs.ansible.com/ansible/latest/user_guide/playbooks_prompts.html

docker-compose mongodb access env variables in /docker-entrypoint-initdb.d/ script

This question is based off the top answer to a previous question on the same topic
My question is, in my custom /docker-entrypoint-initdb.d/ init script, how can I reference env variables that are declared in docker-compose's .env file? Meaning, env variables besides MONGO_INITDB_ROOT_USERNAME and MONGO_INITDB_ROOT_PASSWORD.
for example:
mongo --eval "db.getSiblingDB('sibling').createUser({user: '$SIBLING_USER', pwd: '$SIBLING_PASSWORD', roles: [{ role: 'readWrite', db: 'sibling' }]})"
I did the following for an Reverse Proxy using NGINX where based on a env variable it loads a different config file.
Docker-compose.yml:
https-proxy:
build:
context: ./https-proxy
dockerfile: ./Dockerfile
args:
- MY_VAR=TRUE
Dockerfile:
FROM nginx
ARG MY_VAR
ENV MY_VAR=${MY_VAR}
RUN bash ./etc/nginx/config.sh
config.sh:
#!/bin/bash
if [ $MY_VAR == true ]; then
echo 'My Var is True'
else
echo 'My Var is False'
You could also define an .env file aside your Docker-compose.yml, so you don't have to change that file and only define the values on a different place where Docker-compose will look for them

setting environment variables in ansible permanently

i am using ansible to add permanent environment variables in ubuntu bashrc .
i have these settings defined in prod_vars file:
enviornment_variables:
PRODUCTION:
MONGO_IP: 0.0.0.0
MONGO_PORT: 27017
ELASTIC_IP: localhost
ELASTIC_PORT: 9200
how can i export it using a task? i kniow about lineinfile module but i do not want to repeat for every env var
- name: set env in the bashrc files
lineinfile: dest=/home/user/.bashrc line='export MONGO_IP=enviornment_variables[PRODUCTION][MONGO_IP]'
also above command gives synatx error?
Instead of using lineinfile module, use the blockinfile module.
So something like this should work:
- name: Adding to environment variables for user
blockinfile:
path: /home/user/.bashrc
insertafter: EOF
block: |
export {{ item.key }}={{ item.val }}
marker: "# {mark} {{ item.key }}"
with_dict:
"{{ enviornment_variables['PRODUCTION'] }}"
ps: The spelling error in "environment" literally took 20+ minutes for me to identify!