How do I use Ansible's "include_vars" with a GitHub file? - github

I have a YML file stored in a private GitHub repository and am trying to use Ansible's include_vars module to upload these variables in the playbook_dir.
The tasks responsible for this behavior are all passing however, a later job is failing due to one of the variables being undefined. Which makes me think that I am not correctly adding the values to the directory.
My task.yml file looks like:
---
- name: Ensure the application directory exists
win_file:
path: '{{ playbook_dir }}'
state: directory
- name: Download vars file from VCS
win_get_url:
url: 'https://raw.githubusercontent.com/<org>/<repo>/main/<path-to-file>/DEV1.yml?token=<temp raw file token>'
dest: '{{ playbook_dir }}/DEV1.yml'
force: true
register: result
- name: Include vars file for deployment
include_vars:
dir: '{{ playbook_dir }}'
files_matching: DEV1.yml
register: varfile
- debug:
msg: '{{ varfile }}'
After running this, the console output for the debug task is:
12:35:27 TASK [get_vars : debug] ********************************************************
12:35:27
12:35:27 {"failed":false,"ansible_included_var_files":[],"ansible_facts":{},"changed":false}
Am I not able to download files from GitHub through Ansible?

Related

Github action: stored .env file content into github secrets and in pipeline want to put secret content in .env file

I stored production .env file content into github secrets (in single variable), wants to create the .env file in pipeline and put the secret content into .env file.
Tried following methods
...
env:
ENV_CONTENT: ${{ secrets.ENV_DEV }}
...
run: |
touch .env
echo $ENV_CONTENT
echo $ENV_CONTENT >> .env
cat .env
...
run: |
echo ${{ secrets.ENV_DEV }} >> .env
cat .env
...
Output: variable is env file is not getting defined.
> demo#1.0.0 deploy:dev /home/runner/work/lvld-api/lvld-api
> NODE_ENV=dev serverless deploy --stage dev
Serverless: Deprecation warning: Detected ".env" files. Note that Framework now supports loading variables from those files when "useDotenv: true" is set (and that will be the default from next major release)
More Info: https://www.serverless.com/framework/docs/deprecations/#LOAD_VARIABLES_FROM_ENV_FILES
Serverless: DOTENV: Loading environment variables from .env:
Serverless: - STAGE
Serverless Warning --------------------------------------
A valid environment variable to satisfy the declaration 'env:REGION' could not be found.
Serverless Warning --------------------------------------
Main.yml: https://drive.google.com/file/d/1PK4SlyXkC7xRn_eM2SQO1rkWjoJaOYaZ/view?usp=sharing
GithubAction Log: https://drive.google.com/file/d/1YvBfdle1GpomJpyuqneQt0PK-OYShXZC/view?usp=sharing

Pass build directory (/dist) from a job to next job in concourse

I know its not quite simple to do this and tried to explore many approaches but either I couldn't understand it properly or didn't work for me.
I have a concourse job which runs angular build (ng build) and creates /dist folder. This works well.
jobs:
- name: cache
plan:
- get: source
trigger: true
- get: npm-cache
- name: build
plan:
- get: source
trigger: true
passed: [cache]
- get: npm-cache
passed: [cache]
- task: run build
file: source/ci/build.yml
build.yml
---
platform: linux
image_resource:
type: docker-image
source: { repository: alexsuch/angular-cli, tag: '7.3' }
inputs:
- name: source
- name: npm-cache
path: /cache
outputs:
- name: artifact
run:
path: source/ci/build.sh
build.sh
#!/bin/sh
mv cache/node_modules source
cd source
npm rebuild node-saas # temporary fix
npm run build_prod
cp -R dist ../artifact/
I have mentioned output as artifact where I am storing the dist content.
But when I am trying to use this in next job, it doesn't work. Failed with missing input error.
Here is the next job that supposed to consume this dist folder:
jobs:
...
...
- name: list
plan:
- get: npm-cache
passed: [cache, test, build]
trigger: true
- task: list-files
config:
platform: linux
image_resource:
type: registry-image
source: { repository: busybox }
inputs:
- name: artifact
run:
path: ls
args: ['-la', 'artifact/']
Can anyone please help me with this. How I can use the dist folder in above job.
I'm not quite sure why would you want to have different plan definitions for each task but here is the simplest way of doing what you want to do:
jobs:
- name: deploying-my-app
plan:
- get: source
trigger: true
passed: []
- get: npm-cache
passed: []
- task: run build
file: source/ci/build.yml
- task: list-files
file: source/ci/list-files.yml
build.yml
---
platform: linux
image_resource:
type: docker-image
source: { repository: alexsuch/angular-cli, tag: '7.3' }
inputs:
- name: source
- name: npm-cache
path: /cache
outputs:
- name: artifact
run:
path: source/ci/build.sh
list-files.yml
---
platform: linux
image_resource:
type: registry-image
source: { repository: busybox }
inputs:
- name: artifact
run:
path: ls
args: ['-la', 'artifact/']
build.sh
#!/bin/sh
mv cache/node_modules source
cd source
npm rebuild node-saas # temporary fix
npm run build_prod
cp -R dist ../artifact/
Typically you would pass folders as inputs and outputs between TASKS instead of JOBS (althought there's some alternatives)
Concourse is statelessness and that is the idea behind it. But if you want to pass something between jobs the only way to do that is to use a concourse resource and depending on the nature of the project that could be anything from a git repo to a s3 bucket, docker image etc. You can create your own custom resources too.
Using something like s3 concourse resource for example
This way you can push your artifact to an external storage and then use it again on the next jobs on the get step as a resource. But that just may create some unnecesary complexity understanding that what you want to do is pretty straightforward
In my experience I found that sometimes the visual aspect of a job plan in the concourse dashboard gives the impression that a job-plan should by task atomic, which is not always needed
Hope that helps.

How to add a callback-plugin to AWX docker

Installed AWX docker from here - https://github.com/ansible/awx. I am trying to add a callback-plugin for a specific project as written here - https://docs.ansible.com/ansible-tower/latest/html/administration/tipsandtricks.html#using-callback-plugins-with-tower. Does not work. I add to Template-> EXTRA VARIABLES lines
---
bin_ansible_callbacks: true
callback_plugins: /callback_plugins
stdout_callback: selective
Does not work.
I add the directory /var/lib/awx/projects/test/callback_plugins/ to SETTINGS-> JOBS-> ANSIBLE CALLBACK PLUGINS - it doesn't work either.
Tell me, please, how to do it correctly, so that another (custom) plugin picks up and earns.
I'm issuing the same problem, after some debugs the problem I've open a issue on AWX project https://github.com/ansible/awx/issues/4149
In the meantime I've applied a workaround that consists in create a symlinks for each callback plugin you want to use in the callback_plugins folder of your roles project
For example, if you are using the ara project
- name: Research for callbacks in virtualenv libs
find:
path: '{{ ansible_playbook_python|dirname|dirname }}/{{ item }}'
file_type: file
depth: 1
patterns: '*.py'
excludes: '__init__*'
register: _internal__callbacks
with_items:
- lib/python3.6/site-packages/ara/plugins/callbacks
# TODO : prevent existing callbacks to be overwritten
- name: Create symlinks from virtualenv lib directory to local callback_plugins/
file:
src: '{{ item }}'
dest: '{{ playbook_dir }}/callback_plugins/{{ item|basename }}'
state: link
with_items: "{{ _internal__callbacks.results|map(attribute='files')|flatten|map(attribute='path')|list }}"
seems like you have to use callbacks_enabled instead of callback_plugins. put this example configuration in the /var/lib/awx/ansible.cfg file:
[defaults]
callback_whitelist = profile_tasks
--- works on AWX 17.x

Combining ansible tasks to one task with one definition of "changed"

As part of deployment there is a bit of compilation I want to do on the host. This consists of moving the source files, compiling the program, and removing the source files. I would want it to work in such a way that this results in just ok rather than changed if the program did not change.
This would accurately describe the situation, because if the program did not change, then by running the playbook a (assumedly) non-existent directory would be created, a command run resulting in some file that's then moved to where a identical copy used to be, and then the source files are removed and the created directory is once again non-existent.
Concretely, the tasks would be something like:
- copy:
src: src/
dest: /tmp/my_program_src/
- shell: my_compiler -o /usr/local/bin/my_program /tmp/my_program_src/main.file
become: true
- file:
path: /tmp/my_program_src/
state: absent
Of course what actually happens is that all three report "changed"; because for shell I would have to define changed_when myself, and copy as well as file change something, though they cancel each other out.
Can I group this together into one task which reports ok if /usr/local/bin/my_program did not change? If yes, then how? If no, then what would be the 'right' way to do something like this?
IMHO, I recommend doing the Ansible way like this. Other option is generating a script and calling it by command:. Then checking the sha1 with Ansible, but I don't like that option.
---
- name: Example
hosts: localhost
gather_facts: False
connection: local
tasks:
- name: Get cksum of my program
stat:
path : "/usr/local/bin/my_program"
register: myprogram1
- name: Current SHA1
set_fact:
mp1sha1: "{{ myprogram1.stat.checksum }}"
- name: Copy File
copy:
src: src/
dest: /tmp/my_program_src/
changed_when: False
- name: Compile
shell: my_compiler -o /usr/local/bin/my_program /tmp/my_program_src/main.file
become: true
changed_when: False
- name: Remove src
file:
path: /tmp/my_program_src/
state: absent
changed_when: False
- name: Get cksum of my program
stat:
path : "/usr/local/bin/my_program"
register: myprogram2
- name: Current SHA1
set_fact:
mp2sha1: "{{ myprogram2.stat.checksum }}"
- name: Compilation Changed
debug:
msg: "Check Compilation"
changed_when: mp2sha1 == mp1sha1

Concourse task input folder is empty

I'm experimenting with building a gradle based java app. My pipeline looks like this:
---
resources:
- name: hello-concourse-repo
type: git
source:
uri: https://github.com/ractive/hello-concourse.git
jobs:
- name: gradle-build
public: true
plan:
- get: hello-concourse-repo
trigger: true
- task: build
file: hello-concourse-repo/ci/build.yml
- task: find
file: hello-concourse-repo/ci/find.yml
The build.yml looks like:
---
platform: linux
image_resource:
type: docker-image
source:
repository: java
tag: openjdk-8
inputs:
- name: hello-concourse-repo
outputs:
- name: output
run:
path: hello-concourse-repo/ci/build.sh
caches:
- path: .gradle/
And the build.sh:
#!/bin/bash
export ROOT_FOLDER=$( pwd )
export GRADLE_USER_HOME="${ROOT_FOLDER}/.gradle"
export TERM=${TERM:-dumb}
cd hello-concourse-repo
./gradlew --no-daemon build
mkdir -p output
cp build/libs/*.jar output
cp src/main/docker/* output
ls -l output
And finally find.yml
---
platform: linux
image_resource:
type: docker-image
source: {repository: busybox}
inputs:
- name: output
run:
path: ls
args: ['-alR']
The output of ls at the end of the bash.sh script shows me that the output folder contains the expected files, but the find task only shows empty folders:
What am I doing wrong that the output folder that I'm using as an input in the find task is empty?
The complete example can be found here with the concourse files in the ci subfolder.
You need to remember some things:
There is an initial working directory for your tasks, lets call it '.' (Unless you specify 'dir'). In this initial directory you will find a directory for all your inputs and outputs.
i.e.
./hello-concourse-repo
./output
When you declare an output, there's no need to create a folder 'output' from your script, it will be created automatically.
If you navigate to a different folder in your script, you need to return to the initial working directory or use relative paths to find other folders.
Below you will find the updated script with some comments to fix the problem:
#!/bin/bash
export ROOT_FOLDER=$( pwd )
export GRADLE_USER_HOME="${ROOT_FOLDER}/.gradle"
export TERM=${TERM:-dumb}
cd hello-concourse-repo #You changed directory here, so your 'output' folder is in ../output
./gradlew --no-daemon build
# Add this line to return to the initial working directory or use ../output or $ROOT_FOLDER/output when compiling.
#mkdir -p output <- This line is not required, you already defined an output with this name
cp build/libs/*.jar ../output
cp src/main/docker/* ../output
ls -l ../output
Since you are defining ROOT_FOLDER variable you can use it to navigate.
You are still inside hello-concourse-repo and need to move output up one level.