Currently working on ansible playbook for postgresql installation.
Trying to insert multiple values to shared_preload_libraries. Idea is to be able to insert them conditionnaly and different roles must be able to append to this line without overwriting it.
Current solution:
- name: Check whether postgresql.conf contains "pg_stat_statements"
command: grep "pg_stat_statements" "{{ pg_data }}/postgresql.conf"
register: check_shared_libs
check_mode: no
ignore_errors: yes
changed_when: no
- name: Modify postgresql.conf add pg_stat_statements shared library.
replace:
backup: yes
dest: "{{ pg_data }}/postgresql.conf"
regexp: "^#?(shared_preload_libraries = '.*)(')"
replace: '\1,pg_stat_statements\2'
when: check_shared_libs.rc != 0
notify: restart postgres
which gives me something like this: shared_preload_libraries = ',pg_stat_statements,powa,pg_stat_kcache,pg_qualstats,pg_wait_sampling'
Works fine, but I wonder if there's any better way to do it.
one way to avoid using the command task is the below. it comprises of 2 tasks because reading a remote file into a variable cant be done with a lookup plugin..
so, step1 we read the whole file into a variable and step2 we check if the variable contains the pg_stat_statements:
code:
---
- name: test play
hosts: localhost
connection: local
gather_facts: false
become: yes
vars:
pg_stat_statements_exists: false
tasks:
- name: read file
slurp:
src: /tmp/postgresql.conf
register: postgresql_file
- name: check pg_stat_statements exists
set_fact:
pg_stat_statements_exists: true
when: postgresql_file['content'] | b64decode is search('pg_stat_statements') == true
- name: print variable that will control the replace task that follows
debug:
var: pg_stat_statements_exists
and adding your replace task with the modified when condition:
- name: Modify postgresql.conf add pg_stat_statements shared library.
replace:
backup: yes
dest: "{{ pg_data }}/postgresql.conf"
regexp: "^#?(shared_preload_libraries = '.*)(')"
replace: '\1,pg_stat_statements\2'
when: pg_stat_statements_exists == true
notify: restart postgres
hope it helps
Related
I have a YML file stored in a private GitHub repository and am trying to use Ansible's include_vars module to upload these variables in the playbook_dir.
The tasks responsible for this behavior are all passing however, a later job is failing due to one of the variables being undefined. Which makes me think that I am not correctly adding the values to the directory.
My task.yml file looks like:
---
- name: Ensure the application directory exists
win_file:
path: '{{ playbook_dir }}'
state: directory
- name: Download vars file from VCS
win_get_url:
url: 'https://raw.githubusercontent.com/<org>/<repo>/main/<path-to-file>/DEV1.yml?token=<temp raw file token>'
dest: '{{ playbook_dir }}/DEV1.yml'
force: true
register: result
- name: Include vars file for deployment
include_vars:
dir: '{{ playbook_dir }}'
files_matching: DEV1.yml
register: varfile
- debug:
msg: '{{ varfile }}'
After running this, the console output for the debug task is:
12:35:27 TASK [get_vars : debug] ********************************************************
12:35:27
12:35:27 {"failed":false,"ansible_included_var_files":[],"ansible_facts":{},"changed":false}
Am I not able to download files from GitHub through Ansible?
Installed AWX docker from here - https://github.com/ansible/awx. I am trying to add a callback-plugin for a specific project as written here - https://docs.ansible.com/ansible-tower/latest/html/administration/tipsandtricks.html#using-callback-plugins-with-tower. Does not work. I add to Template-> EXTRA VARIABLES lines
---
bin_ansible_callbacks: true
callback_plugins: /callback_plugins
stdout_callback: selective
Does not work.
I add the directory /var/lib/awx/projects/test/callback_plugins/ to SETTINGS-> JOBS-> ANSIBLE CALLBACK PLUGINS - it doesn't work either.
Tell me, please, how to do it correctly, so that another (custom) plugin picks up and earns.
I'm issuing the same problem, after some debugs the problem I've open a issue on AWX project https://github.com/ansible/awx/issues/4149
In the meantime I've applied a workaround that consists in create a symlinks for each callback plugin you want to use in the callback_plugins folder of your roles project
For example, if you are using the ara project
- name: Research for callbacks in virtualenv libs
find:
path: '{{ ansible_playbook_python|dirname|dirname }}/{{ item }}'
file_type: file
depth: 1
patterns: '*.py'
excludes: '__init__*'
register: _internal__callbacks
with_items:
- lib/python3.6/site-packages/ara/plugins/callbacks
# TODO : prevent existing callbacks to be overwritten
- name: Create symlinks from virtualenv lib directory to local callback_plugins/
file:
src: '{{ item }}'
dest: '{{ playbook_dir }}/callback_plugins/{{ item|basename }}'
state: link
with_items: "{{ _internal__callbacks.results|map(attribute='files')|flatten|map(attribute='path')|list }}"
seems like you have to use callbacks_enabled instead of callback_plugins. put this example configuration in the /var/lib/awx/ansible.cfg file:
[defaults]
callback_whitelist = profile_tasks
--- works on AWX 17.x
As part of deployment there is a bit of compilation I want to do on the host. This consists of moving the source files, compiling the program, and removing the source files. I would want it to work in such a way that this results in just ok rather than changed if the program did not change.
This would accurately describe the situation, because if the program did not change, then by running the playbook a (assumedly) non-existent directory would be created, a command run resulting in some file that's then moved to where a identical copy used to be, and then the source files are removed and the created directory is once again non-existent.
Concretely, the tasks would be something like:
- copy:
src: src/
dest: /tmp/my_program_src/
- shell: my_compiler -o /usr/local/bin/my_program /tmp/my_program_src/main.file
become: true
- file:
path: /tmp/my_program_src/
state: absent
Of course what actually happens is that all three report "changed"; because for shell I would have to define changed_when myself, and copy as well as file change something, though they cancel each other out.
Can I group this together into one task which reports ok if /usr/local/bin/my_program did not change? If yes, then how? If no, then what would be the 'right' way to do something like this?
IMHO, I recommend doing the Ansible way like this. Other option is generating a script and calling it by command:. Then checking the sha1 with Ansible, but I don't like that option.
---
- name: Example
hosts: localhost
gather_facts: False
connection: local
tasks:
- name: Get cksum of my program
stat:
path : "/usr/local/bin/my_program"
register: myprogram1
- name: Current SHA1
set_fact:
mp1sha1: "{{ myprogram1.stat.checksum }}"
- name: Copy File
copy:
src: src/
dest: /tmp/my_program_src/
changed_when: False
- name: Compile
shell: my_compiler -o /usr/local/bin/my_program /tmp/my_program_src/main.file
become: true
changed_when: False
- name: Remove src
file:
path: /tmp/my_program_src/
state: absent
changed_when: False
- name: Get cksum of my program
stat:
path : "/usr/local/bin/my_program"
register: myprogram2
- name: Current SHA1
set_fact:
mp2sha1: "{{ myprogram2.stat.checksum }}"
- name: Compilation Changed
debug:
msg: "Check Compilation"
changed_when: mp2sha1 == mp1sha1
I'll try to make this brief... I'm setting up ansible to write a PostgreSQL pg_hba.conf file, and what I want to do is permit any db server to replicate to any other db server. This way I don't have to reconfigure in the event of a failure. I want ansible to insert lines for each host listed in the group "db". These entries must be CIDR types. So, far I've only succeeded in getting each system to show their own CIDR in the file. I've looked extensively with no joy, but here's what I'm trying to use:
- name: Update the pg_hba.conf file
lineinfile:
path: '{{ pg_data }}/{{ pg_cluster_name }}/pg_hba.conf'
regexp: 'hostssl replication'
insertafter: 'hostssl replication'
line: "hostssl replication rplctn_usr {{ hostvars[ '{{ item }}' ]['ansible_default_ipv4']['address'] }}/32 md5"
with_items: groups['db']
tags:
- "pg_hba.conf"
Nothing I've done gets the {{ item }} variable to expand properly. Anyone?
Firstly, you need to reference the var to iterate through with braces:
with_items: "{{ groups['db'] }}"
Second, item is the var representing the value of each iteration. Inside {{ }} you can reference any vars directly without extra braces:
{{ hostvars[item]['ansible_default_ipv4']['address'] }}
I'm sending a config file for thousands of nodes, because of some customisation there's maybe 5 or 6 paths to that file (There's only one file for host but the path can vary) and there isn't a easy way to figure out the default location with facts.
Based on this, I'm looking for some way to set the "dest" of copy module like we can set the "src", with a with_first_found loop.
Something like that:
copy: src=/foo/{{ ansible_hostname }}/nrpe.cfg dest="{{item}}
with_items:
- "/etc/nagios/nrpe.cfg"
- "/usr/local/nagios/etc/nrpe.cfg"
- "/usr/lib64/nagios/etc/nrpe.cfg"
- "/usr/lib/nagios/etc/nrpe.cfg"
- "/opt/nagios/etc/nrpe.cfg"
PS: I'm sending nrpe.cfg so if someone knows a better way to find where's the default nrpe.cfg it will be a lot easier.
EDIT 1: I've managed to work with the help from #ydaetskcoR like this:
- name: find nrpe.cfg
stat:
path: "{{ item }}"
with_items:
- "/etc/nagios/nrpe.cfg"
- "/usr/local/nagios/etc/nrpe.cfg"
- "/usr/lib64/nagios/etc/nrpe.cfg"
- "/usr/lib/nagios/etc/nrpe.cfg"
- "/opt/nagios/etc/nrpe.cfg"
register: nrpe_stat
no_log: True
- name: Copy nrpe.cfg
copy: src=/foo/{{ ansible_hostname }}/nrpe.cfg dest="{{item.stat.path}}"
when: item.stat.exists
no_log: True
with_items:
- "{{nrpe_stat.results}}"
One option could be to simply search for the already existing nrpe.cfg file and then register that location as a variable to be used for the copy task.
You could do that either through a shell/command task that just uses find or loop through a bunch of locations with stat to check if they exist.
So you might have something like this:
- name: find nrpe.cfg
shell: find / -name nrpe.cfg
register: nrpe_path
- name: overwrite nrpe.cfg
copy: src=/foo/{{ ansible_hostname }}/nrpe.cfg dest="{{item}}"
with_items:
- nrpe_path.stdout_lines
when: nrpe_path.stdout != ""
register: nrpe_copied
- name: copy nrpe.cfg to box if not already there
copy: src=/foo/{{ ansible_hostname }}/nrpe.cfg dest="{{ default_nrpe_path }}"
when: nrpe_copied is not defined
As Mxx pointed out in the comments, we have a third task to fall back to copying to some default path (potentially /etc/nagios/ or any other path really) if the nrpe.cfg file hasn't been found by find.
To use stat rather than a shell/command task you could do something like this:
- name: find nrpe.cfg
stat:
path: {{ item }}
with_items:
- "/etc/nagios/nrpe.cfg"
- "/usr/local/nagios/etc/nrpe.cfg"
- "/usr/lib64/nagios/etc/nrpe.cfg"
- "/usr/lib/nagios/etc/nrpe.cfg"
- "/opt/nagios/etc/nrpe.cfg"
register: nrpe_stat
- name: overwrite nrpe.cfg
copy: src=/foo/{{ ansible_hostname }}/nrpe.cfg dest="{{item.stat.path}}"
when: item.stat.exists
with_items:
- "{{nrpe_stat.results}}"