How to forward an ansible-awx-credential - ssh-agent

The AWX-template is prompting for a credential to connect via ssh to a large list of hosts.
This credential should be used to push via ssh to a git-repository, because i don't want to create or share a ssh-key on every host.
This is possible with ansible-playbook in combination with ssh-agent.
Is there also a way to use ForwardAgent with AWX?
$ cat ~/.ssh/config
Host *
ForwardAgent yes
$ cat playbook_ssh_forward.yml
---
- name: test ForwardAgent
hosts: all
gather_facts: no
tasks:
- name: check ssh-key
shell: ssh-add -lE md5
register: output
- name: show usable keys
debug:
var: output.stdout
...
$ ansible-playbook -i localhost, playbook_ssh_forward.yml
PLAY [test ForwardAgent] *******************************************************************************************************
TASK [check ssh-key] ***********************************************************************************************************
changed: [localhost]
TASK [show usable keys] ********************************************************************************************************
ok: [localhost] => {
"output.stdout": "2048 MD5:d8:4a:52:41:a7:9e:a3:0d:ff:ff:ff:ff:b5:a3:89:1f rsa-key-20190501 (RSA)"
}
PLAY RECAP *********************************************************************************************************************
localhost : ok=2 changed=1 unreachable=0 failed=0
AWX returns "stderr": "Could not open a connection to your authentication agent."
If possible, I need the ForwardAgent-Option per AWX-Credential and not changing ansible.cfg globally:
[ssh_connection]
ssh_args = -o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=15m -q

Related

Ansible hangs on setup or playbook but SSH works fine

I am having an odd issue with Ansible and connecting to a host (any host) and hoping someone can see something I'm not. I can ssh directly to the host w/o any issue. I can run -m ping w/o issue. But that's where success ends. If I run a -m setup it appears to connect and gather some info, but subsequent connections fail.
This is a server spun up on Proxmox (7.2.11). I've done this 100's of times w/o issue. That's why I can't seen to identify what has changed. I typically spin up a container and set up w/ a ssh key (requiring passphrase) for root. If a VM, I simply copy the public key to the root users authorized_keys. Then run ansible playbook to add the user(s) and services along with locking down ssh. So my playbooks initially run using the root user. Ansible has always prompted for the passphrase and then go along it's merry way.
I'm using pipelining, but I've set to false in testing.
Appreciate any insight you may have... Thank you
Here's the output of a simple gather facts. You can see that the first two SSH: EXEC return a result, but the third connection hangs.
➜ ansible git:(main) ✗ ansible all -vvv -i ./inventory.yml -m setup
ansible 2.10.8
config file = /home/johndoe/NAS1-Mounts/Code/ansible/ansible.cfg
configured module search path = ['/home/johndoe/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.10.6 (main, Nov 14 2022, 16:10:14) [GCC 11.3.0]
Using /home/johndoe/NAS1-Mounts/Code/ansible/ansible.cfg as config file
host_list declined parsing /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml as it did not pass its verify_file() method
script declined parsing /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml as it did not pass its verify_file() method
Parsed /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml inventory source with ini plugin
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
META: ran handlers
<target_server> Attempting python interpreter discovery
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.9'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.8'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<10.2.0.27> (0, b'PLATFORM\nLinux\nFOUND\n/usr/bin/python3\nENDFOUND\n', b'')
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'/usr/bin/python3 && sleep 0'"'"''
<10.2.0.27> (0, b'{"platform_dist_result": [], "osrelease_content": "PRETTY_NAME=\\"Ubuntu 22.04.1 LTS\\"\\nNAME=\\"Ubuntu\\"\\nVERSION_ID=\\"22.04\\"\\nVERSION=\\"22.04.1 LTS (Jammy Jellyfish)\\"\\nVERSION_CODENAME=jammy\\nID=ubuntu\\nID_LIKE=debian\\nHOME_URL=\\"https://www.ubuntu.com/\\"\\nSUPPORT_URL=\\"https://help.ubuntu.com/\\"\\nBUG_REPORT_URL=\\"https://bugs.launchpad.net/ubuntu/\\"\\nPRIVACY_POLICY_URL=\\"https://www.ubuntu.com/legal/terms-and-policies/privacy-policy\\"\\nUBUNTU_CODENAME=jammy\\n"}\n', b'')
Using module file /usr/lib/python3/dist-packages/ansible/modules/setup.py
Pipelining is enabled.
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'/usr/bin/python3 && sleep 0'"'"''
^C [ERROR]: User interrupted execution
➜ ansible git:(main) ✗
-m ping
➜ ansible git:(main) ✗ ansible all -vvv -i ./inventory.yml -m ping
ansible 2.10.8
config file = /home/johndoe/NAS1-Mounts/Code/ansible/ansible.cfg
configured module search path = ['/home/johndoe/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 3.10.6 (main, Nov 14 2022, 16:10:14) [GCC 11.3.0]
Using /home/johndoe/NAS1-Mounts/Code/ansible/ansible.cfg as config file
host_list declined parsing /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml as it did not pass its verify_file() method
script declined parsing /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml as it did not pass its verify_file() method
Parsed /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml inventory source with ini plugin
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
META: ran handlers
<target_server> Attempting python interpreter discovery
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.9'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.8'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<10.2.0.27> (0, b'PLATFORM\nLinux\nFOUND\n/usr/bin/python3\nENDFOUND\n', b'')
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'/usr/bin/python3 && sleep 0'"'"''
<10.2.0.27> (0, b'{"platform_dist_result": [], "osrelease_content": "PRETTY_NAME=\\"Ubuntu 22.04.1 LTS\\"\\nNAME=\\"Ubuntu\\"\\nVERSION_ID=\\"22.04\\"\\nVERSION=\\"22.04.1 LTS (Jammy Jellyfish)\\"\\nVERSION_CODENAME=jammy\\nID=ubuntu\\nID_LIKE=debian\\nHOME_URL=\\"https://www.ubuntu.com/\\"\\nSUPPORT_URL=\\"https://help.ubuntu.com/\\"\\nBUG_REPORT_URL=\\"https://bugs.launchpad.net/ubuntu/\\"\\nPRIVACY_POLICY_URL=\\"https://www.ubuntu.com/legal/terms-and-policies/privacy-policy\\"\\nUBUNTU_CODENAME=jammy\\n"}\n', b'')
Using module file /usr/lib/python3/dist-packages/ansible/modules/ping.py
Pipelining is enabled.
<10.2.0.27> ESTABLISH SSH CONNECTION FOR USER: root
<10.2.0.27> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/home/johndoe/.dotfiles/ansible/.ansible/cp/27e670244a 10.2.0.27 '/bin/sh -c '"'"'/usr/bin/python3 && sleep 0'"'"''
<10.2.0.27> (0, b'\n{"ping": "pong", "invocation": {"module_args": {"data": "pong"}}}\n', b'')
target_server | SUCCESS => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python3"
},
"changed": false,
"invocation": {
"module_args": {
"data": "pong"
}
},
"ping": "pong"
}
META: ran handlers
META: ran handlers
ansible.cfg
➜ ansible git:(main) ✗ cat ansible.cfg
[default]
inventory = /home/johndoe/NAS1-Mounts/Code/ansible/inventory.yml
# Use the Beautiful Output callback plugin.
stdout_callback = beautiful_output
# Use specific ssh key and user
# ed25519 w/ passphrase
private_key = /home/johndoe/.ssh/johndoe_default
host_key_checking = False
# For updates/maintenance as sudo user
remote_user = johndoe
# Set remote host working directory
remote_tmp = ~/.ansible/tmp
# Misc
allow_world_readable_tmpfiles = True
display_skipped_hosts = False
# display_args_to_stdout = True
# stdout_callback = full_skip
transport = ssh
[ssh_connection]
pipelining = True
timeout = 30
[connection]
pipelining = True
my inventory.yml
➜ ansible git:(main) ✗ cat inventory.yml
# Vagrant Host
#default
[workstation]
[server]
target_server ansible_user=root ansible_host=10.2.0.27 install_docker=true
[pve_container]
my .ssh/config file
➜ ansible git:(main) ✗ cat ~/.ssh/config
# Defaults
Host *
# Default ed25519 Keypair for all connections - unless otherwise specified
IdentityFile ~/.ssh/johndoe_default
IdentitiesOnly yes
# Always use multiplex'd sessions - unless otherwise specified in host def below
Controlmaster auto
ControlPersist yes
Controlpath /tmp/ssh-%r#%h:%p
ControlPersist 10m
ssh directly to host
➜ ansible git:(main) ✗ ssh root#10.2.0.27
Welcome to Ubuntu 22.04.1 LTS (GNU/Linux 5.15.0-56-generic x86_64)
* Documentation: https://help.ubuntu.com
* Management: https://landscape.canonical.com
* Support: https://ubuntu.com/advantage
System information as of Wed Dec 14 05:06:20 PM UTC 2022
System load: 0.0 Processes: 117
Usage of /: 31.8% of 14.66GB Users logged in: 1
Memory usage: 11% IPv4 address for ens18: 10.2.0.27
Swap usage: 0%
50 updates can be applied immediately.
To see these additional updates run: apt list --upgradable
Last login: Wed Dec 14 17:05:53 2022 from 10.0.2.5
root#ubuntu-ansible-test:~#
My apologies for wasting anyone's time. My issue turned out to be a MTU issue with the Tunnel to the remote site. Someone set it up with 1500 on the wireguard tunnel. A packet capture pcap showed a bunch of TCP Out or order, TCP Dup ACK and TCP Retransmission. Setting back to 1420 cured the issue.
Best

Host key verification failed bitbucket pipeline

Hi i have a problem configuring bitbucket pipeline with ssh login on my remote server.
The output of error is:
ssh_askpass: exec(/usr/bin/ssh-askpass): No such file or directory
Host key verification failed
These are the steps i follow:
generate private and public keys (without password) on my server using this command: ssh-keygen -t rsa -b 4096
add base64 encoded private key under Repository Settings->Pipelines->Deployments->Staging environments
push file "my_known_hosts" on the repository created with: ssh-keyscan -t rsa myserverip > my_known_hosts
I also tried to do another test:
generate keys from Repository Settings
copy public key to authorized_keys file on my remote server
type the ip of my remote server in "Known hosts" click fetch and add
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
This is how i configure pipeline ssh connection
image: atlassian/default-image:latest
pipelines:
default:
- step:
name: Deploy to staging
deployment: staging
script:
- echo "Deploying to staging environment"
- mkdir -p ~/.ssh
- cat ./my_known_hosts >> ~/.ssh/known_hosts
- (umask 077 ; echo $SSH_KEY | base64 --decode > ~/.ssh/id_rsa)
- ssh $USER#$SERVER -p$PORT 'echo "connected to remote host as $USER"'
I'm trying all possible things but still can't connect.
Can anyone help me?
This happen when you try to ssh the first time to the server, you can remove host checking by this option StrictHostKeyChecking=no, below is the complete command for your reference.
ssh -o StrictHostKeyChecking=no $USER#$SERVER -p$PORT 'echo "connected to remote host as $USER"'
PS: disabling host checking is not secure way to do, you can add server key to your ~/.ssh/known_host , run this command ssh-keyscan host1 , replace host1 to the host you want to connect.

How do I pass my username and password into a perl script from an ansible role?

I have a perl script for creating ssl certificates on a IBM MQ qmgr. The script needs a username and password for it to work.
I have an ansible role that calls a ready made perl script to create a MQ qmgr and another to create a ssl kdb. Like this:
- name: Create MQ Queue Manager
shell: "./CreateQmgr.sh -m {{MQ_QMGR1}}"
args:
chdir: /opt/wmqinf/utilities
- name: SSL the new Qmgr
shell: "./renewSSL.pl -S {{SSL_PEER1}} -U -m {{MQ_QMGR1}} -G {{GBGF}}"
args:
chdir: /opt/wmqinf/utilities
The playbook / role fails when it can't create the ssl kdb as no username is entered.
Is there a way I can pass the .pl script my username and password for it to work?
I'm sure there is a better way to do this. such as modify your perl script to accept extra command line arguments. however, this should/might work
shell: "./renewSSL.pl -S {{SSL_PEER1}} -U -m {{MQ_QMGR1}} -G {{GBGF}}" <(echo -n "str2") <(echo -n "str3")

How to deploy with .gitlab-ci.yml in runner used by docker?

I installed docker and gitlab + a runner using this tutorial: https://frenchco.de/article/Add-un-Runner-Gitlab-CE-Docker
The problem is that when I try to modify the .gitlab-ci.yml to make a deployment on my host machine I can not do it.
My .yml :
stages:
- deploy
deploy_develop:
stage: deploy
before_script:
- apk update && apk add bash && apk add openssh && apk add rsync
- apk add --no-cache bash
script:
- mkdir -p ~/.ssh
- ssh-keygen -t rsa -N "" -f ~/.ssh/id_rsa
- cat ~/.ssh/id_rsa.pub
- rsync -hrvz ~/ root#172.16.1.97:~/web_dev/www/test/
environment:
name: develop
And the problem is that in ssh or rsync I always have the same error message in my job:
$ rsync -hrvz ~/ root#172.16.1.97:~/web_dev/www/test/
Host key verification failed.
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(226) [sender=3.1.3]
I tried to copy the ssh id_rsa and id_rsa.pub in the host, it's the same.
Surely a problem because my runner is in a docker can be? It is strange because I manage to ping my host (172.16.1.97) since the execution of the .yml. An idea has my problem?
Looks like you did not add the public key into your authorized_keys on the host server for the deploy-user?
For example, I use gitlab-ci to deploy my webapp, and therefore I added the user gitlab on my host machine, and added the public key to authorized_keys and then I can connect with ssh gitlab#IP -i PRIVATE_KEY to that server.
My gitlab-ci.yml looks like this:
deploy-app:
stage: deploy
image: ubuntu
before_script:
- apt-get update -qq
- 'which ssh-agent || ( apt-get install -qq openssh-client )'
- eval $(ssh-agent -s)
- ssh-add <(cat "$DEPLOY_SERVER_PRIVATE_KEY")
- mkdir -p ~/.ssh
- '[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config'
- chmod 755 ./deploy.sh
script:
- ./deploy.sh
where I added the private key's content as a variable to my gitlab-instance. (see https://docs.gitlab.com/ee/ci/variables/)
The deploy.sh looks like this:
#!/bin/bash
set -eo pipefail
scp app/docker-compose.yml gitlab#"${DEPLOY_SERVER_IP}":~/apps/${NGINX_SERVER_NAME}/
ssh gitlab#$DEPLOY_SERVER_IP "apps/${NGINX_SERVER_NAME}/app.sh update" # this is just doing docker-compose pull && docker-compose up in the app's directory.
Maybe this helps? It's working fine for me and scp/ssh are giving more intuitive error messages than what you got with rsync in this particular case.

Run Python 2.7 by default in a Dotcloud custom service

I need to make Python 2.7 the default version of Python for running a Jenkins build server. I'm trying to use python_version to do this, but Python 2.6 remains the default version. I'm probably missing something really simple. Any suggestions?
dotcloud.yml
jenkins:
type: custom
buildscript: jenkins/builder
ports:
www: http
config:
python_version: v2.7
processes:
sshagent: ssh-agent /bin/bash
jenkins: ~/run
db:
type: postgresql
builder
#!/bin/bash
if [ -f ~/jenkins.war ]
then
echo 'Found jenkins installation.'
else
echo 'Installing jenkins.'
wget -O ~/jenkins.war http://mirrors.jenkins-ci.org/war/latest/jenkins.war
fi
echo 'Installing dotCloud scaffolding.'
cp -a jenkins/. ~
echo 'Setting up SSH.'
mkdir -p ~/.ssh
cp jenkins_id ~/.ssh/id_rsa
chmod 0600 ~/.ssh/id_rsa
ssh-keygen -R bitbucket.org
ssh-keyscan -H bitbucket.org >> ~/.ssh/known_hosts
I'm still not sure why my build file didn't solve the problem, but I was able to work around it by using the --python=/usr/bin/python2.7 option for virtualenv in my Jenkins build script.