Ansible Role delegate_to Failed to connect to the host via ssh: Permission denied

Problem:

Hey guys, Iam trying to get the following ansible-role from elnappo to work with foreman.

Unfortunately one of the tasks always throws a “permission denied” error when I execute the following Playbook via job template.

When I execute the playbook via commandline (ansible-playbook playbookname.yml) everything works as expected.

Anyone got an idea why its not working via foreman?

Link to ansible role: https://github.com/elnappo/ansible-role-check-mk-agent

Error message in foreman UI:

TASK [elnappo.check_mk_agent : Scan SSH host pubkey] ***************************
55:
fatal: [HOSTNAME_HERE]: UNREACHABLE! => {“changed”: false, “msg”: “Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).”, “unreachable”: true}

Playbook:

  • hosts: all
    gather_facts: true
    vars:
    check_mk_agent_over_ssh: True
    check_mk_agent_with_sudo: True
    check_mk_agent_add_host_pubkey: True
    check_mk_agent_monitoring_host: HOST_HERE
    check_mk_agent_monitoring_user: USER_HERE
    check_mk_agent_pubkey_file: URL_HERE
    check_mk_agent_add_to_wato: True
    check_mk_agent_monitoring_host_discovery_mode: new
    check_mk_agent_monitoring_host_url: URL_HERE
    check_mk_agent_monitoring_host_wato_username: automation
    check_mk_agent_monitoring_host_wato_secret: PASSWORT_HERE
    check_mk_agent_monitoring_host_folder: FOLDER_HERE
    check_mk_agent_setup_firewall: False
    check_mk_agent_manual_install: False
    check_mk_agent_local_checks:
    count_users:
    src: files/localchecks/count_users
    cache_time: 600
    count_zombie_procs:
    src: files/localchecks/count_zombie_procs
    cache_time: 600
    check_mk_agent_plugins:
    lvm:
    src: files/plugins/lvm
    mk_inventory.linux:
    src: files/plugins/mk_inventory.linux
    mk_sshd_config:
    src: files/plugins/mk_sshd_config
    netstat.linux:
    src: files/plugins/netstat.linux
    mk_logwatch:
    src: files/plugins/mk_logwatch

    roles:

    • role: elnappo.check_mk_agent

Expected outcome:

Foreman and Proxy versions:

foreman-1.18.0.40-1.el7sat.noarch

Foreman and Proxy plugin versions:

ansible-2.6.13-1.el7ae.noarch
tfm-rubygem-foreman_ansible-2.2.9-8.el7sat.noarch
ansiblerole-insights-client-1.5-1.el7sat.noarch
tfm-rubygem-hammer_cli_foreman_ansible-0.1.1-1.el7sat.noarch
tfm-rubygem-foreman_ansible_core-2.1.1-1.el7sat.noarch

I reckon there might be some problem with the “delegate_to” paramter when the playbook gets executed via foreman. The variables got the correct values assigned though.

Dont know why I can’t edit my post!? Anyway just wanted do add that the ssh-pubkeys are accordingly deposited.

Is the ssh-key that you use for delegated connection readable by foreman-proxy user? I guess it is based on what you mentioned by previous post, but wanted to check. I didn’t try using delegation in my scenarios yet, so can’t confirm if it works or not.

I have removed the “become_user” parameter in the ansible role, this didnt change anything. :confused:

The ssh-key is owned by the foreman-proxy user.

-rw-------. 1 foreman-proxy foreman-proxy 1679 Mar 23 2017 id_rsa_foreman_proxy
-rw-r–r--. 1 foreman-proxy foreman-proxy 426 Mar 23 2017 id_rsa_foreman_proxy.pub

Remote execution is working as expected.

The permission denied message appears, as soon as the the execution hits the following block.

  • block:

    • name: Scan SSH host pubkey
      shell: ssh-keyscan -v -T 10 {{ inventory_hostname }}
      changed_when: False
      register: check_mk_agent_host_ssh_pubkey

    • name: Add known_host entry to monitoring instance
      known_hosts:
      name: “{{ inventory_hostname }}”
      key: “{{ item }}”
      state: present
      with_items: “{{ check_mk_agent_host_ssh_pubkey.stdout_lines }}”

    when: check_mk_agent_over_ssh and check_mk_agent_add_host_pubkey
    delegate_to: “{{ check_mk_agent_monitoring_host }}”
    become_user: “{{ check_mk_agent_monitoring_user }}”
    become: yes

Unfortunately I can’t a more verbose output. The “Default verbosity level” setting seems to have no effect.

After removing the “delegate_to” parameter the task runs successfully.

Hi,
I face similar issue. Were you able to use delegate_to somehow ? I have a role which install bacula, and it requires to connect to main backup server, and without delegate_to I am not able to do it.

Hey pdzionek,

Unfortunately not - I just removed the “delegate_to” string to get it working.

regards

brotaxt

@pdzionek, In case if you still facing the issue. Here is the solution.

When you delegate a task in foreman. the identify file and remote_user are unset causing it to fail

So you must have the following in your play

- name: Delegate a task to {{ host_ip }}
  copy:
    content: "Test content"
    dest: /tmp/test.txt
  delegate_to: "{{ host_ip }}"
  vars:
    - ansible_ssh_user: myuser
    - ansible_ssh_private_key_file: "/usr/share/foreman-proxy/.ssh/id_rsa_foreman_proxy"
1 Like