Foreman + Ansible + winrm


#1

Hi Folks,

i am currently trying to run a ansible playbook with winrm.
From the commandline everything works so winrm seems to be running, but i can not get foreman to run this playbook over winrm.

Foreman always tries to run the playbook over ssh - what am i missing? Is there some kind of tutorial or something like that?

Thanks for your help.

Versions:
Foreman 1.21.0
Debian


#2

Hi,
there is a ‘Connection type’ setting in Settings -> Ansible tab, which is set to ssh by default, so changing it might achieve what you need, but I am not sure if anyone tried to run playbooks from Foreman using winrm before.


#3

Hello @bstorp and welcome to the _fore_um!

You’ll have to set the ansible_connection parameter (can be set in the global settings but also per host). From there on, you’ll also have to provide the password as a parameter.

Next to this, you’ll also have to decided on the ansible_winrm_transport. If you’re using basic (which is btw the only transport I can confirm works, I didn’t test the others), you’ll also have to configure the remote_execution_ssh_password (I know, weird name but seems to do the trick) + the ansible_user.

Finally, I’ve also configured my ansible_become_method, ansible_become_password and ansible_become_user parameters.

I’d also like to link you to the Ansible page on this topic, as it has a very detailed explanation on considerations and options to make:
https://docs.ansible.com/ansible/latest/user_guide/windows_winrm.html

Have fun!
Arend


#4

Hi Ondrej,

thanks for your answer. I have already tried to change this - but it has no effect.
I also added some hostgroup parameters:

ansible_connection: winrm
ansible_pass
ansible_port
ansible_user
ansible_winrm_scheme: https
ansible_winrm_server_cart_validation: ignore
ansible_winrm_transport: ntlm

But these parameters seem to have no effect at all.

If i use these settings for an python script it all works fine …

Any suggestions?


#5

Hi Arend,

thanks a lot but as described in my latest post, i have tried to configure this as described.
This behavior looks quite strange to me.

Any suggestions are welcome and if i can provide any further information to debug these, please let me know.

I really appreciate your help.

Thanks


#6

The mentioned parameters are all configured with a valid input - for security reasons i cannot post it here…


#7

Hi!
What is the error message you’re getting?

I had to define the following parameters to get this working.
ansible_become = false
ansible_ssh_pass
ansible_connection = winrm
ansible_user = Administrator
ansible_winrm_port

In order to debug this outside foreman, you can use the generated Ansible playbooks in /tmp/foreman-playbook-*
together with the inventories in /tmp/foreman-inventories/. Please note, that the inventories must be the result of a dynamic inventory call. I.e. you need to write a simple script that writes the content of the inventory file to stdout.
Mark


#8

Hi,

i get Error initializing command: Net::SSH::ConnectionTimeout - Net::SSH::ConnectionTimeout
Exit status: EXCEPTION

If i run ansible -c winrm -u Administrator -e “ansible_winrm_transport=ntlm ansible_winrm_server_cert_validation=ignore” -k -m setup
it works.

I just checked the /tmp folder. There are no playbooks …
Maybe i defined the parametes the wrong way? I configured them on the parameters tab on the host-group my server is part of …


#9

Hi @hlawatschek,

on which version of foreman did you get your ansible + winrm running?
1.21.x?
I just did a “clean” reinstall in my lab with 1.21.0 and there is no chance to get this runnig - it is always using ssh no matter what options i configure over host-groups parameters.

Or is this my mistake and i need to configure ansible to use foreman? I just activated foreman-ansible-plugin and foreman-ansible-proxy.

Package on debian stretch are: ii ansible 2.7.8-1ppa~trusty all Ansible IT Automation
ii ruby-foreman-ansible 2.3.1-1 all Foreman Ansible plugin
ii ruby-smart-proxy-ansible 2.1.0-1 all Ansible support for Foreman smart proxy


#10

Hello,
I have the same problem. I have a fresh install off foreman 1.21 and my hostparameters are all ignored. Why i should configure ssh pass if i want use winrm ? I have a ubuntu server i have all components installed and all services run correctly . But what can i see i have no dynflow port 8008 open.
Additionally i dont have a configuration options for dynflow core under /etc. Is that normal??
My ansible roles are correctly imported in foreman. I have a smart-proxy configured with all necessary features. Do i need remote execution configured to trigger ansible roles?? Nowhere i found i good documentation how foreman communicate with foreman-proxy to trigger ansible roles. Till today i dont know how foreman and dynflow work together. For years we use foreman and puppet and the documentation was very good but for ansible ver poor.


#11

I configured.
ansible_user = ansible
ansible_pass = xxxx
ansible_become = no/false
ansible_connection = winrm

And foreman try to use ssh.
this configuration works perfect on my ansible server on commandline.

How foreman proxy use ansible to execute roles?


#12

So we mix multiple things in this thread. The first problem us, you can see Net::SSH::ConnectionTimeout. That means ansible is not used at all, it uses pure REX SSH. Can you look at your smart proxy features? Does it list Ansible? Did you install smart_proxy_ansible plugin? Was foreman-installer involved?

The second problem we see is the 8008 port not open. That is expected on debian systems, where smart proxy dynflow core runs as part of the smart proxy process.


#13

Hi Marek,

yes, the ansible smart proxy is listed.
Yes, smart proxy plugin is installed.
Yes, setup has been done by foreman-installer.

Packages are: ii ansible 2.7.8-1ppa~trusty all Ansible IT Automation
ii ruby-foreman-ansible 2.3.1-1 all Foreman Ansible plugin
ii ruby-smart-proxy-ansible 2.1.0-1 all Ansible support for Foreman smart proxy

Do you need any logs / further information?


#14

Just to be sure, i just upgraded to the newer smart-proxys 2.1.1:

ii ansible 2.7.8-1ppa~trusty all Ansible IT Automation
ii ruby-foreman-ansible 2.3.1-1 all Foreman Ansible plugin
ii ruby-smart-proxy-ansible 2.1.1-1 all Ansible support for Foreman smart proxy

But that did not change the behavior.


#15

Hi @bstorp
I’m using CentOS7 and the following packages:

  • foreman-1.18.3-41006.orcharhino.noarch
  • tfm-rubygem-foreman_ansible-2.2.9-1.fm1_18.el7.noarch
  • tfm-rubygem-foreman_ansible_core-2.1.1-1.fm1_18.el7.noarch

#16

thanks - i will try this versions in my lab and see if it works.


#17

@bstorp could you please upload the screenshot of the list of smart proxies? Feel free to hide their names if they contain domain. Are these proxies assigned to the same organization/location that you use for your host? It might also help to see /var/log/foreman-proxy/proxy.log from when you run the job. Ideally set the log level to DEBUG in /etc/foreman-proxy/settings.yml and restart the proxy before you get the log.


#18

Hi Marek,

as it is a lab i only have a one-node installation e.g. all Smart-Proxies on the same server.


Yes, Hosts an Foreman are the same location and organization.
proxy.log:

Blockquote
2019-03-06T10:18:34 [D] Rack::Handler::WEBrick is mounted on /.
2019-03-06T10:18:34 [I] WEBrick::HTTPServer#start: pid=1685 port=8443
2019-03-06T10:18:34 [I] Smart proxy has launched on 1 socket(s), waiting for requests
2019-03-06T10:18:36 [D] Initializing puppet class cache for ‘development’ environment
2019-03-06T10:18:36 [D] Initializing puppet class cache for ‘common’ environment
2019-03-06T10:18:36 [D] Initializing puppet class cache for ‘production’ environment
2019-03-06T10:18:36 [I] Finished puppet class cache initialization
2019-03-06T10:19:21 [D] accept: :51306
2019-03-06T10:19:21 [D] Rack::Handler::WEBrick is invoked.
W, [2019-03-06T10:19:21.274127 #1685] WARN – : Could not open DB for dynflow at ‘’, will keep data in memory. Restart will drop all dynflow data.
2019-03-06T10:19:21 [I] Execution plan cleaner removing 0 execution plans.
2019-03-06T10:19:21 113bf36f [I] Started GET /tasks/count state=running
2019-03-06T10:19:21 113bf36f [I] Finished GET /tasks/count with 200 (6.62 ms)
2019-03-06T10:19:21 [D] close: :51306
2019-03-06T10:19:22 [D] accept: :51308
2019-03-06T10:19:22 [D] Rack::Handler::WEBrick is invoked.
2019-03-06T10:19:22 2c4d8e9b [I] Started POST /tasks/
2019-03-06T10:19:22 2c4d8e9b [D] ExecutionPlan 92fa7867-e6d1-4e00-b158-0b1011ca0e82 pending >> planning
2019-03-06T10:19:22 2c4d8e9b [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 1 pending >> running in phase Plan ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:19:22 2c4d8e9b [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 4 pending >> running in phase Plan SmartProxyDynflowCore::Callback::Action
2019-03-06T10:19:22 2c4d8e9b [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 4 running >> success in phase Plan SmartProxyDynflowCore::Callback::Action
2019-03-06T10:19:22 2c4d8e9b [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 1 running >> success in phase Plan ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:19:22 2c4d8e9b [D] ExecutionPlan 92fa7867-e6d1-4e00-b158-0b1011ca0e82 planning >> planned
2019-03-06T10:19:22 2c4d8e9b [I] Finished POST /tasks/ with 200 (24.64 ms)
2019-03-06T10:19:22 [D] ExecutionPlan 92fa7867-e6d1-4e00-b158-0b1011ca0e82 planned >> running
2019-03-06T10:19:22 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 2 pending >> running in phase Run ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:19:22 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 2 running >> suspended in phase Run ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:19:22 [D] start runner df8815b4-7237-4351-93d6-250667203407
2019-03-06T10:19:22 [D] opening session to root@testnode1.DOMAINNAME
2019-03-06T10:19:23 [D] close: :51308
2019-03-06T10:19:36 [D] Executor heartbeat
2019-03-06T10:19:51 [D] Executor heartbeat
2019-03-06T10:20:06 [D] Executor heartbeat
2019-03-06T10:20:21 [D] Executor heartbeat
2019-03-06T10:20:40 [D] Executor heartbeat
2019-03-06T10:20:55 [D] Executor heartbeat
2019-03-06T10:21:10 [D] Executor heartbeat
2019-03-06T10:21:25 [D] Executor heartbeat
2019-03-06T10:21:36 [E] error while initalizing command Net::SSH::ConnectionTimeout Net::SSH::ConnectionTimeout:
/usr/lib/ruby/vendor_ruby/net/ssh/transport/session.rb:90:in rescue in initialize' /usr/lib/ruby/vendor_ruby/net/ssh/transport/session.rb:57:ininitialize’
/usr/lib/ruby/vendor_ruby/net/ssh.rb:232:in new' /usr/lib/ruby/vendor_ruby/net/ssh.rb:232:instart’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:262:in session' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:333:inrun_sync’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:420:in ensure_remote_directory' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:397:inupload_data’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:393:in cp_script_to_remote' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:167:inprepare_start’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:153:in start' /usr/lib/ruby/vendor_ruby/foreman_tasks_core/runner/dispatcher.rb:32:instart_runner’
/usr/lib/ruby/vendor_ruby/dynflow/actor.rb:6:in on_message' /usr/lib/ruby/vendor_ruby/concurrent/actor/context.rb:46:inon_envelope’
/usr/lib/ruby/vendor_ruby/foreman_tasks_core/runner/dispatcher.rb:24:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/executes_context.rb:7:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/dynflow/actor.rb:26:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/awaits.rb:15:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/sets_results.rb:14:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:38:inprocess_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:31:in process_envelopes?' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:20:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/termination.rb:55:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/removes_child.rb:10:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:in pass' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/sets_results.rb:14:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:161:in process_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:95:inblock in on_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:118:in block (2 levels) in schedule_execution' /usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:inblock in synchronize’
/usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:in synchronize' /usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:insynchronize’
/usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:115:in block in schedule_execution' /usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:18:incall’
/usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:96:in work' /usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:77:inblock in call_job’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:333:in run_task' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:322:inblock (3 levels) in create_worker’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:305:in loop' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:305:inblock (2 levels) in create_worker’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:304:in catch' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:304:inblock in create_worker’
/usr/lib/ruby/vendor_ruby/logging/diagnostic_context.rb:448:in block in create_with_logging_context' 2019-03-06T10:21:36 [E] Error initializing command - Net::SSH::ConnectionTimeout Net::SSH::ConnectionTimeout: /usr/lib/ruby/vendor_ruby/net/ssh/transport/session.rb:90:inrescue in initialize’
/usr/lib/ruby/vendor_ruby/net/ssh/transport/session.rb:57:in initialize' /usr/lib/ruby/vendor_ruby/net/ssh.rb:232:innew’
/usr/lib/ruby/vendor_ruby/net/ssh.rb:232:in start' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:262:insession’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:333:in run_sync' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:420:inensure_remote_directory’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:397:in upload_data' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:393:incp_script_to_remote’
/usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:167:in prepare_start' /usr/lib/ruby/vendor_ruby/foreman_remote_execution_core/script_runner.rb:153:instart’
/usr/lib/ruby/vendor_ruby/foreman_tasks_core/runner/dispatcher.rb:32:in start_runner' /usr/lib/ruby/vendor_ruby/dynflow/actor.rb:6:inon_message’
/usr/lib/ruby/vendor_ruby/concurrent/actor/context.rb:46:in on_envelope' /usr/lib/ruby/vendor_ruby/foreman_tasks_core/runner/dispatcher.rb:24:inon_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/executes_context.rb:7:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/dynflow/actor.rb:26:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/awaits.rb:15:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/sets_results.rb:14:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:38:in process_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:31:inprocess_envelopes?’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/buffer.rb:20:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/termination.rb:55:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/removes_child.rb:10:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/abstract.rb:25:inpass’
/usr/lib/ruby/vendor_ruby/concurrent/actor/behaviour/sets_results.rb:14:in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:161:inprocess_envelope’
/usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:95:in block in on_envelope' /usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:118:inblock (2 levels) in schedule_execution’
/usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:in block in synchronize' /usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:insynchronize’
/usr/lib/ruby/vendor_ruby/concurrent/synchronization/mri_lockable_object.rb:38:in synchronize' /usr/lib/ruby/vendor_ruby/concurrent/actor/core.rb:115:inblock in schedule_execution’
/usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:18:in call' /usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:96:inwork’
/usr/lib/ruby/vendor_ruby/concurrent/executor/serialized_execution.rb:77:in block in call_job' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:333:inrun_task’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:322:in block (3 levels) in create_worker' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:305:inloop’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:305:in block (2 levels) in create_worker' /usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:304:incatch’
/usr/lib/ruby/vendor_ruby/concurrent/executor/ruby_thread_pool_executor.rb:304:in block in create_worker' /usr/lib/ruby/vendor_ruby/logging/diagnostic_context.rb:448:inblock in create_with_logging_context’
2019-03-06T10:21:36 [D] refresh runner df8815b4-7237-4351-93d6-250667203407
2019-03-06T10:21:36 [D] refreshing runner
2019-03-06T10:21:36 [D] finish runner df8815b4-7237-4351-93d6-250667203407
2019-03-06T10:21:36 [D] closing session for command [df8815b4-7237-4351-93d6-250667203407],0 actors left
2019-03-06T10:21:36 [D] terminate df8815b4-7237-4351-93d6-250667203407
2019-03-06T10:21:36 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 2 got event #ForemanTasksCore::Runner::Update:0x0055b1133004e8
2019-03-06T10:21:36 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 2 suspended >> running in phase Run ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:21:36 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 2 running >> success in phase Run ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:21:36 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 5 pending >> running in phase Run SmartProxyDynflowCore::Callback::Action
2019-03-06T10:21:39 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 5 running >> success in phase Run SmartProxyDynflowCore::Callback::Action
2019-03-06T10:21:39 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 3 pending >> running in phase Finalize ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:21:39 [E] Script execution failed
2019-03-06T10:21:39 [D] Step 92fa7867-e6d1-4e00-b158-0b1011ca0e82: 3 running >> error in phase Finalize ForemanRemoteExecutionCore::Actions::RunScript
2019-03-06T10:21:39 [D] ExecutionPlan 92fa7867-e6d1-4e00-b158-0b1011ca0e82 running >> paused
2019-03-06T10:21:39 [D] ExecutionPlan 92fa7867-e6d1-4e00-b158-0b1011ca0e82 paused >> stopped
2019-03-06T10:21:40 [D] Executor heartbeat
2019-03-06T10:21:55 [D] Executor heartbeat
2019-03-06T10:22:10 [D] Executor heartbeat
2019-03-06T10:22:25 [D] Executor heartbeat
2019-03-06T10:22:40 [D] Executor heartbeat
2019-03-06T10:22:55 [D] Executor heartbeat
2019-03-06T10:23:10 [D] Executor heartbeat
2019-03-06T10:23:25 [D] Executor heartbeat
2019-03-06T10:23:40 [D] Executor heartbeat


#19

Hey Folks,

just to keep you informed. I installed foreman 1.20.2 with ansible etc. on a clean centos 7 today.
Winrm works without any problems. I will upgrade to 1.21.0 on CentOS 7 as next step and will keep you informed.

Regards,
Bastian


#20

Also an Upgrade to 1.21.0 on CentOS 7 works with winrm. Maybe there is only a problem with winrm on debian?