Puppet - 3 questions

Problem:
Question #1
While I execute Schedule Remote Job with multiple hosts, it’s really hard to see the results by clicking each host, is there any location on (Linux) server (we had installed foreman in Linux server, I’ve searched at /opt/theforeman/tfm/root/usr/share/gems/gems/foreman_remote_execution-1.4.5/app/views/template_invocations
however did not find the execution results (only can view (time consuming) thru the url https:///job_invocations/309)
Question #2
While I execute at multiple hosts it runs sequential so that it takes longer to complete multiple hosts, is there a way to run Schedule Remote Job parallel (run command at multiple hosts simultaneously)?
Question #3
Where can i find Schedule Remote Job history (a 2-3 months jobs history)?
Expected outcome:

Foreman and Proxy versions:
foreman_remote_execution-1.4.5

Foreman and Proxy plugin versions:

Other relevant data:
Centos_7_CentOS-7-Base CentOS-7-Base 9,591
Centos_7_Centos-7-Updates Centos-7-Updates 740
centos7-base centos7-base 9,007
epel7 epel7 11,993
katello-client-el7 Katello Client 3.4 18
mysql56-community mysql56-community 361

Linux 3.10.0-327.el7.x86_64 #1 SMP Thu Nov 19 22:10:57 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux

Hello,

in facts these question are unrelated to Puppet.

Answer 1)
Go to job invocation detail page, normally you are redirected there after
submitting a form. In a table of hosts, you can see the overall status per
each. Clicking it’s name takes you to job output.

Answer 2)
It should always run in parallel unless you specify concurrency level job
parameter to 1.

Answer 3)
Monitor -> Jobs

Hope that helps

Thanks Marek for addressing, ohh sorry for posting these question under Puppet,

Question 1)
Yes, I was doing the same way of clicking it’s name, but it’s really taking more time to view the output while I execute 50 hosts (need to click 50 times to view the output), so am looking for 2 possibilities, if the output stored somewhere in Linux server, easily can view or on the URL (https://server123/job_invocations/317) displaying all hosts output (without clicking each host)?

Question 2)
I’ve changed the value from “N” to 3 at Concurrency Level however it run’s sequential (in fact i’ve ran multiple times after setting different values like 3 (passed 3 hosts), then 10 (passed 10 hosts)

Question 3)
Yes, i can view all previous jobs. Thanks a lot!

Thanks,
Naga

Hello,

good, number 3 is resolved :slight_smile: so for 1) it’s stored in database, so in the
worst case, you could fetch it from there. But we have REST API which
should make it possible to fetch outputs for hosts for a give job. Also
there’s nice CLI built on top of it called Hammer. With a little scripting,
it should work.

As per 2), how did you install Foreman? This works in production setups I
play with. If Foreman was installed using installer (from packages) I think
it should work like this out of the box.

Thank you! 2nd question also we can mark as resolved :slight_smile: as you indicated it could be installation issue (our systems team installed I’ll have to check with them)
on Question 1, could you please provide a query to fetch it. Also please share the steps or documentation how to use REST API to fetch outputs for hosts for a give job.

Try looking at your Foreman instance that contains the API documentation. This should help

https://$fqdn/apidoc/v2/job_invocations.html

the $fqdn is you Foreman host FQDN. Most likely you’ll be interested in

GET /api/job_invocations/:id 
GET /api/job_invocations/:id/hosts/:host_id

Thanks for the documentation link, after gone thru the documentation still it looks like i did not get the result part(output), all I can see is host information, input information and few other information, but actually am expecting results (meaning output of given command) in one location (or one page or in a file(s)
sample output (what I got)
{“id”:330,“description”:“Run ls -ltr /var/spool/cron/root”,“job_category”:“Commands”,“targeting_id”:330,“status”:0,“start_at”:“2018-10-10 10:47:19 -0700”,“status_label”:“succeeded”,“dynflow_task”:{“id”:“c3799723-cf5e-4446-b0b2-70a32147f226”,“state”:“stopped”},“succeeded”:5,“failed”:0,“pending”:0,“total”:5,“targeting”:{“bookmark_id”:null,“search_query”:“name ^ (host1, host2, host3, host4, host5)”,“targeting_type”:“static_query”,“user_id”:24,“hosts”:[{“name”:“host2”,“id”:67},{“name”:“host4”,“id”:66},{“name”:“host1”,“id”:29},{“name”:“host5”,“id”:65},{“name”:“host3”,“id”:68}]},“task”:{“id”:“c3799723-cf5e-4446-b0b2-70a32147f226”,“state”:“stopped”},“template_invocations”:[{“template_id”:103,“template_name”:“Run Command - SSH Default”,“template_invocation_input_values”:[{“template_input_name”:“command”,“template_input_id”:7,“value”:“ls -ltr /var/spool/cron/root”}]},{“template_id”:103,“template_name”:“Run Command - SSH Default”,“template_invocation_input_values”:[{“template_input_name”:“command”,“template_input_id”:7,“value”:“ls -ltr /var/spool/cron/root”}]},{“template_id”:103,“template_name”:“Run Command - SSH Default”,“template_invocation_input_values”:[{“template_input_name”:“command”,“template_input_id”:7,“value”:“ls -ltr /var/spool/cron/root”}]},{“template_id”:103,“template_name”:“Run Command - SSH Default”,“template_invocation_input_values”:[{“template_input_name”:“command”,“template_input_id”:7,“value”:“ls -ltr /var/spool/cron/root”}]},{“template_id”:103,“template_name”:“Run Command - SSH Default”,“template_invocation_input_values”:[{“template_input_name”:“command”,“template_input_id”:7,“value”:“ls -ltr /var/spool/cron/root”}]}]}

am still waiting for the reply, any help much appreciated!

The example you’ve posted is first API call right? That gives you host ids for particular job. Then use these ids in second API call I suggested. This should give you output for the host. Now this could be easily scripted e.g. in ruby/python/shell to download all outputs for a job.

Thanks a lot and appreciated Marek, it works. You can close this thread.