Katello Upgrade from 3.14.1 to 3.15 fails

Problem:

Followed the upgrade documentation until “foreman-installer --scenario katello --upgrade”
Fails with:

Execution of '/usr/sbin/foreman-rake -- config -k 'remote_execution_cockpit_url' -v '/webcon/=%{host}'' returned 1: rake aborted!
PG::ConnectionBad: could not connect to server: No such file or directory
Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

Expected outcome:

Foreman and Proxy versions:
Foreman 1.24.2
Katello 3.14.1

Foreman and Proxy plugin versions:
|foreman-tasks 0.17.5|
|foreman_ansible 4.0.4|
|foreman_openscap 2.0.2|
|foreman_remote_execution 2.0.7|
|foreman_snapshot_management 1.6.0|
|foreman_templates 7.0.6|
|foreman_virt_who_configure 0.4.5|
|katello 3.14.1|

Distribution and version:
CentOS 7.7.1908

Other relevant data:

[ERROR 2020-05-26T10:26:36 main] Errors encountered during run:
[ERROR 2020-05-26T10:26:36 main]  Execution of '/usr/sbin/foreman-rake -- config -k 'remote_execution_cockpit_url' -v '/webcon/=%{host}'' returned 1: rake aborted!
[ERROR 2020-05-26T10:26:36 main] PG::ConnectionBad: could not connect to server: No such file or directory
[ERROR 2020-05-26T10:26:36 main] 	Is the server running locally and accepting
[ERROR 2020-05-26T10:26:36 main] 	connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `new'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `connect'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:684:in `connect'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:215:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:40:in `new'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:40:in `postgresql_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:809:in `new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:853:in `checkout_new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:832:in `try_to_checkout_new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:793:in `acquire_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:521:in `checkout'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:380:in `connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:1008:in `retrieve_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_handling.rb:118:in `retrieve_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_handling.rb:90:in `connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/schema_migration.rb:22:in `table_exists?'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/schema_migration.rb:26:in `create_table'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/migration.rb:1187:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:343:in `new'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:343:in `pending_migrations'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:284:in `permission'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:52:in `block (3 levels) in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:268:in `instance_eval'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:268:in `security_block'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:51:in `block (2 levels) in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:76:in `instance_eval'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:76:in `register'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:36:in `block in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:32:in `instance_exec'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:32:in `run'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:61:in `block in run_initializers'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:60:in `run_initializers'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:361:in `initialize!'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/railtie.rb:190:in `public_send'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/railtie.rb:190:in `method_missing'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/config/environment.rb:5:in `<top (required)>'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:287:in `block in require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:253:in `load_dependency'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:287:in `require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:337:in `require_environment!'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:520:in `block in run_tasks_blocks'
[ERROR 2020-05-26T10:26:36 main] /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in `<top (required)>'
[ERROR 2020-05-26T10:26:36 main] Tasks: TOP => config => environment
[ERROR 2020-05-26T10:26:36 main] (See full trace by running task with --trace)
[ERROR 2020-05-26T10:26:36 main] Rubocop not loaded.
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/execution.rb:297:in `execute'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman-installer/modules/foreman/lib/puppet/provider/foreman_config_entry/cli.rb:12:in `block in run_foreman_config'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman-installer/modules/foreman/lib/puppet/provider/foreman_config_entry/cli.rb:10:in `chdir'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman-installer/modules/foreman/lib/puppet/provider/foreman_config_entry/cli.rb:10:in `run_foreman_config'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman-installer/modules/foreman/lib/puppet/provider/foreman_config_entry/cli.rb:25:in `run_foreman_config'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman-installer/modules/foreman/lib/puppet/provider/foreman_config_entry/cli.rb:68:in `value='
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/property.rb:193:in `call_provider'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/property.rb:507:in `set'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/property.rb:570:in `sync'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:241:in `sync'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:136:in `sync_if_needed'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:90:in `block in perform_changes'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:89:in `each'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:89:in `perform_changes'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/resource_harness.rb:21:in `evaluate'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:267:in `apply'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:287:in `eval_resource'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:191:in `call'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:191:in `block (2 levels) in evaluate'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:521:in `block in thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:520:in `thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:191:in `block in evaluate'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/graph/relationship_graph.rb:122:in `traverse'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction.rb:178:in `evaluate'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/resource/catalog.rb:240:in `block (2 levels) in apply'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:521:in `block in thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:520:in `thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/resource/catalog.rb:239:in `block in apply'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/log.rb:161:in `with_destination'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/transaction/report.rb:146:in `as_logging_destination'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/resource/catalog.rb:238:in `apply'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:185:in `block (2 levels) in apply_catalog'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:521:in `block in thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:520:in `thinmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:184:in `block in apply_catalog'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:233:in `block in benchmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/2.5.0/benchmark.rb:308:in `realtime'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:232:in `benchmark'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:183:in `apply_catalog'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:391:in `run_internal'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:227:in `block in run'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/context.rb:62:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet.rb:290:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/configurer.rb:210:in `run'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:343:in `apply_catalog'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:260:in `block (2 levels) in main'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/context.rb:62:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet.rb:290:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:243:in `block in main'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/context.rb:62:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet.rb:290:in `override'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:207:in `main'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application/apply.rb:177:in `run_command'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application.rb:382:in `block in run'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util.rb:710:in `exit_on_fail'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/application.rb:382:in `run'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/command_line.rb:143:in `run'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/lib/ruby/vendor_ruby/puppet/util/command_line.rb:77:in `execute'
[ERROR 2020-05-26T10:26:36 main] /opt/puppetlabs/puppet/bin/puppet:5:in `<main>'
[ERROR 2020-05-26T10:26:36 main]  /Stage[main]/Foreman::Plugin::Remote_execution::Cockpit/Foreman_config_entry[remote_execution_cockpit_url]/value: change from '' to '/webcon/=%{host}' failed: Execution of '/usr/sbin/foreman-rake -- config -k 'remote_execution_cockpit_url' -v '/webcon/=%{host}'' returned 1: rake aborted!
[ERROR 2020-05-26T10:26:36 main] PG::ConnectionBad: could not connect to server: No such file or directory
[ERROR 2020-05-26T10:26:36 main] 	Is the server running locally and accepting
[ERROR 2020-05-26T10:26:36 main] 	connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `new'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/pg-1.1.4/lib/pg.rb:56:in `connect'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:684:in `connect'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:215:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:40:in `new'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/postgresql_adapter.rb:40:in `postgresql_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:809:in `new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:853:in `checkout_new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:832:in `try_to_checkout_new_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:793:in `acquire_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:521:in `checkout'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:380:in `connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_adapters/abstract/connection_pool.rb:1008:in `retrieve_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_handling.rb:118:in `retrieve_connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/connection_handling.rb:90:in `connection'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/schema_migration.rb:22:in `table_exists?'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/schema_migration.rb:26:in `create_table'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-5.2.1/lib/active_record/migration.rb:1187:in `initialize'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:343:in `new'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:343:in `pending_migrations'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:284:in `permission'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:52:in `block (3 levels) in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:268:in `instance_eval'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:268:in `security_block'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:51:in `block (2 levels) in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:76:in `instance_eval'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/app/registries/foreman/plugin.rb:76:in `register'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/foreman-tasks-1.1.1/lib/foreman_tasks/engine.rb:36:in `block in <class:Engine>'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:32:in `instance_exec'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:32:in `run'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:61:in `block in run_initializers'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/initializable.rb:60:in `run_initializers'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:361:in `initialize!'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/railtie.rb:190:in `public_send'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/railtie.rb:190:in `method_missing'
[ERROR 2020-05-26T10:26:36 main] /usr/share/foreman/config/environment.rb:5:in `<top (required)>'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:287:in `block in require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:253:in `load_dependency'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-5.2.1/lib/active_support/dependencies.rb:287:in `require'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:337:in `require_environment!'
[ERROR 2020-05-26T10:26:36 main] /opt/theforeman/tfm/root/usr/share/gems/gems/railties-5.2.1/lib/rails/application.rb:520:in `block in run_tasks_blocks'
[ERROR 2020-05-26T10:26:36 main] /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in `<top (required)>'
[ERROR 2020-05-26T10:26:36 main] Tasks: TOP => config => environment
[ERROR 2020-05-26T10:26:36 main] (See full trace by running task with --trace)
[ERROR 2020-05-26T10:26:36 main] Rubocop not loaded.
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/kafo_installation20200526-20620-j6homf
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/kafo_installation20200526-20620-z66tpc
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/default_values.yaml
[DEBUG 2020-05-26T10:26:36 main] Exit with status code: 130 (signal was 130)
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/kafo_installation20200526-20620-j6homf
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/kafo_installation20200526-20620-z66tpc
[DEBUG 2020-05-26T10:26:36 main] Cleaning /tmp/default_values.yaml

@mschau the error is giving the impression that Postgres isn’t running. Is it? You can check with foreman-maintain service status. Did you take a backup of your system? You may want to restore the backup and check the services status before performing the upgrade to make sure everything was OK at that time.

2 Likes

Hi!

Yes - I got a snapshot. Katello works fine - here’s the output:

[root@srv ~]# foreman-maintain service status
Running Status Services 


Get status of applicable services:
Displaying the following service(s):

rh-mongodb34-mongod, postgresql, qdrouterd, qpidd, squid, pulp_celerybeat, 
pulp_resource_manager, pulp_streamer, pulp_workers, smart_proxy_dynflow_core, tomcat, 
dynflowd, httpd, puppetserver, foreman-proxy
| displaying rh-mongodb34-mongod
● rh-mongodb34-mongod.service - High-performance, schema-free document-oriented database
Loaded: loaded (/usr/lib/systemd/system/rh-mongodb34-mongod.service; enabled; vendor preset: 
disabled)
   Active: active (running) since Tue 2020-05-26 10:39:44 CEST; 1 day 21h ago
  Process: 1988 ExecStart=/opt/rh/rh-mongodb34/root/usr/libexec/mongodb-scl-helper enable $RH_MONGODB34_SCLS_ENABLED -- /opt/rh/rh-mongodb34/root/usr/bin/mongod $OPTIONS run (code=exited, status=0/SUCCESS)
 Main PID: 2435 (mongod)
Tasks: 95
   CGroup: /system.slice/rh-mongodb34-mongod.service
           └─2435 /opt/rh/rh-mongodb34/root/usr/bin/mongod -f /etc/opt/rh/rh-mongodb34/mongod.conf run

    May 28 02:03:29 srv.domain.local mongod.27017[2435]: [conn76] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:226516573496 keysExamined:0 docsExamined:76110 hasSortStage:1 numYields:596 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1214 } }, Database: { acquireCount: { r: 607 }, acquireWaitCount: { r: 96 }, timeAcquiringMicros: { r: 5624 } }, Collection: { acquireCount: { r: 606 } } } protocol:op_query 463ms
    May 28 02:03:39 srv.domain.local mongod.27017[2435]: [conn83] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:224006523470 keysExamined:0 docsExamined:76110 hasSortStage:1 numYields:595 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1212 } }, Database: { acquireCount: { r: 606 }, acquireWaitCount: { r: 103 }, timeAcquiringMicros: { r: 5577 } }, Collection: { acquireCount: { r: 605 } } } protocol:op_query 437ms
    May 28 02:03:47 srv.domain.local mongod.27017[2435]: [conn120] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:226694200350 keysExamined:0 docsExamined:76116 hasSortStage:1 numYields:596 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1214 } }, Database: { acquireCount: { r: 607 }, acquireWaitCount: { r: 105 }, timeAcquiringMicros: { r: 5149 } }, Collection: { acquireCount: { r: 606 } } } protocol:op_query 482ms
    May 28 02:03:51 srv.domain.local mongod.27017[2435]: [conn78] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:226098160477 keysExamined:0 docsExamined:76121 hasSortStage:1 numYields:595 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1212 } }, Database: { acquireCount: { r: 606 }, acquireWaitCount: { r: 72 }, timeAcquiringMicros: { r: 5142 } }, Collection: { acquireCount: { r: 605 } } } protocol:op_query 486ms
    May 28 02:03:53 srv.domain.local mongod.27017[2435]: [conn76] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:227128982716 keysExamined:0 docsExamined:76128 hasSortStage:1 numYields:596 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1214 } }, Database: { acquireCount: { r: 607 }, acquireWaitCount: { r: 105 }, timeAcquiringMicros: { r: 5719 } }, Collection: { acquireCount: { r: 606 } } } protocol:op_query 470ms
    May 28 02:04:06 srv.domain.local mongod.27017[2435]: [conn119] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:225404856800 keysExamined:0 docsExamined:76151 hasSortStage:1 numYields:596 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1214 } }, Database: { acquireCount: { r: 607 }, acquireWaitCount: { r: 88 }, timeAcquiringMicros: { r: 6031 } }, Collection: { acquireCount: { r: 606 } } } protocol:op_query 448ms
May 28 02:04:23 srv.domain.local mongod.27017[2435]: [conn80] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:225705288470 keysExamined:0 docsExamined:76151 hasSortStage:1 numYields:596 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1214 } }, Database: { acquireCount: { r: 607 }, acquireWaitCount: { r: 102 }, timeAcquiringMicros: { r: 6161 } }, Collection: { acquireCount: { r: 606 } } } protocol:op_query 475ms
May 28 02:04:23 srv.domain.local mongod.27017[2435]: [conn121] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:224474827642 keysExamined:0 docsExamined:76151 hasSortStage:1 numYields:597 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1216 } }, Database: { acquireCount: { r: 608 }, acquireWaitCount: { r: 94 }, timeAcquiringMicros: { r: 5539 } }, Collection: { acquireCount: { r: 607 } } } protocol:op_query 427ms
May 28 02:07:44 srv.domain.local mongod.27017[2435]: [conn88] command pulp_database.units_rpm command: aggregate { aggregate: "units_rpm", pipeline: [ { $sort: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } }, { $project: { release: 1, epoch: 1, version: 1, arch: 1, name: 1 } } ], cursor: { batchSize: 5 }, allowDiskUse: true } planSummary: COLLSCAN cursorid:223514879017 keysExamined:0 docsExamined:76162 hasSortStage:1 numYields:597 nreturned:5 reslen:902 locks:{ Global: { acquireCount: { r: 1216 } }, Database: { acquireCount: { r: 608 } }, Collection: { acquireCount: { r: 607 } } } protocol:op_query 568ms
May 28 07:39:53 srv.domain.local mongod.27017[2435]: [conn122] received client metadata from 127.0.0.1:34020 conn122: { application: { name: "MongoDB Shell" }, driver: { name: "MongoDB Internal Client", version: "3.4.9" }, os: { type: "Linux", name: "CentOS Linux release 7.7.1908 (Core)", architecture: "x86_64", version: "Kernel 3.10.0-1062.9.1.el7.x86_64" } }
| displaying postgresql
● postgresql.service - PostgreSQL database server
Loaded: loaded (/etc/systemd/system/postgresql.service; enabled; vendor preset: disabled)
Active: active (running) since Tue 2020-05-26 10:39:45 CEST; 1 day 21h ago
Process: 2083 ExecStart=/usr/bin/pg_ctl start -D ${PGDATA} -s -o -p ${PGPORT} -w -t 300 
(code=exited, status=0/SUCCESS)
Process: 2025 ExecStartPre=/usr/bin/postgresql-check-db-dir ${PGDATA} (code=exited, 
status=0/SUCCESS)
Main PID: 2175 (postgres)
Tasks: 31
CGroup: /system.slice/postgresql.service
       ├─ 1165 postgres: foreman foreman [local] idle
       ├─ 1168 postgres: foreman foreman [local] idle
       ├─ 1170 postgres: foreman foreman [local] idle
       ├─ 2140 postgres: foreman foreman [local] idle
       ├─ 2175 /usr/bin/postgres -D /var/lib/pgsql/data -p 5432
       ├─ 2297 postgres: logger process
       ├─ 2382 postgres: foreman foreman [local] idle
       ├─ 2453 postgres: foreman foreman [local] idle
       ├─ 2837 postgres: checkpointer process
       ├─ 2838 postgres: writer process
       ├─ 2839 postgres: wal writer process
       ├─ 2840 postgres: autovacuum launcher process
       ├─ 2841 postgres: stats collector process
       ├─ 2978 postgres: foreman foreman [local] idle
       ├─ 3667 postgres: candlepin candlepin 127.0.0.1(33040) idl
       ├─ 3668 postgres: candlepin candlepin 127.0.0.1(33042) idl
       ├─ 3669 postgres: candlepin candlepin 127.0.0.1(33044) idl
       ├─ 3675 postgres: candlepin candlepin 127.0.0.1(33046) idl
       ├─ 3676 postgres: candlepin candlepin 127.0.0.1(33048) idl
       ├─ 3738 postgres: foreman foreman [local] idle
       ├─ 3755 postgres: foreman foreman [local] idle
       ├─ 3756 postgres: foreman foreman [local] idle
       ├─ 3757 postgres: foreman foreman [local] idle
       ├─ 3993 postgres: foreman foreman [local] idle
       ├─ 4124 postgres: candlepin candlepin 127.0.0.1(57854) idl
       ├─ 4666 postgres: candlepin candlepin 127.0.0.1(58022) idl
       ├─ 4779 postgres: candlepin candlepin 127.0.0.1(58066) idl
       ├─ 5154 postgres: candlepin candlepin 127.0.0.1(58206) idl
       ├─ 5266 postgres: candlepin candlepin 127.0.0.1(58250) idl
       ├─ 7144 postgres: foreman foreman [local] idle
       └─10565 postgres: foreman foreman [local] idle

May 26 10:39:43 srv.domain.local systemd[1]: Starting PostgreSQL database server...
May 26 10:39:43 srv.domain.local pg_ctl[2083]: pg_ctl: another server might be running; trying to start server anyway
May 26 10:39:45 srv.domain.local systemd[1]: Started PostgreSQL database server.
| displaying qdrouterd
● qdrouterd.service - Qpid Dispatch router daemon
Loaded: loaded (/usr/lib/systemd/system/qdrouterd.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/qdrouterd.service.d
           └─90-limits.conf
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
 Main PID: 1990 (qdrouterd)
    Tasks: 13
   CGroup: /system.slice/qdrouterd.service
           └─1990 /usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf

May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.347055 +0200 SERVER (info) [C15] Accepted connection to :5647 from 10.1.64.62:51150
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.355680 +0200 SERVER (info) [C16] Accepted connection to :5647 from 10.1.64.61:43496
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.360858 +0200 ROUTER (info) [C13] Connection Opened: dir=in host=10.1.64.51:36960 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=8704a7e0-5579-4652-92f8-d47e8d9e19c0 props=
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.361948 +0200 ROUTER (info) [C14] Connection Opened: dir=in host=10.1.64.52:58360 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=b256d14f-5949-4ff6-821e-0e0bd305b09e props=
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.363407 +0200 ROUTER (info) [C13][L20] Link attached: dir=out source={pulp.agent.9579e839-ae79-49ae-8b70-7302c6337ec6 expire:sess} target={<none> expire:sess}
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.364314 +0200 ROUTER (info) [C14][L22] Link attached: dir=out source={pulp.agent.39a54b50-cc2b-4a1d-b0ae-7eb3ac999184 expire:sess} target={<none> expire:sess}
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.364992 +0200 ROUTER (info) [C15] Connection Opened: dir=in host=10.1.64.62:51150 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=89b4bb34-02e7-4716-a69f-4317d64031ea props=
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.367284 +0200 ROUTER (info) [C15][L24] Link attached: dir=out source={pulp.agent.29108fce-38b7-485d-942f-3f4fb25ee3da expire:sess} target={<none> expire:sess}
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.375140 +0200 ROUTER (info) [C16] Connection Opened: dir=in host=10.1.64.61:43496 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=43bda148-4384-4ef4-94cb-04c7b2a94520 props=
May 26 10:40:17 srv.domain.local qdrouterd[1990]: 2020-05-26 10:40:17.377618 +0200 ROUTER (info) [C16][L26] Link attached: dir=out source={pulp.agent.4ac5c3ae-d78f-4d36-8a11-e9519f9508c6 expire:sess} target={<none> expire:sess}
| displaying qpidd
● qpidd.service - An AMQP message broker daemon.
   Loaded: loaded (/usr/lib/systemd/system/qpidd.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/qpidd.service.d
           └─90-limits.conf, wait-for-port.conf
   Active: active (running) since Tue 2020-05-26 10:39:46 CEST; 1 day 21h ago
     Docs: man:qpidd(1)
           http://qpid.apache.org/
  Process: 1986 ExecStartPost=/bin/bash -c while ! nc -z 127.0.0.1 5671; do sleep 1; done (code=exited, status=0/SUCCESS)
 Main PID: 1985 (qpidd)
    Tasks: 14
   CGroup: /system.slice/qpidd.service
           └─1985 /usr/sbin/qpidd --config /etc/qpid/qpidd.conf

May 26 10:39:43 srv.domain.local systemd[1]: Starting An AMQP message broker daemon....
May 26 10:39:46 srv.domain.local qpidd[1985]: 2020-05-26 10:39:46 [System] error Error reading socket: Encountered end of file [-5938]
May 26 10:39:46 srv.domain.local systemd[1]: Started An AMQP message broker daemon..
May 26 10:39:46 srv.domain.local qpidd[1985]: 2020-05-26 10:39:46 [System] error Error reading socket: Encountered end of file [-5938]
| displaying squid
● squid.service - Squid caching proxy
   Loaded: loaded (/usr/lib/systemd/system/squid.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
  Process: 2073 ExecStart=/usr/sbin/squid $SQUID_OPTS -f $SQUID_CONF (code=exited, status=0/SUCCESS)
  Process: 1999 ExecStartPre=/usr/libexec/squid/cache_swap.sh (code=exited, status=0/SUCCESS)
 Main PID: 2113 (squid)
    Tasks: 3
   CGroup: /system.slice/squid.service
           ├─2113 /usr/sbin/squid -f /etc/squid/squid.conf
           ├─2116 (squid-1) -f /etc/squid/squid.conf
           └─2176 (logfile-daemon) /var/log/squid/access.log

May 26 10:39:43 srv.domain.local systemd[1]: Starting Squid caching proxy...
May 26 10:39:43 srv.domain.local squid[2113]: Squid Parent: will start 1 kids
May 26 10:39:43 srv.domain.local squid[2113]: Squid Parent: (squid-1) process 2116 started
May 26 10:39:43 srv.domain.local systemd[1]: Started Squid caching proxy.
/ displaying pulp_celerybeat
● pulp_celerybeat.service - Pulp's Celerybeat
   Loaded: loaded (/usr/lib/systemd/system/pulp_celerybeat.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
 Main PID: 2049 (celery)
    Tasks: 5
   CGroup: /system.slice/pulp_celerybeat.service
           └─2049 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler

May 28 06:09:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 06:19:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 06:29:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 06:39:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 06:49:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 06:59:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 07:09:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 07:19:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 07:29:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
May 28 07:39:47 srv.domain.local pulp[2049]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
/ displaying pulp_resource_manager
● pulp_resource_manager.service - Pulp Resource Manager
   Loaded: loaded (/usr/lib/systemd/system/pulp_resource_manager.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
 Main PID: 2044 (celery)
    Tasks: 10
   CGroup: /system.slice/pulp_resource_manager.service
           ├─2044 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid
           └─3272 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid

May 28 02:04:13 srv.domain.local pulp[3272]: celery.app.trace:INFO: [7c016969] Task pulp.server.async.tasks._queue_reserved_task[7c016969-7ddc-410c-80d7-6f0e3bbb12d4] succeeded in 0.0165078940045s: None
May 28 02:04:14 srv.domain.local pulp[3272]: celery.app.trace:INFO: [dafdb6e2] Task pulp.server.async.tasks._queue_reserved_task[dafdb6e2-2a11-48b7-bb47-b5c871fd4b1c] succeeded in 0.788449515007s: None
May 28 02:04:29 srv.domain.local pulp[2044]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[95b12184-44f4-4449-a1de-d208a30fdd27]
May 28 02:04:29 srv.domain.local pulp[3272]: celery.app.trace:INFO: [95b12184] Task pulp.server.async.tasks._queue_reserved_task[95b12184-44f4-4449-a1de-d208a30fdd27] succeeded in 0.0147795150115s: None
May 28 02:04:31 srv.domain.local pulp[2044]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[63cf1dbb-c407-471f-bbc8-7966ed815f07]
May 28 02:04:31 srv.domain.local pulp[3272]: celery.app.trace:INFO: [63cf1dbb] Task pulp.server.async.tasks._queue_reserved_task[63cf1dbb-c407-471f-bbc8-7966ed815f07] succeeded in 0.0197392689879s: None
May 28 02:06:36 srv.domain.local pulp[2044]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[6f77eafa-cfc5-4376-b3ed-eef498b0b985]
May 28 02:06:36 srv.domain.local pulp[3272]: celery.app.trace:INFO: [6f77eafa] Task pulp.server.async.tasks._queue_reserved_task[6f77eafa-cfc5-4376-b3ed-eef498b0b985] succeeded in 0.0176256179984s: None
May 28 02:07:51 srv.domain.local pulp[2044]: celery.worker.strategy:INFO: Received task: pulp.server.async.tasks._queue_reserved_task[d27bbf25-723d-47f3-8104-2b0feaf35cdc]
May 28 02:07:51 srv.domain.local pulp[3272]: celery.app.trace:INFO: [d27bbf25] Task pulp.server.async.tasks._queue_reserved_task[d27bbf25-723d-47f3-8104-2b0feaf35cdc] succeeded in 0.0165938240243s: None
/ displaying pulp_streamer
● pulp_streamer.service - The Pulp lazy content loading streamer
   Loaded: loaded (/usr/lib/systemd/system/pulp_streamer.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
 Main PID: 2035 (pulp_streamer)
    Tasks: 3
   CGroup: /system.slice/pulp_streamer.service
           └─2035 /usr/bin/python /usr/bin/pulp_streamer --nodaemon --syslog --prefix=pulp_streamer --pidfile= --python /usr/share/pulp/wsgi/streamer.tac

May 26 10:39:46 srv.domain.local pulp_streamer[2035]: pulp.plugins.loader.manager:INFO: Loaded plugin deb_importer for types: deb,deb_component,deb_release
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: pulp.plugins.loader.manager:INFO: Loaded plugin puppet_whole_repo_profiler for types: puppet_module
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: pulp.plugins.loader.manager:INFO: Loaded plugin yum_profiler for types: rpm,erratum,modulemd
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: pulp.plugins.loader.manager:INFO: Loaded plugin yum for types: rpm
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: pulp.plugins.loader.manager:INFO: Loaded plugin rhui for types: rpm
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: [-] Log opened.
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: [-] twistd 12.2.0 (/usr/bin/python 2.7.5) starting up.
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: [-] reactor class: twisted.internet.epollreactor.EPollReactor.
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: [-] Site starting on 8751
May 26 10:39:46 srv.domain.local pulp_streamer[2035]: [-] Starting factory <twisted.web.server.Site instance at 0x7f5fb7b56e18>
/ displaying pulp_workers
● pulp_workers.service - Pulp Celery Workers
   Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service; enabled; vendor preset: disabled)
   Active: active (exited) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
  Process: 2045 ExecStart=/usr/bin/python -m pulp.server.async.manage_workers start (code=exited, status=0/SUCCESS)
 Main PID: 2045 (code=exited, status=0/SUCCESS)
    Tasks: 0
   CGroup: /system.slice/pulp_workers.service

May 26 10:39:43 srv.domain.local systemd[1]: Starting Pulp Celery Workers...
May 26 10:39:43 srv.domain.local systemd[1]: Started Pulp Celery Workers.
/ displaying smart_proxy_dynflow_core
● smart_proxy_dynflow_core.service - Foreman smart proxy dynflow core service
   Loaded: loaded (/usr/lib/systemd/system/smart_proxy_dynflow_core.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/smart_proxy_dynflow_core.service.d
           └─90-limits.conf
   Active: active (running) since Tue 2020-05-26 10:39:46 CEST; 1 day 21h ago
     Docs: https://github.com/theforeman/smart_proxy_dynflow
  Process: 2023 ExecStart=/usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid (code=exited, status=0/SUCCESS)
 Main PID: 3267 (ruby)
    Tasks: 8
   CGroup: /system.slice/smart_proxy_dynflow_core.service
           └─3267 ruby /usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid

May 26 10:39:43 srv.domain.local systemd[1]: Starting Foreman smart proxy dynflow core service...
May 26 10:39:46 srv.domain.local systemd[1]: Started Foreman smart proxy dynflow core service.
/ displaying tomcat
● tomcat.service - Apache Tomcat Web Application Container
   Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
 Main PID: 1993 (java)
    Tasks: 114
   CGroup: /system.slice/tomcat.service
           └─1993 /usr/lib/jvm/jre/bin/java -Xms1024m -Xmx4096m -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/commons-daemon.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start

May 26 10:39:53 srv.domain.local server[1993]: May 26, 2020 10:39:53 AM com.google.inject.internal.ProxyFactory <init>
May 26 10:39:53 srv.domain.local server[1993]: WARNING: Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@39b7adb1]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 26 10:39:53 srv.domain.local server[1993]: May 26, 2020 10:39:53 AM com.google.inject.internal.ProxyFactory <init>
May 26 10:39:53 srv.domain.local server[1993]: WARNING: Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@39b7adb1]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 26 10:40:00 srv.domain.local server[1993]: May 26, 2020 10:40:00 AM org.apache.catalina.startup.HostConfig deployDirectory
May 26 10:40:00 srv.domain.local server[1993]: INFO: Deployment of web application directory /var/lib/tomcat/webapps/candlepin has finished in 14,455 ms
May 26 10:40:00 srv.domain.local server[1993]: May 26, 2020 10:40:00 AM org.apache.coyote.AbstractProtocol start
May 26 10:40:00 srv.domain.local server[1993]: INFO: Starting ProtocolHandler ["http-bio-8443"]
May 26 10:40:00 srv.domain.local server[1993]: May 26, 2020 10:40:00 AM org.apache.catalina.startup.Catalina start
May 26 10:40:00 srv.domain.local server[1993]: INFO: Server startup in 14527 ms
/ displaying dynflowd
● dynflowd.service - Foreman jobs daemon
   Loaded: loaded (/usr/lib/systemd/system/dynflowd.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:50 CEST; 1 day 21h ago
     Docs: https://theforeman.org
  Process: 2020 ExecStart=/usr/sbin/dynflowd start (code=exited, status=0/SUCCESS)
    Tasks: 12
   CGroup: /system.slice/dynflowd.service
           ├─3502 dynflow_executor_monitor
           └─3505 dynflow_executor

May 26 10:39:43 srv.domain.local systemd[1]: Starting Foreman jobs daemon...
May 26 10:39:49 srv.domain.local dynflowd[2020]: /usr/share/foreman/lib/foreman.rb:8: warning: already initialized constant Foreman::UUID_REGEXP
May 26 10:39:49 srv.domain.local dynflowd[2020]: /usr/share/foreman/lib/foreman.rb:8: warning: previous definition of UUID_REGEXP was here
May 26 10:39:49 srv.domain.local dynflowd[2020]: Dynflow Executor: start in progress
May 26 10:39:50 srv.domain.local dynflowd[2020]: /opt/theforeman/tfm/root/usr/share/gems/gems/daemons-1.2.3/lib/daemons/daemonize.rb:108: warning: conflicting chdir during another chdir block
May 26 10:39:50 srv.domain.local dynflowd[2020]: /opt/theforeman/tfm/root/usr/share/gems/gems/daemons-1.2.3/lib/daemons/daemonize.rb:75: warning: conflicting chdir during another chdir block
May 26 10:39:50 srv.domain.local dynflowd[2020]: dynflow_executor: process with pid 3505 started.
May 26 10:39:50 srv.domain.local systemd[1]: Started Foreman jobs daemon.
/ displaying httpd
● httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:43 CEST; 1 day 21h ago
     Docs: man:httpd(8)
           man:apachectl(8)
 Main PID: 2033 (httpd)
   Status: "Total requests: 0; Current requests/sec: 0; Current traffic:   0 B/sec"
    Tasks: 169
   CGroup: /system.slice/httpd.service
           ├─1157 Passenger RackApp: /usr/share/foreman
           ├─2033 /usr/sbin/httpd -DFOREGROUND
           ├─2498 (wsgi:pulp)     -DFOREGROUND
           ├─2499 (wsgi:pulp)     -DFOREGROUND
           ├─2500 (wsgi:pulp)     -DFOREGROUND
           ├─2501 (wsgi:pulp-cont -DFOREGROUND
           ├─2503 (wsgi:pulp-cont -DFOREGROUND
           ├─2505 (wsgi:pulp-cont -DFOREGROUND
           ├─2506 PassengerWatchdog
           ├─2524 PassengerHelperAgent
           ├─2538 PassengerLoggingAgent
           ├─2549 /usr/sbin/httpd -DFOREGROUND
           ├─2550 /usr/sbin/httpd -DFOREGROUND
           ├─2551 /usr/sbin/httpd -DFOREGROUND
           ├─2553 /usr/sbin/httpd -DFOREGROUND
           ├─2556 /usr/sbin/httpd -DFOREGROUND
           ├─2559 /usr/sbin/httpd -DFOREGROUND
           ├─2561 /usr/sbin/httpd -DFOREGROUND
           ├─2562 /usr/sbin/httpd -DFOREGROUND
           ├─3148 Passenger AppPreloader: /usr/share/foreman
           ├─3838 /usr/sbin/httpd -DFOREGROUND
           ├─3964 /usr/sbin/httpd -DFOREGROUND
           ├─3984 /usr/sbin/httpd -DFOREGROUND
           └─3985 /usr/sbin/httpd -DFOREGROUND

May 28 02:00:50 srv.domain.local pulp[2498]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:51 srv.domain.local pulp[2499]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:52 srv.domain.local pulp[2498]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:53 srv.domain.local pulp[2499]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:53 srv.domain.local pulp[2500]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:55 srv.domain.local pulp[2499]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:56 srv.domain.local pulp[2500]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:56 srv.domain.local pulp[2500]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:57 srv.domain.local pulp[2499]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
May 28 02:00:58 srv.domain.local pulp[2500]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
/ displaying puppetserver
● puppetserver.service - puppetserver Service
   Loaded: loaded (/usr/lib/systemd/system/puppetserver.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:40:01 CEST; 1 day 20h ago
  Process: 1991 ExecStart=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver start (code=exited, status=0/SUCCESS)
 Main PID: 2095 (java)
    Tasks: 65 (limit: 4915)
   CGroup: /system.slice/puppetserver.service
           └─2095 /usr/bin/java -Xms2G -Xmx2G -Djruby.logger.class=com.puppetlabs.jruby_utils.jruby.Slf4jLogger -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/facter.jar:/opt/puppetlabs/server/data/puppetserver/jars/* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter

May 26 10:39:43 srv.domain.local systemd[1]: Starting puppetserver Service...
May 26 10:40:01 srv.domain.local systemd[1]: Started puppetserver Service.
/ displaying foreman-proxy
● foreman-proxy.service - Foreman Proxy
   Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled)
   Active: active (running) since Tue 2020-05-26 10:39:45 CEST; 1 day 21h ago
 Main PID: 2021 (ruby)
    Tasks: 8
   CGroup: /system.slice/foreman-proxy.service
           └─2021 ruby /usr/share/foreman-proxy/bin/smart-proxy --no-daemonize

May 26 10:40:36 srv.domain.local smart-proxy[2021]: srv.domain.local - - [26/May/2020:10:40:36 CEST] "GET /version HTTP/1.1" 200 247
May 26 10:40:36 srv.domain.local smart-proxy[2021]: - -> /version
May 26 17:01:06 srv.domain.local smart-proxy[2021]: srv.domain.local - - [26/May/2020:17:01:06 CEST] "GET /pulp/status/disk_usage HTTP/1.1" 200 423
May 26 17:01:06 srv.domain.local smart-proxy[2021]: - -> /pulp/status/disk_usage
May 27 05:01:18 srv.domain.local smart-proxy[2021]: srv.domain.local - - [27/May/2020:05:01:18 CEST] "GET /pulp/status/disk_usage HTTP/1.1" 200 423
May 27 05:01:18 srv.domain.local smart-proxy[2021]: - -> /pulp/status/disk_usage
May 27 17:01:29 srv.domain.local smart-proxy[2021]: srv.domain.local - - [27/May/2020:17:01:29 CEST] "GET /pulp/status/disk_usage HTTP/1.1" 200 423
May 27 17:01:29 srv.domain.local smart-proxy[2021]: - -> /pulp/status/disk_usage
May 28 05:01:42 srv.domain.local smart-proxy[2021]: srv.domain.local - - [28/May/2020:05:01:42 CEST] "GET /pulp/status/disk_usage HTTP/1.1" 200 423
May 28 05:01:42 srv.domain.local smart-proxy[2021]: - -> /pulp/status/disk_usage
/ All services are running                                            [OK]
--------------------------------------------------------------------------------

Thanks @mschau

@ekohl I think our user is hitting this https://github.com/theforeman/puppet-foreman/commit/81a68c923a9e52a1579c599f95828042d0c95470 which you’ve fixed. Seems like it needs a backport. What do you think?

We just released 1.24.3 and with 2.1 around the corner (currently RC1) I think it’s unlikely we’ll release a 1.24.4.

These ordering issues can often be solved by rerunning the installer so that’s what I recommend now. Not pretty, but the proper fix will be there in a future version.

Makes sense to me.

@mschau Like Ewoud recommended: try running the installer again (or maybe a few more tries) and see if that gets things in order for you.

Tried it several times, made no difference.
any other suggestions?

@ekohl can the above patch make it into the 2.0 stream which corresponds to 3.15?