Frequent web service timeouts requiring restart

Problem:
Sometime after Foreman has started, the web service stops responding correctly.
Additionally, subscription-manager on client systems are not able to manage or check their subscriptions (also a timeout).
“foreman-maintain health check” returns all services running usually, but sometimes hangs on pinging services.

The error mainly manifests in the log as an Apache timeout.

==> /var/log/httpd/foreman-ssl_access_ssl.log <==
10.3.2.20 - - [19/Nov/2020:23:25:15 +1030] "GET /rhsm/consumers/88b98b35-6dae-4b09-92b7-9d89ff770ae7/compliance HTTP/1.1" 502 503 "-" "RHSM/1.0 (cmd=subscription-manager)"

==> /var/log/httpd/foreman-ssl_error_ssl.log <==
[Thu Nov 19 23:26:15.256693 2020] [proxy_http:error] [pid 19328] (70007)The timeout specified has expired: [client 10.3.2.20:54798] AH01102: error reading status line from remote server 127.0.0.1:3000
[Thu Nov 19 23:26:15.256772 2020] [proxy:error] [pid 19328] [client 10.3.2.20:54798] AH00898: Error reading from remote server returned by /rhsm/consumers/88b98b35-6dae-4b09-92b7-9d89ff770ae7/compliance

==> /var/log/httpd/foreman-ssl_error_ssl.log <==
[Fri Nov 20 10:39:02.257312 2020] [proxy_http:error] [pid 18375] (70007)The timeout specified has expired: [client 10.3.2.120:37498] AH01102: error reading status line from remote server 127.0.0.1:3000
[Fri Nov 20 10:39:02.257560 2020] [proxy:error] [pid 18375] [client 10.3.2.120:37498] AH00898: Error reading from remote server returned by /rhsm/consumers/ed3826b3-910e-4c0f-bfb6-9e5b0144e930/certificates/serials

The issue does not appear to self recover and must be resolved by running “foreman-maintain service restart” or rebooting the system.

Expected outcome:
The services to respond correctly.

Foreman and Proxy versions:
The issue occurred on Foreman 2.0 and Katello 3.15, so I upgraded to Foreman 2.2.1.1 and Katello 3.17 to try and resolve and it still occurs.

Foreman and Proxy plugin versions:

candlepin-3.1.22-1.el7.noarch
candlepin-selinux-3.1.22-1.el7.noarch
foreman-2.2.1-1.el7.noarch
foreman-cli-2.2.1-1.el7.noarch
foreman-debug-2.2.1-1.el7.noarch
foreman-dynflow-sidekiq-2.2.1-1.el7.noarch
foreman-installer-2.2.1-1.el7.noarch
foreman-installer-katello-2.2.1-1.el7.noarch
foreman-postgresql-2.2.1-1.el7.noarch
foreman-proxy-2.2.1-1.el7.noarch
foreman-release-2.2.1-1.el7.noarch
foreman-selinux-2.2.1-1.el7.noarch
foreman-service-2.2.1-1.el7.noarch
foreman-vmware-2.2.1-1.el7.noarch
foreman.sub.domain.com-apache-1.0-1.noarch
foreman.sub.domain.com-foreman-client-1.0-1.noarch
foreman.sub.domain.com-foreman-proxy-1.0-1.noarch
foreman.sub.domain.com-foreman-proxy-client-1.0-1.noarch
foreman.sub.domain.com-puppet-client-1.0-1.noarch
foreman.sub.domain.com-qpid-broker-1.0-1.noarch
foreman.sub.domain.com-qpid-client-cert-1.0-1.noarch
foreman.sub.domain.com-qpid-router-client-1.0-1.noarch
foreman.sub.domain.com-qpid-router-server-1.0-1.noarch
foreman.sub.domain.com-tomcat-1.0-1.noarch
katello-3.17.0-1.el7.noarch
katello-ca-consumer-foreman.sub.domain.com-1.0-1.noarch
katello-certs-tools-2.7.1-2.el7.noarch
katello-client-bootstrap-1.7.5-1.el7.noarch
katello-common-3.17.0-1.el7.noarch
katello-debug-3.17.0-1.el7.noarch
katello-default-ca-1.0-1.noarch
katello-repos-3.17.0-1.el7.noarch
katello-selinux-3.4.0-1.el7.noarch
katello-server-ca-1.0-1.noarch
pulp-client-1.0-1.noarch
pulp-deb-plugins-1.10.2-1.el7.noarch
pulp-docker-plugins-3.2.8-1.el7.noarch
pulp-katello-1.0.3-1.el7.noarch
pulp-puppet-plugins-2.21.4-1.el7.noarch
pulp-puppet-tools-2.21.4-1.el7.noarch
pulp-rpm-plugins-2.21.4-1.el7.noarch
pulp-selinux-2.21.4-2.el7.noarch
pulp-server-2.21.4-2.el7.noarch
pulpcore-selinux-1.0.0-2.el7.x86_64
python-gofer-qpid-2.12.5-3.el7.noarch
python-pulp-common-2.21.4-2.el7.noarch
python-pulp-deb-common-1.10.2-1.el7.noarch
python-pulp-docker-common-3.2.8-1.el7.noarch
python-pulp-oid_validation-2.21.4-2.el7.noarch
python-pulp-puppet-common-2.21.4-1.el7.noarch
python-pulp-repoauth-2.21.4-2.el7.noarch
python-pulp-rpm-common-2.21.4-1.el7.noarch
python-pulp-streamer-2.21.4-2.el7.noarch
python2-qpid-1.37.0-4.el7.noarch
python2-qpid-proton-0.32.0-2.el7.x86_64
python2-qpid-qmf-1.39.0-1.el7.x86_64
python3-pulp-2to3-migration-0.5.0-1.el7.noarch
python3-pulp-certguard-1.0.2-1.el7.noarch
python3-pulp-container-2.0.1-1.el7.noarch
python3-pulp-file-1.2.0-1.el7.noarch
python3-pulp-rpm-3.6.2-1.el7.noarch
python3-pulpcore-3.6.3-2.el7.noarch
qpid-cpp-client-1.39.0-1.el7.x86_64
qpid-cpp-client-devel-1.39.0-1.el7.x86_64
qpid-cpp-server-1.39.0-1.el7.x86_64
qpid-cpp-server-linearstore-1.39.0-1.el7.x86_64
qpid-dispatch-router-1.14.0-1.el7.x86_64
qpid-proton-c-0.32.0-2.el7.x86_64
qpid-qmf-1.39.0-1.el7.x86_64
qpid-tools-1.39.0-1.el7.noarch
rubygem-foreman_maintain-0.6.13-1.el7.noarch
tfm-rubygem-actioncable-6.0.3.1-1.el7.noarch
tfm-rubygem-actionmailbox-6.0.3.1-1.el7.noarch
tfm-rubygem-actionmailer-6.0.3.1-1.el7.noarch
tfm-rubygem-actionpack-6.0.3.1-1.el7.noarch
tfm-rubygem-actiontext-6.0.3.1-1.el7.noarch
tfm-rubygem-actionview-6.0.3.1-1.el7.noarch
tfm-rubygem-activejob-6.0.3.1-1.el7.noarch
tfm-rubygem-activemodel-6.0.3.1-1.el7.noarch
tfm-rubygem-activerecord-6.0.3.1-1.el7.noarch
tfm-rubygem-activerecord-import-1.0.0-2.el7.noarch
tfm-rubygem-activerecord-session_store-1.1.1-4.el7.noarch
tfm-rubygem-activestorage-6.0.3.1-1.el7.noarch
tfm-rubygem-activesupport-6.0.3.1-1.el7.noarch
tfm-rubygem-addressable-2.6.0-2.el7.noarch
tfm-rubygem-algebrick-0.7.3-7.el7.noarch
tfm-rubygem-amazing_print-1.1.0-1.el7.noarch
tfm-rubygem-ancestry-3.0.7-1.el7.noarch
tfm-rubygem-anemone-0.7.2-17.el7.noarch
tfm-rubygem-angular-rails-templates-1.1.0-1.el7.noarch
tfm-rubygem-ansi-1.5.0-2.el7.noarch
tfm-rubygem-apipie-bindings-0.4.0-1.el7.noarch
tfm-rubygem-apipie-dsl-2.2.9-1.el7.noarch
tfm-rubygem-apipie-params-0.0.5-4.el7.noarch
tfm-rubygem-apipie-rails-0.5.17-3.el7.noarch
tfm-rubygem-arel-9.0.0-3.el7.noarch
tfm-rubygem-audited-4.9.0-3.el7.noarch
tfm-rubygem-awesome_print-1.8.0-5.el7.noarch
tfm-rubygem-bcrypt-3.1.12-3.el7.x86_64
tfm-rubygem-builder-3.2.4-1.el7.noarch
tfm-rubygem-bundler_ext-0.4.1-5.el7.noarch
tfm-rubygem-clamp-1.1.2-6.el7.noarch
tfm-rubygem-concurrent-ruby-1.1.6-2.el7.noarch
tfm-rubygem-concurrent-ruby-edge-0.6.0-2.fm2_1.el7.noarch
tfm-rubygem-connection_pool-2.2.2-2.el7.noarch
tfm-rubygem-crass-1.0.6-1.el7.noarch
tfm-rubygem-css_parser-1.4.7-4.el7.noarch
tfm-rubygem-daemons-1.2.3-6.el7.noarch
tfm-rubygem-deacon-1.0.0-4.el7.noarch
tfm-rubygem-deep_cloneable-3.0.0-3.el7.noarch
tfm-rubygem-deface-1.5.3-2.el7.noarch
tfm-rubygem-domain_name-0.5.20160310-4.el7.noarch
tfm-rubygem-dynflow-1.4.7-1.fm2_2.el7.noarch
tfm-rubygem-erubi-1.9.0-1.el7.noarch
tfm-rubygem-ethon-0.12.0-1.el7.noarch
tfm-rubygem-excon-0.76.0-1.el7.noarch
tfm-rubygem-facter-2.4.0-7.el7.x86_64
tfm-rubygem-faraday-0.15.4-2.el7.noarch
tfm-rubygem-fast_gettext-1.4.1-4.el7.noarch
tfm-rubygem-ffi-1.12.2-1.el7.x86_64
tfm-rubygem-fog-core-2.1.0-3.el7.noarch
tfm-rubygem-fog-vsphere-3.4.0-1.el7.noarch
tfm-rubygem-foreman-tasks-3.0.1-1.fm2_2.el7.noarch
tfm-rubygem-foreman-tasks-core-0.3.4-1.fm2_1.el7.noarch
tfm-rubygem-foreman_remote_execution-4.1.0-1.fm2_2.el7.noarch
tfm-rubygem-foreman_remote_execution_core-1.3.1-1.el7.noarch
tfm-rubygem-formatador-0.2.1-12.el7.noarch
tfm-rubygem-friendly_id-5.3.0-1.el7.noarch
tfm-rubygem-fx-0.5.0-1.el7.noarch
tfm-rubygem-get_process_mem-0.2.1-4.el7.noarch
tfm-rubygem-gettext_i18n_rails-1.8.0-2.el7.noarch
tfm-rubygem-gitlab-sidekiq-fetcher-0.6.0-1.el7.noarch
tfm-rubygem-globalid-0.4.2-1.el7.noarch
tfm-rubygem-graphql-1.8.14-2.el7.noarch
tfm-rubygem-graphql-batch-0.3.10-2.el7.noarch
tfm-rubygem-gssapi-1.2.0-7.el7.noarch
tfm-rubygem-hammer_cli-2.2.1-1.el7.noarch
tfm-rubygem-hammer_cli_foreman-2.2.0-1.el7.noarch
tfm-rubygem-hammer_cli_foreman_bootdisk-0.3.0-1.el7.noarch
tfm-rubygem-hammer_cli_foreman_docker-0.0.7-1.el7.noarch
tfm-rubygem-hammer_cli_foreman_tasks-0.0.15-1.fm2_2.el7.noarch
tfm-rubygem-hammer_cli_katello-0.23.2-1.el7.noarch
tfm-rubygem-hashie-3.6.0-2.el7.noarch
tfm-rubygem-highline-1.7.8-5.el7.noarch
tfm-rubygem-http-cookie-1.0.2-4.el7.noarch
tfm-rubygem-i18n-1.8.2-1.el7.noarch
tfm-rubygem-jwt-2.2.1-2.el7.noarch
tfm-rubygem-kafo-5.0.1-1.el7.noarch
tfm-rubygem-kafo_parsers-1.1.0-3.el7.noarch
tfm-rubygem-kafo_wizards-0.0.1-4.el7.noarch
tfm-rubygem-katello-3.17.0-1.el7.noarch
tfm-rubygem-ldap_fluff-0.4.7-5.el7.noarch
tfm-rubygem-little-plugger-1.1.4-2.el7.noarch
tfm-rubygem-locale-2.0.9-14.el7.noarch
tfm-rubygem-logging-2.2.2-5.el7.noarch
tfm-rubygem-loofah-2.4.0-1.el7.noarch
tfm-rubygem-mail-2.7.1-1.el7.noarch
tfm-rubygem-marcel-0.3.3-1.el7.noarch
tfm-rubygem-method_source-0.9.2-1.el7.noarch
tfm-rubygem-mime-types-3.2.2-4.el7.noarch
tfm-rubygem-mime-types-data-3.2018.0812-4.el7.noarch
tfm-rubygem-mimemagic-0.3.5-1.el7.noarch
tfm-rubygem-mini_mime-1.0.2-1.el7.noarch
tfm-rubygem-mini_portile2-2.4.0-1.el7.noarch
tfm-rubygem-multi_json-1.14.1-2.el7.noarch
tfm-rubygem-multipart-post-2.0.0-2.el7.noarch
tfm-rubygem-mustermann-1.0.2-4.el7.noarch
tfm-rubygem-net-ldap-0.16.1-2.el7.noarch
tfm-rubygem-net-ping-2.0.1-4.el7.noarch
tfm-rubygem-net-scp-1.2.1-4.el7.noarch
tfm-rubygem-net-ssh-4.2.0-2.el7.noarch
tfm-rubygem-netrc-0.11.0-5.el7.noarch
tfm-rubygem-nio4r-2.5.2-1.el7.x86_64
tfm-rubygem-nokogiri-1.10.9-1.el7.x86_64
tfm-rubygem-oauth-0.5.4-4.el7.noarch
tfm-rubygem-optimist-3.0.0-2.el7.noarch
tfm-rubygem-parse-cron-0.1.4-4.fm2_1.el7.noarch
tfm-rubygem-passenger-4.0.18-10.13.el7.x86_64
tfm-rubygem-passenger-native-4.0.18-10.13.el7.x86_64
tfm-rubygem-passenger-native-libs-4.0.18-10.13.el7.x86_64
tfm-rubygem-pg-1.1.4-3.el7.x86_64
tfm-rubygem-polyglot-0.3.5-2.el7.noarch
tfm-rubygem-powerbar-2.0.1-2.el7.noarch
tfm-rubygem-promise.rb-0.7.4-2.el7.noarch
tfm-rubygem-public_suffix-3.0.3-2.el7.noarch
tfm-rubygem-pulp_2to3_migration_client-0.5.0-1.el7.noarch
tfm-rubygem-pulp_ansible_client-0.3.0-1.el7.noarch
tfm-rubygem-pulp_certguard_client-1.0.2-1.el7.noarch
tfm-rubygem-pulp_container_client-2.0.0-1.el7.noarch
tfm-rubygem-pulp_file_client-1.2.0-1.el7.noarch
tfm-rubygem-pulp_rpm_client-3.6.2-1.el7.noarch
tfm-rubygem-pulpcore_client-3.6.0-1.el7.noarch
tfm-rubygem-puma-4.3.3-4.el7.x86_64
tfm-rubygem-qpid_messaging-1.38.0-1.el7.x86_64
tfm-rubygem-rabl-0.14.3-1.el7.noarch
tfm-rubygem-rack-2.2.3-1.el7.noarch
tfm-rubygem-rack-cors-1.0.2-2.el7.noarch
tfm-rubygem-rack-jsonp-1.3.1-9.el7.noarch
tfm-rubygem-rack-protection-2.0.3-4.el7.noarch
tfm-rubygem-rack-test-1.1.0-4.el7.noarch
tfm-rubygem-rails-6.0.3.1-1.el7.noarch
tfm-rubygem-rails-dom-testing-2.0.3-6.el7.noarch
tfm-rubygem-rails-html-sanitizer-1.3.0-1.el7.noarch
tfm-rubygem-rails-i18n-6.0.0-2.el7.noarch
tfm-rubygem-railties-6.0.3.1-1.el7.noarch
tfm-rubygem-rainbow-2.2.1-3.el7.noarch
tfm-rubygem-rake-compiler-1.0.7-3.el7.noarch
tfm-rubygem-rb-inotify-0.9.7-5.el7.noarch
tfm-rubygem-rbvmomi-2.2.0-3.el7.noarch
tfm-rubygem-record_tag_helper-1.0.1-3.el7.noarch
tfm-rubygem-redis-4.1.2-2.el7.noarch
tfm-rubygem-responders-3.0.0-3.el7.noarch
tfm-rubygem-rest-client-2.0.2-3.el7.noarch
tfm-rubygem-rkerberos-0.1.5-18.el7.x86_64
tfm-rubygem-roadie-3.4.0-3.el7.noarch
tfm-rubygem-roadie-rails-2.1.1-2.el7.noarch
tfm-rubygem-robotex-1.0.0-21.el7.noarch
tfm-rubygem-rsec-0.4.3-4.el7.noarch
tfm-rubygem-ruby-libvirt-0.7.1-1.el7.x86_64
tfm-rubygem-ruby2ruby-2.4.2-3.el7.noarch
tfm-rubygem-ruby_parser-3.10.1-3.el7.noarch
tfm-rubygem-rubyipmi-0.10.0-6.el7.noarch
tfm-rubygem-runcible-2.13.1-1.el7.noarch
tfm-rubygem-safemode-1.3.5-3.el7.noarch
tfm-rubygem-scoped_search-4.1.9-1.el7.noarch
tfm-rubygem-secure_headers-6.3.0-2.el7.noarch
tfm-rubygem-sequel-5.7.1-3.el7.noarch
tfm-rubygem-sexp_processor-4.10.0-6.el7.noarch
tfm-rubygem-sidekiq-5.2.7-3.el7.noarch
tfm-rubygem-sinatra-2.0.3-4.el7.noarch
tfm-rubygem-smart_proxy_pulp-2.1.0-3.fm2_2.el7.noarch
tfm-rubygem-sprockets-4.0.2-1.el7.noarch
tfm-rubygem-sprockets-rails-3.2.1-6.el7.noarch
tfm-rubygem-sshkey-1.9.0-4.el7.noarch
tfm-rubygem-statsd-instrument-2.1.4-3.el7.noarch
tfm-rubygem-stomp-1.4.9-1.el7.noarch
tfm-rubygem-thor-1.0.1-2.el7.noarch
tfm-rubygem-thread_safe-0.3.6-5.el7.noarch
tfm-rubygem-tilt-2.0.8-4.el7.noarch
tfm-rubygem-typhoeus-1.3.1-1.el7.noarch
tfm-rubygem-tzinfo-1.2.6-1.el7.noarch
tfm-rubygem-unf-0.1.3-8.el7.noarch
tfm-rubygem-unf_ext-0.0.7.2-3.el7.x86_64
tfm-rubygem-unicode-0.4.4.4-3.el7.x86_64
tfm-rubygem-unicode-display_width-1.0.5-4.el7.noarch
tfm-rubygem-validates_lengths_from_database-0.5.0-7.el7.noarch
tfm-rubygem-webpack-rails-0.9.8-5.el7.noarch
tfm-rubygem-websocket-driver-0.7.1-1.el7.x86_64
tfm-rubygem-websocket-extensions-0.1.5-1.el7.noarch
tfm-rubygem-will_paginate-3.1.7-3.el7.noarch
tfm-rubygem-x-editable-rails-1.5.5-5.el7.noarch
tfm-rubygem-xmlrpc-0.3.0-2.el7.noarch
tfm-rubygem-zeitwerk-2.2.2-1.el7.noarch
tfm-runtime-6.1-3.el7.x86_64

Distribution and version:
CentOS 7.9.2009

Other relevant data:

The issue seems similar to Tomcat stops responding after several hours. 502 Errors but I don’t get the same error output in catalina logs.

I just upgraded from Katello 3.16 to Katello 3.17 and having the same exact issues. Everything seemed to be working on Katello 3.16.

What is the memory and free disk space on the environment? Also, do you see any logs in /var/log/candlepin when this happens?

@sajha Here is the ‘free’ command

[root@satellite kcadmin]# free
              total        used        free      shared  buff/cache   available
Mem:       12137444     6735312     3650652      394728     1751480     4724812
Swap:       7812092           0     7812092

Here is the some info from the candle pin log:

2020-11-23 06:19:29,585 [thread=localhost-startStop-1] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel

I did recently update my Wildcard cert on the server but it was working up until I attempted the upgrade.

@kcgaisford, could you also provide the df -h output and and candlepin logs for when this happens?

[root@satellite kcadmin]# df -h
Filesystem      Size  Used Avail Use% Mounted on
devtmpfs        5.8G     0  5.8G   0% /dev
tmpfs           5.8G   40K  5.8G   1% /dev/shm
tmpfs           5.8G  354M  5.5G   6% /run
tmpfs           5.8G     0  5.8G   0% /sys/fs/cgroup
/dev/dm-1        20G  6.6G   14G  33% /
/dev/sda1       697M  233M  465M  34% /boot
/dev/dm-2        20G   16G  4.6G  78% /var
/dev/dm-3       150G  1.9G  149G   2% /var/lib/pgsql
/dev/dm-4       500G   95G  406G  19% /var/lib/pulp
tmpfs           1.2G     0  1.2G   0% /run/user/1002

Then nothing really changed with the log output, but I did also spot this log:

/var/log/candlepin/cpinit.log

2020-11-23 06:19:32,280 [thread=localhost-startStop-1] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue katello_candlepin_event_monitor.candlepin_events in AddressSettings

Then In my in my Katello log from the update I found this as well.
/var/log/foreman-installer/katello.log

[DEBUG 2020-11-20T17:04:25 main] Hook /usr/share/foreman-installer/katello/hooks/post/99-version_locking.rb returned nil
[ INFO 2020-11-20T17:04:25 main] All hooks in group post finished
[DEBUG 2020-11-20T17:04:25 main] Exit with status code: 6 (signal was 6)
[ERROR 2020-11-20T17:04:25 main] Errors encountered during run:
[ERROR 2020-11-20T17:04:25 main] foreman-maintain packages is-locked --assumeyes failed! Check the output for error!
[ERROR 2020-11-20T17:04:25 main]  /Stage[main]/Pulpcore::Static/Pulpcore::Admin[collectstatic --noinput]/Exec[pulpcore-manager collectstatic --noinput]: Failed to call refresh: 'pulpcore-manager collectstatic --noinput' returned 1 instead of one of [0]
[ERROR 2020-11-20T17:04:25 main]  /Stage[main]/Pulpcore::Static/Pulpcore::Admin[collectstatic --noinput]/Exec[pulpcore-manager collectstatic --noinput]: 'pulpcore-manager collectstatic --noinput' returned 1 instead of one of [0]

What does your foreman-maintain service status show?

I outputted the entire command down below, but at the very bottom of it says:

\ All services are running                                            [OK]      

Entire foreman-maintain service status

sudo foreman-maintain service status
[sudo] password for kcadmin: 
Running Status Services
================================================================================
Get status of applicable services: 

Displaying the following service(s):
rh-mongodb34-mongod, rh-redis5-redis, postgresql, pulpcore-api, pulpcore-content, pulpcore-resource-manager, qdrouterd, qpidd, rh-redis5-redis, squid, pulp_celerybeat, pulp_resource_manager, pulp_streamer, pulp_workers, pulpcore-worker@*, tomcat, dynflow-sidekiq@orchestrator, foreman, httpd, puppetserver, dynflow-sidekiq@worker, dynflow-sidekiq@worker-hosts-queue, foreman-proxy
| displaying rh-mongodb34-mongod                                                
● rh-mongodb34-mongod.service - High-performance, schema-free document-oriented database
   Loaded: loaded (/usr/lib/systemd/system/rh-mongodb34-mongod.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:56 MST; 8h ago
  Process: 1779 ExecStart=/opt/rh/rh-mongodb34/root/usr/libexec/mongodb-scl-helper enable $RH_MONGODB34_SCLS_ENABLED -- /opt/rh/rh-mongodb34/root/usr/bin/mongod $OPTIONS run (code=exited, status=0/SUCCESS)
 Main PID: 2461 (mongod)
    Tasks: 55
   CGroup: /system.slice/rh-mongodb34-mongod.service
           └─2461 /opt/rh/rh-mongodb34/root/usr/bin/mongod -f /etc/opt/rh/rh-mongodb34/mongod.conf run

Nov 23 06:18:54 satellite.truhearing.com mongod.27017[2461]: [initandlisten] **        We suggest setting it to 'never'
Nov 23 06:18:54 satellite.truhearing.com mongod.27017[2461]: [initandlisten]
Nov 23 06:18:54 satellite.truhearing.com mongod.27017[2461]: [initandlisten] ** WARNING: /sys/kernel/mm/transparent_hugepage/defrag is 'always'.
Nov 23 06:18:54 satellite.truhearing.com mongod.27017[2461]: [initandlisten] **        We suggest setting it to 'never'
Nov 23 06:18:54 satellite.truhearing.com mongod.27017[2461]: [initandlisten]
Nov 23 06:18:56 satellite.truhearing.com mongod.27017[2461]: [initandlisten] Initializing full-time diagnostic data capture with directory '/var/lib/mongodb/diagnostic.data'
Nov 23 06:18:56 satellite.truhearing.com mongod.27017[2461]: [thread1] waiting for connections on port 27017
Nov 23 06:18:56 satellite.truhearing.com mongodb-scl-helper[1779]: child process started successfully, parent exiting
Nov 23 06:18:56 satellite.truhearing.com systemd[1]: Started High-performance, schema-free document-oriented database.
Nov 23 14:49:13 satellite.truhearing.com mongod.27017[2461]: [conn74] received client metadata from 127.0.0.1:54120 conn74: { application: { name: "MongoDB Shell" }, driver: { name: "MongoDB Internal Client", version: "3.4.9" }, os: { type: "Linux", name: "CentOS Linux release 7.9.2009 (Core)", architecture: "x86_64", version: "Kernel 3.10.0-1160.6.1.el7.x86_64" } }
| displaying rh-redis5-redis                                                    
● rh-redis5-redis.service - Redis persistent key-value database
   Loaded: loaded (/usr/lib/systemd/system/rh-redis5-redis.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/rh-redis5-redis.service.d
           └─limit.conf
   Active: active (running) since Mon 2020-11-23 06:18:51 MST; 8h ago
 Main PID: 1816 (redis-server)
    Tasks: 4
   CGroup: /system.slice/rh-redis5-redis.service
           └─1816 /opt/rh/rh-redis5/root/usr/bin/redis-server 127.0.0.1:6379                                       

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting Redis persistent key-value database...
Nov 23 06:18:51 satellite.truhearing.com systemd[1]: Started Redis persistent key-value database.
/ displaying postgresql                                                         
● postgresql.service - PostgreSQL database server
   Loaded: loaded (/etc/systemd/system/postgresql.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:53 MST; 8h ago
  Process: 1780 ExecStartPre=/opt/rh/rh-postgresql12/root/usr/libexec/postgresql-check-db-dir %N (code=exited, status=0/SUCCESS)
 Main PID: 1869 (postmaster)
    Tasks: 24
   CGroup: /system.slice/postgresql.service
           ├─ 1869 postmaster -D /var/opt/rh/rh-postgresql12/lib/pgsql/data
           ├─ 2207 postgres: logger                                        
           ├─ 2308 postgres: checkpointer                                  
           ├─ 2309 postgres: background writer                             
           ├─ 2310 postgres: walwriter                                     
           ├─ 2311 postgres: autovacuum launcher                           
           ├─ 2312 postgres: stats collector                               
           ├─ 2313 postgres: logical replication launcher                  
           ├─ 3189 postgres: pulp pulpcore ::1(41970) idle                 
           ├─ 3190 postgres: pulp pulpcore ::1(41972) idle                 
           ├─ 3191 postgres: pulp pulpcore ::1(41974) idle                 
           ├─ 3195 postgres: pulp pulpcore ::1(41976) idle                 
           ├─ 3197 postgres: pulp pulpcore ::1(41978) idle                 
           ├─ 3321 postgres: candlepin candlepin 127.0.0.1(59188) idle     
           ├─ 3324 postgres: candlepin candlepin 127.0.0.1(59190) idle     
           ├─ 3326 postgres: candlepin candlepin 127.0.0.1(59192) idle     
           ├─27532 postgres: candlepin candlepin 127.0.0.1(48022) idle     
           ├─27533 postgres: candlepin candlepin 127.0.0.1(48024) idle     
           ├─27534 postgres: candlepin candlepin 127.0.0.1(48026) idle     
           ├─27535 postgres: candlepin candlepin 127.0.0.1(48028) idle     
           ├─28793 postgres: candlepin candlepin 127.0.0.1(48330) idle     
           ├─30759 postgres: foreman foreman [local] idle                  
           ├─30805 postgres: foreman foreman [local] idle                  
           └─30858 postgres: foreman foreman [local] idle                  

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting PostgreSQL database server...
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  starting PostgreSQL 12.1 on x86_64-redhat-linux-gnu, compiled by gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-39), 64-bit
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  listening on IPv6 address "::1", port 5432
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  listening on IPv4 address "127.0.0.1", port 5432
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  listening on Unix socket "/tmp/.s.PGSQL.5432"
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST LOG:  redirecting log output to logging collector process
Nov 23 06:18:52 satellite.truhearing.com sh[1869]: 2020-11-23 06:18:52 MST HINT:  Future log output will appear in directory "log".
Nov 23 06:18:53 satellite.truhearing.com systemd[1]: Started PostgreSQL database server.
/ displaying pulpcore-api                                                       
● pulpcore-api.service - Pulp WSGI Server
   Loaded: loaded (/etc/systemd/system/pulpcore-api.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1803 (gunicorn)
    Tasks: 2
   CGroup: /system.slice/pulpcore-api.service
           ├─1803 /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind 127.0.0.1:24817 --access-logfile -
           └─2612 /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind 127.0.0.1:24817 --access-logfile -

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Started Pulp WSGI Server.
Nov 23 06:18:56 satellite.truhearing.com pulpcore-api[1803]: [2020-11-23 06:18:56 -0700] [1803] [INFO] Starting gunicorn 20.0.4
Nov 23 06:18:56 satellite.truhearing.com pulpcore-api[1803]: [2020-11-23 06:18:56 -0700] [1803] [INFO] Listening at: http://127.0.0.1:24817 (1803)
Nov 23 06:18:56 satellite.truhearing.com pulpcore-api[1803]: [2020-11-23 06:18:56 -0700] [1803] [INFO] Using worker: sync
Nov 23 06:18:56 satellite.truhearing.com pulpcore-api[1803]: [2020-11-23 06:18:56 -0700] [2612] [INFO] Booting worker with pid: 2612
/ displaying pulpcore-content                                                   
● pulpcore-content.service - Pulp Content App
   Loaded: loaded (/etc/systemd/system/pulpcore-content.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1815 (gunicorn)
    Tasks: 3
   CGroup: /system.slice/pulpcore-content.service
           ├─1815 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -
           ├─2923 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -
           └─2924 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Started Pulp Content App.
Nov 23 06:18:59 satellite.truhearing.com pulpcore-content[1815]: [2020-11-23 06:18:59 -0700] [1815] [INFO] Starting gunicorn 20.0.4
Nov 23 06:18:59 satellite.truhearing.com pulpcore-content[1815]: [2020-11-23 06:18:59 -0700] [1815] [INFO] Listening at: http://127.0.0.1:24816 (1815)
Nov 23 06:18:59 satellite.truhearing.com pulpcore-content[1815]: [2020-11-23 06:18:59 -0700] [1815] [INFO] Using worker: aiohttp.GunicornWebWorker
Nov 23 06:18:59 satellite.truhearing.com pulpcore-content[1815]: [2020-11-23 06:18:59 -0700] [2923] [INFO] Booting worker with pid: 2923
Nov 23 06:18:59 satellite.truhearing.com pulpcore-content[1815]: [2020-11-23 06:18:59 -0700] [2924] [INFO] Booting worker with pid: 2924
/ displaying pulpcore-resource-manager                                          
● pulpcore-resource-manager.service - Pulp Resource Manager
   Loaded: loaded (/etc/systemd/system/pulpcore-resource-manager.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1794 (rq)
    Tasks: 1
   CGroup: /system.slice/pulpcore-resource-manager.service
           └─1794 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker -n resource-manager --pid=/var/run/pulpcore-resource-manager/resource-manager.pid -c pulpcore.rqconfig --disable-job-desc-logging

Nov 23 11:40:37 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 12:00:42 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 12:20:47 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 12:40:51 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 13:00:56 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 13:21:01 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 13:41:06 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 14:01:11 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 14:21:15 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
Nov 23 14:41:21 satellite.truhearing.com pulpcore-resource-manager[1794]: pulp: rq.worker:INFO: Cleaning registries for queue: resource-manager
/ displaying qdrouterd                                                          
● qdrouterd.service - Qpid Dispatch router daemon
   Loaded: loaded (/usr/lib/systemd/system/qdrouterd.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/qdrouterd.service.d
           └─90-limits.conf
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1725 (qdrouterd)
    Tasks: 7
   CGroup: /system.slice/qdrouterd.service
           └─1725 /usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf

Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.717781 -0700 ROUTER_CORE (info) [C63][L110] Link attached: dir=out source={pulp.agent.10e5e82e-aa20-4f95-836a-9f2b0bcbfbc9 expire:sess} target={<none> expire:sess}
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.723014 -0700 ROUTER_CORE (info) [C62][L114] Link attached: dir=out source={pulp.agent.a026a3a1-8cea-467c-888d-647c910f42bc expire:sess} target={<none> expire:sess}
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.734022 -0700 SERVER (info) [C65] Accepted connection to :5647 from 10.2.1.169:42738
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.798981 -0700 ROUTER (info) [C65] Connection Opened: dir=in host=10.2.1.169:42738 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=0f8ff364-0451-49d1-bb5f-cfb7d4b7ce99 props=
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.811073 -0700 ROUTER_CORE (info) [C65][L116] Link attached: dir=out source={pulp.agent.6d5ac000-c1d9-4cfb-ae1e-3dc04a5c2be2 expire:sess} target={<none> expire:sess}
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.866830 -0700 ROUTER (info) [C64] Connection Opened: dir=in host=10.66.101.117:37390 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=10236fe0-4fee-4045-9ff0-4cec01f2560c props=
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.916585 -0700 ROUTER_CORE (info) [C64][L118] Link attached: dir=out source={pulp.agent.7c0075e2-ede7-4d99-8ce8-863b55d954f8 expire:sess} target={<none> expire:sess}
Nov 23 06:19:29 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:29.944784 -0700 SERVER (info) [C66] Accepted connection to :5647 from 10.2.1.42:57622
Nov 23 06:19:30 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:30.004327 -0700 ROUTER (info) [C66] Connection Opened: dir=in host=10.2.1.42:57622 vhost= encrypted=TLSv1/SSLv3 auth=ANONYMOUS user=anonymous container_id=8fa9ef0b-4402-4f8d-b7c2-ae700baccef4 props=
Nov 23 06:19:30 satellite.truhearing.com qdrouterd[1725]: 2020-11-23 06:19:30.012404 -0700 ROUTER_CORE (info) [C66][L120] Link attached: dir=out source={pulp.agent.44edac25-aac2-4881-bd38-4ef994804958 expire:sess} target={<none> expire:sess}
/ displaying qpidd                                                              
● qpidd.service - An AMQP message broker daemon.
   Loaded: loaded (/usr/lib/systemd/system/qpidd.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/qpidd.service.d
           └─90-limits.conf, wait-for-port.conf
   Active: active (running) since Mon 2020-11-23 06:18:58 MST; 8h ago
     Docs: man:qpidd(1)
           http://qpid.apache.org/
  Process: 1737 ExecStartPost=/bin/bash -c while ! nc -z localhost 5671; do sleep 1; done (code=exited, status=0/SUCCESS)
 Main PID: 1736 (qpidd)
    Tasks: 8
   CGroup: /system.slice/qpidd.service
           └─1736 /usr/sbin/qpidd --config /etc/qpid/qpidd.conf

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting An AMQP message broker daemon....
Nov 23 06:18:58 satellite.truhearing.com qpidd[1736]: 2020-11-23 06:18:58 [System] error Error reading socket: Encountered end of file [-5938]
Nov 23 06:18:58 satellite.truhearing.com qpidd[1736]: 2020-11-23 06:18:58 [System] error Error reading socket: Encountered end of file [-5938]
Nov 23 06:18:58 satellite.truhearing.com systemd[1]: Started An AMQP message broker daemon..
/ displaying rh-redis5-redis                                                    
● rh-redis5-redis.service - Redis persistent key-value database
   Loaded: loaded (/usr/lib/systemd/system/rh-redis5-redis.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/rh-redis5-redis.service.d
           └─limit.conf
   Active: active (running) since Mon 2020-11-23 06:18:51 MST; 8h ago
 Main PID: 1816 (redis-server)
    Tasks: 4
   CGroup: /system.slice/rh-redis5-redis.service
           └─1816 /opt/rh/rh-redis5/root/usr/bin/redis-server 127.0.0.1:6379                                       

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting Redis persistent key-value database...
Nov 23 06:18:51 satellite.truhearing.com systemd[1]: Started Redis persistent key-value database.
- displaying squid                                                              
● squid.service - Squid caching proxy
   Loaded: loaded (/usr/lib/systemd/system/squid.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:51 MST; 8h ago
  Process: 1859 ExecStart=/usr/sbin/squid $SQUID_OPTS -f $SQUID_CONF (code=exited, status=0/SUCCESS)
  Process: 1721 ExecStartPre=/usr/libexec/squid/cache_swap.sh (code=exited, status=0/SUCCESS)
 Main PID: 1882 (squid)
    Tasks: 3
   CGroup: /system.slice/squid.service
           ├─1882 /usr/sbin/squid -f /etc/squid/squid.conf
           ├─1884 (squid-1) -f /etc/squid/squid.conf
           └─1937 (logfile-daemon) /var/log/squid/access.log

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting Squid caching proxy...
Nov 23 06:18:51 satellite.truhearing.com squid[1882]: Squid Parent: will start 1 kids
Nov 23 06:18:51 satellite.truhearing.com squid[1882]: Squid Parent: (squid-1) process 1884 started
Nov 23 06:18:51 satellite.truhearing.com systemd[1]: Started Squid caching proxy.
- displaying pulp_celerybeat                                                    
● pulp_celerybeat.service - Pulp's Celerybeat
   Loaded: loaded (/usr/lib/systemd/system/pulp_celerybeat.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1814 (celery)
    Tasks: 5
   CGroup: /system.slice/pulp_celerybeat.service
           └─1814 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler

Nov 23 13:18:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 13:28:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 13:38:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 13:48:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 13:58:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 14:08:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 14:18:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 14:28:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 14:38:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
Nov 23 14:48:59 satellite.truhearing.com pulp[1814]: celery.beat:INFO: Scheduler: Sending due task download_deferred_content (pulp.server.controllers.repository.queue_download_deferred)
- displaying pulp_resource_manager                                              
● pulp_resource_manager.service - Pulp Resource Manager
   Loaded: loaded (/usr/lib/systemd/system/pulp_resource_manager.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1808 (celery)
    Tasks: 7
   CGroup: /system.slice/pulp_resource_manager.service
           ├─1808 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid
           └─2986 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid

Nov 23 06:19:03 satellite.truhearing.com celery[1808]: -------------- [queues]
Nov 23 06:19:03 satellite.truhearing.com celery[1808]: .> resource_manager exchange=resource_manager(direct) key=resource_manager
Nov 23 06:19:03 satellite.truhearing.com celery[1808]: .> resource_manager@satellite.truhearing.com.dq2 exchange=C.dq2(direct) key=resource_manager@satellite.truhearing.com
Nov 23 06:19:03 satellite.truhearing.com pulp[2986]: pulp.server.db.connection:INFO: Attempting to connect to localhost:27017
Nov 23 06:19:03 satellite.truhearing.com pulp[1808]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
Nov 23 06:19:03 satellite.truhearing.com pulp[1808]: celery.worker.consumer.connection:INFO: Connected to qpid://localhost:5671//
Nov 23 06:19:03 satellite.truhearing.com pulp[1808]: kombu.transport.qpid:INFO: Connected to qpid with SASL mechanism ANONYMOUS
Nov 23 06:19:03 satellite.truhearing.com pulp[2986]: pulp.server.db.connection:INFO: Attempting to connect to localhost:27017
Nov 23 06:19:04 satellite.truhearing.com pulp[1808]: celery.apps.worker:INFO: resource_manager@satellite.truhearing.com ready.
Nov 23 06:19:04 satellite.truhearing.com pulp[2986]: pulp.server.db.connection:INFO: Write concern for Mongo connection: {}
- displaying pulp_streamer                                                      
● pulp_streamer.service - The Pulp lazy content loading streamer
   Loaded: loaded (/usr/lib/systemd/system/pulp_streamer.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1790 (pulp_streamer)
    Tasks: 3
   CGroup: /system.slice/pulp_streamer.service
           └─1790 /usr/bin/python /usr/bin/pulp_streamer --nodaemon --syslog --prefix=pulp_streamer --pidfile= --python /usr/share/pulp/wsgi/streamer.tac

Nov 23 06:19:02 satellite.truhearing.com pulp_streamer[1790]: pulp.plugins.loader.manager:INFO: Loaded plugin iso_importer for types: iso
Nov 23 06:19:02 satellite.truhearing.com pulp_streamer[1790]: pulp.plugins.loader.manager:INFO: Loaded plugin puppet_whole_repo_profiler for types: puppet_module
Nov 23 06:19:02 satellite.truhearing.com pulp_streamer[1790]: pulp.plugins.loader.manager:INFO: Loaded plugin yum_profiler for types: rpm,erratum,modulemd
Nov 23 06:19:02 satellite.truhearing.com pulp_streamer[1790]: pulp.plugins.loader.manager:INFO: Loaded plugin yum for types: rpm
Nov 23 06:19:02 satellite.truhearing.com pulp_streamer[1790]: pulp.plugins.loader.manager:INFO: Loaded plugin rhui for types: rpm
Nov 23 06:19:03 satellite.truhearing.com pulp_streamer[1790]: [-] Log opened.
Nov 23 06:19:03 satellite.truhearing.com pulp_streamer[1790]: [-] twistd 12.2.0 (/usr/bin/python 2.7.5) starting up.
Nov 23 06:19:03 satellite.truhearing.com pulp_streamer[1790]: [-] reactor class: twisted.internet.epollreactor.EPollReactor.
Nov 23 06:19:03 satellite.truhearing.com pulp_streamer[1790]: [-] Site starting on 8751
Nov 23 06:19:03 satellite.truhearing.com pulp_streamer[1790]: [-] Starting factory <twisted.web.server.Site instance at 0x7f163ad92c68>
- displaying pulp_workers                                                       
● pulp_workers.service - Pulp Celery Workers
   Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service; enabled; vendor preset: disabled)
   Active: active (exited) since Mon 2020-11-23 06:18:51 MST; 8h ago
  Process: 1811 ExecStart=/usr/bin/python -m pulp.server.async.manage_workers start (code=exited, status=0/SUCCESS)
 Main PID: 1811 (code=exited, status=0/SUCCESS)
    Tasks: 0
   CGroup: /system.slice/pulp_workers.service

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting Pulp Celery Workers...
Nov 23 06:18:51 satellite.truhearing.com systemd[1]: Started Pulp Celery Workers.
- displaying pulpcore-worker@*                                                  
● pulpcore-worker@2.service - Pulp RQ Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1832 (rq)
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@2.service
           └─1832 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker --pid=/var/run/pulpcore-worker-2/reserved-resource-worker-2.pid -c pulpcore.rqconfig --disable-job-desc-logging

Nov 23 11:40:34 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 12:00:39 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 12:20:44 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 12:40:48 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 13:00:54 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 13:20:59 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 13:41:03 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 14:01:08 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 14:21:13 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com
Nov 23 14:41:18 satellite.truhearing.com pulpcore-worker-2[1832]: pulp: rq.worker:INFO: Cleaning registries for queue: 1832@satellite.truhearing.com

● pulpcore-worker@1.service - Pulp RQ Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1813 (rq)
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@1.service
           └─1813 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker --pid=/var/run/pulpcore-worker-1/reserved-resource-worker-1.pid -c pulpcore.rqconfig --disable-job-desc-logging

Nov 23 11:40:37 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 12:00:42 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 12:20:48 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 12:40:53 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 13:00:58 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 13:21:02 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 13:41:07 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 14:01:12 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 14:21:17 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
Nov 23 14:41:22 satellite.truhearing.com pulpcore-worker-1[1813]: pulp: rq.worker:INFO: Cleaning registries for queue: 1813@satellite.truhearing.com
- displaying tomcat                                                             
● tomcat.service - Apache Tomcat Web Application Container
   Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:50 MST; 8h ago
 Main PID: 1784 (java)
    Tasks: 75
   CGroup: /system.slice/tomcat.service
           └─1784 /usr/lib/jvm/jre/bin/java -Xms1024m -Xmx4096m -Djava.security.auth.login.config=/usr/share/tomcat/conf/login.config -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/commons-daemon.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start

Nov 23 06:19:22 satellite.truhearing.com server[1784]: Nov 23, 2020 6:19:22 AM com.google.inject.internal.ProxyFactory <init>
Nov 23 06:19:22 satellite.truhearing.com server[1784]: WARNING: Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3b145b54]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
Nov 23 06:19:22 satellite.truhearing.com server[1784]: Nov 23, 2020 6:19:22 AM com.google.inject.internal.ProxyFactory <init>
Nov 23 06:19:22 satellite.truhearing.com server[1784]: WARNING: Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3b145b54]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
Nov 23 06:19:35 satellite.truhearing.com server[1784]: Nov 23, 2020 6:19:35 AM org.apache.catalina.startup.HostConfig deployDirectory
Nov 23 06:19:35 satellite.truhearing.com server[1784]: INFO: Deployment of web application directory /var/lib/tomcat/webapps/candlepin has finished in 32,697 ms
Nov 23 06:19:35 satellite.truhearing.com server[1784]: Nov 23, 2020 6:19:35 AM org.apache.coyote.AbstractProtocol start
Nov 23 06:19:35 satellite.truhearing.com server[1784]: INFO: Starting ProtocolHandler ["http-bio-127.0.0.1-8443"]
Nov 23 06:19:35 satellite.truhearing.com server[1784]: Nov 23, 2020 6:19:35 AM org.apache.catalina.startup.Catalina start
Nov 23 06:19:35 satellite.truhearing.com server[1784]: INFO: Server startup in 32947 ms
- displaying dynflow-sidekiq@orchestrator                                       
● dynflow-sidekiq@orchestrator.service - Foreman jobs daemon - orchestrator on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 14:49:07 MST; 7s ago
     Docs: https://theforeman.org
 Main PID: 30498 (sidekiq)
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@orchestrator.service
           └─30498 /opt/rh/rh-ruby25/root/usr/bin/ruby /opt/theforeman/tfm/root/usr/bin/sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/orchestrator.yml

Nov 23 14:49:07 satellite.truhearing.com systemd[1]: Started Foreman jobs daemon - orchestrator on sidekiq.
Nov 23 14:49:09 satellite.truhearing.com dynflow-sidekiq@orchestrator[30498]: 2020-11-23T21:49:09.473Z 30498 TID-62cpe INFO: GitLab reliable fetch activated!
Nov 23 14:49:09 satellite.truhearing.com dynflow-sidekiq@orchestrator[30498]: 2020-11-23T21:49:09.475Z 30498 TID-hadh6 INFO: Booting Sidekiq 5.2.7 with redis options {:id=>"Sidekiq-server-PID-30498", :url=>"redis://localhost:6379/0"}
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@orchestrator[30498]: /usr/share/foreman/lib/foreman.rb:8: warning: already initialized constant Foreman::UUID_REGEXP
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@orchestrator[30498]: /usr/share/foreman/lib/foreman.rb:8: warning: previous definition of UUID_REGEXP was here
\ displaying foreman                                                            
● foreman.service - Foreman
   Loaded: loaded (/usr/lib/systemd/system/foreman.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/foreman.service.d
           └─installer.conf
   Active: active (running) since Mon 2020-11-23 14:49:08 MST; 6s ago
     Docs: https://theforeman.org
 Main PID: 30530 (ruby)
    Tasks: 4
   CGroup: /system.slice/foreman.service
           └─30530 ruby /usr/share/foreman/bin/rails server --environment production

Nov 23 14:49:08 satellite.truhearing.com systemd[1]: Started Foreman.
Nov 23 14:49:13 satellite.truhearing.com foreman[30530]: /usr/share/foreman/lib/foreman.rb:8: warning: already initialized constant Foreman::UUID_REGEXP
Nov 23 14:49:13 satellite.truhearing.com foreman[30530]: /usr/share/foreman/lib/foreman.rb:8: warning: previous definition of UUID_REGEXP was here
Nov 23 14:49:14 satellite.truhearing.com foreman[30530]: => Booting Puma
Nov 23 14:49:14 satellite.truhearing.com foreman[30530]: => Rails 6.0.3.1 application starting in production
Nov 23 14:49:14 satellite.truhearing.com foreman[30530]: => Run `rails server --help` for more startup options
\ displaying httpd                                                              
● httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:18:52 MST; 8h ago
     Docs: man:httpd(8)
           man:apachectl(8)
 Main PID: 1754 (httpd)
   Status: "Total requests: 0; Current requests/sec: 0; Current traffic:   0 B/sec"
    Tasks: 165
   CGroup: /system.slice/httpd.service
           ├─ 1754 /usr/sbin/httpd -DFOREGROUND
           ├─ 2202 (wsgi:pulp)     -DFOREGROUND
           ├─ 2206 (wsgi:pulp)     -DFOREGROUND
           ├─ 2209 (wsgi:pulp)     -DFOREGROUND
           ├─ 2210 (wsgi:pulp-cont -DFOREGROUND
           ├─ 2212 (wsgi:pulp-cont -DFOREGROUND
           ├─ 2214 (wsgi:pulp-cont -DFOREGROUND
           ├─ 2215 (wsgi:pulp_forg -DFOREGROUND
           ├─21091 /usr/sbin/httpd -DFOREGROUND
           ├─21092 /usr/sbin/httpd -DFOREGROUND
           ├─21151 /usr/sbin/httpd -DFOREGROUND
           ├─21156 /usr/sbin/httpd -DFOREGROUND
           ├─21158 /usr/sbin/httpd -DFOREGROUND
           ├─28465 /usr/sbin/httpd -DFOREGROUND
           ├─28466 /usr/sbin/httpd -DFOREGROUND
           ├─28467 /usr/sbin/httpd -DFOREGROUND
           ├─28468 /usr/sbin/httpd -DFOREGROUND
           ├─28472 /usr/sbin/httpd -DFOREGROUND
           ├─28475 /usr/sbin/httpd -DFOREGROUND
           ├─28479 /usr/sbin/httpd -DFOREGROUND
           ├─28486 /usr/sbin/httpd -DFOREGROUND
           ├─28487 /usr/sbin/httpd -DFOREGROUND
           ├─28488 /usr/sbin/httpd -DFOREGROUND
           ├─28489 /usr/sbin/httpd -DFOREGROUND
           ├─28495 /usr/sbin/httpd -DFOREGROUND
           ├─28501 /usr/sbin/httpd -DFOREGROUND
           ├─28519 /usr/sbin/httpd -DFOREGROUND
           ├─28521 /usr/sbin/httpd -DFOREGROUND
           ├─28526 /usr/sbin/httpd -DFOREGROUND
           ├─28528 /usr/sbin/httpd -DFOREGROUND
           ├─28529 /usr/sbin/httpd -DFOREGROUND
           └─28531 /usr/sbin/httpd -DFOREGROUND

Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: pulp.server.agent.direct.services:INFO: AMQP reply handler started
Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: pulp.server.webservices.application:INFO: *************************************************************
Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: pulp.server.webservices.application:INFO: The Pulp server has been successfully initialized
Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: pulp.server.webservices.application:INFO: *************************************************************
Nov 23 06:19:03 satellite.truhearing.com pulp[2202]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671
Nov 23 06:19:03 satellite.truhearing.com pulp[2202]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671
Nov 23 06:19:03 satellite.truhearing.com pulp[2209]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671
Nov 23 06:19:03 satellite.truhearing.com pulp[2209]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671
Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: gofer.messaging.adapter.qpid.connection:INFO: opened: qpid+ssl://localhost:5671
Nov 23 06:19:03 satellite.truhearing.com pulp[2206]: gofer.messaging.adapter.connect:INFO: connected: qpid+ssl://localhost:5671
\ displaying puppetserver                                                       
● puppetserver.service - puppetserver Service
   Loaded: loaded (/usr/lib/systemd/system/puppetserver.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:19:37 MST; 8h ago
  Process: 1747 ExecStart=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver start (code=exited, status=0/SUCCESS)
 Main PID: 1855 (java)
    Tasks: 113 (limit: 4915)
   CGroup: /system.slice/puppetserver.service
           ├─ 1855 /usr/bin/java -Xms2G -Xmx2G -Djruby.logger.class=com.puppetlabs.jruby_utils.jruby.Slf4jLogger -Djava.security.egd=file:/dev/urandom -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/server/apps/puppetserver/jruby-1_7.jar:/opt/puppetlabs/server/data/puppetserver/jars/* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter
           ├─30489 ruby /etc/puppetlabs/puppet/node.rb echo.truhearing.com
           └─30616 ruby /etc/puppetlabs/puppet/node.rb mysql2.truhearing.com

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting puppetserver Service...
Nov 23 06:19:37 satellite.truhearing.com systemd[1]: Started puppetserver Service.
\ displaying dynflow-sidekiq@worker                                             
● dynflow-sidekiq@worker.service - Foreman jobs daemon - worker on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 14:49:11 MST; 3s ago
     Docs: https://theforeman.org
 Main PID: 30655 (sidekiq)
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker.service
           └─30655 /opt/rh/rh-ruby25/root/usr/bin/ruby /opt/theforeman/tfm/root/usr/bin/sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/worker.yml

Nov 23 14:49:11 satellite.truhearing.com systemd[1]: Started Foreman jobs daemon - worker on sidekiq.
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@worker[30655]: 2020-11-23T21:49:13.015Z 30655 TID-9foej INFO: GitLab reliable fetch activated!
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@worker[30655]: 2020-11-23T21:49:13.023Z 30655 TID-kp5r7 INFO: Booting Sidekiq 5.2.7 with redis options {:id=>"Sidekiq-server-PID-30655", :url=>"redis://localhost:6379/0"}
\ displaying dynflow-sidekiq@worker-hosts-queue                                 
● dynflow-sidekiq@worker-hosts-queue.service - Foreman jobs daemon - worker-hosts-queue on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 14:49:08 MST; 6s ago
     Docs: https://theforeman.org
 Main PID: 30513 (sidekiq)
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker-hosts-queue.service
           └─30513 /opt/rh/rh-ruby25/root/usr/bin/ruby /opt/theforeman/tfm/root/usr/bin/sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/worker-hosts-queue.yml

Nov 23 14:49:08 satellite.truhearing.com systemd[1]: Started Foreman jobs daemon - worker-hosts-queue on sidekiq.
Nov 23 14:49:09 satellite.truhearing.com dynflow-sidekiq@worker-hosts-queue[30513]: 2020-11-23T21:49:09.946Z 30513 TID-8cp8h INFO: GitLab reliable fetch activated!
Nov 23 14:49:09 satellite.truhearing.com dynflow-sidekiq@worker-hosts-queue[30513]: 2020-11-23T21:49:09.947Z 30513 TID-hm9s1 INFO: Booting Sidekiq 5.2.7 with redis options {:id=>"Sidekiq-server-PID-30513", :url=>"redis://localhost:6379/0"}
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@worker-hosts-queue[30513]: /usr/share/foreman/lib/foreman.rb:8: warning: already initialized constant Foreman::UUID_REGEXP
Nov 23 14:49:13 satellite.truhearing.com dynflow-sidekiq@worker-hosts-queue[30513]: /usr/share/foreman/lib/foreman.rb:8: warning: previous definition of UUID_REGEXP was here
\ displaying foreman-proxy                                                      
● foreman-proxy.service - Foreman Proxy
   Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled)
   Active: active (running) since Mon 2020-11-23 06:19:04 MST; 8h ago
 Main PID: 1755 (ruby)
    Tasks: 8
   CGroup: /system.slice/foreman-proxy.service
           └─1755 ruby /usr/share/foreman-proxy/bin/smart-proxy --no-daemonize

Nov 23 06:18:50 satellite.truhearing.com systemd[1]: Starting Foreman Proxy...
Nov 23 06:19:04 satellite.truhearing.com systemd[1]: Started Foreman Proxy.
\ All services are running                                            [OK]      

Reran the installer:
It through errors right at 100% ->

[root@satellite kcadmin]# foreman-installer --scenario katello
 /Stage[main]/Foreman_proxy::Register/Foreman_smartproxy[satellite.truhearing.com]: Could not evaluate: Proxy satellite.truhearing.com cannot be retrieved: unknown error (response 502)
Installing             Done                                               [100%] [............................................................]
  Something went wrong! Check the log for ERROR-level output
  The full log is at /var/log/foreman-installer/katello.log
[root@satellite kcadmin]# tail /var/log/foreman-installer/katello.log
[DEBUG 2020-11-23T15:02:50 main] Hook /usr/share/foreman-installer/katello/hooks/post/99-version_locking.rb returned nil
[ INFO 2020-11-23T15:02:50 main] All hooks in group post finished
[DEBUG 2020-11-23T15:02:50 main] Exit with status code: 6 (signal was 6)
[ERROR 2020-11-23T15:02:50 main] Errors encountered during run:
[ERROR 2020-11-23T15:02:50 main] foreman-maintain packages is-locked --assumeyes failed! Check the output for error!
[ERROR 2020-11-23T15:02:50 main]  /Stage[main]/Foreman_proxy::Register/Foreman_smartproxy[satellite.truhearing.com]: Could not evaluate: Proxy satellite.truhearing.com cannot be retrieved: unknown error (response 502)
[DEBUG 2020-11-23T15:02:50 main] Cleaning /tmp/kafo_installation20201123-4193-hxmys8
[DEBUG 2020-11-23T15:02:50 main] Cleaning /tmp/kafo_installation20201123-4193-1xrf1c1
[DEBUG 2020-11-23T15:02:50 main] Cleaning /tmp/default_values.yaml
[ INFO 2020-11-23T15:02:50 main] Installer finished in 232.395580196 seconds

I checked my install this morning after not touching it for a few days. The web service gave me the logon prompt. I tried logging in, and at the same time tried to register one of my problem nodes via subscription manager. The timeout on registration occurred, and the login failed with a timeout.

I did get a stack trace:

2020-11-24T12:02:02 [I|app|d2dd9cc0] Started GET "/node/foreman.internal.domain. com?format=yml" for 127.0.0.1 at 2020-11-24 12:02:02 +1                                                                                                                                         030
2020-11-24T12:02:02 [I|app|d2dd9cc0] Processing by HostsController#externalNodes as YML
2020-11-24T12:02:02 [I|app|d2dd9cc0]   Parameters: {"name"=>"foreman.internal.domain. com"}
2020-11-24T12:02:02 [I|app|d2dd9cc0]   Rendering text template
2020-11-24T12:02:02 [I|app|d2dd9cc0]   Rendered text template (Duration: 0.0ms | Allocations: 3)
2020-11-24T12:02:02 [I|app|d2dd9cc0] Completed 200 OK in 175ms (Views: 1.0ms | ActiveRecord: 32.6ms | Allocations: 26359)
2020-11-24T12:02:03 [I|app|f25b6993] Started POST "/api/hosts/facts" for 127.0.0.1 at 2020-11-24 12:02:03 +1030
2020-11-24T12:02:03 [I|app|f25b6993] Processing by Api::V2::HostsController#facts as JSON
2020-11-24T12:02:03 [I|app|f25b6993]   Parameters: {"facts"=>"[FILTERED]", "name"=>"foreman.internal.domain. com", "certname"=>"foreman.                                                                                                                                         internal.domain. com", "apiv"=>"v2", "host"=>{"certname"=>"foreman.internal.domain. com", "name"=>"foreman.internal.domain. com"}}
2020-11-24T12:02:04 [I|app|33f6e407] Started GET "/rhsm/status" for 127.0.0.1 at 2020-11-24 12:02:04 +1030
2020-11-24T12:02:04 [I|app|33f6e407] Processing by Katello::Api::Rhsm::CandlepinProxiesController#server_status as JSON
2020-11-24T12:02:04 [I|app|f25b6993] Import facts for 'foreman.internal.domain. com' completed. Added: 0, Updated: 17, Deleted 0 facts
2020-11-24T12:02:04 [I|app|f25b6993] Completed 201 Created in 1192ms (Views: 14.4ms | ActiveRecord: 275.9ms | Allocations: 180540)
2020-11-24T12:02:04 [I|app|663ca1ec] Started GET "/node/foreman.internal.domain. com?format=yml" for 127.0.0.1 at 2020-11-24 12:02:04 +1                                                                                                                                         030
2020-11-24T12:02:04 [I|app|663ca1ec] Processing by HostsController#externalNodes as YML
2020-11-24T12:02:04 [I|app|663ca1ec]   Parameters: {"name"=>"foreman.internal.domain. com"}
2020-11-24T12:02:04 [I|app|663ca1ec]   Rendering text template
2020-11-24T12:02:04 [I|app|663ca1ec]   Rendered text template (Duration: 0.0ms | Allocations: 3)
2020-11-24T12:02:04 [I|app|663ca1ec] Completed 200 OK in 158ms (Views: 1.1ms | ActiveRecord: 29.6ms | Allocations: 26405)
2020-11-24T12:02:05 [I|app|2eafd8f9] Started POST "/api/config_reports" for 127.0.0.1 at 2020-11-24 12:02:05 +1030
2020-11-24T12:02:05 [I|app|2eafd8f9] Processing by Api::V2::ConfigReportsController#create as JSON
2020-11-24T12:02:05 [I|app|2eafd8f9]   Parameters: {"config_report"=>"[FILTERED]", "apiv"=>"v2"}
2020-11-24T12:02:05 [I|app|2eafd8f9] Scanning report with: Foreman::PuppetReportScanner
2020-11-24T12:02:05 [I|app|2eafd8f9] Imported report for foreman.internal.domain. com in 73.1 ms, status refreshed in 27.1 ms
2020-11-24T12:02:05 [I|app|2eafd8f9]   Rendering api/v2/config_reports/create.json.rabl
2020-11-24T12:02:05 [I|app|2eafd8f9]   Rendered api/v2/config_reports/create.json.rabl (Duration: 22.8ms | Allocations: 7779)
2020-11-24T12:02:05 [I|app|2eafd8f9] Completed 201 Created in 152ms (Views: 22.8ms | ActiveRecord: 41.3ms | Allocations: 23839)
2020-11-24T12:02:15 [I|app|4ddec9f9] Started POST "/rhsm/consumers?owner=Default_Organization&activation_keys=Enrol%20existing%20hosts                                                                                                                                         " for 127.0.0.1 at 2020-11-24 12:02:15 +1030
2020-11-24T12:02:15 [I|app|4ddec9f9] Processing by Katello::Api::Rhsm::CandlepinProxiesController#consumer_activate as JSON
2020-11-24T12:02:15 [I|app|4ddec9f9]   Parameters: {"name"=>"plex.domain. com", "contentTags"=>[], "serviceLevel"=>"", "usage"=>"", "rol                                                                                                                                         e"=>"", "facts"=>"[FILTERED]", "addOns"=>[], "type"=>"system", "owner"=>"Default_Organization", "activation_keys"=>"Enrol existing hos                                                                                                                                         ts"}
2020-11-24T12:02:22 [I|app|29f3285a] Started POST "/users/login" for 127.0.0.1 at 2020-11-24 12:02:22 +1030
2020-11-24T12:02:22 [I|app|29f3285a] Processing by UsersController#login as HTML
2020-11-24T12:02:22 [I|app|29f3285a]   Parameters: {"login"=>{"login"=>"admin", "password"=>"[FILTERED]"}, "authenticity_token"=>"NrGj                                                                                                                                         z+wWYtczPaLUT+faJjL61rRoGXE3iTgB1huBjyB1mHxRe9Uz3pvPcma8LMXlK6mieEIbo1wfJGzeUJPsBw=="}
2020-11-24T12:02:22 [I|app|29f3285a] User 'admin' logged in from '10.0.0.10'
2020-11-24T12:02:22 [I|app|29f3285a] Redirected to https ://foreman.domain. com/
2020-11-24T12:02:22 [I|app|29f3285a] Completed 302 Found in 83ms (ActiveRecord: 19.1ms | Allocations: 6374)
2020-11-24T12:02:22 [I|app|22f7577c] Started GET "/" for 127.0.0.1 at 2020-11-24 12:02:22 +1030
2020-11-24T12:02:22 [I|app|22f7577c] Processing by DashboardController#index as HTML
2020-11-24T12:02:22 [I|app|22f7577c]   Rendering dashboard/index.html.erb within layouts/application

==> /var/log/httpd/foreman-ssl_access_ssl.log <==
10.3.2.20 - - [24/Nov/2020:12:02:04 +1030] "GET /rhsm/status HTTP/1.1" 502 401 "-" "RHSM/1.0 (cmd=subscription-manager)"

==> /var/log/httpd/foreman-ssl_error_ssl.log <==
[Tue Nov 24 12:03:04.581924 2020] [proxy_http:error] [pid 5108] (70007)The timeout specified has expired: [client 10.3.2.20:55752] AH01102: error reading status line from remote server 127.0.0.1:3000
[Tue Nov 24 12:03:04.582104 2020] [proxy:error] [pid 5108] [client 10.3.2.20:55752] AH00898: Error reading from remote server returned by /rhsm/status

==> /var/log/httpd/foreman-ssl_access_ssl.log <==
10.3.2.20 - - [24/Nov/2020:12:02:15 +1030] "POST /rhsm/consumers?owner=Default_Organization&activation_keys=Enrol%20existing%20hosts HTTP/1.1" 502 408 "-" "RHSM/1.0 (cmd=subscription-manager)"
10.3.2.120 - admin [24/Nov/2020:12:03:15 +1030] "GET /pulp/api/v2/users/ HTTP/1.1" 200 140 "-" "rest-client/2.0.2 (linux-gnu x86_64) ruby/2.5.5p157"

==> /var/log/httpd/foreman-ssl_error_ssl.log <==
[Tue Nov 24 12:03:15.334355 2020] [proxy_http:error] [pid 5110] (70007)The timeout specified has expired: [client 10.3.2.20:59010] AH01102: error reading status line from remote server 127.0.0.1:3000
[Tue Nov 24 12:03:15.334475 2020] [proxy:error] [pid 5110] [client 10.3.2.20:59010] AH00898: Error reading from remote server returned by /rhsm/consumers

==> /var/log/foreman/production.log <==
2020-11-24T12:03:15 [W|app|4ddec9f9] Timed out connecting to server
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:731:in `rescue in transmit'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:642:in `transmit'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:145:in `execute'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:52:in `execute'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/resource.rb:51:in `get'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/models/katello/ping.rb:230:in `backend_status'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/models/katello/ping.rb:90:in `block in ping_candlepin_without_auth'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/models/katello/ping.rb:139:in `exception_watch'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/models/katello/ping.rb:89:in `ping_candlepin_without_auth'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/models/katello/ping.rb:29:in `ping'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/services/katello/registration_manager.rb:172:in `block in check_registration_services'
 4ddec9f9 | /usr/share/foreman/app/models/concerns/foreman/thread_session.rb:108:in `as'
 4ddec9f9 | /usr/share/foreman/app/models/concerns/foreman/thread_session.rb:114:in `as_anonymous_admin'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/services/katello/registration_manager.rb:171:in `check_registration_services'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/controllers/katello/api/rhsm/candlepin_proxies_controller.rb:461:in `check_registration_services'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:428:in `block in make_lambda'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:181:in `block (2 levels) in halting_and_conditional'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/callbacks.rb:34:in `block (2 levels) in <module:Callbacks>'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:182:in `block in halting_and_conditional'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:513:in `block in invoke_before'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:513:in `each'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:513:in `invoke_before'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:110:in `block in run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/controllers/katello/api/rhsm/candlepin_proxies_controller.rb:35:in `repackage_message'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /usr/share/foreman/app/controllers/api/v2/base_controller.rb:181:in `disable_json_root'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /usr/share/foreman/app/controllers/concerns/foreman/controller/timezone.rb:10:in `set_timezone'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /usr/share/foreman/app/models/concerns/foreman/thread_session.rb:32:in `clear_thread'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /usr/share/foreman/app/controllers/concerns/foreman/controller/topbar_sweeper.rb:12:in `set_topbar_sweeper_controller'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:139:in `run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/callbacks.rb:41:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/rescue.rb:22:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/instrumentation.rb:33:in `block in process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications.rb:180:in `block in instrument'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications/instrumenter.rb:24:in `instrument'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications.rb:180:in `instrument'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/instrumentation.rb:32:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/params_wrapper.rb:245:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.1/lib/active_record/railties/controller_runtime.rb:27:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/app/controllers/katello/concerns/api/api_controller.rb:40:in `process_action'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/base.rb:136:in `process'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionview-6.0.3.1/lib/action_view/rendering.rb:39:in `process'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal.rb:190:in `dispatch'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal.rb:254:in `dispatch'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:50:in `dispatch'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:33:in `serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:18:in `block in <class:Constraints>'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:48:in `serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:49:in `block in serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `each'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:834:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/engine.rb:527:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `public_send'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `method_missing'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:19:in `block in <class:Constraints>'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:48:in `serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:49:in `block in serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `each'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `serve'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:834:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/lib/katello/middleware/event_daemon.rb:10:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-dsl-2.2.9/lib/apipie_dsl/static_dispatcher.rb:67:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/static_dispatcher.rb:66:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/extractor/recorder.rb:137:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /usr/share/foreman/lib/foreman/middleware/telemetry.rb:10:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/middleware/checksum_in_headers.rb:27:in `call'
 4ddec9f9 | /usr/share/foreman/lib/foreman/middleware/catch_json_parse_errors.rb:9:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/tempfile_reaper.rb:15:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/etag.rb:27:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/conditional_get.rb:40:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/head.rb:12:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/http/content_security_policy.rb:18:in `call'
 4ddec9f9 | /usr/share/foreman/lib/foreman/middleware/logging_context_session.rb:22:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:266:in `context'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:260:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/cookies.rb:648:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/callbacks.rb:27:in `block in call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:101:in `run_callbacks'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/callbacks.rb:26:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/actionable_exceptions.rb:17:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/debug_exceptions.rb:32:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/show_exceptions.rb:33:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/rack/logger.rb:37:in `call_app'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/rack/logger.rb:28:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/sprockets-rails-3.2.1/lib/sprockets/rails/quiet_assets.rb:13:in `call'
 4ddec9f9 | /usr/share/foreman/lib/foreman/middleware/logging_context_request.rb:11:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/remote_ip.rb:81:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/request_id.rb:27:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.17.0/lib/katello/prevent_json_parsing.rb:12:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/method_override.rb:24:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/runtime.rb:22:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/cache/strategy/local_cache_middleware.rb:29:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/executor.rb:14:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/sendfile.rb:110:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/host_authorization.rb:76:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/secure_headers-6.3.0/lib/secure_headers/middleware.rb:11:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/engine.rb:527:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `public_send'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `method_missing'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:74:in `block in call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:58:in `each'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:58:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/configuration.rb:228:in `call'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:682:in `handle_request'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:472:in `process_client'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:328:in `block in run'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/thread_pool.rb:134:in `block in spawn_thread'
 4ddec9f9 | /opt/theforeman/tfm/root/usr/share/gems/gems/logging-2.2.2/lib/logging/diagnostic_context.rb:474:in `block in create_with_logging_context'

==> /var/log/httpd/foreman-ssl_access_ssl.log <==
10.3.4.10 - - [24/Nov/2020:12:02:22 +1030] "GET / HTTP/1.0" 502 379 "https ://foreman.domain. com/users/login" "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:82.0) Gecko/20100101 Firefox/82.0"

==> /var/log/httpd/foreman-ssl_error_ssl.log <==
[Tue Nov 24 12:03:22.882958 2020] [proxy_http:error] [pid 5111] (70007)The timeout specified has expired: [client 10.3.4.10:39596] AH01102: error reading status line from remote server 127.0.0.1:3000, referer: https ://foreman.domain. com/users/login
[Tue Nov 24 12:03:22.883123 2020] [proxy:error] [pid 5111] [client 10.3.4.10:39596] AH00898: Error reading from remote server returned by /, referer: https ://foreman.domain. com/users/login

Disk resources:

root@foreman:~# df -h
Filesystem               Size  Used Avail Use% Mounted on
devtmpfs                 6.8G     0  6.8G   0% /dev
tmpfs                    6.8G   40K  6.8G   1% /dev/shm
tmpfs                    6.8G   42M  6.8G   1% /run
tmpfs                    6.8G     0  6.8G   0% /sys/fs/cgroup
/dev/mapper/centos-root   14G   13G  1.1G  93% /
/dev/sda1                497M  276M  222M  56% /boot
tmpfs                    1.4G     0  1.4G   0% /run/user/0

Memory resources:

root@foreman:~# free -h
               total        used        free      shared  buff/cache   available
Mem:            13G         10G        680M        144M        2.6G        2.8G
Swap:          1.6G        520K        1.6G

I can’t even get to the log-in page… I just get the 502 error.

Can you grep for oom in your /var/log/messages to see if you’re running low on memory?

Here you go:

[root@satellite kcadmin]# cat /var/log/messages | grep oom
Nov 22 08:42:35 satellite dynflow-sidekiq@worker-hosts-queue: 2020-11-22T15:42:35.309Z 4802 TID-7loom INFO: GitLab reliable fetch activated!
Nov 24 05:31:48 satellite dynflow-sidekiq@orchestrator: 2020-11-24T12:31:48.560Z 20126 TID-jwoom INFO: Booting Sidekiq 5.2.7 with redis options {:id=>"Sidekiq-server-PID-20126", :url=>"redis://localhost:6379/0"}
[root@satellite kcadmin]# 

@therokh: Could you also check if you’re seeing any oom messages on your setup?

No OOM logs on my setup

HTTP 502 errors show up when Apache times out when it talks to a backend. I think the relevant logs are in /var/log/httpd. Could you have a look there?

I can only see the timeouts logged, and infrequently an error about setting r->user (I think these were generated when I tried to login and got a timeout)

[Wed Nov 25 21:51:37.963219 2020] [proxy_http:error] [pid 5111] (70007)The timeout specified has expired: [client 10.3.2.120:59904] AH01102: error reading status line from remote server 127.0.0.1:3000
[Wed Nov 25 21:51:37.963309 2020] [proxy:error] [pid 5111] [client 10.3.2.120:59904] AH00898: Error reading from remote server returned by /rhsm/consumers/ed3826b3-910e-4c0f-bfb6-9e5b0144e930/content_overrides
[Wed Nov 25 21:52:38.785568 2020] [proxy_http:error] [pid 30500] (70007)The timeout specified has expired: [client 10.3.2.120:59912] AH01102: error reading status line from remote server 127.0.0.1:3000
[Wed Nov 25 21:52:38.785660 2020] [proxy:error] [pid 30500] [client 10.3.2.120:59912] AH00898: Error reading from remote server returned by /rhsm/consumers/ed3826b3-910e-4c0f-bfb6-9e5b0144e930/certificates/serials
[Wed Nov 25 21:53:38.858600 2020] [proxy_http:error] [pid 5112] (70007)The timeout specified has expired: [client 10.3.2.120:59920] AH01102: error reading status line from remote server 127.0.0.1:3000
[Wed Nov 25 21:53:38.858748 2020] [proxy:error] [pid 5112] [client 10.3.2.120:59920] AH00898: Error reading from remote server returned by /rhsm/status
[Wed Nov 25 22:11:30.942425 2020] [proxy_http:error] [pid 5108] (70007)The timeout specified has expired: [client 10.3.2.253:39488] AH01102: error reading status line from remote server 127.0.0.1:3000
[Wed Nov 25 22:11:30.942509 2020] [proxy:error] [pid 5108] [client 10.3.2.253:39488] AH00898: Error reading from remote server returned by /rhsm/consumers/122df4bf-2f59-4208-9342-fe025256eabf/certificates/serials
[Wed Nov 25 23:00:34.301416 2020] [ssl:warn] [pid 5107] [client 10.3.2.120:60546] AH02227: Failed to set r->user to 'SSL_CLIENT_S_DN_CN'
[Wed Nov 25 23:00:34.320479 2020] [ssl:warn] [pid 5106] [client 10.3.2.120:60550] AH02227: Failed to set r->user to 'SSL_CLIENT_S_DN_CN'
[Wed Nov 25 23:00:34.384587 2020] [ssl:warn] [pid 5108] [client 10.3.2.120:60556] AH02227: Failed to set r->user to 'SSL_CLIENT_S_DN_CN'
[Wed Nov 25 23:00:34.413455 2020] [ssl:warn] [pid 5110] [client 10.3.2.120:60558] AH02227: Failed to set r->user to 'SSL_CLIENT_S_DN_CN'
[Wed Nov 25 23:12:44.938158 2020] [proxy_http:error] [pid 5108] (70007)The timeout specified has expired: [client 10.3.2.253:39552] AH01102: error reading status line from remote server 127.0.0.1:3000
[Wed Nov 25 23:12:44.938296 2020] [proxy:error] [pid 5108] [client 10.3.2.253:39552] AH00898: Error reading from remote server returned by /rhsm/consumers/122df4bf-2f59-4208-9342-fe025256eabf/certificates/serials
[Thu Nov 26 00:13:48.844340 2020] [proxy_http:error] [pid 5107] (70007)The timeout specified has expired: [client 10.3.2.253:39590] AH01102: error reading status line from remote server 127.0.0.1:3000
[Thu Nov 26 00:13:48.844474 2020] [proxy:error] [pid 5107] [client 10.3.2.253:39590] AH00898: Error reading from remote server returned by /rhsm/consumers/122df4bf-2f59-4208-9342-fe025256eabf/certificates/serials

Looks like requests to Candlepin take too long. I wonder why, but don’t know a lot about Candlepin.

I am not a Candlepin expert, however I’ve seen this when OS entropy was low and when Candlepin had to generate many X509 certificates. Some older version of Candlepin can be using the blocking /dev/random. Just check the OS entropy to be sure.