Katello 4.0 installation on RHEL 8 failed

Problem:
Fresh installation of katello 4.0 on a rhel8 system failed with foreman-rake db:seed’ returned 1 instead of one of [0].

Expected outcome:
Successfull installation of katello.

Foreman and Proxy versions:
Foreman 2.4 and Katello 4.0

Foreman and Proxy plugin versions:

Distribution and version:
Red Hat Enterprise Linux 8.3 (Ootpa)

Other relevant data:
I followed th instruction from https://community.theforeman.org/t/what-is-the-state-of-katello-support-on-el8/22186/6

[root@scotty ~]# foreman-installer -v --scenario katello 
2021-05-03 08:37:24 [NOTICE] [root] Loading default values from puppet modules...
2021-05-03 08:37:36 [NOTICE] [root] ... finished
2021-05-03 08:37:47 [NOTICE] [root] Running validation checks
2021-05-03 08:38:17 [NOTICE] [configure] Starting system configuration.
The total number of configuration tasks may increase during the run.
Observe logs or specify --verbose-log-level to see individual configuration tasks.
2021-05-03 08:38:53 [NOTICE] [configure] 100 out of 1944 done.
2021-05-03 08:38:53 [NOTICE] [configure] 200 out of 1944 done.
2021-05-03 08:39:08 [NOTICE] [configure] 300 out of 1944 done.
2021-05-03 08:39:23 [NOTICE] [configure] 400 out of 1944 done.
2021-05-03 08:39:27 [NOTICE] [configure] 500 out of 1944 done.
2021-05-03 08:39:28 [NOTICE] [configure] 600 out of 1946 done.
2021-05-03 08:39:28 [NOTICE] [configure] 700 out of 1946 done.
2021-05-03 08:39:29 [NOTICE] [configure] 800 out of 1948 done.
2021-05-03 08:39:32 [NOTICE] [configure] 900 out of 1949 done.
2021-05-03 08:39:33 [NOTICE] [configure] 1000 out of 1951 done.
2021-05-03 08:39:33 [NOTICE] [configure] 1100 out of 1954 done.
2021-05-03 08:39:35 [NOTICE] [configure] 1200 out of 1955 done.
2021-05-03 08:39:39 [NOTICE] [configure] 1300 out of 1956 done.
2021-05-03 08:42:23 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Failed to call refresh: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-03 08:42:23 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-03 08:42:23 [NOTICE] [configure] 1400 out of 1956 done.
2021-05-03 08:42:24 [NOTICE] [configure] 1500 out of 1956 done.
2021-05-03 08:42:24 [NOTICE] [configure] 1600 out of 1956 done.
2021-05-03 08:43:17 [NOTICE] [configure] 1700 out of 1956 done.
2021-05-03 08:43:18 [NOTICE] [configure] 1800 out of 1956 done.
2021-05-03 08:43:42 [NOTICE] [configure] 1900 out of 1956 done.
2021-05-03 08:43:55 [NOTICE] [configure] System configuration has finished.

There were errors detected during install.
Please address the errors and re-run the installer to ensure the system is properly configured.
Failing to do so is likely to result in broken functionality.

The full log is at /var/log/foreman-installer/katello.log
[root@scotty ~]#

From katello.log:

2021-05-03 08:39:41 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1384 of 1956)
2021-05-03 08:39:41 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-03 08:39:41 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Starting to evaluate the resource (1385 of 1956)
2021-05-03 08:39:41 [DEBUG ] [configure] Exec[foreman-rake-db:migrate](provider=posix): Executing check '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-03 08:39:41 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-03 08:40:36 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: '/usr/sbin/foreman-rake db:migrate' won't be executed because of failed check 'unless'
2021-05-03 08:40:36 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Evaluated in 55.11 seconds
2021-05-03 08:40:36 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1386 of 1956)
2021-05-03 08:40:36 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-03 08:40:36 [DEBUG ] [configure] Prefetching cli resources for foreman_config_entry
2021-05-03 08:40:36 [DEBUG ] [configure] Executing with uid=foreman gid=foreman: '/usr/sbin/foreman-rake -- config '
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Starting to evaluate the resource (1387 of 1956)
2021-05-03 08:41:24 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]/value: value changed 'true' to 'false'
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: The container Class[Foreman::Database] will propagate my refresh event
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Scheduling refresh of Foreman::Rake[db:seed]
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Evaluated in 0.06 seconds
2021-05-03 08:41:24 [DEBUG ] [configure] Foreman::Rake[db:seed]: Starting to evaluate the resource (1388 of 1956)
2021-05-03 08:41:24 [DEBUG ] [configure] Foreman::Rake[db:seed]: Scheduling refresh of Exec[foreman-rake-db:seed]
2021-05-03 08:41:24 [DEBUG ] [configure] Foreman::Rake[db:seed]: Evaluated in 0.00 seconds
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Starting to evaluate the resource (1389 of 1956)
2021-05-03 08:41:24 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' won't be executed because of failed check 'refreshonly'
2021-05-03 08:41:24 [DEBUG ] [configure] Exec[foreman-rake-db:seed](provider=posix): Executing '/usr/sbin/foreman-rake db:seed'
2021-05-03 08:41:24 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:seed'
2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: rake aborted!
2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: There was an issue with the backend service candlepin: 404 Not Found
2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:1
7:in `block in plan'
.....

Got the same problem here. Doing a fresh installation on Almalinux 8.3 (as a test to replace CentOS) using external PostgreSQL 12.6 databases for candlepin, pulpcore and foreman.
Foreman running on a dedicated Almalinux server (VM).
Product versions Katello 4.0 and Foreman 2.4. Following the instructions in The Foreman Documentation | Foreman 2.4 and Katello 4.0 documentation
Please find below my installation report.

2021-05-04 09:34:14 [NOTICE] [root] Loading default values from puppet modules...
2021-05-04 09:34:23 [NOTICE] [root] ... finished
2021-05-04 09:34:28 [NOTICE] [root] Running validation checks
2021-05-04 09:34:46 [NOTICE] [configure] Starting system configuration.
  The total number of configuration tasks may increase during the run.
  Observe logs or specify --verbose-log-level to see individual configuration tasks.
2021-05-04 09:35:12 [NOTICE] [configure] 100 out of 1769 done.
2021-05-04 09:35:13 [NOTICE] [configure] 200 out of 1769 done.
2021-05-04 09:35:21 [NOTICE] [configure] 300 out of 1769 done.
2021-05-04 09:35:22 [NOTICE] [configure] 400 out of 1769 done.
2021-05-04 09:35:25 [NOTICE] [configure] 500 out of 1770 done.
2021-05-04 09:35:25 [NOTICE] [configure] 600 out of 1771 done.
2021-05-04 09:35:26 [NOTICE] [configure] 700 out of 1771 done.
2021-05-04 09:35:31 [NOTICE] [configure] 800 out of 1772 done.
2021-05-04 09:35:32 [NOTICE] [configure] 900 out of 1773 done.
2021-05-04 09:35:32 [NOTICE] [configure] 1000 out of 1776 done.
2021-05-04 09:35:35 [NOTICE] [configure] 1100 out of 1778 done.
2021-05-04 09:35:37 [NOTICE] [configure] 1200 out of 1779 done.
2021-05-04 09:38:24 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Failed to call refresh: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-04 09:38:24 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-04 09:38:24 [NOTICE] [configure] 1300 out of 1779 done.
2021-05-04 09:38:25 [NOTICE] [configure] 1400 out of 1779 done.
2021-05-04 09:38:25 [NOTICE] [configure] 1500 out of 1779 done.
2021-05-04 09:38:25 [NOTICE] [configure] 1600 out of 1779 done.
2021-05-04 09:39:32 [NOTICE] [configure] 1700 out of 1779 done.
2021-05-04 09:39:45 [NOTICE] [configure] System configuration has finished.

  There were errors detected during install.
  Please address the errors and re-run the installer to ensure the system is properly configured.
  Failing to do so is likely to result in broken functionality.

  The full log is at /var/log/foreman-installer/katello.log

I have been looking everywhere for a solution, but unfortunately not found and I was wondering whethter Support did provide you with a solution.

rgds,
-gw

Please code blocks when pasting log output. Typically, fenced code blocks are easier and supported.

The critical part is this:

2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: rake aborted!
2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: There was an issue with the backend service candlepin: 404 Not Found
2021-05-03 08:42:23 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:1

For some reason Candlepin wasn’t working well. You should really look into that. Note that you can also run the command manually (/usr/sbin/foreman-rake db:seed) to see the exact output, but I think Candlepin logs are in /var/log/candlepin and should tell you more. The output of journalctl -u tomcat can also be informative.

1 Like

I already run the db:seed command manually, including the trace option, but the output was the same as above.
In /var/log/candlepin I have two files:
1.) error.log

[root@scotty candlepin]# cat error.log
2021-05-04 10:57:16,125 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-04 10:57:16,321 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-04 10:57:16,322 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-04 10:57:21,400 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or maynot indicate a problem with the static metamodel
2021-05-04 10:57:21,402 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-04 10:57:27,025 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue event.org.candlepin.audit.LoggingListener in AddressSettings
2021-05-04 10:57:27,026 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue event.org.candlepin.audit.LoggingListener in AddressSettings
2021-05-04 10:57:27,108 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue event.org.candlepin.audit.ActivationListener in AddressSettings
2021-05-04 10:57:27,109 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue event.org.candlepin.audit.ActivationListener in AddressSettings
2021-05-04 10:57:27,135 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue katello_candlepin_event_monitor.candlepin_events in AddressSettings
2021-05-04 10:57:27,137 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue katello_candlepin_event_monitor.candlepin_events in AddressSettings
2021-05-04 10:57:27,154 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue jobs in AddressSettings
2021-05-04 10:57:27,154 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue jobs in AddressSettings
[root@scotty candlepin]#

2.) candlepin.log

[root@scotty candlepin]# cat candlepin.log
2021-05-04 10:56:57,117 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin initializing context.
2021-05-04 10:56:57,151 [thread=main] [=, org=, csid=] INFO  org.candlepin.pki.impl.JSSProviderLoader - Using JSS version 4.7.3
2021-05-04 10:56:57,402 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin reading configuration.
2021-05-04 10:56:57,413 [thread=main] [=, org=, csid=] INFO  org.candlepin.common.config.EncryptedConfiguration - No secret file provided.
2021-05-04 10:56:57,469 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Running under postgresql
2021-05-04 10:56:57,504 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin will show support for the following capabilities: [instance_multiplier, derived_product, vcpu, cert_v3, hypervisors_heartbeat, remove_by_pool_id, syspurpose, insights_auto_register, storage_band, cores, hypervisors_async, org_level_content_access, guest_limit, ram, batch_bind]
2021-05-04 10:56:58,322 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CustomizableModules - Found custom module module.config.adapter_module
2021-05-04 10:56:59,231 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ActiveEntitlementJob: org.candlepin.async.tasks.ActiveEntitlementJob
2021-05-04 10:56:59,233 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: CRLUpdateJob: org.candlepin.async.tasks.CRLUpdateJob
2021-05-04 10:56:59,234 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitlerJob: org.candlepin.async.tasks.EntitlerJob
2021-05-04 10:56:59,236 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitleByProductsJob: org.candlepin.async.tasks.EntitleByProductsJob
2021-05-04 10:56:59,237 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExpiredPoolsCleanupJob: org.candlepin.async.tasks.ExpiredPoolsCleanupJob
2021-05-04 10:56:59,239 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExportJob: org.candlepin.async.tasks.ExportJob
2021-05-04 10:56:59,240 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HealEntireOrgJob: org.candlepin.async.tasks.HealEntireOrgJob
2021-05-04 10:56:59,242 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorHeartbeatUpdateJob: org.candlepin.async.tasks.HypervisorHeartbeatUpdateJob
2021-05-04 10:56:59,243 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorUpdateJob: org.candlepin.async.tasks.HypervisorUpdateJob
2021-05-04 10:56:59,245 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportJob: org.candlepin.async.tasks.ImportJob
2021-05-04 10:56:59,246 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportRecordCleanerJob: org.candlepin.async.tasks.ImportRecordCleanerJob
2021-05-04 10:56:59,248 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: JobCleaner: org.candlepin.async.tasks.JobCleaner
2021-05-04 10:56:59,249 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ManifestCleanerJob: org.candlepin.async.tasks.ManifestCleanerJob
2021-05-04 10:56:59,251 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: OrphanCleanupJob: org.candlepin.async.tasks.OrphanCleanupJob
2021-05-04 10:56:59,252 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsForProductJob: org.candlepin.async.tasks.RefreshPoolsForProductJob
2021-05-04 10:56:59,254 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsJob: org.candlepin.async.tasks.RefreshPoolsJob
2021-05-04 10:56:59,255 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenEnvEntitlementCertsJob: org.candlepin.async.tasks.RegenEnvEntitlementCertsJob
2021-05-04 10:56:59,257 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenProductEntitlementCertsJob: org.candlepin.async.tasks.RegenProductEntitlementCertsJob
2021-05-04 10:56:59,259 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UndoImportsJob: org.candlepin.async.tasks.UndoImportsJob
2021-05-04 10:56:59,260 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UnmappedGuestEntitlementCleanerJob: org.candlepin.async.tasks.UnmappedGuestEntitlementCleanerJob
2021-05-04 10:57:16,125 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-04 10:57:16,321 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-04 10:57:16,322 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-04 10:57:21,400 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or maynot indicate a problem with the static metamodel
2021-05-04 10:57:21,402 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-04 10:57:22,466 [thread=main] [=, org=, csid=] INFO  org.candlepin.policy.js.JsRunnerProvider - Recompiling rules with timestamp: 2021-05-04 10:57:22.35
2021-05-04 10:57:24,913 [thread=main] [=, org=, csid=] INFO  org.candlepin.messaging.impl.artemis.ArtemisContextListener - Initializing embedded Artemis server...
2021-05-04 10:57:24,923 [thread=main] [=, org=, csid=] INFO  org.candlepin.messaging.impl.artemis.ArtemisContextListener - Loading Artemis config file: /etc/candlepin/broker.xml
2021-05-04 10:57:27,025 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue event.org.candlepin.audit.LoggingListener in AddressSettings
2021-05-04 10:57:27,026 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue event.org.candlepin.audit.LoggingListener in AddressSettings
2021-05-04 10:57:27,108 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue event.org.candlepin.audit.ActivationListener in AddressSettings
2021-05-04 10:57:27,109 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue event.org.candlepin.audit.ActivationListener in AddressSettings
2021-05-04 10:57:27,135 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue katello_candlepin_event_monitor.candlepin_events in AddressSettings
2021-05-04 10:57:27,137 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue katello_candlepin_event_monitor.candlepin_events in AddressSettings
2021-05-04 10:57:27,154 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222165: No Dead Letter Address configured for queue jobs in AddressSettings
2021-05-04 10:57:27,154 [thread=main] [=, org=, csid=] WARN  org.apache.activemq.artemis.core.server - AMQ222166: No Expiry Address configured for queue jobs in AddressSettings
2021-05-04 10:57:27,805 [thread=main] [=, org=, csid=] INFO  org.candlepin.messaging.impl.artemis.ArtemisContextListener - Embedded Artemis server started successfully
2021-05-04 10:57:27,806 [thread=main] [=, org=, csid=] INFO  org.candlepin.messaging.impl.artemis.ArtemisSessionFactory - Connecting to Artemis server at vm://0
2021-05-04 10:57:27,874 [thread=main] [=, org=, csid=] INFO  org.candlepin.messaging.impl.artemis.ArtemisSessionFactory - Artemis session factory initialized
2021-05-04 10:57:28,130 [thread=main] [=, org=, csid=] INFO  org.candlepin.controller.ActiveMQStatusMonitor - Connection to ActiveMQ is available.
2021-05-04 10:57:28,133 [thread=main] [=, org=, csid=] INFO  org.candlepin.audit.ArtemisMessageSource - ActiveMQ status has been updated: UNKNOWN:CONNECTED
2021-05-04 10:57:28,134 [thread=main] [=, org=, csid=] INFO  org.candlepin.audit.ArtemisMessageSource - Connecting to message broker and initializing all message listeners.
2021-05-04 10:57:28,416 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Initializing job manager
2021-05-04 10:57:28,911 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "OrphanCleanupJob" with cron schedule: 0 0 3 ? * 1
2021-05-04 10:57:28,927 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "ActiveEntitlementJob" with cron schedule: 0 0 0/1 * * ?
2021-05-04 10:57:28,944 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "CRLUpdateJob" with cron schedule: 0 0 12 * * ?
2021-05-04 10:57:28,972 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "ExpiredPoolsCleanupJob" with cron schedule: 0 0 0/1 * * ?
2021-05-04 10:57:29,004 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "ImportRecordCleanerJob" with cron schedule: 0 0 12 * * ?
2021-05-04 10:57:29,022 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "JobCleaner" with cron schedule: 0 0 12 * * ?
2021-05-04 10:57:29,039 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "ManifestCleanerJob" with cron schedule: 0 0 12 * * ?
2021-05-04 10:57:29,055 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Scheduled job "UnmappedGuestEntitlementCleanerJob" with cron schedule: 0 0 3/12 * * ?
2021-05-04 10:57:29,060 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobMessageReceiver - Creating 10 job receiver threads with filter: null
2021-05-04 10:57:29,369 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Job manager initialization complete
2021-05-04 10:57:29,369 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Job manager started
[root@scotty candlepin]#

So, there are “only” warnings and I cannot see any blocker.

The tomcat.log seems to be suspicious, I think:

May 04 10:56:40 scotty.home.petersen20.de systemd[1]: Started Apache Tomcat Web Application Container.
May 04 10:56:40 scotty.home.petersen20.de server[638582]: Java virtual machine used: /usr/lib/jvm/jre-11/bin/java
May 04 10:56:40 scotty.home.petersen20.de server[638582]: classpath used: /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/ant.jar:/usr/share/java/ant-launcher.jar:/usr/lib/jvm/java/lib/tools.jar
May 04 10:56:40 scotty.home.petersen20.de server[638582]: main class used: org.apache.catalina.startup.Bootstrap
May 04 10:56:40 scotty.home.petersen20.de server[638582]: flags used: -Xms1024m -Xmx4096m -Djava.security.auth.login.config=/usr/share/tomcat/conf/login.config
May 04 10:56:40 scotty.home.petersen20.de server[638582]: options used: -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
May 04 10:56:40 scotty.home.petersen20.de server[638582]: arguments used: start
May 04 10:56:41 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:41.767 WARNING [main] org.apache.catalina.startup.SetAllPropertiesRule.begin [SetAllPropertiesRule]{Server/Service/Connector} Setting property 'sslProtocols' to 'TLSv1.2' did not find a matching property.
May 04 10:56:41 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:41.980 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlValidation] to [false]
May 04 10:56:41 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:41.981 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlNamespaceAware] to [false]
May 04 10:56:41 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:41.995 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib]
May 04 10:56:43 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:43.194 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
May 04 10:56:44 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:44.562 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [3,439] milliseconds
May 04 10:56:44 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:44.774 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
May 04 10:56:44 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:44.775 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.30]
May 04 10:56:44 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:44.797 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
May 04 10:56:57 scotty.home.petersen20.de server[638582]: 04-May-2021 10:56:57.060 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
May 04 10:56:57 scotty.home.petersen20.de server[638582]: WARNING: An illegal reflective access operation has occurred
May 04 10:56:57 scotty.home.petersen20.de server[638582]: WARNING: Illegal reflective access by org.candlepin.pki.impl.JSSProviderLoader (file:/var/lib/tomcat/webapps/candlepin/WEB-INF/classes/) to field java.lang.ClassLoader.usr_paths
May 04 10:56:57 scotty.home.petersen20.de server[638582]: WARNING: Please consider reporting this to the maintainers of org.candlepin.pki.impl.JSSProviderLoader
May 04 10:56:57 scotty.home.petersen20.de server[638582]: WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
May 04 10:56:57 scotty.home.petersen20.de server[638582]: WARNING: All illegal access operations will be denied in a future release
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.235 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.OwnerCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.488 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.merge(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.489 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ProductCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.490 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.698 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.766 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ConsumerCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:06 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:06.768 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ConsumerCurator.create(org.candlepin.model.Persisted,boolean)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:07 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:07.107 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.CdnCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:07 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:07.175 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.PoolCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:07 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:07.558 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.RulesCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:07 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:07.559 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:07 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:07.624 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ContentCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:08 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:08.110 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@55571f5e]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 04 10:57:29 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:29.901 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
May 04 10:57:29 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:29.911 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal Context [/candlepin] startup failed due to previous errors
May 04 10:57:30 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:30.123 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc The web application [candlepin] registered the JDBC driver [org.postgresql.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 04 10:57:30 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:30.127 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahf5c9ch1bx3l4o|4bc1bcb8]-AdminTaskTimer] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
May 04 10:57:30 scotty.home.petersen20.de server[638582]:  java.base@11.0.11/java.lang.Object.wait(Native Method)
May 04 10:57:30 scotty.home.petersen20.de server[638582]:  java.base@11.0.11/java.util.TimerThread.mainLoop(Timer.java:553)
May 04 10:57:30 scotty.home.petersen20.de server[638582]:  java.base@11.0.11/java.util.TimerThread.run(Timer.java:506)
May 04 10:57:30 scotty.home.petersen20.de server[638582]: 04-May-2021 10:57:30.130 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahf5c9ch1bx3l4o|4bc1bcb8]-HelperThread-#0] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:

Unfortunalety I cannot interpret this messages ;-(
Perhaps some one can give a hint ?
Best regards

Does starting tomcat manually result in an error?
Do you see any additional errors in the any of the tomcat logs?

Starting tomcat manually on the cli gave no error:

[root@scotty tomcat]# tomcat start
[root@scotty tomcat]# pwd
/var/log/tomcat
[root@scotty tomcat]#

Manual start leeds to the same error entries in the log (see above).
In the localhost.log is this entry:

04-May-2021 14:13:00.284 SEVERE [main] org.apache.catalina.core.StandardContext.listenerStart Exception sending context initialized event to listener instance of class [org.candlepin.guice.CandlepinContextListener]
java.lang.NoSuchFieldError: SERVER_SENT_EVENTS_TYPE

After the tomcat is started, this message repeatedly every minute:

May 04 14:47:05 scotty.home.petersen20.de server[671974]: Exception in thread "Thread-57 (ActiveMQ-server-org.apache.activemq.artemis.core.server.impl.ActiveMQServerImpl$6@2a505beb)" java.lang.NoClassDefFoundError: ch/qos/logback/classic/spi/ThrowableProxy
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at ch.qos.logback.classic.spi.LoggingEvent.<init>(LoggingEvent.java:119)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:419)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at ch.qos.logback.classic.Logger.log(Logger.java:765)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at jdk.internal.reflect.GeneratedMethodAccessor52.invoke(Unknown Source)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.jboss.logging.Slf4jLocationAwareLogger.doLog(Slf4jLocationAwareLogger.java:89)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.jboss.logging.Slf4jLocationAwareLogger.doLog(Slf4jLocationAwareLogger.java:75)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.jboss.logging.Logger.warn(Logger.java:1236)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:47)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:31)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.apache.activemq.artemis.utils.actors.ProcessorBase.executePendingTasks(ProcessorBase.java:65)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
May 04 14:47:05 scotty.home.petersen20.de server[671974]:         at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118)

This makes me wonder if it installed correctly. Can you share the output of:

rpm -qv tomcatjss
rpm -qV tomcatjss
rpm -qv candlepin
rpm -qV candlepin
[mpetersen@scotty ~]$ rpm -qv tomcatjss
tomcatjss-7.5.0-1.module+el8.3.0+7355+c59bcbd9.noarch
[mpetersen@scotty ~]$ rpm -qV tomcatjss
[mpetersen@scotty ~]$ rpm -qv candlepin
candlepin-3.2.11-1.el8.noarch
[mpetersen@scotty ~]$ rpm -qV candlepin
S.?....T.  c /etc/candlepin/candlepin.conf
.M...UG..  g /etc/candlepin/certs/candlepin-ca.crt
[mpetersen@scotty ~]$

In the installation docs is mentioned for centos 8 , that ruby 2.7 must be enabled. Is this also true for rhel 8 ?

In nightly it is, but since you’re on 4.0 it shouldn’t be. Also, Candlepin is Java based and doesn’t use Ruby. That shouldn’t affect it.

However, my knowledge of Java is lacking here. You may want to verify your JRE is correct. It should be using version 11.

My java version is 11 and the tomcat is 9.0.30 .

I started from scratch: wipe the disk, fresh installation of os (rhel8) and starting the katello installation process as described above. The result is exactly the same as before: candlepin would not start.
Are there any findings, if this combination (rhel8 & katello 4.0) is generally working ? Would it be worthwhile to switch from rhel8 to almalinux or rocky linux ?

@ekohl Should we check with someone on the candlepin team about the failure @martux69 posted ?

We do not test against RHEL 8 (and we currently do not have instructions for it) so you are out in uncharted waters. What we do test against is CentOS 8 and more recently with Katello 4.1 RCs CentOS 8 stream. I would recommend using something we test against to help ensure compatability.

This could be a discrepancy between something in RHEL but not yet in CentOS causing the problem as there is a lag.

I will try to run through an installation locally on RHEL 8 of 4.0 and see if I run into the same issue and report back any findings.

Some more information about my doing/setup:

  • My setup is only intend for developing/evaluation/training purpose, so it is very small:
    4x cpus and 8GB memory
  • running only foreman on this system worked fine for months
  • For comparison I do an installation in a virtual system (kvm) with the same performance data as my physical system. There it works !
  • The only obvious different I can see til now is the disk layout: In the vm it is all one partition, on the physical system it looks like this
[root@scotty ~]# df -h
Filesystem                   Size  Used Avail Use% Mounted on
devtmpfs                     3.8G     0  3.8G   0% /dev
tmpfs                        3.8G   28K  3.8G   1% /dev/shm
tmpfs                        3.8G  8.9M  3.8G   1% /run
tmpfs                        3.8G     0  3.8G   0% /sys/fs/cgroup
/dev/mapper/vgsys-lvroot      16G  3.1G   12G  21% /
/dev/mapper/vgsys-lvpgsql    7.9G  138M  7.3G   2% /var/lib/pgsql
/dev/mapper/vgsys-lvforeman  976M  385M  525M  43% /var/lib/foreman
/dev/mapper/vgsys-lvhome     976M  2.6M  907M   1% /home
/dev/mapper/vgsys-lvlog      976M   55M  855M   7% /var/log
/dev/mapper/vgsys-lvtmp      2.0G   13M  1.8G   1% /tmp
/dev/sdb2                    976M  221M  689M  25% /boot
/dev/mapper/vgsys-lvopt      976M  236M  674M  26% /opt
/dev/sdb1                    599M  6.9M  592M   2% /boot/efi
/dev/mapper/vgdata-lvtftp    976M  4.4M  905M   1% /var/lib/tftpboot
/dev/mapper/vgdata-lvwiki    976M  2.6M  907M   1% /var/www/dokuwiki
/dev/mapper/vgdata-lvpuppet  976M  2.6M  907M   1% /etc/puppetlabs/code
/dev/mapper/vgdata-lvpulp    9.8G   40M  9.3G   1% /var/lib/pulp
/dev/mapper/vgdata-lvqpid    976M   15M  894M   2% /var/lib/qpidd
tmpfs                        767M     0  767M   0% /run/user/0
[root@scotty ~]#
  • differing from the instructions at What is the state of Katello support on EL8 - #8 by ehelms for rhel it is not powertools but codeready-builder
  • I saw some hints in the logs regarding selinux, so I try to set selinux to permissive (with no luck)
  • no selinux profiles are enabled
  • cockpit is disabled on the system because of port conflicts with the smart-proxy (9090)

The actual candlepin logs are:
error.log

[root@scotty candlepin]# cat error.log 
2021-05-11 15:20:05,007 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 15:20:05,179 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:05,180 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:09,809 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:09,810 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:04,020 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 16:33:04,274 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:04,275 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
[root@scotty candlepin]#

candlepin.log:

[root@scotty candlepin]# cat candlepin.log 
2021-05-11 15:19:47,470 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin initializing context.
2021-05-11 15:19:47,491 [thread=main] [=, org=, csid=] INFO  org.candlepin.pki.impl.JSSProviderLoader - Using JSS version 4.7.3
2021-05-11 15:19:47,752 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin reading configuration.
2021-05-11 15:19:47,765 [thread=main] [=, org=, csid=] INFO  org.candlepin.common.config.EncryptedConfiguration - No secret file provided.
2021-05-11 15:19:47,822 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Running under postgresql
2021-05-11 15:19:47,859 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin will show support for the following capabilities: [instance_multiplier, derived_product, vcpu, cert_v3, hypervisors_heartbeat, remove_by_pool_id, syspurpose, insights_auto_register, storage_band, cores, hypervisors_async, org_level_content_access, guest_limit, ram, batch_bind]
2021-05-11 15:19:47,860 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.CandlepinContextListener - Candlepin stored config on context.
2021-05-11 15:19:48,685 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CustomizableModules - Found custom module module.config.adapter_module
2021-05-11 15:19:49,502 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ActiveEntitlementJob: org.candlepin.async.tasks.ActiveEntitlementJob
2021-05-11 15:19:49,503 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: CRLUpdateJob: org.candlepin.async.tasks.CRLUpdateJob
2021-05-11 15:19:49,505 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitlerJob: org.candlepin.async.tasks.EntitlerJob
2021-05-11 15:19:49,506 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitleByProductsJob: org.candlepin.async.tasks.EntitleByProductsJob
2021-05-11 15:19:49,508 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExpiredPoolsCleanupJob: org.candlepin.async.tasks.ExpiredPoolsCleanupJob
2021-05-11 15:19:49,509 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExportJob: org.candlepin.async.tasks.ExportJob
2021-05-11 15:19:49,511 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HealEntireOrgJob: org.candlepin.async.tasks.HealEntireOrgJob
2021-05-11 15:19:49,512 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorHeartbeatUpdateJob: org.candlepin.async.tasks.HypervisorHeartbeatUpdateJob
2021-05-11 15:19:49,513 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorUpdateJob: org.candlepin.async.tasks.HypervisorUpdateJob
2021-05-11 15:19:49,515 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportJob: org.candlepin.async.tasks.ImportJob
2021-05-11 15:19:49,516 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportRecordCleanerJob: org.candlepin.async.tasks.ImportRecordCleanerJob
2021-05-11 15:19:49,518 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: JobCleaner: org.candlepin.async.tasks.JobCleaner
2021-05-11 15:19:49,519 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ManifestCleanerJob: org.candlepin.async.tasks.ManifestCleanerJob
2021-05-11 15:19:49,521 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: OrphanCleanupJob: org.candlepin.async.tasks.OrphanCleanupJob
2021-05-11 15:19:49,522 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsForProductJob: org.candlepin.async.tasks.RefreshPoolsForProductJob
2021-05-11 15:19:49,523 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsJob: org.candlepin.async.tasks.RefreshPoolsJob
2021-05-11 15:19:49,525 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenEnvEntitlementCertsJob: org.candlepin.async.tasks.RegenEnvEntitlementCertsJob
2021-05-11 15:19:49,526 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenProductEntitlementCertsJob: org.candlepin.async.tasks.RegenProductEntitlementCertsJob
2021-05-11 15:19:49,528 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UndoImportsJob: org.candlepin.async.tasks.UndoImportsJob
2021-05-11 15:19:49,529 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UnmappedGuestEntitlementCleanerJob: org.candlepin.async.tasks.UnmappedGuestEntitlementCleanerJob
2021-05-11 15:19:58,584 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.LoggingListener
2021-05-11 15:19:58,598 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.I18nProvider - Getting i18n engine for locale en_US
2021-05-11 15:19:58,640 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.ActivationListener
2021-05-11 15:20:05,007 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 15:20:05,179 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:05,180 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:09,809 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:09,810 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:10,017 [thread=main] [=, org=, csid=] DEBUG org.candlepin.policy.js.JsRunnerProvider - Compiling rules for initial load
2021-05-11 16:32:29,725 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin initializing context.
2021-05-11 16:32:29,822 [thread=main] [=, org=, csid=] INFO  org.candlepin.pki.impl.JSSProviderLoader - Using JSS version 4.7.3
2021-05-11 16:32:30,469 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin reading configuration.
2021-05-11 16:32:30,531 [thread=main] [=, org=, csid=] INFO  org.candlepin.common.config.EncryptedConfiguration - No secret file provided.
2021-05-11 16:32:30,777 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Running under postgresql
2021-05-11 16:32:30,875 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin will show support for the following capabilities: [instance_multiplier, derived_product, vcpu, cert_v3, hypervisors_heartbeat, remove_by_pool_id, syspurpose, insights_auto_register, storage_band, cores, hypervisors_async, org_level_content_access, guest_limit, ram, batch_bind]
2021-05-11 16:32:30,887 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.CandlepinContextListener - Candlepin stored config on context.
2021-05-11 16:32:33,260 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CustomizableModules - Found custom module module.config.adapter_module
2021-05-11 16:32:35,804 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ActiveEntitlementJob: org.candlepin.async.tasks.ActiveEntitlementJob
2021-05-11 16:32:35,806 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: CRLUpdateJob: org.candlepin.async.tasks.CRLUpdateJob
2021-05-11 16:32:35,808 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitlerJob: org.candlepin.async.tasks.EntitlerJob
2021-05-11 16:32:35,809 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitleByProductsJob: org.candlepin.async.tasks.EntitleByProductsJob
2021-05-11 16:32:35,811 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExpiredPoolsCleanupJob: org.candlepin.async.tasks.ExpiredPoolsCleanupJob
2021-05-11 16:32:35,812 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExportJob: org.candlepin.async.tasks.ExportJob
2021-05-11 16:32:35,836 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HealEntireOrgJob: org.candlepin.async.tasks.HealEntireOrgJob
2021-05-11 16:32:35,837 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorHeartbeatUpdateJob: org.candlepin.async.tasks.HypervisorHeartbeatUpdateJob
2021-05-11 16:32:35,839 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorUpdateJob: org.candlepin.async.tasks.HypervisorUpdateJob
2021-05-11 16:32:35,841 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportJob: org.candlepin.async.tasks.ImportJob
2021-05-11 16:32:35,842 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportRecordCleanerJob: org.candlepin.async.tasks.ImportRecordCleanerJob
2021-05-11 16:32:35,844 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: JobCleaner: org.candlepin.async.tasks.JobCleaner
2021-05-11 16:32:35,868 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ManifestCleanerJob: org.candlepin.async.tasks.ManifestCleanerJob
2021-05-11 16:32:35,870 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: OrphanCleanupJob: org.candlepin.async.tasks.OrphanCleanupJob
2021-05-11 16:32:35,871 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsForProductJob: org.candlepin.async.tasks.RefreshPoolsForProductJob
2021-05-11 16:32:35,873 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsJob: org.candlepin.async.tasks.RefreshPoolsJob
2021-05-11 16:32:35,874 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenEnvEntitlementCertsJob: org.candlepin.async.tasks.RegenEnvEntitlementCertsJob
2021-05-11 16:32:35,876 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenProductEntitlementCertsJob: org.candlepin.async.tasks.RegenProductEntitlementCertsJob
2021-05-11 16:32:35,877 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UndoImportsJob: org.candlepin.async.tasks.UndoImportsJob
2021-05-11 16:32:35,901 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UnmappedGuestEntitlementCleanerJob: org.candlepin.async.tasks.UnmappedGuestEntitlementCleanerJob
2021-05-11 16:32:53,577 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.LoggingListener
2021-05-11 16:32:53,595 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.I18nProvider - Getting i18n engine for locale en_US
2021-05-11 16:32:53,649 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.ActivationListener
2021-05-11 16:33:04,020 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 16:33:04,274 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:04,275 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,995 [thread=main] [=, org=, csid=] DEBUG org.candlepin.policy.js.JsRunnerProvider - Compiling rules for initial load
[root@scotty candlepin]#

localhost.log from tomcat:

[root@scotty tomcat]# cat localhost.2021-05-11.log 
11-May-2021 19:48:54.510 SEVERE [main] org.apache.catalina.core.StandardContext.listenerStop Exception sending context destroyed event to listener instance of class [org.candlepin.guice.CandlepinContextListener]
        java.lang.NullPointerException
                at org.jboss.resteasy.plugins.guice.GuiceResteasyBootstrapServletContextListener.triggerAnnotatedMethods(GuiceResteasyBootstrapServletContextListener.java:151)
                at org.jboss.resteasy.plugins.guice.GuiceResteasyBootstrapServletContextListener.contextDestroyed(GuiceResteasyBootstrapServletContextListener.java:146)
                at org.candlepin.guice.CandlepinContextListener.contextDestroyed(CandlepinContextListener.java:249)
                at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
                at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
                at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:187)
                at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:717)
                at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:690)
                at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:705)
                at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1133)
                at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1867)
                at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
                at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
                at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
                at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:118)
                at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1045)
                at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:429)
                at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1576)
                at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:309)
                at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
                at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:423)
                at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:366)
                at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:936)
                at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:841)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384)
                at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374)
                at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
                at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
                at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:140)
                at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909)
                at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:262)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.StandardService.startInternal(StandardService.java:421)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:930)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.startup.Catalina.start(Catalina.java:633)
                at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.base/java.lang.reflect.Method.invoke(Method.java:566)
                at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:343)
                at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:474)
[root@scotty tomcat]#

catalina.log:

[root@scotty tomcat]# cat catalina.2021-05-11.log 
11-May-2021 19:48:19.027 WARNING [main] org.apache.catalina.startup.SetAllPropertiesRule.begin [SetAllPropertiesRule]{Server/Service/Connector} Setting property 'sslProtocols' to 'TLSv1.2' did not find a matching property.
11-May-2021 19:48:19.198 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlValidation] to [false]
11-May-2021 19:48:19.198 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlNamespaceAware] to [false]
11-May-2021 19:48:19.211 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib]
11-May-2021 19:48:20.131 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
11-May-2021 19:48:21.456 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [2,934] milliseconds
11-May-2021 19:48:21.656 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
11-May-2021 19:48:21.657 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.30]
11-May-2021 19:48:21.675 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
11-May-2021 19:48:32.251 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
11-May-2021 19:48:40.316 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.OwnerCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.520 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.merge(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.521 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ProductCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.523 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.734 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.787 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ConsumerCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.788 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ConsumerCurator.create(org.candlepin.model.Persisted,boolean)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.063 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.CdnCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.111 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.PoolCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.464 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.RulesCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.465 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.531 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ContentCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:42.051 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:54.440 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
11-May-2021 19:48:54.452 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal Context [/candlepin] startup failed due to previous errors
11-May-2021 19:48:54.517 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc The web application [candlepin] registered the JDBC driver [org.postgresql.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
11-May-2021 19:48:54.583 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/var/lib/tomcat/webapps/candlepin] has finished in [32,908] ms
11-May-2021 19:48:54.597 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
11-May-2021 19:48:54.619 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [33,162] milliseconds
[root@scotty tomcat]#

Try to run the installer step db:seed ends up with:

[root@scotty tomcat]# /usr/sbin/foreman-rake db:seed   
User with login admin already exists, not seeding as admin.
rake aborted!
There was an issue with the backend service candlepin: 404 Not Found
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:17:in `block in plan'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:15:in `each'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:15:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:5:in `block in plan'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:19:in `propagate_candlepin_errors'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:5:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/remote_action.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/keep_locale.rb:7:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:34:in `with_current_request_id'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:31:in `with_current_timezone'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:30:in `with_current_taxonomies'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:41:in `with_current_user'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:513:in `block (2 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:417:in `concurrence'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:512:in `block in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `block in with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `catch'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:511:in `execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:285:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:55:in `block in execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract.rb:167:in `with_meta_calculation'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:54:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:445:in `plan_action'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/katello/organization/create.rb:15:in `block in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:422:in `sequence'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/katello/organization/create.rb:14:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:514:in `block (3 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/keep_locale.rb:7:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:34:in `with_current_request_id'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:31:in `with_current_timezone'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:30:in `with_current_taxonomies'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:41:in `with_current_user'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:513:in `block (2 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:417:in `concurrence'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:512:in `block in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `block in with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `catch'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:511:in `execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:285:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:55:in `block in execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract.rb:167:in `with_meta_calculation'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:54:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:286:in `block (2 levels) in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:374:in `with_planning_scope'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:285:in `block in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:17:in `block in rollback_on_error'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/database_statements.rb:280:in `block in transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/transaction.rb:280:in `block in within_new_transaction'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:26:in `block (2 levels) in synchronize'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:25:in `handle_interrupt'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:25:in `block in synchronize'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:21:in `handle_interrupt'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:21:in `synchronize'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/transaction.rb:278:in `within_new_transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/database_statements.rb:280:in `transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/transactions.rb:212:in `transaction'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/transaction_adapters/active_record.rb:6:in `transaction'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:16:in `rollback_on_error'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:6:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:284:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:211:in `block (2 levels) in plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/coordinator.rb:326:in `acquire'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:209:in `block in plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:208:in `tap'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:208:in `plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:204:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:180:in `trigger'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:23:in `trigger'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:29:in `block in trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:49:in `block in rails_safe_trigger_task'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies/interlock.rb:48:in `block in permit_concurrent_loads'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/share_lock.rb:187:in `yield_shares'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies/interlock.rb:47:in `permit_concurrent_loads'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:48:in `rails_safe_trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:27:in `trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:58:in `sync_task'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:10:in `block (2 levels) in <top (required)>'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:108:in `as'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:9:in `block in <top (required)>'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/delegation.rb:87:in `each'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/delegation.rb:87:in `each'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:8:in `<top (required)>'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `block in load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:291:in `load_dependency'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/foreman/app/services/foreman_seeder.rb:46:in `block (2 levels) in execute'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:108:in `as'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:114:in `as_anonymous_admin'
/usr/share/foreman/app/services/foreman_seeder.rb:45:in `block in execute'
/usr/share/foreman/app/services/foreman_seeder.rb:39:in `each'
/usr/share/foreman/app/services/foreman_seeder.rb:39:in `execute'
/usr/share/foreman/db/seeds.rb:14:in `<top (required)>'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `block in load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:291:in `load_dependency'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/railties-6.0.3.4/lib/rails/engine.rb:559:in `load_seed'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/tasks/database_tasks.rb:440:in `load_seed'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/railties/databases.rake:331:in `block (2 levels) in <top (required)>'
/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in `<top (required)>'
Tasks: TOP => db:seed
(See full trace by running task with --trace)
[root@scotty tomcat]#

It is the same message as in the foreman-installer run.
Available ports are

[root@scotty tomcat]# ss -tlpn
State               Recv-Q              Send-Q                                 Local Address:Port                              Peer Address:Port                                                                   
LISTEN              0                   128                                        127.0.0.1:5432                                   0.0.0.0:*                  users:(("postmaster",pid=1122,fd=4))                
LISTEN              0                   128                                          0.0.0.0:9090                                   0.0.0.0:*                  users:(("smart-proxy",pid=1095,fd=10))                               
LISTEN              0                   10                                         127.0.0.1:5671                                   0.0.0.0:*                  users:(("qpidd",pid=1096,fd=26))                    
LISTEN              0                   10                                         127.0.0.1:5672                                   0.0.0.0:*                  users:(("qpidd",pid=1096,fd=29))                                      
LISTEN              0                   128                                        127.0.0.1:6379                                   0.0.0.0:*                  users:(("redis-server",pid=1101,fd=6))              
LISTEN              0                   50                                           0.0.0.0:5646                                   0.0.0.0:*                  users:(("qdrouterd",pid=1081,fd=13))                
LISTEN              0                   50                                           0.0.0.0:5647                                   0.0.0.0:*                  users:(("qdrouterd",pid=1081,fd=11))                
LISTEN              0                   128                                          0.0.0.0:22                                     0.0.0.0:*                  users:(("sshd",pid=1090,fd=5))                      
LISTEN              0                   128                                            [::1]:5432                                      [::]:*                  users:(("postmaster",pid=1122,fd=3))                
LISTEN              0                   128                                             [::]:9090                                      [::]:*                  users:(("smart-proxy",pid=1095,fd=11))              
LISTEN              0                   1                                 [::ffff:127.0.0.1]:8005                                         *:*                  users:(("java",pid=3797,fd=58))                     
LISTEN              0                   10                                             [::1]:5671                                      [::]:*                  users:(("qpidd",pid=1096,fd=28))                    
LISTEN              0                   10                                             [::1]:5672                                      [::]:*                  users:(("qpidd",pid=1096,fd=30))                    
LISTEN              0                   50                                                 *:8140                                         *:*                  users:(("java",pid=1185,fd=39))                     
LISTEN              0                   50                                              [::]:5646                                      [::]:*                  users:(("qdrouterd",pid=1081,fd=14))                
LISTEN              0                   50                                              [::]:5647                                      [::]:*                  users:(("qdrouterd",pid=1081,fd=12))                
LISTEN              0                   100                               [::ffff:127.0.0.1]:23443                                        *:*                  users:(("java",pid=3797,fd=46))                     
LISTEN              0                   128                                             [::]:22                                        [::]:*                  users:(("sshd",pid=1090,fd=7))                      
[root@scotty tomcat]# 

What version of RHEL 8 are you running? Still 8.3? So far I have not been able to replicate this. And you said no SELinux issues appear to be the case?

Yes. RHEL 8.3:

[root@scotty ~]# cat /etc/os-release 
NAME="Red Hat Enterprise Linux"
VERSION="8.3 (Ootpa)"
ID="rhel"
ID_LIKE="fedora"
VERSION_ID="8.3"
PLATFORM_ID="platform:el8"
PRETTY_NAME="Red Hat Enterprise Linux 8.3 (Ootpa)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:redhat:enterprise_linux:8.3:GA"
HOME_URL="https://www.redhat.com/"
BUG_REPORT_URL="https://bugzilla.redhat.com/"

REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 8"
REDHAT_BUGZILLA_PRODUCT_VERSION=8.3
REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
REDHAT_SUPPORT_PRODUCT_VERSION="8.3"
[root@scotty ~]#

I’m no selinux expert, but I cannot see any selinux related messages any more.
The vm runs out-of-the-box, I’m really confused …

I will now try a foreman-installer run with selinux completely disabled …

No success :pensive:

[root@scotty ~]# foreman-installer --certs-reset --scenario katello
2021-05-11 20:07:53 [NOTICE] [root] Loading default values from puppet modules...
2021-05-11 20:08:07 [NOTICE] [root] ... finished
2021-05-11 20:08:17 [NOTICE] [root] Running validation checks
Marking certificate /root/ssl-build/scotty.home.petersen20.de/pulp-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-apache for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-qpid-broker for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-proxy-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-qpid-router-server for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-proxy for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-puppet-client for update
Marking certificate /root/ssl-build/katello-server-ca for update
2021-05-11 20:08:22 [NOTICE] [configure] Starting system configuration.
  The total number of configuration tasks may increase during the run.
  Observe logs or specify --verbose-log-level to see individual configuration tasks.
2021-05-11 20:08:54 [NOTICE] [configure] 100 out of 1939 done.
2021-05-11 20:08:55 [NOTICE] [configure] 200 out of 1939 done.
2021-05-11 20:09:09 [NOTICE] [configure] 300 out of 1939 done.
2021-05-11 20:09:27 [NOTICE] [configure] 400 out of 1939 done.
2021-05-11 20:10:38 [NOTICE] [configure] 500 out of 1939 done.
2021-05-11 20:10:46 [NOTICE] [configure] 600 out of 1941 done.
2021-05-11 20:10:46 [NOTICE] [configure] 700 out of 1941 done.
2021-05-11 20:10:50 [NOTICE] [configure] 800 out of 1943 done.
2021-05-11 20:10:53 [NOTICE] [configure] 900 out of 1944 done.
2021-05-11 20:10:53 [NOTICE] [configure] 1000 out of 1946 done.
2021-05-11 20:10:54 [NOTICE] [configure] 1100 out of 1949 done.
2021-05-11 20:10:55 [NOTICE] [configure] 1200 out of 1950 done.
2021-05-11 20:11:02 [NOTICE] [configure] 1300 out of 1951 done.
2021-05-11 20:13:43 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Failed to call refresh: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-11 20:13:43 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-11 20:13:43 [NOTICE] [configure] 1400 out of 1951 done.
2021-05-11 20:13:43 [NOTICE] [configure] 1500 out of 1951 done.
2021-05-11 20:13:43 [NOTICE] [configure] 1600 out of 1951 done.
2021-05-11 20:13:44 [NOTICE] [configure] 1700 out of 1951 done.
2021-05-11 20:13:44 [NOTICE] [configure] 1800 out of 1951 done.
2021-05-11 20:14:02 [NOTICE] [configure] 1900 out of 1951 done.
2021-05-11 20:14:12 [NOTICE] [configure] System configuration has finished.

  There were errors detected during install.
  Please address the errors and re-run the installer to ensure the system is properly configured.
  Failing to do so is likely to result in broken functionality.

  The full log is at /var/log/foreman-installer/katello.log
[root@scotty ~]#

From the katello.log:

2021-05-11 20:11:04 [DEBUG ] [configure] Class[Foreman::Database::Postgresql]: Starting to evaluate the resource (1378 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Class[Foreman::Database::Postgresql]: Evaluated in 0.00 seconds
2021-05-11 20:11:04 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1379 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-11 20:11:04 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Starting to evaluate the resource (1380 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Exec[foreman-rake-db:migrate](provider=posix): Executing check '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-11 20:11:04 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-11 20:11:57 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: '/usr/sbin/foreman-rake db:migrate' won't be executed because of failed check 'unl
ess'
2021-05-11 20:11:57 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Evaluated in 53.08 seconds
2021-05-11 20:11:57 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1381 of 1951)
2021-05-11 20:11:57 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-11 20:11:57 [DEBUG ] [configure] Prefetching cli resources for foreman_config_entry
2021-05-11 20:11:57 [DEBUG ] [configure] Executing with uid=foreman gid=foreman: '/usr/sbin/foreman-rake -- config '
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Starting to evaluate the resource (1382 of 1951)
2021-05-11 20:12:44 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]/value: value changed 'true' to 'false'
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: The container Class[Foreman::Database] will propagate my refresh event
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Scheduling refresh of Foreman::Rake[db:seed]
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Evaluated in 0.01 seconds
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Starting to evaluate the resource (1383 of 1951)
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Scheduling refresh of Exec[foreman-rake-db:seed]
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Evaluated in 0.00 seconds
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Starting to evaluate the resource (1384 of 1951)
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' won't be executed because of failed check 'refreshonly'
2021-05-11 20:12:44 [DEBUG ] [configure] Exec[foreman-rake-db:seed](provider=posix): Executing '/usr/sbin/foreman-rake db:seed'
2021-05-11 20:12:44 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:seed'
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: rake aborted!
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: There was an issue with the backend service candlepin: 404 Not Found
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_c
heck.rb:17:in `block in plan'
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_c
heck.rb:15:in `each'