Katello 4.0 installation on RHEL 8 failed

In nightly it is, but since you’re on 4.0 it shouldn’t be. Also, Candlepin is Java based and doesn’t use Ruby. That shouldn’t affect it.

However, my knowledge of Java is lacking here. You may want to verify your JRE is correct. It should be using version 11.

My java version is 11 and the tomcat is 9.0.30 .

I started from scratch: wipe the disk, fresh installation of os (rhel8) and starting the katello installation process as described above. The result is exactly the same as before: candlepin would not start.
Are there any findings, if this combination (rhel8 & katello 4.0) is generally working ? Would it be worthwhile to switch from rhel8 to almalinux or rocky linux ?

@ekohl Should we check with someone on the candlepin team about the failure @martux69 posted ?

We do not test against RHEL 8 (and we currently do not have instructions for it) so you are out in uncharted waters. What we do test against is CentOS 8 and more recently with Katello 4.1 RCs CentOS 8 stream. I would recommend using something we test against to help ensure compatability.

This could be a discrepancy between something in RHEL but not yet in CentOS causing the problem as there is a lag.

I will try to run through an installation locally on RHEL 8 of 4.0 and see if I run into the same issue and report back any findings.

Some more information about my doing/setup:

  • My setup is only intend for developing/evaluation/training purpose, so it is very small:
    4x cpus and 8GB memory
  • running only foreman on this system worked fine for months
  • For comparison I do an installation in a virtual system (kvm) with the same performance data as my physical system. There it works !
  • The only obvious different I can see til now is the disk layout: In the vm it is all one partition, on the physical system it looks like this
[root@scotty ~]# df -h
Filesystem                   Size  Used Avail Use% Mounted on
devtmpfs                     3.8G     0  3.8G   0% /dev
tmpfs                        3.8G   28K  3.8G   1% /dev/shm
tmpfs                        3.8G  8.9M  3.8G   1% /run
tmpfs                        3.8G     0  3.8G   0% /sys/fs/cgroup
/dev/mapper/vgsys-lvroot      16G  3.1G   12G  21% /
/dev/mapper/vgsys-lvpgsql    7.9G  138M  7.3G   2% /var/lib/pgsql
/dev/mapper/vgsys-lvforeman  976M  385M  525M  43% /var/lib/foreman
/dev/mapper/vgsys-lvhome     976M  2.6M  907M   1% /home
/dev/mapper/vgsys-lvlog      976M   55M  855M   7% /var/log
/dev/mapper/vgsys-lvtmp      2.0G   13M  1.8G   1% /tmp
/dev/sdb2                    976M  221M  689M  25% /boot
/dev/mapper/vgsys-lvopt      976M  236M  674M  26% /opt
/dev/sdb1                    599M  6.9M  592M   2% /boot/efi
/dev/mapper/vgdata-lvtftp    976M  4.4M  905M   1% /var/lib/tftpboot
/dev/mapper/vgdata-lvwiki    976M  2.6M  907M   1% /var/www/dokuwiki
/dev/mapper/vgdata-lvpuppet  976M  2.6M  907M   1% /etc/puppetlabs/code
/dev/mapper/vgdata-lvpulp    9.8G   40M  9.3G   1% /var/lib/pulp
/dev/mapper/vgdata-lvqpid    976M   15M  894M   2% /var/lib/qpidd
tmpfs                        767M     0  767M   0% /run/user/0
[root@scotty ~]#
  • differing from the instructions at What is the state of Katello support on EL8 - #8 by ehelms for rhel it is not powertools but codeready-builder
  • I saw some hints in the logs regarding selinux, so I try to set selinux to permissive (with no luck)
  • no selinux profiles are enabled
  • cockpit is disabled on the system because of port conflicts with the smart-proxy (9090)

The actual candlepin logs are:
error.log

[root@scotty candlepin]# cat error.log 
2021-05-11 15:20:05,007 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 15:20:05,179 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:05,180 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:09,809 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:09,810 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:04,020 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 16:33:04,274 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:04,275 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
[root@scotty candlepin]#

candlepin.log:

[root@scotty candlepin]# cat candlepin.log 
2021-05-11 15:19:47,470 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin initializing context.
2021-05-11 15:19:47,491 [thread=main] [=, org=, csid=] INFO  org.candlepin.pki.impl.JSSProviderLoader - Using JSS version 4.7.3
2021-05-11 15:19:47,752 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin reading configuration.
2021-05-11 15:19:47,765 [thread=main] [=, org=, csid=] INFO  org.candlepin.common.config.EncryptedConfiguration - No secret file provided.
2021-05-11 15:19:47,822 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Running under postgresql
2021-05-11 15:19:47,859 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin will show support for the following capabilities: [instance_multiplier, derived_product, vcpu, cert_v3, hypervisors_heartbeat, remove_by_pool_id, syspurpose, insights_auto_register, storage_band, cores, hypervisors_async, org_level_content_access, guest_limit, ram, batch_bind]
2021-05-11 15:19:47,860 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.CandlepinContextListener - Candlepin stored config on context.
2021-05-11 15:19:48,685 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CustomizableModules - Found custom module module.config.adapter_module
2021-05-11 15:19:49,502 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ActiveEntitlementJob: org.candlepin.async.tasks.ActiveEntitlementJob
2021-05-11 15:19:49,503 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: CRLUpdateJob: org.candlepin.async.tasks.CRLUpdateJob
2021-05-11 15:19:49,505 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitlerJob: org.candlepin.async.tasks.EntitlerJob
2021-05-11 15:19:49,506 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitleByProductsJob: org.candlepin.async.tasks.EntitleByProductsJob
2021-05-11 15:19:49,508 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExpiredPoolsCleanupJob: org.candlepin.async.tasks.ExpiredPoolsCleanupJob
2021-05-11 15:19:49,509 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExportJob: org.candlepin.async.tasks.ExportJob
2021-05-11 15:19:49,511 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HealEntireOrgJob: org.candlepin.async.tasks.HealEntireOrgJob
2021-05-11 15:19:49,512 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorHeartbeatUpdateJob: org.candlepin.async.tasks.HypervisorHeartbeatUpdateJob
2021-05-11 15:19:49,513 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorUpdateJob: org.candlepin.async.tasks.HypervisorUpdateJob
2021-05-11 15:19:49,515 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportJob: org.candlepin.async.tasks.ImportJob
2021-05-11 15:19:49,516 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportRecordCleanerJob: org.candlepin.async.tasks.ImportRecordCleanerJob
2021-05-11 15:19:49,518 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: JobCleaner: org.candlepin.async.tasks.JobCleaner
2021-05-11 15:19:49,519 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ManifestCleanerJob: org.candlepin.async.tasks.ManifestCleanerJob
2021-05-11 15:19:49,521 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: OrphanCleanupJob: org.candlepin.async.tasks.OrphanCleanupJob
2021-05-11 15:19:49,522 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsForProductJob: org.candlepin.async.tasks.RefreshPoolsForProductJob
2021-05-11 15:19:49,523 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsJob: org.candlepin.async.tasks.RefreshPoolsJob
2021-05-11 15:19:49,525 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenEnvEntitlementCertsJob: org.candlepin.async.tasks.RegenEnvEntitlementCertsJob
2021-05-11 15:19:49,526 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenProductEntitlementCertsJob: org.candlepin.async.tasks.RegenProductEntitlementCertsJob
2021-05-11 15:19:49,528 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UndoImportsJob: org.candlepin.async.tasks.UndoImportsJob
2021-05-11 15:19:49,529 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UnmappedGuestEntitlementCleanerJob: org.candlepin.async.tasks.UnmappedGuestEntitlementCleanerJob
2021-05-11 15:19:58,584 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.LoggingListener
2021-05-11 15:19:58,598 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.I18nProvider - Getting i18n engine for locale en_US
2021-05-11 15:19:58,640 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.ActivationListener
2021-05-11 15:20:05,007 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 15:20:05,179 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:05,180 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 15:20:09,809 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:09,810 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 15:20:10,017 [thread=main] [=, org=, csid=] DEBUG org.candlepin.policy.js.JsRunnerProvider - Compiling rules for initial load
2021-05-11 16:32:29,725 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin initializing context.
2021-05-11 16:32:29,822 [thread=main] [=, org=, csid=] INFO  org.candlepin.pki.impl.JSSProviderLoader - Using JSS version 4.7.3
2021-05-11 16:32:30,469 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin reading configuration.
2021-05-11 16:32:30,531 [thread=main] [=, org=, csid=] INFO  org.candlepin.common.config.EncryptedConfiguration - No secret file provided.
2021-05-11 16:32:30,777 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Running under postgresql
2021-05-11 16:32:30,875 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CandlepinContextListener - Candlepin will show support for the following capabilities: [instance_multiplier, derived_product, vcpu, cert_v3, hypervisors_heartbeat, remove_by_pool_id, syspurpose, insights_auto_register, storage_band, cores, hypervisors_async, org_level_content_access, guest_limit, ram, batch_bind]
2021-05-11 16:32:30,887 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.CandlepinContextListener - Candlepin stored config on context.
2021-05-11 16:32:33,260 [thread=main] [=, org=, csid=] INFO  org.candlepin.guice.CustomizableModules - Found custom module module.config.adapter_module
2021-05-11 16:32:35,804 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ActiveEntitlementJob: org.candlepin.async.tasks.ActiveEntitlementJob
2021-05-11 16:32:35,806 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: CRLUpdateJob: org.candlepin.async.tasks.CRLUpdateJob
2021-05-11 16:32:35,808 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitlerJob: org.candlepin.async.tasks.EntitlerJob
2021-05-11 16:32:35,809 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: EntitleByProductsJob: org.candlepin.async.tasks.EntitleByProductsJob
2021-05-11 16:32:35,811 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExpiredPoolsCleanupJob: org.candlepin.async.tasks.ExpiredPoolsCleanupJob
2021-05-11 16:32:35,812 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ExportJob: org.candlepin.async.tasks.ExportJob
2021-05-11 16:32:35,836 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HealEntireOrgJob: org.candlepin.async.tasks.HealEntireOrgJob
2021-05-11 16:32:35,837 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorHeartbeatUpdateJob: org.candlepin.async.tasks.HypervisorHeartbeatUpdateJob
2021-05-11 16:32:35,839 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: HypervisorUpdateJob: org.candlepin.async.tasks.HypervisorUpdateJob
2021-05-11 16:32:35,841 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportJob: org.candlepin.async.tasks.ImportJob
2021-05-11 16:32:35,842 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ImportRecordCleanerJob: org.candlepin.async.tasks.ImportRecordCleanerJob
2021-05-11 16:32:35,844 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: JobCleaner: org.candlepin.async.tasks.JobCleaner
2021-05-11 16:32:35,868 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: ManifestCleanerJob: org.candlepin.async.tasks.ManifestCleanerJob
2021-05-11 16:32:35,870 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: OrphanCleanupJob: org.candlepin.async.tasks.OrphanCleanupJob
2021-05-11 16:32:35,871 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsForProductJob: org.candlepin.async.tasks.RefreshPoolsForProductJob
2021-05-11 16:32:35,873 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RefreshPoolsJob: org.candlepin.async.tasks.RefreshPoolsJob
2021-05-11 16:32:35,874 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenEnvEntitlementCertsJob: org.candlepin.async.tasks.RegenEnvEntitlementCertsJob
2021-05-11 16:32:35,876 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: RegenProductEntitlementCertsJob: org.candlepin.async.tasks.RegenProductEntitlementCertsJob
2021-05-11 16:32:35,877 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UndoImportsJob: org.candlepin.async.tasks.UndoImportsJob
2021-05-11 16:32:35,901 [thread=main] [=, org=, csid=] INFO  org.candlepin.async.JobManager - Registering job: UnmappedGuestEntitlementCleanerJob: org.candlepin.async.tasks.UnmappedGuestEntitlementCleanerJob
2021-05-11 16:32:53,577 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.LoggingListener
2021-05-11 16:32:53,595 [thread=main] [=, org=, csid=] DEBUG org.candlepin.guice.I18nProvider - Getting i18n engine for locale en_US
2021-05-11 16:32:53,649 [thread=main] [=, org=, csid=] DEBUG org.candlepin.audit.ArtemisMessageSourceReceiverFactory - Registering event listener for queue: event.org.candlepin.audit.ActivationListener
2021-05-11 16:33:04,020 [thread=main] [=, org=, csid=] WARN  org.hibernate.id.UUIDHexGenerator - HHH000409: Using org.hibernate.id.UUIDHexGenerator which does not generate IETF RFC 4122 compliant UUID values; consider using org.hibernate.id.UUIDGenerator instead
2021-05-11 16:33:04,274 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000038: Composite-id class does not override equals(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:04,275 [thread=main] [=, org=, csid=] WARN  org.hibernate.mapping.RootClass - HHH000039: Composite-id class does not override hashCode(): org.candlepin.model.PoolAttribute
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#privateKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,637 [thread=main] [=, org=, csid=] WARN  org.hibernate.metamodel.internal.MetadataContext - HHH015011: Unable to locate static metamodel field : org.candlepin.model.KeyPair_#publicKey; this may or may not indicate a problem with the static metamodel
2021-05-11 16:33:11,995 [thread=main] [=, org=, csid=] DEBUG org.candlepin.policy.js.JsRunnerProvider - Compiling rules for initial load
[root@scotty candlepin]#

localhost.log from tomcat:

[root@scotty tomcat]# cat localhost.2021-05-11.log 
11-May-2021 19:48:54.510 SEVERE [main] org.apache.catalina.core.StandardContext.listenerStop Exception sending context destroyed event to listener instance of class [org.candlepin.guice.CandlepinContextListener]
        java.lang.NullPointerException
                at org.jboss.resteasy.plugins.guice.GuiceResteasyBootstrapServletContextListener.triggerAnnotatedMethods(GuiceResteasyBootstrapServletContextListener.java:151)
                at org.jboss.resteasy.plugins.guice.GuiceResteasyBootstrapServletContextListener.contextDestroyed(GuiceResteasyBootstrapServletContextListener.java:146)
                at org.candlepin.guice.CandlepinContextListener.contextDestroyed(CandlepinContextListener.java:249)
                at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
                at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
                at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:187)
                at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:717)
                at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:690)
                at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:705)
                at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1133)
                at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1867)
                at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
                at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
                at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
                at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:118)
                at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1045)
                at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:429)
                at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1576)
                at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:309)
                at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123)
                at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:423)
                at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:366)
                at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:936)
                at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:841)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384)
                at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374)
                at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
                at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
                at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:140)
                at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909)
                at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:262)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.StandardService.startInternal(StandardService.java:421)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:930)
                at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
                at org.apache.catalina.startup.Catalina.start(Catalina.java:633)
                at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.base/java.lang.reflect.Method.invoke(Method.java:566)
                at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:343)
                at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:474)
[root@scotty tomcat]#

catalina.log:

[root@scotty tomcat]# cat catalina.2021-05-11.log 
11-May-2021 19:48:19.027 WARNING [main] org.apache.catalina.startup.SetAllPropertiesRule.begin [SetAllPropertiesRule]{Server/Service/Connector} Setting property 'sslProtocols' to 'TLSv1.2' did not find a matching property.
11-May-2021 19:48:19.198 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlValidation] to [false]
11-May-2021 19:48:19.198 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlNamespaceAware] to [false]
11-May-2021 19:48:19.211 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib]
11-May-2021 19:48:20.131 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
11-May-2021 19:48:21.456 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [2,934] milliseconds
11-May-2021 19:48:21.656 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
11-May-2021 19:48:21.657 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.30]
11-May-2021 19:48:21.675 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
11-May-2021 19:48:32.251 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
11-May-2021 19:48:40.316 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.OwnerCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.520 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.merge(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.521 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ProductCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.523 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.734 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.787 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ConsumerCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:40.788 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ConsumerCurator.create(org.candlepin.model.Persisted,boolean)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.063 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.CdnCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.111 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.PoolCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.464 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.RulesCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.465 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:41.531 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ContentCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:42.051 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@69ff5aca]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
11-May-2021 19:48:54.440 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
11-May-2021 19:48:54.452 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal Context [/candlepin] startup failed due to previous errors
11-May-2021 19:48:54.517 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc The web application [candlepin] registered the JDBC driver [org.postgresql.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
11-May-2021 19:48:54.583 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/var/lib/tomcat/webapps/candlepin] has finished in [32,908] ms
11-May-2021 19:48:54.597 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
11-May-2021 19:48:54.619 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [33,162] milliseconds
[root@scotty tomcat]#

Try to run the installer step db:seed ends up with:

[root@scotty tomcat]# /usr/sbin/foreman-rake db:seed   
User with login admin already exists, not seeding as admin.
rake aborted!
There was an issue with the backend service candlepin: 404 Not Found
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:17:in `block in plan'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:15:in `each'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_check.rb:15:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:5:in `block in plan'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:19:in `propagate_candlepin_errors'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/propagate_candlepin_errors.rb:5:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/remote_action.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/keep_locale.rb:7:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:34:in `with_current_request_id'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:31:in `with_current_timezone'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:30:in `with_current_taxonomies'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:41:in `with_current_user'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:513:in `block (2 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:417:in `concurrence'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:512:in `block in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `block in with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `catch'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:511:in `execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:285:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:55:in `block in execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract.rb:167:in `with_meta_calculation'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:54:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:445:in `plan_action'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/katello/organization/create.rb:15:in `block in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:422:in `sequence'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/katello/organization/create.rb:14:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:514:in `block (3 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/keep_locale.rb:7:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:34:in `with_current_request_id'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_request_id.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:31:in `with_current_timezone'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_timezone.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:30:in `with_current_taxonomies'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_taxonomies.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:36:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:10:in `block in plan'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:41:in `with_current_user'
/usr/share/gems/gems/foreman-tasks-4.0.1/app/lib/actions/middleware/keep_current_user.rb:9:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:513:in `block (2 levels) in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:417:in `concurrence'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:512:in `block in execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `block in with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `catch'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:472:in `with_error_handling'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:511:in `execute_plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/action.rb:285:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:55:in `block in execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/abstract.rb:167:in `with_meta_calculation'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan/steps/plan_step.rb:54:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:286:in `block (2 levels) in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:384:in `switch_flow'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:374:in `with_planning_scope'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:285:in `block in plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:17:in `block in rollback_on_error'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/database_statements.rb:280:in `block in transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/transaction.rb:280:in `block in within_new_transaction'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:26:in `block (2 levels) in synchronize'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:25:in `handle_interrupt'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:25:in `block in synchronize'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:21:in `handle_interrupt'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/load_interlock_aware_monitor.rb:21:in `synchronize'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/transaction.rb:278:in `within_new_transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/connection_adapters/abstract/database_statements.rb:280:in `transaction'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/transactions.rb:212:in `transaction'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/transaction_adapters/active_record.rb:6:in `transaction'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:16:in `rollback_on_error'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/common/transaction.rb:6:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:27:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:19:in `pass'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware.rb:44:in `plan_phase'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/stack.rb:23:in `call'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/middleware/world.rb:31:in `execute'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/execution_plan.rb:284:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:211:in `block (2 levels) in plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/coordinator.rb:326:in `acquire'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:209:in `block in plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:208:in `tap'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:208:in `plan_with_options'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:204:in `plan'
/usr/share/gems/gems/dynflow-1.4.7/lib/dynflow/world.rb:180:in `trigger'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:23:in `trigger'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:29:in `block in trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:49:in `block in rails_safe_trigger_task'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies/interlock.rb:48:in `block in permit_concurrent_loads'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/concurrency/share_lock.rb:187:in `yield_shares'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies/interlock.rb:47:in `permit_concurrent_loads'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:48:in `rails_safe_trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:27:in `trigger_task'
/usr/share/gems/gems/foreman-tasks-4.0.1/lib/foreman_tasks.rb:58:in `sync_task'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:10:in `block (2 levels) in <top (required)>'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:108:in `as'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:9:in `block in <top (required)>'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/delegation.rb:87:in `each'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/delegation.rb:87:in `each'
/usr/share/gems/gems/katello-4.0.0/db/seeds.d/102-organizations.rb:8:in `<top (required)>'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `block in load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:291:in `load_dependency'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/foreman/app/services/foreman_seeder.rb:46:in `block (2 levels) in execute'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:108:in `as'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:114:in `as_anonymous_admin'
/usr/share/foreman/app/services/foreman_seeder.rb:45:in `block in execute'
/usr/share/foreman/app/services/foreman_seeder.rb:39:in `each'
/usr/share/foreman/app/services/foreman_seeder.rb:39:in `execute'
/usr/share/foreman/db/seeds.rb:14:in `<top (required)>'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `block in load'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:291:in `load_dependency'
/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/dependencies.rb:318:in `load'
/usr/share/gems/gems/railties-6.0.3.4/lib/rails/engine.rb:559:in `load_seed'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/tasks/database_tasks.rb:440:in `load_seed'
/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/railties/databases.rake:331:in `block (2 levels) in <top (required)>'
/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in `<top (required)>'
Tasks: TOP => db:seed
(See full trace by running task with --trace)
[root@scotty tomcat]#

It is the same message as in the foreman-installer run.
Available ports are

[root@scotty tomcat]# ss -tlpn
State               Recv-Q              Send-Q                                 Local Address:Port                              Peer Address:Port                                                                   
LISTEN              0                   128                                        127.0.0.1:5432                                   0.0.0.0:*                  users:(("postmaster",pid=1122,fd=4))                
LISTEN              0                   128                                          0.0.0.0:9090                                   0.0.0.0:*                  users:(("smart-proxy",pid=1095,fd=10))                               
LISTEN              0                   10                                         127.0.0.1:5671                                   0.0.0.0:*                  users:(("qpidd",pid=1096,fd=26))                    
LISTEN              0                   10                                         127.0.0.1:5672                                   0.0.0.0:*                  users:(("qpidd",pid=1096,fd=29))                                      
LISTEN              0                   128                                        127.0.0.1:6379                                   0.0.0.0:*                  users:(("redis-server",pid=1101,fd=6))              
LISTEN              0                   50                                           0.0.0.0:5646                                   0.0.0.0:*                  users:(("qdrouterd",pid=1081,fd=13))                
LISTEN              0                   50                                           0.0.0.0:5647                                   0.0.0.0:*                  users:(("qdrouterd",pid=1081,fd=11))                
LISTEN              0                   128                                          0.0.0.0:22                                     0.0.0.0:*                  users:(("sshd",pid=1090,fd=5))                      
LISTEN              0                   128                                            [::1]:5432                                      [::]:*                  users:(("postmaster",pid=1122,fd=3))                
LISTEN              0                   128                                             [::]:9090                                      [::]:*                  users:(("smart-proxy",pid=1095,fd=11))              
LISTEN              0                   1                                 [::ffff:127.0.0.1]:8005                                         *:*                  users:(("java",pid=3797,fd=58))                     
LISTEN              0                   10                                             [::1]:5671                                      [::]:*                  users:(("qpidd",pid=1096,fd=28))                    
LISTEN              0                   10                                             [::1]:5672                                      [::]:*                  users:(("qpidd",pid=1096,fd=30))                    
LISTEN              0                   50                                                 *:8140                                         *:*                  users:(("java",pid=1185,fd=39))                     
LISTEN              0                   50                                              [::]:5646                                      [::]:*                  users:(("qdrouterd",pid=1081,fd=14))                
LISTEN              0                   50                                              [::]:5647                                      [::]:*                  users:(("qdrouterd",pid=1081,fd=12))                
LISTEN              0                   100                               [::ffff:127.0.0.1]:23443                                        *:*                  users:(("java",pid=3797,fd=46))                     
LISTEN              0                   128                                             [::]:22                                        [::]:*                  users:(("sshd",pid=1090,fd=7))                      
[root@scotty tomcat]# 

What version of RHEL 8 are you running? Still 8.3? So far I have not been able to replicate this. And you said no SELinux issues appear to be the case?

Yes. RHEL 8.3:

[root@scotty ~]# cat /etc/os-release 
NAME="Red Hat Enterprise Linux"
VERSION="8.3 (Ootpa)"
ID="rhel"
ID_LIKE="fedora"
VERSION_ID="8.3"
PLATFORM_ID="platform:el8"
PRETTY_NAME="Red Hat Enterprise Linux 8.3 (Ootpa)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:redhat:enterprise_linux:8.3:GA"
HOME_URL="https://www.redhat.com/"
BUG_REPORT_URL="https://bugzilla.redhat.com/"

REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 8"
REDHAT_BUGZILLA_PRODUCT_VERSION=8.3
REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
REDHAT_SUPPORT_PRODUCT_VERSION="8.3"
[root@scotty ~]#

I’m no selinux expert, but I cannot see any selinux related messages any more.
The vm runs out-of-the-box, I’m really confused …

I will now try a foreman-installer run with selinux completely disabled …

No success :pensive:

[root@scotty ~]# foreman-installer --certs-reset --scenario katello
2021-05-11 20:07:53 [NOTICE] [root] Loading default values from puppet modules...
2021-05-11 20:08:07 [NOTICE] [root] ... finished
2021-05-11 20:08:17 [NOTICE] [root] Running validation checks
Marking certificate /root/ssl-build/scotty.home.petersen20.de/pulp-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-apache for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-qpid-broker for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-proxy-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-client for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-qpid-router-server for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-foreman-proxy for update
Marking certificate /root/ssl-build/scotty.home.petersen20.de/scotty.home.petersen20.de-puppet-client for update
Marking certificate /root/ssl-build/katello-server-ca for update
2021-05-11 20:08:22 [NOTICE] [configure] Starting system configuration.
  The total number of configuration tasks may increase during the run.
  Observe logs or specify --verbose-log-level to see individual configuration tasks.
2021-05-11 20:08:54 [NOTICE] [configure] 100 out of 1939 done.
2021-05-11 20:08:55 [NOTICE] [configure] 200 out of 1939 done.
2021-05-11 20:09:09 [NOTICE] [configure] 300 out of 1939 done.
2021-05-11 20:09:27 [NOTICE] [configure] 400 out of 1939 done.
2021-05-11 20:10:38 [NOTICE] [configure] 500 out of 1939 done.
2021-05-11 20:10:46 [NOTICE] [configure] 600 out of 1941 done.
2021-05-11 20:10:46 [NOTICE] [configure] 700 out of 1941 done.
2021-05-11 20:10:50 [NOTICE] [configure] 800 out of 1943 done.
2021-05-11 20:10:53 [NOTICE] [configure] 900 out of 1944 done.
2021-05-11 20:10:53 [NOTICE] [configure] 1000 out of 1946 done.
2021-05-11 20:10:54 [NOTICE] [configure] 1100 out of 1949 done.
2021-05-11 20:10:55 [NOTICE] [configure] 1200 out of 1950 done.
2021-05-11 20:11:02 [NOTICE] [configure] 1300 out of 1951 done.
2021-05-11 20:13:43 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Failed to call refresh: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-11 20:13:43 [ERROR ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' returned 1 instead of one of [0]
2021-05-11 20:13:43 [NOTICE] [configure] 1400 out of 1951 done.
2021-05-11 20:13:43 [NOTICE] [configure] 1500 out of 1951 done.
2021-05-11 20:13:43 [NOTICE] [configure] 1600 out of 1951 done.
2021-05-11 20:13:44 [NOTICE] [configure] 1700 out of 1951 done.
2021-05-11 20:13:44 [NOTICE] [configure] 1800 out of 1951 done.
2021-05-11 20:14:02 [NOTICE] [configure] 1900 out of 1951 done.
2021-05-11 20:14:12 [NOTICE] [configure] System configuration has finished.

  There were errors detected during install.
  Please address the errors and re-run the installer to ensure the system is properly configured.
  Failing to do so is likely to result in broken functionality.

  The full log is at /var/log/foreman-installer/katello.log
[root@scotty ~]#

From the katello.log:

2021-05-11 20:11:04 [DEBUG ] [configure] Class[Foreman::Database::Postgresql]: Starting to evaluate the resource (1378 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Class[Foreman::Database::Postgresql]: Evaluated in 0.00 seconds
2021-05-11 20:11:04 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1379 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-11 20:11:04 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Starting to evaluate the resource (1380 of 1951)
2021-05-11 20:11:04 [DEBUG ] [configure] Exec[foreman-rake-db:migrate](provider=posix): Executing check '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-11 20:11:04 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:abort_if_pending_migrations'
2021-05-11 20:11:57 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: '/usr/sbin/foreman-rake db:migrate' won't be executed because of failed check 'unl
ess'
2021-05-11 20:11:57 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:migrate]/Exec[foreman-rake-db:migrate]: Evaluated in 53.08 seconds
2021-05-11 20:11:57 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Starting to evaluate the resource (1381 of 1951)
2021-05-11 20:11:57 [DEBUG ] [configure] Foreman::Rake[db:migrate]: Evaluated in 0.00 seconds
2021-05-11 20:11:57 [DEBUG ] [configure] Prefetching cli resources for foreman_config_entry
2021-05-11 20:11:57 [DEBUG ] [configure] Executing with uid=foreman gid=foreman: '/usr/sbin/foreman-rake -- config '
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Starting to evaluate the resource (1382 of 1951)
2021-05-11 20:12:44 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]/value: value changed 'true' to 'false'
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: The container Class[Foreman::Database] will propagate my refresh event
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Scheduling refresh of Foreman::Rake[db:seed]
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman_config_entry[db_pending_seed]: Evaluated in 0.01 seconds
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Starting to evaluate the resource (1383 of 1951)
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Scheduling refresh of Exec[foreman-rake-db:seed]
2021-05-11 20:12:44 [DEBUG ] [configure] Foreman::Rake[db:seed]: Evaluated in 0.00 seconds
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: Starting to evaluate the resource (1384 of 1951)
2021-05-11 20:12:44 [DEBUG ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]: '/usr/sbin/foreman-rake db:seed' won't be executed because of failed check 'refreshonly'
2021-05-11 20:12:44 [DEBUG ] [configure] Exec[foreman-rake-db:seed](provider=posix): Executing '/usr/sbin/foreman-rake db:seed'
2021-05-11 20:12:44 [DEBUG ] [configure] Executing with uid=foreman: '/usr/sbin/foreman-rake db:seed'
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: rake aborted!
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: There was an issue with the backend service candlepin: 404 Not Found
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_c
heck.rb:17:in `block in plan'
2021-05-11 20:13:43 [INFO  ] [configure] /Stage[main]/Foreman::Database/Foreman::Rake[db:seed]/Exec[foreman-rake-db:seed]/returns: /usr/share/gems/gems/katello-4.0.0/app/lib/actions/middleware/backend_services_c
heck.rb:15:in `each'

Adding a note that might be interesting or congruent perhaps @ekohl / @wbclark is that we are seeing that 404 in our puppet-candlepin spec testing only on EL8, e.g.

  1) candlepin works Command "curl -k -s -o /dev/null -w '%{http_code}' https://localhost:8443/candlepin/status" stdout is expected to eq "200"
     Failure/Error: its(:stdout) { should eq "200" }
       
       expected: "200"
            got: "404"
       
       (compared using ==)
       
     # ./spec/acceptance/basic_candlepin_spec.rb:13:in `block (3 levels) in <top (required)>'

This is also a case of we are seeing it in one environment but not replicating it in other environments

Could it be a connection problem between the postgres db and candlepin during the start process of tomcat/candlepin (eg. timeout) ? How could I debug it ?

Two questions to maybe help us debug:

  1. How much RAM does the system have?
  2. What version of postgres is running?
  1. Both systems (vm and bare metal) have 8GB memory. On bare metal actually it looks like this:
[root@scotty ~]# free -m
total        used        free      shared  buff/cache   available
Mem:           7664        3895        2123          83        1645        3387
Swap:          4095           0        4095
[root@scotty ~]#
  1. postgresql12 is running:
[postgres@scotty ~]$ psql
psql (12.5)
Type "help" for help.
postgres=#

Bear with me I am trying to replicate this and gather enough info to figure out what is happening here.

Could try restarting tomcat and then look at the most recent output and paste it? It should all show up via:

systemctl restart tomcat
journalctl -xef -t server

Just a few hours ago once again I made a complete new installation (wipe all, install os, install katello) with the same result: tomcat/candlepin does not start.
For this run I set selinux to permissive, so I think, we can eliminate this as error source.
Here is the output:

[root@scotty tomcat]# systemctl stop tomcat
[root@scotty tomcat]# rm -f *log
[root@scotty tomcat]# systemctl start tomcat &&  journalctl -xef -t server
-- Logs begin at Wed 2021-05-12 21:47:01 CEST. --
May 12 21:47:18 scotty.home.petersen20.de server[1082]: Java virtual machine used: /usr/lib/jvm/jre-11/bin/java
May 12 21:47:18 scotty.home.petersen20.de server[1082]: classpath used: /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/ant.jar:/usr/share/java/ant-launcher.jar:/usr/lib/jvm/java/lib/tools.jar
May 12 21:47:18 scotty.home.petersen20.de server[1082]: main class used: org.apache.catalina.startup.Bootstrap
May 12 21:47:18 scotty.home.petersen20.de server[1082]: flags used: -Xms1024m -Xmx4096m -Djava.security.auth.login.config=/usr/share/tomcat/conf/login.config
May 12 21:47:18 scotty.home.petersen20.de server[1082]: options used: -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
May 12 21:47:18 scotty.home.petersen20.de server[1082]: arguments used: start
May 12 21:47:24 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:24.266 WARNING [main] org.apache.catalina.startup.SetAllPropertiesRule.begin [SetAllPropertiesRule]{Server/Service/Connector} Setting property 'sslProtocols' to 'TLSv1.2' did not find a matching property.
May 12 21:47:24 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:24.804 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlValidation] to [false]
May 12 21:47:24 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:24.805 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Engine/Host] failed to set property [xmlNamespaceAware] to [false]
May 12 21:47:24 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:24.850 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib]
May 12 21:47:31 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:31.227 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
May 12 21:47:40 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:40.728 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [18,578] milliseconds
May 12 21:47:41 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:41.882 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
May 12 21:47:41 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:41.883 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.30]
May 12 21:47:42 scotty.home.petersen20.de server[1082]: 12-May-2021 21:47:42.053 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
May 12 21:48:31 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:31.611 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
May 12 21:48:31 scotty.home.petersen20.de server[1082]: WARNING: An illegal reflective access operation has occurred
May 12 21:48:31 scotty.home.petersen20.de server[1082]: WARNING: Illegal reflective access by org.candlepin.pki.impl.JSSProviderLoader (file:/var/lib/tomcat/webapps/candlepin/WEB-INF/classes/) to field java.lang.ClassLoader.usr_paths
May 12 21:48:31 scotty.home.petersen20.de server[1082]: WARNING: Please consider reporting this to the maintainers of org.candlepin.pki.impl.JSSProviderLoader
May 12 21:48:31 scotty.home.petersen20.de server[1082]: WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
May 12 21:48:31 scotty.home.petersen20.de server[1082]: WARNING: All illegal access operations will be denied in a future release
May 12 21:48:43 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:43.430 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.OwnerCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:43 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:43.802 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.merge(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:43 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:43.803 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ProductCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:43 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:43.804 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.031 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.083 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ConsumerCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.085 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ConsumerCurator.create(org.candlepin.model.Persisted,boolean)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.382 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.CdnCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.448 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.PoolCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.839 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.RulesCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.840 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:44 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:44.926 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ContentCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:48:45 scotty.home.petersen20.de server[1082]: 12-May-2021 21:48:45.560 WARNING [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@3ae3dfd2]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.538 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.567 SEVERE [main] org.apache.catalina.core.StandardContext.startInternal Context [/candlepin] startup failed due to previous errors
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.743 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc The web application [candlepin] registered the JDBC driver [org.postgresql.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.774 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahr854qk1annxnp|1aff6dc1]-AdminTaskTimer] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.lang.Object.wait(Native Method)
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.util.TimerThread.mainLoop(Timer.java:553)
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.util.TimerThread.run(Timer.java:506)
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.792 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahr854qk1annxnp|1aff6dc1]-HelperThread-#0] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.lang.Object.wait(Native Method)
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:683)
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.793 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahr854qk1annxnp|1aff6dc1]-HelperThread-#1] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.lang.Object.wait(Native Method)
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:683)
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.795 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [C3P0PooledConnectionPoolManager[identityToken->1hgf027ahr854qk1annxnp|1aff6dc1]-HelperThread-#2] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  java.base@11.0.11/java.lang.Object.wait(Native Method)
May 12 21:49:18 scotty.home.petersen20.de server[1082]:  com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:683)
May 12 21:49:18 scotty.home.petersen20.de server[1082]: 12-May-2021 21:49:18.796 WARNING [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [candlepin] appears to have started a thread named [Thread-0 (-scheduled-threads)] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
......
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118)
May 12 21:51:21 scotty.home.petersen20.de server[2098]: Caused by: java.lang.ClassNotFoundException: Illegal access: this web application instance has been stopped already. Could not load [ch.qos.logback.classic.spi.ThrowableProxy]. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access.
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForClassLoading(WebappClassLoaderBase.java:1375)
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1226)
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1188)
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         ... 16 more
May 12 21:51:21 scotty.home.petersen20.de server[2098]: Caused by: java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load [ch.qos.logback.classic.spi.ThrowableProxy]. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access.
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading(WebappClassLoaderBase.java:1385)
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForClassLoading(WebappClassLoaderBase.java:1373)
May 12 21:51:21 scotty.home.petersen20.de server[2098]:         ... 18 more
May 12 21:51:26 scotty.home.petersen20.de server[2098]: Exception in thread "Thread-2 (ActiveMQ-server-org.apache.activemq.artemis.core.server.impl.ActiveMQServerImpl$6@52cfd82b)" java.lang.NoClassDefFoundError: ch/qos/logback/classic/spi/ThrowableProxy
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at ch.qos.logback.classic.spi.LoggingEvent.<init>(LoggingEvent.java:119)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:419)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:383)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at ch.qos.logback.classic.Logger.log(Logger.java:765)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at jdk.internal.reflect.GeneratedMethodAccessor49.invoke(Unknown Source)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.jboss.logging.Slf4jLocationAwareLogger.doLog(Slf4jLocationAwareLogger.java:89)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.jboss.logging.Slf4jLocationAwareLogger.doLog(Slf4jLocationAwareLogger.java:75)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.jboss.logging.Logger.warn(Logger.java:1236)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:47)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:31)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.apache.activemq.artemis.utils.actors.ProcessorBase.executePendingTasks(ProcessorBase.java:65)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
May 12 21:51:26 scotty.home.petersen20.de server[2098]:         at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118)
^C
[root@scotty tomcat]# 

(The complete logs are too long for pasting, so I attached them as file).candlepin.log (48.3 KB) catalina.2021-05-12.log (69.2 KB) error.log (14.0 KB) localhost.2021-05-12.log (8.8 KB)

To help level set the environment what do you have for:

/usr/lib/jvm/jre-11/bin/java -version
rpm -q candlepin
[root@scotty ~]# /usr/lib/jvm/jre-11/bin/java -version
openjdk version "11.0.11" 2021-04-20 LTS
OpenJDK Runtime Environment 18.9 (build 11.0.11+9-LTS)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.11+9-LTS, mixed mode, sharing)
[root@scotty ~]# rpm -q candlepin
candlepin-3.2.11-1.el8.noarch
[root@scotty ~]#

Hello Gentleman.

While I have absolutely nothing meaningful to contribute, I have been experiencing this issue as well. I have noticed the following.

Senerio:
I have an automated build (kickstart) with additional build scripts that execute after the node has been rebooted to ensure a consistent build. 4 out of 5 builds results in the same issue with the “Candlepin 404” error. Similar logs as denoted above.

On the 5th build, the entire Foreman/Katello system builds out perfectly fine.

The system is an 8 core, 32 gig RAM RHEL v8.3 VM. The hypervisor it is on is extremely under utilized.

Happy to contribute with any troubleshooting, log collection, etc.