Status code: 403 for Error: Failed to download metadata for repo after upgrade to 3.11

Hi, can you check the logs under /var/log/tomcat/ ? specifically the catalina.log and localhost.log

25-Jul-2024 15:28:46.837 INFO [Catalina-utility-2] org.apache.catalina.startup.HostConfig.reload Reloading context [/candlepin]
25-Jul-2024 15:28:46.847 INFO [Catalina-utility-2] org.apache.catalina.core.StandardContext.reload Reloading Context with name [/candlepin] has started
25-Jul-2024 15:28:47.948 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@9fb9d8b]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.953 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@7636ef14]) and a value of type [org.hibernate.internal.SessionImpl] (value [SessionImpl(684982996)]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.954 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@4d7e6c42]) and a value of type [liquibase.SingletonScopeManager] (value [liquibase.SingletonScopeManager@183cb7a3]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.956 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@46cba80a]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.990 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@697765e8]) and a value of type [org.apache.activemq.artemis.core.persistence.impl.journal.OperationContextImpl] (value [OperationContextImpl [1251510525] [minimalStore=9223372036854775807, storeLineUp=1, stored=1, minimalReplicated=9223372036854775807, replicationLineUp=0, replicated=0, paged=0, minimalPage=9223372036854775807, pageLineUp=0, errorCode=-1, errorMessage=null, executorsPending=0]]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.990 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@7efe4a1f]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.990 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@1aaf578e]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.990 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@293facdb]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.991 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@5c059a8a]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.991 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@50279da7]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.991 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@30d4a30a]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.991 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@645998bf]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.991 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@14788f74]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.992 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@78def2f9]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:47.992 SEVERE [Catalina-utility-2] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [candlepin] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@32eaf4b8]) and a value of type [io.netty.util.internal.InternalThreadLocalMap] (value [io.netty.util.internal.InternalThreadLocalMap@6d07d21b]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
25-Jul-2024 15:28:49.771 INFO [Finalizer] org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading Illegal access: this web application instance has been stopped already. Could not load [io.netty.util.internal.shaded.org.jctools.queues.IndexedQueueSizeUtil]. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access.
java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load [io.netty.util.internal.shaded.org.jctools.queues.IndexedQueueSizeUtil]. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access.
at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading(WebappClassLoaderBase.java:1349)
at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForClassLoading(WebappClassLoaderBase.java:1337)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1174)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1141)
at io.netty.util.internal.shaded.org.jctools.queues.ConcurrentCircularArrayQueue.isEmpty(ConcurrentCircularArrayQueue.java:71)
at io.netty.util.internal.shaded.org.jctools.queues.MpscArrayQueue.isEmpty(MpscArrayQueue.java:207)
at io.netty.buffer.PoolThreadCache.checkCacheMayLeak(PoolThreadCache.java:236)
at io.netty.buffer.PoolThreadCache.free(PoolThreadCache.java:227)
at io.netty.buffer.PoolThreadCache$FreeOnFinalize.finalize(PoolThreadCache.java:500)
at java.base/java.lang.System$2.invokeFinalize(System.java:2301)
at java.base/java.lang.ref.Finalizer.runFinalizer(Finalizer.java:88)
at java.base/java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:173)
25-Jul-2024 15:28:53.432 INFO [Catalina-utility-2] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 17:41:31.073 INFO [Thread-10] org.apache.coyote.AbstractProtocol.pause Pausing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 17:41:31.078 INFO [Thread-10] org.apache.catalina.core.StandardService.stopInternal Stopping service [Catalina]
25-Jul-2024 17:43:01.575 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 17:43:01.578 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 17:43:01.578 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 17:43:01.579 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 17:43:01.581 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 17:43:01.810 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 17:43:01.855 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 17:43:02.309 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 17:43:02.319 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [918] milliseconds
25-Jul-2024 17:43:02.354 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 17:43:02.354 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 17:43:02.360 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 17:43:06.023 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 18:06:24.892 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 18:06:24.925 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 18:06:24.925 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 18:06:24.925 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 18:06:24.928 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 18:06:25.430 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 18:06:25.487 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 18:06:26.125 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 18:06:26.159 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [1811] milliseconds
25-Jul-2024 18:06:26.237 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 18:06:26.237 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 18:06:26.250 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 18:06:33.147 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 18:55:15.423 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 18:55:15.426 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 18:55:15.427 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 18:55:15.427 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 18:55:15.439 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 18:55:15.680 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 18:55:15.702 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 18:55:16.094 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 18:55:16.102 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [846] milliseconds
25-Jul-2024 18:55:16.134 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 18:55:16.134 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 18:55:16.139 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 18:55:19.679 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 19:22:17.738 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 19:22:17.742 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 19:22:17.742 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 19:22:17.742 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 19:22:17.746 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 19:22:17.977 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 19:22:17.996 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 19:22:18.395 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 19:22:18.403 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [850] milliseconds
25-Jul-2024 19:22:18.438 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 19:22:18.438 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 19:22:18.444 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 19:22:21.955 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 19:26:03.591 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 19:26:03.596 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 19:26:03.596 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 19:26:03.596 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 19:26:03.600 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 19:26:03.876 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 19:26:03.898 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 19:26:04.293 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 19:26:04.314 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [902] milliseconds
25-Jul-2024 19:26:04.347 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 19:26:04.347 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 19:26:04.353 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 19:26:07.756 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 19:51:06.172 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 19:51:06.176 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 19:51:06.176 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 19:51:06.176 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 19:51:06.179 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 19:51:06.382 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 19:51:06.399 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 19:51:06.765 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 19:51:06.772 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [766] milliseconds
25-Jul-2024 19:51:06.802 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 19:51:06.802 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 19:51:06.807 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 19:51:10.127 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
25-Jul-2024 20:32:30.850 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent An older version [1.2.35] of the Apache Tomcat Native library is installed, while Tomcat recommends a minimum version of [1.2.38]
25-Jul-2024 20:32:30.854 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded Apache Tomcat Native library [1.2.35] using APR version [1.6.3].
25-Jul-2024 20:32:30.854 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true].
25-Jul-2024 20:32:30.854 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
25-Jul-2024 20:32:30.857 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k FIPS 25 Mar 2021]
25-Jul-2024 20:32:31.090 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler [“https-openssl-nio-127.0.0.1-23443”]
25-Jul-2024 20:32:31.109 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
25-Jul-2024 20:32:31.484 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [default], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
25-Jul-2024 20:32:31.493 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [815] milliseconds
25-Jul-2024 20:32:31.524 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
25-Jul-2024 20:32:31.525 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
25-Jul-2024 20:32:31.530 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
25-Jul-2024 20:32:34.961 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.

and is there nothing in the /var/log/tomcat/localhost.log?

nothing is in the localhost.log

Here is the foreman status log

Running Status Services
================================================================================
Get status of applicable services: 

Displaying the following service(s):
redis, postgresql (candlepin), postgresql (foreman), postgresql (pulpcore), pulpcore-api, pulpcore-content, pulpcore-worker@1.service, pulpcore-worker@2.service, pulpcore-worker@3.service, pulpcore-worker@4.service, pulpcore-worker@5.service, pulpcore-worker@6.service, pulpcore-worker@7.service, pulpcore-worker@8.service, tomcat, dynflow-sidekiq@orchestrator, foreman, httpd, puppetserver, dynflow-sidekiq@worker-1, dynflow-sidekiq@worker-hosts-queue-1, foreman-proxy, foreman-cockpit

                                                                                
| 
                                                                                
/ 
                                                                                
/ displaying redis
● redis.service - Redis persistent key-value database
   Loaded: loaded (/usr/lib/systemd/system/redis.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/redis.service.d
           └─redis-90-limits.conf
   Active: active (running) since Fri 2024-07-26 11:00:21 EDT; 6min ago
  Process: 529589 ExecStop=/usr/libexec/redis-shutdown (code=exited, status=0/SUCCESS)
 Main PID: 533020 (redis-server)
   Status: "Ready to accept connections"
    Tasks: 5 (limit: 821331)
   Memory: 7.3M
   CGroup: /system.slice/redis.service
           └─533020 /usr/bin/redis-server 127.0.0.1:6379

Jul 26 11:00:21 parton systemd[1]: Starting Redis persistent key-value database...
Jul 26 11:00:21 parton systemd[1]: Started Redis persistent key-value database.

                                                                                
/ displaying postgresql (candlepin)
postgresql (candlepin) is remote and is UP.

                                                                                
/ displaying postgresql (foreman)
postgresql (foreman) is remote and is UP.

                                                                                
/ displaying postgresql (pulpcore)
postgresql (pulpcore) is remote and is UP.

                                                                                
/ displaying pulpcore-api
● pulpcore-api.service - Pulp API Server
   Loaded: loaded (/etc/systemd/system/pulpcore-api.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533041 (pulpcore-api)
   Status: "Gunicorn arbiter booted"
    Tasks: 6 (limit: 821331)
   Memory: 462.8M
   CGroup: /system.slice/pulpcore-api.service
           ├─533041 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
           ├─533056 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
           ├─533075 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
           ├─533101 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
           ├─533227 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
           └─533266 /usr/bin/python3.11 /usr/bin/pulpcore-api --preload --timeout 90 --workers 5 --max-requests 800 --max-requests-jitter 100 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"

Jul 26 11:02:24 parton pulpcore-api[533101]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:24 +0000] "GET /pulp/api/v3/publications/deb/apt/0190ef90-5a4a-7618-89c9-5fd33ff65796/ HTTP/1.1" 200 495 "-" "OpenAPI-Generator/3.2.0/ruby"
Jul 26 11:02:25 parton pulpcore-api[533101]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Starting task 0190ef90-886f-77df-b97b-1308b4c102b3
Jul 26 11:02:25 parton pulpcore-api[533101]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Task completed 0190ef90-886f-77df-b97b-1308b4c102b3
Jul 26 11:02:25 parton pulpcore-api[533101]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:25 +0000] "PATCH /pulp/api/v3/distributions/deb/apt/0190ae46-6963-799d-ae16-2685dfcd9260/ HTTP/1.1" 202 67 "-" "OpenAPI-Generator/3.2.0/ruby"
Jul 26 11:02:25 parton pulpcore-api[533227]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:25 +0000] "GET /pulp/api/v3/tasks/0190ef90-886f-77df-b97b-1308b4c102b3/ HTTP/1.1" 200 706 "-" "OpenAPI-Generator/3.49.6/ruby"
Jul 26 11:02:25 parton pulpcore-api[533075]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:25 +0000] "GET /pulp/api/v3/publications/deb/apt/0190ef90-5a4a-7618-89c9-5fd33ff65796/ HTTP/1.1" 200 495 "-" "OpenAPI-Generator/3.2.0/ruby"
Jul 26 11:02:25 parton pulpcore-api[533266]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Starting task 0190ef90-8a6f-77b5-80b0-09854d62911b
Jul 26 11:02:25 parton pulpcore-api[533266]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Task completed 0190ef90-8a6f-77b5-80b0-09854d62911b
Jul 26 11:02:25 parton pulpcore-api[533266]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:25 +0000] "PATCH /pulp/api/v3/distributions/deb/apt/0190ae46-6963-799d-ae16-2685dfcd9260/ HTTP/1.1" 202 67 "-" "OpenAPI-Generator/3.2.0/ruby"
Jul 26 11:02:26 parton pulpcore-api[533266]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]:  - - [26/Jul/2024:15:02:26 +0000] "GET /pulp/api/v3/content/deb/packages/?limit=5000&offset=0&repository_version=%2Fpulp%2Fapi%2Fv3%2Frepositories%2Fdeb%2Fapt%2F0190ae46-5990-7114-bbde-69b6b20f376d%2Fversions%2F9%2F HTTP/1.1" 200 704418 "-" "OpenAPI-Generator/3.2.0/ruby"

                                                                                
/ displaying pulpcore-content
● pulpcore-content.service - Pulp Content App
   Loaded: loaded (/etc/systemd/system/pulpcore-content.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533044 (pulpcore-conten)
   Status: "Gunicorn arbiter booted"
    Tasks: 35 (limit: 821331)
   Memory: 505.2M
   CGroup: /system.slice/pulpcore-content.service
           ├─533044 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533062 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533088 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533123 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533235 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533296 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533350 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533360 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533363 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533366 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533368 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533370 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533371 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533374 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533376 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533380 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           ├─533384 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -
           └─533385 /usr/bin/python3.11 /usr/bin/pulpcore-content --preload --timeout 90 --workers 17 --access-logfile -

Jul 26 11:00:25 parton pulpcore-content[533363]: [2024-07-26 15:00:25 +0000] [533363] [INFO] Booting worker with pid: 533363
Jul 26 11:00:25 parton pulpcore-content[533366]: [2024-07-26 15:00:25 +0000] [533366] [INFO] Booting worker with pid: 533366
Jul 26 11:00:25 parton pulpcore-content[533368]: [2024-07-26 15:00:25 +0000] [533368] [INFO] Booting worker with pid: 533368
Jul 26 11:00:25 parton pulpcore-content[533370]: [2024-07-26 15:00:25 +0000] [533370] [INFO] Booting worker with pid: 533370
Jul 26 11:00:25 parton pulpcore-content[533371]: [2024-07-26 15:00:25 +0000] [533371] [INFO] Booting worker with pid: 533371
Jul 26 11:00:25 parton pulpcore-content[533374]: [2024-07-26 15:00:25 +0000] [533374] [INFO] Booting worker with pid: 533374
Jul 26 11:00:25 parton pulpcore-content[533376]: [2024-07-26 15:00:25 +0000] [533376] [INFO] Booting worker with pid: 533376
Jul 26 11:00:25 parton pulpcore-content[533380]: [2024-07-26 15:00:25 +0000] [533380] [INFO] Booting worker with pid: 533380
Jul 26 11:00:25 parton pulpcore-content[533384]: [2024-07-26 15:00:25 +0000] [533384] [INFO] Booting worker with pid: 533384
Jul 26 11:00:25 parton pulpcore-content[533385]: [2024-07-26 15:00:25 +0000] [533385] [INFO] Booting worker with pid: 533385

                                                                                
/ displaying pulpcore-worker@1.service
● pulpcore-worker@1.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533067 (pulpcore-worker)
    Tasks: 2 (limit: 821331)
   Memory: 118.0M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@1.service
           ├─533067 /usr/bin/python3.11 /usr/bin/pulpcore-worker
           └─535548 gpg-agent --homedir /var/lib/pulp/.gnupg --use-standard-socket --daemon

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-1[533067]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-1[533067]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533067@parton.subdomain.domain.com' discovered
Jul 26 11:02:13 parton pulpcore-worker-1[535477]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Starting task 0190ef90-59d2-7a79-850c-bbc03953fb24
Jul 26 11:02:13 parton pulpcore-worker-1[535477]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.publishing:WARNING: Your pulp instance is configured to prohibit use of the MD5 checksum algorithm!
Jul 26 11:02:13 parton pulpcore-worker-1[535477]: Processing MD5 IN ADDITION to a secure hash like SHA-256 is "highly recommended".
Jul 26 11:02:13 parton pulpcore-worker-1[535477]: See https://docs.pulpproject.org/pulp_deb/workflows/checksums.html for more info.
Jul 26 11:02:13 parton pulpcore-worker-1[535477]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.publishing:INFO: Publishing: repository=Microsoft_VSCode_Deb-73484, version=9, simple=True, structured=True
Jul 26 11:02:22 parton pulpcore-worker-1[535477]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.publishing:INFO: Publication: 0190ef90-5a4a-7618-89c9-5fd33ff65796 created
Jul 26 11:02:22 parton pulpcore-worker-1[535477]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Task completed 0190ef90-59d2-7a79-850c-bbc03953fb24

                                                                                
/ displaying pulpcore-worker@2.service
● pulpcore-worker@2.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533070 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.4M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@2.service
           └─533070 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-2[533070]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-2[533070]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533070@parton.subdomain.domain.com' discovered

                                                                                
/ displaying pulpcore-worker@3.service
● pulpcore-worker@3.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533074 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.6M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@3.service
           └─533074 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-3[533074]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-3[533074]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533074@parton.subdomain.domain.com' discovered
Jul 26 11:02:35 parton pulpcore-worker-3[533074]: pulp [None]: pulpcore.tasking.worker:INFO: Clean missing pulp worker 2778632@parton.subdomain.domain.com.
Jul 26 11:02:35 parton pulpcore-worker-3[533074]: pulp [None]: pulpcore.tasking.worker:INFO: Clean missing pulp worker 1931281@parton.subdomain.domain.com.

                                                                                
/ displaying pulpcore-worker@4.service
● pulpcore-worker@4.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533081 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.5M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@4.service
           └─533081 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-4[533081]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-4[533081]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533081@parton.subdomain.domain.com' discovered

                                                                                
/ displaying pulpcore-worker@5.service
● pulpcore-worker@5.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533078 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.5M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@5.service
           └─533078 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-5[533078]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-5[533078]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533078@parton.subdomain.domain.com' discovered

                                                                                
/ displaying pulpcore-worker@6.service
● pulpcore-worker@6.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533089 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.5M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@6.service
           └─533089 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-6[533089]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-6[533089]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533089@parton.subdomain.domain.com' discovered

                                                                                
/ displaying pulpcore-worker@7.service
● pulpcore-worker@7.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533086 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 116.5M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@7.service
           └─533086 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:00:24 parton systemd[1]: Started Pulp Worker.
Jul 26 11:00:27 parton pulpcore-worker-7[533086]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Jul 26 11:00:27 parton pulpcore-worker-7[533086]: pulp [None]: pulpcore.tasking.worker:INFO: New worker '533086@parton.subdomain.domain.com' discovered

                                                                                
/ displaying pulpcore-worker@8.service
● pulpcore-worker@8.service - Pulp Worker
   Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533090 (pulpcore-worker)
    Tasks: 1 (limit: 821331)
   Memory: 410.2M
   CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@8.service
           └─533090 /usr/bin/python3.11 /usr/bin/pulpcore-worker

Jul 26 11:01:44 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:WARNING: Your pulp instance is configured to prohibit use of the MD5 checksum algorithm!
Jul 26 11:01:44 parton pulpcore-worker-8[535364]: Processing MD5 IN ADDITION to a secure hash like SHA-256 is "highly recommended".
Jul 26 11:01:44 parton pulpcore-worker-8[535364]: See https://docs.pulpproject.org/pulp_deb/workflows/checksums.html for more info.
Jul 26 11:01:44 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Downloading Release file for distribution: "stable"
Jul 26 11:01:45 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Parsing Release file at distribution="stable"
Jul 26 11:01:45 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Creating PackageIndex unit with relative_path="dists/stable/main/binary-all/Packages".
Jul 26 11:01:45 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Creating PackageIndex unit with relative_path="dists/stable/main/binary-amd64/Packages".
Jul 26 11:01:45 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Creating PackageIndex unit with relative_path="dists/stable/main/binary-arm64/Packages".
Jul 26 11:01:45 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulp_deb.app.tasks.synchronizing:INFO: Creating PackageIndex unit with relative_path="dists/stable/main/binary-armhf/Packages".
Jul 26 11:02:12 parton pulpcore-worker-8[535364]: pulp [102c4b45-bbf3-45b9-885e-9ac4ef028c5e]: pulpcore.tasking.tasks:INFO: Task completed 0190ef8f-e873-73d3-9472-bc9a9d510e4f

                                                                                
/ displaying tomcat
● tomcat.service - Apache Tomcat Web Application Container
   Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
 Main PID: 533091 (java)
    Tasks: 50 (limit: 821331)
   Memory: 847.6M
   CGroup: /system.slice/tomcat.service
           └─533091 /usr/lib/jvm/jre-17/bin/java -Xms1024m -Xmx8192m -Dcom.redhat.fips=false -Djava.security.auth.login.config=/usr/share/tomcat/conf/login.config -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar: -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start

Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.148 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true]
Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.151 INFO [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL successfully initialized [OpenSSL 1.1.1k  FIPS 25 Mar 2021]
Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.449 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-openssl-nio-127.0.0.1-23443"]
Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.485 WARNING [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprets the [ciphers] attribute in a manner consistent with the latest OpenSSL development branch. Some of the specified [ciphers] are not supported by the configured SSL engine for this connector (which may use JSSE or an older OpenSSL version) and have been skipped: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.937 INFO [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connector [https-openssl-nio-127.0.0.1-23443], TLS virtual host [_default_], certificate type [UNDEFINED] configured from keystore [/etc/candlepin/certs/keystore] using alias [tomcat] with trust store [/etc/candlepin/certs/truststore]
Jul 26 11:00:25 parton server[533091]: 26-Jul-2024 11:00:25.950 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [994] milliseconds
Jul 26 11:00:26 parton server[533091]: 26-Jul-2024 11:00:26.002 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
Jul 26 11:00:26 parton server[533091]: 26-Jul-2024 11:00:26.003 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.87]
Jul 26 11:00:26 parton server[533091]: 26-Jul-2024 11:00:26.010 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/var/lib/tomcat/webapps/candlepin]
Jul 26 11:00:29 parton server[533091]: 26-Jul-2024 11:00:29.779 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.

                                                                                
/ displaying dynflow-sidekiq@orchestrator
● dynflow-sidekiq@orchestrator.service - Foreman jobs daemon - orchestrator on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:01:03 EDT; 5min ago
     Docs: https://theforeman.org
 Main PID: 533102 (sidekiq)
   Status: "Everything ready for world: c05cb0f1-0356-42c8-93b6-2d49b47c1f99"
    Tasks: 12 (limit: 821331)
   Memory: 522.1M
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@orchestrator.service
           └─533102 sidekiq 6.5.12  [0 of 1 busy]

Jul 26 11:00:24 parton systemd[1]: Starting Foreman jobs daemon - orchestrator on sidekiq...
Jul 26 11:00:25 parton dynflow-sidekiq@orchestrator[533102]: 2024-07-26T15:00:25.025Z pid=533102 tid=bg8i INFO: Enabling systemd notification integration
Jul 26 11:00:25 parton dynflow-sidekiq@orchestrator[533102]: 2024-07-26T15:00:25.915Z pid=533102 tid=bg8i INFO: Booting Sidekiq 6.5.12 with Sidekiq::RedisConnection::RedisAdapter options {:url=>"redis://localhost:6379/6"}
Jul 26 11:00:25 parton dynflow-sidekiq@orchestrator[533102]: 2024-07-26T15:00:25.916Z pid=533102 tid=bg8i INFO: GitLab reliable fetch activated!
Jul 26 11:00:56 parton dynflow-sidekiq@orchestrator[533102]: User with login admin already exists, not seeding as admin.
Jul 26 11:01:03 parton systemd[1]: Started Foreman jobs daemon - orchestrator on sidekiq.

                                                                                
/ displaying foreman
● foreman.service - Foreman
   Loaded: loaded (/usr/lib/systemd/system/foreman.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/foreman.service.d
           └─installer.conf
   Active: active (running) since Fri 2024-07-26 11:01:06 EDT; 5min ago
     Docs: https://theforeman.org
 Main PID: 533114 (rails)
   Status: "Puma 6.4.2: cluster: 54/54, worker_status: [{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 4 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 4 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog },{ 5/5 threads, 5 available, 0 backlog }]"
    Tasks: 980 (limit: 821331)
   Memory: 5.1G
   CGroup: /system.slice/foreman.service
           ├─533114 puma 6.4.2 (unix:///run/foreman.sock) [foreman]
           ├─534203 puma: cluster worker 0: 533114 [foreman]
           ├─534208 puma: cluster worker 1: 533114 [foreman]
           ├─534214 puma: cluster worker 2: 533114 [foreman]
           ├─534220 puma: cluster worker 3: 533114 [foreman]
           ├─534226 puma: cluster worker 4: 533114 [foreman]
           ├─534232 puma: cluster worker 5: 533114 [foreman]
           ├─534238 puma: cluster worker 6: 533114 [foreman]
           ├─534244 puma: cluster worker 7: 533114 [foreman]
           ├─534250 puma: cluster worker 8: 533114 [foreman]
           ├─534256 puma: cluster worker 9: 533114 [foreman]
           ├─534265 puma: cluster worker 10: 533114 [foreman]
           ├─534273 puma: cluster worker 11: 533114 [foreman]
           ├─534281 puma: cluster worker 12: 533114 [foreman]
           ├─534292 puma: cluster worker 13: 533114 [foreman]
           ├─534298 puma: cluster worker 14: 533114 [foreman]
           ├─534308 puma: cluster worker 15: 533114 [foreman]
           ├─534316 puma: cluster worker 16: 533114 [foreman]
           ├─534325 puma: cluster worker 17: 533114 [foreman]
           ├─534336 puma: cluster worker 18: 533114 [foreman]
           ├─534343 puma: cluster worker 19: 533114 [foreman]
           ├─534356 puma: cluster worker 20: 533114 [foreman]
           ├─534369 puma: cluster worker 21: 533114 [foreman]
           ├─534380 puma: cluster worker 22: 533114 [foreman]
           ├─534390 puma: cluster worker 23: 533114 [foreman]
           ├─534395 puma: cluster worker 24: 533114 [foreman]
           ├─534409 puma: cluster worker 25: 533114 [foreman]
           ├─534430 puma: cluster worker 26: 533114 [foreman]
           ├─534460 puma: cluster worker 27: 533114 [foreman]
           ├─534472 puma: cluster worker 28: 533114 [foreman]
           ├─534481 puma: cluster worker 29: 533114 [foreman]
           ├─534485 puma: cluster worker 30: 533114 [foreman]
           ├─534494 puma: cluster worker 31: 533114 [foreman]
           ├─534500 puma: cluster worker 32: 533114 [foreman]
           ├─534506 puma: cluster worker 33: 533114 [foreman]
           ├─534510 puma: cluster worker 34: 533114 [foreman]
           ├─534521 puma: cluster worker 35: 533114 [foreman]
           ├─534525 puma: cluster worker 36: 533114 [foreman]
           ├─534539 puma: cluster worker 37: 533114 [foreman]
           ├─534548 puma: cluster worker 38: 533114 [foreman]
           ├─534555 puma: cluster worker 39: 533114 [foreman]
           ├─534567 puma: cluster worker 40: 533114 [foreman]
           ├─534573 puma: cluster worker 41: 533114 [foreman]
           ├─534582 puma: cluster worker 42: 533114 [foreman]
           ├─534588 puma: cluster worker 43: 533114 [foreman]
           ├─534594 puma: cluster worker 44: 533114 [foreman]
           ├─534605 puma: cluster worker 45: 533114 [foreman]
           ├─534616 puma: cluster worker 46: 533114 [foreman]
           ├─534622 puma: cluster worker 47: 533114 [foreman]
           ├─534628 puma: cluster worker 48: 533114 [foreman]
           ├─534634 puma: cluster worker 49: 533114 [foreman]
           ├─534640 puma: cluster worker 50: 533114 [foreman]
           ├─534646 puma: cluster worker 51: 533114 [foreman]
           ├─534661 puma: cluster worker 52: 533114 [foreman]
           └─534673 puma: cluster worker 53: 533114 [foreman]

Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 45 (PID: 534605) booted in 2.35s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 29 (PID: 534481) booted in 2.53s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 19 (PID: 534343) booted in 2.66s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 52 (PID: 534661) booted in 2.3s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 26 (PID: 534430) booted in 2.61s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 53 (PID: 534673) booted in 2.32s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 16 (PID: 534316) booted in 2.74s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 50 (PID: 534640) booted in 2.35s, phase: 0
Jul 26 11:01:06 parton foreman[533114]: [533114] - Worker 46 (PID: 534616) booted in 2.41s, phase: 0
Jul 26 11:01:06 parton systemd[1]: Started Foreman.

                                                                                
/ displaying httpd
● httpd.service - The Apache HTTP Server
   Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/httpd.service.d
           └─limits.conf
   Active: active (running) since Fri 2024-07-26 11:00:24 EDT; 6min ago
     Docs: man:httpd.service(8)
 Main PID: 533116 (httpd)
   Status: "Total requests: 110; Idle/Busy workers 93/6;Requests/sec: 0.306; Bytes served/sec: 2.4KB/sec"
    Tasks: 85 (limit: 821331)
   Memory: 28.5M
   CGroup: /system.slice/httpd.service
           ├─533116 /usr/sbin/httpd -DFOREGROUND
           ├─533141 /usr/sbin/httpd -DFOREGROUND
           └─533142 /usr/sbin/httpd -DFOREGROUND

Jul 26 11:00:24 parton systemd[1]: Starting The Apache HTTP Server...
Jul 26 11:00:24 parton systemd[1]: Started The Apache HTTP Server.
Jul 26 11:00:24 parton httpd[533116]: Server configured, listening on: port 80, port 443

                                                                                
/ displaying puppetserver
● puppetserver.service - puppetserver Service
   Loaded: loaded (/usr/lib/systemd/system/puppetserver.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/puppetserver.service.d
           └─privatetmp.conf
   Active: active (running) since Fri 2024-07-26 11:00:40 EDT; 5min ago
  Process: 529087 ExecStop=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver stop (code=exited, status=0/SUCCESS)
  Process: 533120 ExecStart=/opt/puppetlabs/server/apps/puppetserver/bin/puppetserver start (code=exited, status=0/SUCCESS)
 Main PID: 533292 (java)
    Tasks: 118 (limit: 4915)
   Memory: 1.6G
   CGroup: /system.slice/puppetserver.service
           ├─533292 /usr/bin/java -Xms2G -Xmx2G -Dcom.redhat.fips=false -Djruby.logger.class=com.puppetlabs.jruby_utils.jruby.Slf4jLogger -Dlogappender=F1 -XX:OnOutOfMemoryError=kill -9 %p -XX:ErrorFile=/var/log/puppetlabs/puppetserver/puppetserver_err_pid%p.log -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/facter.jar:/opt/puppetlabs/server/data/puppetserver/jars/* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter
           └─536648 ruby /etc/puppetlabs/puppet/node.rb orlnet00007238.subdomain.domain.com

Jul 26 11:00:24 parton systemd[1]: Starting puppetserver Service...
Jul 26 11:00:28 parton puppetserver[533292]: WARNING: abs already refers to: #'clojure.core/abs in namespace: medley.core, being replaced by: #'medley.core/abs
Jul 26 11:00:40 parton systemd[1]: Started puppetserver Service.

                                                                                
/ displaying dynflow-sidekiq@worker-1
● dynflow-sidekiq@worker-1.service - Foreman jobs daemon - worker-1 on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:01:39 EDT; 4min 45s ago
     Docs: https://theforeman.org
 Main PID: 535241 (sidekiq)
   Status: "Everything ready for world: 71d62dcc-7e94-47e1-9dfe-04d466602826"
    Tasks: 15 (limit: 821331)
   Memory: 639.2M
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker-1.service
           └─535241 sidekiq 6.5.12  [4 of 5 busy]

Jul 26 11:01:06 parton systemd[1]: Starting Foreman jobs daemon - worker-1 on sidekiq...
Jul 26 11:01:06 parton dynflow-sidekiq@worker-1[535241]: 2024-07-26T15:01:06.679Z pid=535241 tid=bhp1 INFO: Enabling systemd notification integration
Jul 26 11:01:07 parton dynflow-sidekiq@worker-1[535241]: 2024-07-26T15:01:07.485Z pid=535241 tid=bhp1 INFO: Booting Sidekiq 6.5.12 with Sidekiq::RedisConnection::RedisAdapter options {:url=>"redis://localhost:6379/6"}
Jul 26 11:01:07 parton dynflow-sidekiq@worker-1[535241]: 2024-07-26T15:01:07.486Z pid=535241 tid=bhp1 INFO: GitLab reliable fetch activated!
Jul 26 11:01:39 parton systemd[1]: Started Foreman jobs daemon - worker-1 on sidekiq.

                                                                                
/ displaying dynflow-sidekiq@worker-hosts-queue-1
● dynflow-sidekiq@worker-hosts-queue-1.service - Foreman jobs daemon - worker-hosts-queue-1 on sidekiq
   Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:01:40 EDT; 4min 45s ago
     Docs: https://theforeman.org
 Main PID: 535242 (sidekiq)
   Status: "Everything ready for world: c7b98078-dda9-47cd-9616-e23e52cd5435"
    Tasks: 15 (limit: 821331)
   Memory: 507.0M
   CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker-hosts-queue-1.service
           └─535242 sidekiq 6.5.12  [0 of 5 busy]

Jul 26 11:01:06 parton systemd[1]: Starting Foreman jobs daemon - worker-hosts-queue-1 on sidekiq...
Jul 26 11:01:06 parton dynflow-sidekiq@worker-hosts-queue-1[535242]: 2024-07-26T15:01:06.672Z pid=535242 tid=bhp2 INFO: Enabling systemd notification integration
Jul 26 11:01:07 parton dynflow-sidekiq@worker-hosts-queue-1[535242]: 2024-07-26T15:01:07.430Z pid=535242 tid=bhp2 INFO: Booting Sidekiq 6.5.12 with Sidekiq::RedisConnection::RedisAdapter options {:url=>"redis://localhost:6379/6"}
Jul 26 11:01:07 parton dynflow-sidekiq@worker-hosts-queue-1[535242]: 2024-07-26T15:01:07.432Z pid=535242 tid=bhp2 INFO: GitLab reliable fetch activated!
Jul 26 11:01:40 parton systemd[1]: Started Foreman jobs daemon - worker-hosts-queue-1 on sidekiq.

                                                                                
/ displaying foreman-proxy
● foreman-proxy.service - Foreman Proxy
   Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/foreman-proxy.service.d
           └─90-limits.conf
   Active: active (running) since Fri 2024-07-26 11:01:41 EDT; 4min 43s ago
 Main PID: 535317 (smart-proxy)
    Tasks: 7 (limit: 821331)
   Memory: 71.4M
   CGroup: /system.slice/foreman-proxy.service
           └─535317 /usr/bin/ruby /usr/share/foreman-proxy/bin/smart-proxy

Jul 26 11:01:40 parton systemd[1]: Starting Foreman Proxy...
Jul 26 11:01:41 parton systemd[1]: Started Foreman Proxy.

                                                                                
/ displaying foreman-cockpit
● foreman-cockpit.service - Foreman authentication service for Cockpit
   Loaded: loaded (/usr/lib/systemd/system/foreman-cockpit.service; enabled; vendor preset: disabled)
   Active: active (running) since Fri 2024-07-26 11:01:41 EDT; 4min 43s ago
     Docs: https://theforeman.org
 Main PID: 535333 (cockpit-ws)
    Tasks: 2 (limit: 821331)
   Memory: 1.2M
   CGroup: /system.slice/foreman-cockpit.service
           └─535333 /usr/libexec/cockpit-ws --no-tls --address 127.0.0.1 --port 19090

Jul 26 11:01:41 parton systemd[1]: Started Foreman authentication service for Cockpit.

                                                                                
/ All services displayed
                                                                                
/ All services are running                                            e[32me[1m[OK]e[0m
--------------------------------------------------------------------------------


Downgraded back to 4.4.10 and it looks like it is back up and working

I have created 2300323 – candlepin-4.4.12-1 DB upgrade gets stuck to track this issue. I am not able to reproduce on a fresh candlepin installation. It likely has to do with the shape/size of existing data.

Hi, does anyone that has hit the candlepin-4.4.12-1 upgrade issue send us a DB dump to help reproduce? We cannot reproduce it locally, and we think the DB data may play a big role.

Just to add my contribution: I’m tried to upgrade from 3.8 to 3.11 (via 3.9 and 3.10)
3.8 to 3.9 and 3.9 to 3.10: OK

3.10 to 3.11: some error during the first steps:

dnf module disable pulpcore => some errors.

So reboot => same error
Continue with:

dnf -y module switch-to postgresql:13 => OK

Test again

dnf module disable pulpcore => still some errors, but not so important

Continue procedure.
Launch:

foreman-installer

foreman-rake upgrade:run => took a long long time to finish, and
Upgrade Step 2/2: katello:clean_backend_objects. This may take a long while.
Failed upgrade task: katello:clean_backend_objects, see logs for more information.
Success!

The full log is at /var/log/foreman-installer/katello.log

After that, the foreman server was responding, but takes 5mn to respond to each command.

I tested to modify the candlepin.conf file (change Manage to None) => server no more respond. I’m currently testing the “cpdb --update-schema”

But if needed I could try to create the DB dump, just let me know which command I have to run.

I tried to add the complete log file of the upgrade, but “new users are not allowed to post files!”

Please see Feedback for Foreman 3.11 & Katello 4.13 - #68 by Ceiu

Thanks for the link. I miss this important information.
So I will stay in the previous version and wait for candlepin patch.
Except if someone could confirm that candlepin 4.4.10 is not affected by the problem.

Hi, I confirm that 4.4.10 is not affected by the problem.

Thanks, I will test

candlepin-4.4.13-1 is now available, which contains the fix for the issue

2 Likes

Hi,

Foreman/Katello instance upgraded last friday from 3.8/4.10 up to 3.11/4.13, and had to downgrade candelpin packages to 4.4.10 as 4.4.12 was not working at that time.
Reported this in the ticket Feedback for Foreman 3.11 & Katello 4.13 and forgot to check updates on the ticket until today…

I had a new RHEL8 VM registred today and was not able to install anything nor refresh metadata cache: exact same error 403. After checking with rct cat-cert /etc/pki/entitlement/XXX.pem command, the authorized content URL was not OK, aka contains %2F after organization name.

Following details provided on the other ticket, I installed as a workaround rng-tools, enabled the rngd service and then upgraded again candelpin from 4.4.10 to 4.4.12. It was successful this time as the liquibase runned correctly at startup (as seen in the catalina log file).

After re-registering the RHEL8 host in Katello the certificate was this time fully valid and then packages installation was finally working.

So it seems that in some cases it works “out-of-the-box” after upgrading to latest version, sometime it fails with an existing database or an empty one.

Hope this can help.

Best Regards,
Nicolas.

Humm … interesting

I had another Foreman/Katello instance that was updated from 3.9/4.11 to 3.11/4.13 on 7/23 and at that time it was candelpin 4.4.10 that was installed (no downgrade at that time, either no 4.4.12 available or my cache was too old and not expired) and everything worked as expected.

Then, I refreshed the cache and launched the update to get latest packages and the update is stuck right now at the candlepin startup.
It is the exact same issue I had with the other instance updated on 7/19… no liquibase logs in catalina logs at candelpin startup and production.log filled with errors:

[...]
2024-07-31T18:42:08 [I|app|323e5e42] Started GET "/" for 10.140.0.10 at 2024-07-31 18:42:08 +0200
2024-07-31T18:42:21 [E|app|cb18b8c7] Error occurred while starting Katello::CandlepinEventListener
2024-07-31T18:42:21 [E|app|cb18b8c7] Connection refused - connect(2) for "localhost" port 61613
2024-07-31T18:42:21 [E|app|cb18b8c7] /usr/share/gems/gems/stomp-1.4.10/lib/connection/netio.rb:461:in `initialize'
[...]

Checked again to be sure:

# rpm -qa |grep candlepin
candlepin-4.4.13-1.el8.noarch
candlepin-selinux-4.4.13-1.el8.noarch

I just installed rng-tools package and enabled rngd.service, and without doing anything else, while foreman-installer wass still trying to run foreman-rake upgrade:run, candlepin startup finished !!

As you can see in the following catalina log file, there is a gap between the last startup log at 18.28.25 and the start of liquibase at 18:45:59:

31-Jul-2024 18:28:19.483 INFOS [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Un version ancienne [1.2.35] de la bibliothèque Apache Tomcat Native basée sur APR est installée, alors que Tomcat recommande au minimum la version [1.2.38]
31-Jul-2024 18:28:19.490 INFOS [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Chargement de la librairie Apache Tomcat Native [1.2.35] en utilisant APR version [1.6.3]
31-Jul-2024 18:28:19.491 INFOS [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Fonctionnalités d'APR : IPv6 [true], sendfile [true], accept filters [false], random [true], UDS [true]
31-Jul-2024 18:28:19.491 INFOS [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Configuration de APR/OpenSSL : useAprConnector [false], useOpenSSL [true]
31-Jul-2024 18:28:19.496 INFOS [main] org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL a été initialisé avec succès [OpenSSL 1.1.1k  FIPS 25 Mar 2021]
31-Jul-2024 18:28:19.932 INFOS [main] org.apache.coyote.AbstractProtocol.init Initialisation du gestionnaire de protocole ["https-openssl-nio-127.0.0.1-23443"]
31-Jul-2024 18:28:19.963 AVERTISSEMENT [main] org.apache.tomcat.util.net.SSLUtilBase.getEnabled Tomcat interprête l'attribut [ciphers] de manière à être cohérent ave la dernière branche de développement d'OpenSSL. Certains de ceux qui ont été spéifiés [ciphers] ne sont pas suportés par le moteur SSL configré pour ce connecteur (qui pourrait utiliser JSSE ou une version antérieure d'OpenSSL) et ont été ignorés: [[TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256]]
31-Jul-2024 18:28:20.689 INFOS [main] org.apache.tomcat.util.net.AbstractEndpoint.logCertificate Connecteur [https-openssl-nio-127.0.0.1-23443], hôte virtuel TLS [_default_], type de certificat [UNDEFINED] configuré depuis [/etc/candlepin/certs/keystore] avec l'alias [tomcat] et la trust store [/etc/candlepin/certs/truststore]
31-Jul-2024 18:28:20.701 INFOS [main] org.apache.catalina.startup.Catalina.load L'initialisation du serveur a pris [1513] millisecondes
31-Jul-2024 18:28:20.744 INFOS [main] org.apache.catalina.core.StandardService.startInternal Démarrage du service [Catalina]
31-Jul-2024 18:28:20.744 INFOS [main] org.apache.catalina.core.StandardEngine.startInternal Démarrage du moteur de Servlets : [Apache Tomcat/9.0.87]
31-Jul-2024 18:28:20.751 INFOS [main] org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du répertoire d'application web [/var/lib/tomcat/webapps/candlepin]
31-Jul-2024 18:28:25.525 INFOS [main] org.apache.jasper.servlet.TldScanner.scanJars Au moins un fichier JAR a été analysé pour trouver des TLDs mais il n'en contenait pas, le mode "debug" du journal peut être activé pour obtenir une liste complète de JAR scannés sans succès ; éviter d'analyser des JARs inutilement peut améliorer sensiblement le temps de démarrage et le temps de compilation des JSPs
31-Jul-2024 18:45:59.814 INFOS [main] liquibase.database.null Set default schema name to public
31-Jul-2024 18:45:59.843 INFOS [main] liquibase.changelog.null Reading from public.databasechangelog
31-Jul-2024 18:46:01.555 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.591 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.EntitlementCertificateCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.597 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.OwnerCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.625 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.625 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ProductCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.625 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ProductCurator.merge(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.631 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ContentCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.646 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.ConsumerCurator.create(org.candlepin.model.Persisted,boolean)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.647 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.ConsumerCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.683 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.CdnCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.690 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.PoolCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.745 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public org.candlepin.model.Persisted org.candlepin.model.RulesCurator.create(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:01.745 AVERTISSEMENT [main] com.google.inject.internal.ProxyFactory.<init> Method [public void org.candlepin.model.RulesCurator.delete(org.candlepin.model.Persisted)] is synthetic and is being intercepted by [com.google.inject.persist.jpa.JpaLocalTxnInterceptor@1d57e7e5]. This could indicate a bug.  The method may be intercepted twice, or may not be intercepted at all.
31-Jul-2024 18:46:12.504 INFOS [main] org.apache.catalina.startup.HostConfig.deployDirectory Le déploiement du répertoire [/var/lib/tomcat/webapps/candlepin] de l'application web s'est terminé en [1 071 753] ms
31-Jul-2024 18:46:12.509 INFOS [main] org.apache.coyote.AbstractProtocol.start Démarrage du gestionnaire de protocole ["https-openssl-nio-127.0.0.1-23443"]
31-Jul-2024 18:46:12.521 INFOS [main] org.apache.catalina.startup.Catalina.start Le démarrage du serveur a pris [1071819] millisecondes

The liquibase part started right after the rngd.service launch:

● rngd.service - Hardware RNG Entropy Gatherer Daemon
   Loaded: loaded (/usr/lib/systemd/system/rngd.service; enabled; vendor preset: enabled)
   Active: active (running) since Wed 2024-07-31 18:45:53 CEST; 7min ago
 Main PID: 1290235 (rngd)

Seems that there is something more in some setups and the new candlepin release did not fully fix it :pensive:

Hello,

I upgraded my Foreman with Katello from 3.11 to 3.12 and it is still not wortking, according to the
docu Upgrading Foreman to 3.12 .

does anyone has a working solution without new installation ?

Best regrads
T. Reineck

ah ok by setting the correct OS level it ist workimng now.

subscription-manager release --set 9.2
subscription-manager release --show