Problem:
After running dnf update
(with reboot), Foreman doesn’t start anymore.
Apache httpd doesn’t start due to a problem in the config file.
/var/log/messages
:
AH00526: Syntax error on line 5 of /etc/httpd/conf.d/ssl.conf:
Cannot define multiple Listeners on the same IP:port
Expected outcome:
System updates should not break the Foreman application.
Ideally, it should be possible to run system updates independently from Foreman application updates.
Foreman and Proxy versions:
3.5.1
Foreman and Proxy plugin versions:
katello 4.7.2
foreman-tasks 7.1.1
foreman_remote_execution 8.2.0
Distribution and version:
Rocky Linux 8.7
Kernel 4.18.0-425.10.1.el8_7.x86_64
Other relevant data:
I made a snapshot of the Foreman VM before running dnf update
.
When I go back to the VM snapshot, and reboot, there is no problem.
Hence, I can reproduce the problem.
Stopped Foreman application cleanly:
# foreman-maintain service stop
Running Stop Services
================================================================================
Check if command is run as root user: [OK]
--------------------------------------------------------------------------------
Stop applicable services:
Stopping the following service(s):
redis, postgresql, pulpcore-api, pulpcore-content, pulpcore-api.socket, pulpcore-content.socket, pulpcore-worker@1.service, pulpcore-worker@2.service, pulpcore-worker@3.service, pulpcore-worker@4.service, tomcat, dynflow-sidekiq@orchestrator, foreman, httpd, foreman.socket, dynflow-sidekiq@worker-1, dynflow-sidekiq@worker-hosts-queue-1, foreman-proxy
/ All services stopped [OK]
--------------------------------------------------------------------------------
Running dnf update
:
# dnf clean all
69 files removed
# dnf update
Rocky Linux 8 - AppStream 12 MB/s | 9.9 MB 00:00
Rocky Linux 8 - BaseOS 14 MB/s | 4.9 MB 00:00
Rocky Linux 8 - Extras 36 kB/s | 12 kB 00:00
Rocky Linux 8 - PowerTools 10 MB/s | 2.7 MB 00:00
Foreman 3.5 1.8 MB/s | 1.8 MB 00:00
Foreman plugins 3.5 7.0 MB/s | 2.0 MB 00:00
Katello 4.7 2.4 MB/s | 635 kB 00:00
Candlepin: an open source entitlement management system. 137 kB/s | 30 kB 00:00
pulpcore: Fetch, Upload, Organize, and Distribute Software Packages. 1.1 MB/s | 277 kB 00:00
Puppet 7 Repository el 8 - x86_64 11 MB/s | 16 MB 00:01
Dependencies resolved.
=====================================================================================================================================================================================================================================
Package Architecture Version Repository Size
=====================================================================================================================================================================================================================================
Installing:
kernel x86_64 4.18.0-425.13.1.el8_7 baseos 8.8 M
Upgrading:
bpftool x86_64 4.18.0-425.13.1.el8_7 baseos 9.6 M
candlepin noarch 4.2.13-1.el8 katello-candlepin 67 M
candlepin-selinux noarch 4.2.13-1.el8 katello-candlepin 497 k
curl x86_64 7.61.1-25.el8_7.2 baseos 351 k
grub2-common noarch 1:2.02-142.el8_7.3.rocky.0.2 baseos 894 k
grub2-pc x86_64 1:2.02-142.el8_7.3.rocky.0.2 baseos 45 k
grub2-pc-modules noarch 1:2.02-142.el8_7.3.rocky.0.2 baseos 921 k
grub2-tools x86_64 1:2.02-142.el8_7.3.rocky.0.2 baseos 2.0 M
grub2-tools-efi x86_64 1:2.02-142.el8_7.3.rocky.0.2 baseos 478 k
grub2-tools-extra x86_64 1:2.02-142.el8_7.3.rocky.0.2 baseos 1.1 M
grub2-tools-minimal x86_64 1:2.02-142.el8_7.3.rocky.0.2 baseos 212 k
httpd x86_64 2.4.37-51.module+el8.7.0+1155+5163394a.1 appstream 1.4 M
httpd-filesystem noarch 2.4.37-51.module+el8.7.0+1155+5163394a.1 appstream 41 k
httpd-tools x86_64 2.4.37-51.module+el8.7.0+1155+5163394a.1 appstream 109 k
iptables x86_64 1.8.4-23.el8_7.1 baseos 585 k
iptables-ebtables x86_64 1.8.4-23.el8_7.1 baseos 72 k
iptables-libs x86_64 1.8.4-23.el8_7.1 baseos 108 k
katello noarch 4.7.3-1.el8 katello 16 k
katello-common noarch 4.7.3-1.el8 katello 24 k
katello-debug noarch 4.7.3-1.el8 katello 17 k
katello-repos noarch 4.7.3-1.el8 katello 17 k
kernel-tools x86_64 4.18.0-425.13.1.el8_7 baseos 9.1 M
kernel-tools-libs x86_64 4.18.0-425.13.1.el8_7 baseos 8.9 M
kmod-kvdo x86_64 6.2.7.17-88.el8_7 baseos 347 k
libcurl x86_64 7.61.1-25.el8_7.2 baseos 301 k
libnfsidmap x86_64 1:2.3.3-57.el8_7.1 baseos 121 k
libsmbclient x86_64 4.16.4-4.el8_7 baseos 151 k
libvirt-libs x86_64 8.0.0-10.2.module+el8.7.0+1151+ecbb9390 appstream 4.7 M
libwbclient x86_64 4.16.4-4.el8_7 baseos 123 k
mod_ssl x86_64 1:2.4.37-51.module+el8.7.0+1155+5163394a.1 appstream 138 k
openssh x86_64 8.0p1-17.el8_7 baseos 522 k
openssh-clients x86_64 8.0p1-17.el8_7 baseos 668 k
openssh-server x86_64 8.0p1-17.el8_7 baseos 492 k
platform-python x86_64 3.6.8-48.el8_7.1.rocky.0 baseos 86 k
platform-python-setuptools noarch 39.2.0-6.el8_7.1 baseos 631 k
python3-libs x86_64 3.6.8-48.el8_7.1.rocky.0 baseos 7.8 M
python3-perf x86_64 4.18.0-425.13.1.el8_7 baseos 9.0 M
python3-setuptools noarch 39.2.0-6.el8_7.1 baseos 162 k
python3-setuptools-wheel noarch 39.2.0-6.el8_7.1 baseos 288 k
rubygem-katello noarch 4.7.3-1.el8 katello 10 M
samba-client-libs x86_64 4.16.4-4.el8_7 baseos 5.0 M
samba-common noarch 4.16.4-4.el8_7 baseos 226 k
samba-common-libs x86_64 4.16.4-4.el8_7 baseos 178 k
systemd x86_64 239-68.el8_7.4 baseos 3.6 M
systemd-libs x86_64 239-68.el8_7.4 baseos 1.1 M
systemd-pam x86_64 239-68.el8_7.4 baseos 491 k
systemd-udev x86_64 239-68.el8_7.4 baseos 1.6 M
tar x86_64 2:1.30-6.el8_7.1 baseos 837 k
Installing dependencies:
kernel-core x86_64 4.18.0-425.13.1.el8_7 baseos 41 M
kernel-modules x86_64 4.18.0-425.13.1.el8_7 baseos 33 M
Removing:
kernel x86_64 4.18.0-348.el8.0.2 @anaconda 0
kernel-core x86_64 4.18.0-348.el8.0.2 @anaconda 68 M
kernel-modules x86_64 4.18.0-348.el8.0.2 @anaconda 22 M
Transaction Summary
=====================================================================================================================================================================================================================================
Install 3 Packages
Upgrade 48 Packages
Remove 3 Packages
Total download size: 235 M
Is this ok [y/N]: y
Downloading Packages:
<< / snip / >>
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Total 42 MB/s | 235 MB 00:05
Running transaction check
Transaction check succeeded.
Running transaction test
Transaction test succeeded.
Running transaction
<< - snip - >>
Running scriptlet: httpd-filesystem-2.4.37-51.module+el8.7.0+1155+5163394a.1.noarch 17/102
Upgrading : httpd-filesystem-2.4.37-51.module+el8.7.0+1155+5163394a.1.noarch 17/102
<< - snip - >>
Upgrading : httpd-tools-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64 31/102
Upgrading : httpd-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64 32/102
Running scriptlet: httpd-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64 32/102
<< - snip - >>
Running scriptlet: httpd-2.4.37-51.module+el8.7.0+1059+126e9251.x86_64 63/102
Cleanup : httpd-2.4.37-51.module+el8.7.0+1059+126e9251.x86_64 63/102
Running scriptlet: httpd-2.4.37-51.module+el8.7.0+1059+126e9251.x86_64 63/102
<< - snip - >>
Cleanup : httpd-tools-2.4.37-51.module+el8.7.0+1059+126e9251.x86_64 98/102
<< - snip - >>
Running scriptlet: httpd-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64 102/102
<< - snip - >>
Upgraded:
bpftool-4.18.0-425.13.1.el8_7.x86_64 candlepin-4.2.13-1.el8.noarch candlepin-selinux-4.2.13-1.el8.noarch
curl-7.61.1-25.el8_7.2.x86_64 grub2-common-1:2.02-142.el8_7.3.rocky.0.2.noarch grub2-pc-1:2.02-142.el8_7.3.rocky.0.2.x86_64
grub2-pc-modules-1:2.02-142.el8_7.3.rocky.0.2.noarch grub2-tools-1:2.02-142.el8_7.3.rocky.0.2.x86_64 grub2-tools-efi-1:2.02-142.el8_7.3.rocky.0.2.x86_64
grub2-tools-extra-1:2.02-142.el8_7.3.rocky.0.2.x86_64 grub2-tools-minimal-1:2.02-142.el8_7.3.rocky.0.2.x86_64 httpd-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64
httpd-filesystem-2.4.37-51.module+el8.7.0+1155+5163394a.1.noarch httpd-tools-2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64 iptables-1.8.4-23.el8_7.1.x86_64
iptables-ebtables-1.8.4-23.el8_7.1.x86_64 iptables-libs-1.8.4-23.el8_7.1.x86_64 katello-4.7.3-1.el8.noarch
katello-common-4.7.3-1.el8.noarch katello-debug-4.7.3-1.el8.noarch katello-repos-4.7.3-1.el8.noarch
kernel-tools-4.18.0-425.13.1.el8_7.x86_64 kernel-tools-libs-4.18.0-425.13.1.el8_7.x86_64 kmod-kvdo-6.2.7.17-88.el8_7.x86_64
libcurl-7.61.1-25.el8_7.2.x86_64 libnfsidmap-1:2.3.3-57.el8_7.1.x86_64 libsmbclient-4.16.4-4.el8_7.x86_64
libvirt-libs-8.0.0-10.2.module+el8.7.0+1151+ecbb9390.x86_64 libwbclient-4.16.4-4.el8_7.x86_64 mod_ssl-1:2.4.37-51.module+el8.7.0+1155+5163394a.1.x86_64
openssh-8.0p1-17.el8_7.x86_64 openssh-clients-8.0p1-17.el8_7.x86_64 openssh-server-8.0p1-17.el8_7.x86_64
platform-python-3.6.8-48.el8_7.1.rocky.0.x86_64 platform-python-setuptools-39.2.0-6.el8_7.1.noarch python3-libs-3.6.8-48.el8_7.1.rocky.0.x86_64
python3-perf-4.18.0-425.13.1.el8_7.x86_64 python3-setuptools-39.2.0-6.el8_7.1.noarch python3-setuptools-wheel-39.2.0-6.el8_7.1.noarch
rubygem-katello-4.7.3-1.el8.noarch samba-client-libs-4.16.4-4.el8_7.x86_64 samba-common-4.16.4-4.el8_7.noarch
samba-common-libs-4.16.4-4.el8_7.x86_64 systemd-239-68.el8_7.4.x86_64 systemd-libs-239-68.el8_7.4.x86_64
systemd-pam-239-68.el8_7.4.x86_64 systemd-udev-239-68.el8_7.4.x86_64 tar-2:1.30-6.el8_7.1.x86_64
Installed:
kernel-4.18.0-425.13.1.el8_7.x86_64 kernel-core-4.18.0-425.13.1.el8_7.x86_64 kernel-modules-4.18.0-425.13.1.el8_7.x86_64
Removed:
kernel-4.18.0-348.el8.0.2.x86_64 kernel-core-4.18.0-348.el8.0.2.x86_64 kernel-modules-4.18.0-348.el8.0.2.x86_64
Complete!
Reboot
After Reboot, Foreman isn’t running.
# foreman-maintain service status
Running Status Services
================================================================================
Get status of applicable services:
Displaying the following service(s):
redis, postgresql, pulpcore-api, pulpcore-content, pulpcore-worker@1.service, pulpcore-worker@2.service, pulpcore-worker@3.service, pulpcore-worker@4.service, tomcat, dynflow-sidekiq@orchestrator, foreman, httpd, dynflow-sidekiq@worker-1, dynflow-sidekiq@worker-hosts-queue-1, foreman-proxy
| displaying redis
● redis.service - Redis persistent key-value database
Loaded: loaded (/usr/lib/systemd/system/redis.service; enabled; vendor preset: disabled)
Drop-In: /etc/systemd/system/redis.service.d
└─90-limits.conf
Active: active (running) since Thu 2023-02-23 13:38:39 CET; 2min 50s ago
Main PID: 1185 (redis-server)
Status: "Ready to accept connections"
Tasks: 5 (limit: 126860)
Memory: 8.8M
CGroup: /system.slice/redis.service
└─1185 /usr/bin/redis-server 127.0.0.1:6379
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Redis persistent key-value database...
Feb 23 13:38:39 foreman.my.org systemd[1]: Started Redis persistent key-value database.
/ displaying postgresql
● postgresql.service - PostgreSQL database server
Loaded: loaded (/usr/lib/systemd/system/postgresql.service; enabled; vendor preset: disabled)
Drop-In: /etc/systemd/system/postgresql.service.d
└─postgresql.conf
Active: active (running) since Thu 2023-02-23 13:38:39 CET; 2min 50s ago
Process: 1167 ExecStartPre=/usr/libexec/postgresql-check-db-dir postgresql (code=exited, status=0/SUCCESS)
Main PID: 1196 (postmaster)
Tasks: 46 (limit: 126860)
Memory: 254.2M
CGroup: /system.slice/postgresql.service
├─1196 /usr/bin/postmaster -D /var/lib/pgsql/data
├─1214 postgres: logger
├─1219 postgres: checkpointer
├─1220 postgres: background writer
├─1221 postgres: walwriter
├─1222 postgres: autovacuum launcher
├─1223 postgres: stats collector
├─1224 postgres: logical replication launcher
├─1970 postgres: pulp pulpcore ::1(56478) idle
├─1973 postgres: pulp pulpcore ::1(56484) idle
├─1974 postgres: pulp pulpcore ::1(56496) idle
├─1987 postgres: pulp pulpcore ::1(56512) idle
├─2002 postgres: pulp pulpcore ::1(56518) idle
├─2007 postgres: pulp pulpcore ::1(56534) idle
├─2009 postgres: pulp pulpcore ::1(56540) idle
├─2010 postgres: pulp pulpcore ::1(56546) idle
├─2011 postgres: pulp pulpcore ::1(56550) idle
├─2012 postgres: pulp pulpcore ::1(56562) idle
├─2013 postgres: pulp pulpcore ::1(56572) idle
├─2015 postgres: pulp pulpcore ::1(56588) idle
├─2018 postgres: pulp pulpcore ::1(56590) idle
├─2659 postgres: candlepin candlepin 127.0.0.1(47076) idle in transaction
├─4079 postgres: foreman foreman [local] idle
├─4284 postgres: foreman foreman [local] idle
├─4293 postgres: foreman foreman [local] idle
├─4295 postgres: foreman foreman [local] idle
├─4301 postgres: foreman foreman [local] idle
├─4320 postgres: foreman foreman [local] idle
├─4329 postgres: foreman foreman [local] idle
├─4330 postgres: foreman foreman [local] idle
├─4431 postgres: foreman foreman [local] idle
├─4432 postgres: foreman foreman [local] idle
├─4433 postgres: foreman foreman [local] idle
├─4434 postgres: foreman foreman [local] idle
├─4436 postgres: foreman foreman [local] idle
├─4441 postgres: foreman foreman [local] idle
├─4448 postgres: foreman foreman [local] idle
├─4449 postgres: foreman foreman [local] idle
├─4452 postgres: foreman foreman [local] idle
├─4453 postgres: foreman foreman [local] idle
├─4454 postgres: foreman foreman [local] idle
├─4455 postgres: foreman foreman [local] idle
├─4457 postgres: foreman foreman [local] idle
├─4471 postgres: foreman foreman [local] idle
├─4484 postgres: foreman foreman [local] idle
└─4489 postgres: foreman foreman [local] idle
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting PostgreSQL database server...
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: PostgreSQL 12.12 on x86_64-redhat-linux-gnu, compiled by gcc (GCC) 8.5.0 20210514 (Red Hat 8.5.0-10), 64-bit startet
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: erwarte Verbindungen auf IPv6-Adresse »::1«, Port 5432
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: erwarte Verbindungen auf IPv4-Adresse »127.0.0.1«, Port 5432
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: erwarte Verbindungen auf Unix-Socket »/var/run/postgresql/.s.PGSQL.5432«
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: erwarte Verbindungen auf Unix-Socket »/tmp/.s.PGSQL.5432«
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET LOG: Logausgabe wird an Logsammelprozess umgeleitet
Feb 23 13:38:39 foreman.my.org postmaster[1196]: 2023-02-23 13:38:39 CET TIPP: Die weitere Logausgabe wird im Verzeichnis »log« erscheinen.
Feb 23 13:38:39 foreman.my.org systemd[1]: Started PostgreSQL database server.
/ displaying pulpcore-api
● pulpcore-api.service - Pulp API Server
Loaded: loaded (/etc/systemd/system/pulpcore-api.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 49s ago
Main PID: 1181 (gunicorn)
Status: "Gunicorn arbiter booted"
Tasks: 6 (limit: 126860)
Memory: 503.7M
CGroup: /system.slice/pulpcore-api.service
├─1181 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
├─1373 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
├─1380 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
├─1398 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
├─1413 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
└─1418 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.app.wsgi:application --timeout 90 -w 5 --access-logfile - --access-logformat pulp [%({correlation-id}o)s]: %(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Pulp API Server...
Feb 23 13:38:40 foreman.my.org pulpcore-api[1181]: [2023-02-23 13:38:40 +0100] [1181] [INFO] Starting gunicorn 20.1.0
Feb 23 13:38:40 foreman.my.org pulpcore-api[1181]: [2023-02-23 13:38:40 +0100] [1181] [INFO] Listening at: unix:/run/pulpcore-api.sock (1181)
Feb 23 13:38:40 foreman.my.org pulpcore-api[1181]: [2023-02-23 13:38:40 +0100] [1181] [INFO] Using worker: sync
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp API Server.
Feb 23 13:38:40 foreman.my.org pulpcore-api[1373]: [2023-02-23 13:38:40 +0100] [1373] [INFO] Booting worker with pid: 1373
Feb 23 13:38:40 foreman.my.org pulpcore-api[1380]: [2023-02-23 13:38:40 +0100] [1380] [INFO] Booting worker with pid: 1380
Feb 23 13:38:40 foreman.my.org pulpcore-api[1398]: [2023-02-23 13:38:40 +0100] [1398] [INFO] Booting worker with pid: 1398
Feb 23 13:38:40 foreman.my.org pulpcore-api[1413]: [2023-02-23 13:38:40 +0100] [1413] [INFO] Booting worker with pid: 1413
Feb 23 13:38:40 foreman.my.org pulpcore-api[1418]: [2023-02-23 13:38:40 +0100] [1418] [INFO] Booting worker with pid: 1418
/ displaying pulpcore-content
● pulpcore-content.service - Pulp Content App
Loaded: loaded (/etc/systemd/system/pulpcore-content.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 48s ago
Main PID: 1168 (gunicorn)
Status: "Gunicorn arbiter booted"
Tasks: 19 (limit: 126860)
Memory: 924.2M
CGroup: /system.slice/pulpcore-content.service
├─1168 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1470 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1474 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1480 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1493 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1514 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1516 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1532 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
├─1534 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
└─1550 /usr/bin/python3.9 /usr/bin/gunicorn pulpcore.content:server --timeout 90 --worker-class aiohttp.GunicornWebWorker -w 9 --access-logfile -
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp Content App.
Feb 23 13:38:40 foreman.my.org pulpcore-content[1470]: [2023-02-23 13:38:40 +0100] [1470] [INFO] Booting worker with pid: 1470
Feb 23 13:38:40 foreman.my.org pulpcore-content[1474]: [2023-02-23 13:38:40 +0100] [1474] [INFO] Booting worker with pid: 1474
Feb 23 13:38:40 foreman.my.org pulpcore-content[1480]: [2023-02-23 13:38:40 +0100] [1480] [INFO] Booting worker with pid: 1480
Feb 23 13:38:41 foreman.my.org pulpcore-content[1493]: [2023-02-23 13:38:41 +0100] [1493] [INFO] Booting worker with pid: 1493
Feb 23 13:38:41 foreman.my.org pulpcore-content[1514]: [2023-02-23 13:38:41 +0100] [1514] [INFO] Booting worker with pid: 1514
Feb 23 13:38:41 foreman.my.org pulpcore-content[1516]: [2023-02-23 13:38:41 +0100] [1516] [INFO] Booting worker with pid: 1516
Feb 23 13:38:41 foreman.my.org pulpcore-content[1532]: [2023-02-23 13:38:41 +0100] [1532] [INFO] Booting worker with pid: 1532
Feb 23 13:38:41 foreman.my.org pulpcore-content[1534]: [2023-02-23 13:38:41 +0100] [1534] [INFO] Booting worker with pid: 1534
Feb 23 13:38:41 foreman.my.org pulpcore-content[1550]: [2023-02-23 13:38:41 +0100] [1550] [INFO] Booting worker with pid: 1550
/ displaying pulpcore-worker@1.service
● pulpcore-worker@1.service - Pulp Worker
Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 49s ago
Main PID: 1387 (pulpcore-worker)
Tasks: 1 (limit: 126860)
Memory: 97.9M
CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@1.service
└─1387 /usr/bin/python3.9 /usr/bin/pulpcore-worker
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp Worker.
Feb 23 13:38:56 foreman.my.org pulpcore-worker-1[1387]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Feb 23 13:38:56 foreman.my.org pulpcore-worker-1[1387]: pulp [None]: pulpcore.tasking.pulpcore_worker:INFO: New worker '1387@foreman.my.org' discovered
/ displaying pulpcore-worker@2.service
● pulpcore-worker@2.service - Pulp Worker
Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 49s ago
Main PID: 1390 (pulpcore-worker)
Tasks: 1 (limit: 126860)
Memory: 97.8M
CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@2.service
└─1390 /usr/bin/python3.9 /usr/bin/pulpcore-worker
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp Worker.
Feb 23 13:38:56 foreman.my.org pulpcore-worker-2[1390]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Feb 23 13:38:56 foreman.my.org pulpcore-worker-2[1390]: pulp [None]: pulpcore.tasking.pulpcore_worker:INFO: New worker '1390@foreman.my.org' discovered
/ displaying pulpcore-worker@3.service
● pulpcore-worker@3.service - Pulp Worker
Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 49s ago
Main PID: 1389 (pulpcore-worker)
Tasks: 1 (limit: 126860)
Memory: 98.1M
CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@3.service
└─1389 /usr/bin/python3.9 /usr/bin/pulpcore-worker
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp Worker.
Feb 23 13:38:56 foreman.my.org pulpcore-worker-3[1389]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Feb 23 13:38:56 foreman.my.org pulpcore-worker-3[1389]: pulp [None]: pulpcore.tasking.pulpcore_worker:INFO: New worker '1389@foreman.my.org' discovered
/ displaying pulpcore-worker@4.service
● pulpcore-worker@4.service - Pulp Worker
Loaded: loaded (/etc/systemd/system/pulpcore-worker@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:40 CET; 2min 49s ago
Main PID: 1391 (pulpcore-worker)
Tasks: 1 (limit: 126860)
Memory: 98.0M
CGroup: /system.slice/system-pulpcore\x2dworker.slice/pulpcore-worker@4.service
└─1391 /usr/bin/python3.9 /usr/bin/pulpcore-worker
Feb 23 13:38:40 foreman.my.org systemd[1]: Started Pulp Worker.
Feb 23 13:38:56 foreman.my.org pulpcore-worker-4[1391]: pulp [None]: pulpcore.tasking.entrypoint:INFO: Starting distributed type worker
Feb 23 13:38:56 foreman.my.org pulpcore-worker-4[1391]: pulp [None]: pulpcore.tasking.pulpcore_worker:INFO: New worker '1391@foreman.my.org' discovered
/ displaying tomcat
● tomcat.service - Apache Tomcat Web Application Container
Loaded: loaded (/usr/lib/systemd/system/tomcat.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:38:39 CET; 2min 50s ago
Main PID: 1179 (java)
Tasks: 37 (limit: 126860)
Memory: 960.8M
CGroup: /system.slice/tomcat.service
└─1179 /usr/lib/jvm/jre-11/bin/java -Xms1024m -Xmx4096m -Dcom.redhat.fips=false -Djava.security.auth.login.config=/usr/share/tomcat/conf/login.config -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/ant.jar:/usr/share/java/ant-launcher.jar:/usr/lib/jvm/java/lib/tools.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start
Feb 23 13:39:04 foreman.my.org server[1179]: at org.apache.catalina.startup.Catalina.start(Catalina.java:633)
Feb 23 13:39:04 foreman.my.org server[1179]: at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Feb 23 13:39:04 foreman.my.org server[1179]: at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
Feb 23 13:39:04 foreman.my.org server[1179]: at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
Feb 23 13:39:04 foreman.my.org server[1179]: at java.base/java.lang.reflect.Method.invoke(Method.java:566)
Feb 23 13:39:04 foreman.my.org server[1179]: at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:343)
Feb 23 13:39:04 foreman.my.org server[1179]: at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:474)
Feb 23 13:39:04 foreman.my.org server[1179]: 23-Feb-2023 13:39:04.167 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/var/lib/tomcat/webapps/candlepin] has finished in [17,890] ms
Feb 23 13:39:04 foreman.my.org server[1179]: 23-Feb-2023 13:39:04.182 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["https-jsse-nio-127.0.0.1-23443"]
Feb 23 13:39:04 foreman.my.org server[1179]: 23-Feb-2023 13:39:04.253 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [18,208] milliseconds
/ displaying dynflow-sidekiq@orchestrator
● dynflow-sidekiq@orchestrator.service - Foreman jobs daemon - orchestrator on sidekiq
Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:39:09 CET; 2min 19s ago
Docs: https://theforeman.org
Main PID: 1177 (sidekiq)
Status: "Everything ready for world: 9c499056-cd3e-4b18-ae82-920240200f2d"
Tasks: 12 (limit: 126860)
Memory: 381.3M
CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@orchestrator.service
└─1177 sidekiq 6.3.1 [0 of 1 busy]
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Foreman jobs daemon - orchestrator on sidekiq...
Feb 23 13:38:40 foreman.my.org dynflow-sidekiq@orchestrator[1177]: 2023-02-23T12:38:40.462Z pid=1177 tid=5 INFO: Enabling systemd notification integration
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@orchestrator[1177]: 2023-02-23T12:38:44.467Z pid=1177 tid=5 INFO: Booting Sidekiq 6.3.1 with redis options {:url=>"redis://localhost:6379/0"}
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@orchestrator[1177]: 2023-02-23T12:38:44.489Z pid=1177 tid=5 INFO: GitLab reliable fetch activated!
Feb 23 13:39:09 foreman.my.org systemd[1]: Started Foreman jobs daemon - orchestrator on sidekiq.
/ displaying foreman
● foreman.service - Foreman
Loaded: loaded (/usr/lib/systemd/system/foreman.service; enabled; vendor preset: disabled)
Drop-In: /etc/systemd/system/foreman.service.d
└─installer.conf
Active: active (running) since Thu 2023-02-23 13:39:10 CET; 2min 19s ago
Docs: https://theforeman.org
Main PID: 1169 (rails)
Tasks: 113 (limit: 126860)
Memory: 687.0M
CGroup: /system.slice/foreman.service
├─1169 puma 5.6.5 (unix:///run/foreman.sock) [foreman]
├─4390 puma: cluster worker 0: 1169 [foreman]
├─4394 puma: cluster worker 1: 1169 [foreman]
├─4396 puma: cluster worker 2: 1169 [foreman]
├─4399 puma: cluster worker 3: 1169 [foreman]
├─4402 puma: cluster worker 4: 1169 [foreman]
└─4406 puma: cluster worker 5: 1169 [foreman]
Feb 23 13:39:09 foreman.my.org foreman[1169]: [1169] * Activated unix:///run/foreman.sock
Feb 23 13:39:09 foreman.my.org foreman[1169]: [1169] Use Ctrl-C to stop
Feb 23 13:39:09 foreman.my.org foreman[1169]: [1169] * Starting control server on unix:///usr/share/foreman/tmp/sockets/pumactl.sock
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 0 (PID: 4390) booted in 0.13s, phase: 0
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 3 (PID: 4399) booted in 0.13s, phase: 0
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 5 (PID: 4406) booted in 0.13s, phase: 0
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 2 (PID: 4396) booted in 0.15s, phase: 0
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 1 (PID: 4394) booted in 0.17s, phase: 0
Feb 23 13:39:10 foreman.my.org foreman[1169]: [1169] - Worker 4 (PID: 4402) booted in 0.17s, phase: 0
Feb 23 13:39:10 foreman.my.org systemd[1]: Started Foreman.
/ displaying httpd
● httpd.service - The Apache HTTP Server
Loaded: loaded (/usr/lib/systemd/system/httpd.service; enabled; vendor preset: disabled)
Active: failed (Result: exit-code) since Thu 2023-02-23 13:40:32 CET; 57s ago
Docs: man:httpd.service(8)
Process: 6981 ExecStart=/usr/sbin/httpd $OPTIONS -DFOREGROUND (code=exited, status=1/FAILURE)
Main PID: 6981 (code=exited, status=1/FAILURE)
Status: "Reading configuration..."
Feb 23 13:40:32 foreman.my.org httpd[6981]: [Thu Feb 23 13:40:32.444676 2023] [so:warn] [pid 6981:tid 140329318956480] AH01574: module proxy_http_module is already loaded, skipping
Feb 23 13:40:32 foreman.my.org httpd[6981]: [Thu Feb 23 13:40:32.444945 2023] [so:warn] [pid 6981:tid 140329318956480] AH01574: module proxy_wstunnel_module is already loaded, skipping
Feb 23 13:40:32 foreman.my.org httpd[6981]: [Thu Feb 23 13:40:32.444976 2023] [so:warn] [pid 6981:tid 140329318956480] AH01574: module ssl_module is already loaded, skipping
Feb 23 13:40:32 foreman.my.org httpd[6981]: [Thu Feb 23 13:40:32.444992 2023] [so:warn] [pid 6981:tid 140329318956480] AH01574: module systemd_module is already loaded, skipping
Feb 23 13:40:32 foreman.my.org httpd[6981]: [Thu Feb 23 13:40:32.451735 2023] [alias:warn] [pid 6981:tid 140329318956480] AH00671: The Alias directive in /etc/httpd/conf.d/autoindex.conf at line 21 will probably never match because it overlaps an earlier Alias.
Feb 23 13:40:32 foreman.my.org httpd[6981]: AH00526: Syntax error on line 5 of /etc/httpd/conf.d/ssl.conf:
Feb 23 13:40:32 foreman.my.org httpd[6981]: Cannot define multiple Listeners on the same IP:port
Feb 23 13:40:32 foreman.my.org systemd[1]: httpd.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 13:40:32 foreman.my.org systemd[1]: httpd.service: Failed with result 'exit-code'.
Feb 23 13:40:32 foreman.my.org systemd[1]: Failed to start The Apache HTTP Server.
/ displaying dynflow-sidekiq@worker-1
● dynflow-sidekiq@worker-1.service - Foreman jobs daemon - worker-1 on sidekiq
Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:39:10 CET; 2min 19s ago
Docs: https://theforeman.org
Main PID: 1182 (sidekiq)
Status: "Everything ready for world: 8c5ea08c-4392-4d28-86e5-8f07ce377b5d"
Tasks: 15 (limit: 126860)
Memory: 395.2M
CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker-1.service
└─1182 sidekiq 6.3.1 [0 of 5 busy]
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Foreman jobs daemon - worker-1 on sidekiq...
Feb 23 13:38:40 foreman.my.org dynflow-sidekiq@worker-1[1182]: 2023-02-23T12:38:40.368Z pid=1182 tid=2 INFO: Enabling systemd notification integration
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@worker-1[1182]: 2023-02-23T12:38:44.461Z pid=1182 tid=2 INFO: Booting Sidekiq 6.3.1 with redis options {:url=>"redis://localhost:6379/0"}
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@worker-1[1182]: 2023-02-23T12:38:44.464Z pid=1182 tid=2 INFO: GitLab reliable fetch activated!
Feb 23 13:39:10 foreman.my.org systemd[1]: Started Foreman jobs daemon - worker-1 on sidekiq.
/ displaying dynflow-sidekiq@worker-hosts-queue-1
● dynflow-sidekiq@worker-hosts-queue-1.service - Foreman jobs daemon - worker-hosts-queue-1 on sidekiq
Loaded: loaded (/usr/lib/systemd/system/dynflow-sidekiq@.service; enabled; vendor preset: disabled)
Active: active (running) since Thu 2023-02-23 13:39:10 CET; 2min 19s ago
Docs: https://theforeman.org
Main PID: 1172 (sidekiq)
Status: "Everything ready for world: e9e6af63-e514-4e37-9819-18d749b94f29"
Tasks: 15 (limit: 126860)
Memory: 398.3M
CGroup: /system.slice/system-dynflow\x2dsidekiq.slice/dynflow-sidekiq@worker-hosts-queue-1.service
└─1172 sidekiq 6.3.1 [0 of 5 busy]
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Foreman jobs daemon - worker-hosts-queue-1 on sidekiq...
Feb 23 13:38:40 foreman.my.org dynflow-sidekiq@worker-hosts-queue-1[1172]: 2023-02-23T12:38:40.459Z pid=1172 tid=8 INFO: Enabling systemd notification integration
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@worker-hosts-queue-1[1172]: 2023-02-23T12:38:44.465Z pid=1172 tid=8 INFO: Booting Sidekiq 6.3.1 with redis options {:url=>"redis://localhost:6379/0"}
Feb 23 13:38:44 foreman.my.org dynflow-sidekiq@worker-hosts-queue-1[1172]: 2023-02-23T12:38:44.478Z pid=1172 tid=8 INFO: GitLab reliable fetch activated!
Feb 23 13:39:10 foreman.my.org systemd[1]: Started Foreman jobs daemon - worker-hosts-queue-1 on sidekiq.
/ displaying foreman-proxy
● foreman-proxy.service - Foreman Proxy
Loaded: loaded (/usr/lib/systemd/system/foreman-proxy.service; enabled; vendor preset: disabled)
Drop-In: /etc/systemd/system/foreman-proxy.service.d
└─90-limits.conf
Active: active (running) since Thu 2023-02-23 13:38:47 CET; 2min 42s ago
Main PID: 1173 (smart-proxy)
Tasks: 7 (limit: 126860)
Memory: 83.1M
CGroup: /system.slice/foreman-proxy.service
└─1173 /usr/bin/ruby /usr/share/foreman-proxy/bin/smart-proxy
Feb 23 13:38:39 foreman.my.org systemd[1]: Starting Foreman Proxy...
Feb 23 13:38:47 foreman.my.org systemd[1]: Started Foreman Proxy.
Feb 23 13:40:31 foreman.my.org smart-proxy[1173]: 10.11.12.13 - - [23/Feb/2023:13:40:31 CET] "GET /features HTTP/1.1" 200 38
Feb 23 13:40:31 foreman.my.org smart-proxy[1173]: - -> /features
/ All services displayed [FAIL]
Some services are not running (httpd)
--------------------------------------------------------------------------------
Scenario [Status Services] failed.
The following steps ended up in failing state:
[service-status]
Resolve the failed steps and rerun the command.
In case the failures are false positives, use
--whitelist="service-status"
Starting foreman manually:
# foreman-maintain service start
Running Start Services
================================================================================
Check if command is run as root user: [OK]
--------------------------------------------------------------------------------
Start applicable services:
Starting the following service(s):
redis, postgresql, pulpcore-api, pulpcore-content, pulpcore-worker@1.service, pulpcore-worker@2.service, pulpcore-worker@3.service, pulpcore-worker@4.service, tomcat, dynflow-sidekiq@orchestrator, foreman, httpd, dynflow-sidekiq@worker-1, dynflow-sidekiq@worker-hosts-queue-1, foreman-proxy
- starting httpd
Job for httpd.service failed because the control process exited with error code.
See "systemctl status httpd.service" and "journalctl -xe" for details.
\ All services started [OK]
--------------------------------------------------------------------------------
Foreman gives [OK] when it is not ok (Job for httpd.service failed
).
Starting httpd
in foreground:
# /sbin/httpd
[Thu Feb 23 13:56:47.085942 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module alias_module is already loaded, skipping
[Thu Feb 23 13:56:47.086348 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module authz_core_module is already loaded, skipping
[Thu Feb 23 13:56:47.086494 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module authz_host_module is already loaded, skipping
[Thu Feb 23 13:56:47.086591 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module autoindex_module is already loaded, skipping
[Thu Feb 23 13:56:47.087379 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module dir_module is already loaded, skipping
[Thu Feb 23 13:56:47.087483 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module env_module is already loaded, skipping
[Thu Feb 23 13:56:47.087604 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module filter_module is already loaded, skipping
[Thu Feb 23 13:56:47.087614 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module headers_module is already loaded, skipping
[Thu Feb 23 13:56:47.087736 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module log_config_module is already loaded, skipping
[Thu Feb 23 13:56:47.087891 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module mime_module is already loaded, skipping
[Thu Feb 23 13:56:47.088101 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module rewrite_module is already loaded, skipping
[Thu Feb 23 13:56:47.088113 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module setenvif_module is already loaded, skipping
[Thu Feb 23 13:56:47.088304 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module socache_shmcb_module is already loaded, skipping
[Thu Feb 23 13:56:47.088508 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module unixd_module is already loaded, skipping
[Thu Feb 23 13:56:47.089434 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module mpm_event_module is already loaded, skipping
[Thu Feb 23 13:56:47.089470 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module proxy_module is already loaded, skipping
[Thu Feb 23 13:56:47.090162 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module proxy_http_module is already loaded, skipping
[Thu Feb 23 13:56:47.090378 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module proxy_wstunnel_module is already loaded, skipping
[Thu Feb 23 13:56:47.090405 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module ssl_module is already loaded, skipping
[Thu Feb 23 13:56:47.090423 2023] [so:warn] [pid 8654:tid 140209832304064] AH01574: module systemd_module is already loaded, skipping
[Thu Feb 23 13:56:47.094426 2023] [alias:warn] [pid 8654:tid 140209832304064] AH00671: The Alias directive in /etc/httpd/conf.d/autoindex.conf at line 21 will probably never match because it overlaps an earlier Alias.
AH00526: Syntax error on line 5 of /etc/httpd/conf.d/ssl.conf:
Cannot define multiple Listeners on the same IP:port