@gvde and just to clarify, you restarted all services? ‘foreman-maintain service restart’
Exactly. Restarted after the update.
@gvde would you be able to send a new copy of the journalctl output to @Partha_Aji after the issue has occured?
@gvde also, can you provide the output of: sudo ps aux
I guess you want that while it’s hanging? I’ll have to wait. Nothing happened the last two days. Strange…
It just happened again 20 minutes ago. I have sent the latest journal to @Partha_Aji
Currently running processes:
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.0 194224 7268 ? Ss Jul30 1:22 /usr/lib/systemd/systemd --switched-root --system --deserialize 22
root 2 0.0 0.0 0 0 ? S Jul30 0:00 [kthreadd]
root 4 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/0:0H]
root 6 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/0]
root 7 0.0 0.0 0 0 ? S Jul30 0:00 [migration/0]
root 8 0.0 0.0 0 0 ? S Jul30 0:00 [rcu_bh]
root 9 0.1 0.0 0 0 ? S Jul30 5:16 [rcu_sched]
root 10 0.0 0.0 0 0 ? S< Jul30 0:00 [lru-add-drain]
root 11 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/0]
root 12 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/1]
root 13 0.0 0.0 0 0 ? S Jul30 0:00 [migration/1]
root 14 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/1]
root 16 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/1:0H]
root 17 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/2]
root 18 0.0 0.0 0 0 ? S Jul30 0:00 [migration/2]
root 19 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/2]
root 21 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/2:0H]
root 22 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/3]
root 23 0.0 0.0 0 0 ? S Jul30 0:00 [migration/3]
root 24 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/3]
root 26 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/3:0H]
root 27 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/4]
root 28 0.0 0.0 0 0 ? S Jul30 0:00 [migration/4]
root 29 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/4]
root 31 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/4:0H]
root 32 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/5]
root 33 0.0 0.0 0 0 ? S Jul30 0:00 [migration/5]
root 34 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/5]
root 36 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/5:0H]
root 37 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/6]
root 38 0.0 0.0 0 0 ? S Jul30 0:00 [migration/6]
root 39 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/6]
root 41 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/6:0H]
root 42 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/7]
root 43 0.0 0.0 0 0 ? S Jul30 0:00 [migration/7]
root 44 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/7]
root 46 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/7:0H]
root 47 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/8]
root 48 0.0 0.0 0 0 ? S Jul30 0:00 [migration/8]
root 49 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/8]
root 51 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/8:0H]
root 52 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/9]
root 53 0.0 0.0 0 0 ? S Jul30 0:00 [migration/9]
root 54 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/9]
root 56 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/9:0H]
root 57 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/10]
root 58 0.0 0.0 0 0 ? S Jul30 0:00 [migration/10]
root 59 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/10]
root 61 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/10:0H]
root 62 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/11]
root 63 0.0 0.0 0 0 ? S Jul30 0:00 [migration/11]
root 64 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/11]
root 66 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/11:0H]
root 67 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/12]
root 68 0.0 0.0 0 0 ? S Jul30 0:00 [migration/12]
root 69 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/12]
root 71 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/12:0H]
root 72 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/13]
root 73 0.0 0.0 0 0 ? S Jul30 0:00 [migration/13]
root 74 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/13]
root 76 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/13:0H]
root 77 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/14]
root 78 0.0 0.0 0 0 ? S Jul30 0:00 [migration/14]
root 79 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/14]
root 81 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/14:0H]
root 82 0.0 0.0 0 0 ? S Jul30 0:01 [watchdog/15]
root 83 0.0 0.0 0 0 ? S Jul30 0:00 [migration/15]
root 84 0.0 0.0 0 0 ? S Jul30 0:00 [ksoftirqd/15]
root 86 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/15:0H]
root 88 0.0 0.0 0 0 ? S Jul30 0:00 [kdevtmpfs]
root 89 0.0 0.0 0 0 ? S< Jul30 0:00 [netns]
root 90 0.0 0.0 0 0 ? S Jul30 0:00 [xenwatch]
root 91 0.0 0.0 0 0 ? S Jul30 0:00 [xenbus]
root 93 0.0 0.0 0 0 ? S Jul30 0:00 [khungtaskd]
root 94 0.0 0.0 0 0 ? S< Jul30 0:00 [writeback]
root 95 0.0 0.0 0 0 ? S< Jul30 0:00 [kintegrityd]
root 96 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 97 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 98 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 99 0.0 0.0 0 0 ? S< Jul30 0:00 [kblockd]
root 101 0.0 0.0 0 0 ? S< Jul30 0:00 [md]
root 102 0.0 0.0 0 0 ? S< Jul30 0:00 [edac-poller]
root 103 0.0 0.0 0 0 ? S< Jul30 0:00 [watchdogd]
root 122 0.0 0.0 0 0 ? S Jul30 0:00 [kswapd0]
root 123 0.0 0.0 0 0 ? SN Jul30 0:00 [ksmd]
root 124 0.0 0.0 0 0 ? SN Jul30 0:12 [khugepaged]
root 125 0.0 0.0 0 0 ? S< Jul30 0:00 [crypto]
root 133 0.0 0.0 0 0 ? S< Jul30 0:00 [kthrotld]
root 135 0.0 0.0 0 0 ? S Jul30 0:00 [khvcd]
root 136 0.0 0.0 0 0 ? S< Jul30 0:00 [kmpath_rdacd]
root 137 0.0 0.0 0 0 ? S< Jul30 0:00 [kaluad]
root 138 0.0 0.0 0 0 ? S< Jul30 0:00 [kpsmoused]
root 139 0.0 0.0 0 0 ? S< Jul30 0:00 [ipv6_addrconf]
root 152 0.0 0.0 0 0 ? S< Jul30 0:00 [deferwq]
root 214 0.0 0.0 0 0 ? S Jul30 0:01 [kauditd]
root 389 0.0 0.0 0 0 ? S< Jul30 0:00 [ata_sff]
root 391 0.0 0.0 0 0 ? S Jul30 0:00 [scsi_eh_0]
root 392 0.0 0.0 0 0 ? S< Jul30 0:00 [scsi_tmf_0]
root 399 0.0 0.0 0 0 ? S Jul30 0:00 [scsi_eh_1]
root 401 0.0 0.0 0 0 ? S< Jul30 0:00 [scsi_tmf_1]
root 475 0.0 0.0 0 0 ? S< Jul30 0:01 [kworker/0:1H]
root 514 0.0 0.0 0 0 ? S< Jul30 0:00 [kdmflush]
root 515 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 522 0.0 0.0 0 0 ? S< Jul30 0:00 [kdmflush]
root 523 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 540 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 541 0.0 0.0 0 0 ? S< Jul30 0:00 [xfsalloc]
root 542 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs_mru_cache]
root 543 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-buf/dm-1]
root 544 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-data/dm-1]
root 545 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-conv/dm-1]
root 546 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-cil/dm-1]
root 547 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-reclaim/dm-]
root 548 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-log/dm-1]
root 549 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-eofblocks/d]
root 550 0.0 0.0 0 0 ? S Jul30 1:30 [xfsaild/dm-1]
root 632 0.0 0.0 48456 12528 ? Ss Jul30 1:39 /usr/lib/systemd/systemd-journald
root 645 0.0 0.0 0 0 ? S< Jul30 0:00 [rpciod]
root 646 0.0 0.0 0 0 ? S< Jul30 0:00 [xprtiod]
root 654 0.0 0.0 348812 7228 ? Ss Jul30 0:08 /usr/sbin/lvmetad -f
root 676 0.0 0.0 49320 6328 ? Ss Jul30 0:01 /usr/lib/systemd/systemd-udevd
root 736 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/11:1H]
root 754 0.0 0.0 0 0 ? S< Jul30 0:00 [nfit]
root 940 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/1:1H]
root 945 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-buf/xvda1]
root 946 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-data/xvda1]
root 947 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-conv/xvda1]
root 948 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-cil/xvda1]
root 949 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-reclaim/xvd]
root 950 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-log/xvda1]
root 951 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-eofblocks/x]
root 952 0.0 0.0 0 0 ? S Jul30 0:00 [xfsaild/xvda1]
root 954 0.0 0.0 0 0 ? S< Jul30 0:00 [kdmflush]
root 955 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 967 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-buf/dm-2]
root 968 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-data/dm-2]
root 969 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-conv/dm-2]
root 970 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-cil/dm-2]
root 971 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-reclaim/dm-]
root 972 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-log/dm-2]
root 973 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-eofblocks/d]
root 975 0.0 0.0 0 0 ? S Jul30 0:46 [xfsaild/dm-2]
root 979 0.0 0.0 0 0 ? S< Jul30 0:00 [kdmflush]
root 980 0.0 0.0 0 0 ? S< Jul30 0:00 [bioset]
root 989 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-buf/dm-3]
root 990 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-data/dm-3]
root 991 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-conv/dm-3]
root 992 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-cil/dm-3]
root 993 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-reclaim/dm-]
root 994 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-log/dm-3]
root 995 0.0 0.0 0 0 ? S< Jul30 0:00 [xfs-eofblocks/d]
root 997 0.0 0.0 0 0 ? S Jul30 1:11 [xfsaild/dm-3]
root 1023 0.0 0.0 55532 1064 ? S<sl Jul30 0:04 /sbin/auditd
polkitd 1047 0.0 0.0 624496 14184 ? Ssl Jul30 0:06 /usr/lib/polkit-1/polkitd --no-debug
dbus 1051 0.0 0.0 68556 2628 ? Ssl Jul30 0:35 /usr/bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation
rpc 1055 0.0 0.0 69280 1000 ? Ss Jul30 0:00 /sbin/rpcbind -w
root 1059 0.0 0.0 269132 1552 ? Ssl Jul30 0:00 /usr/sbin/gssproxy -D
root 1069 0.0 0.0 21804 1496 ? Ss Jul30 0:32 /usr/sbin/irqbalance --foreground
ntp 1087 0.0 0.0 59544 2496 ? Ss Jul30 0:01 /usr/sbin/ntpd -u ntp:ntp -g
root 1097 0.0 0.0 270576 5596 ? Ss Jul30 0:00 /usr/sbin/sssd -i --logger=files
root 1114 0.0 0.0 106580 11816 ? Ssl Jul30 0:34 /usr/sbin/xe-daemon
root 1116 0.0 0.0 40340 980 ? Ss Jul30 0:00 /usr/sbin/rpc.gssd
root 1125 0.0 0.0 108060 692 ? S Jul30 0:00 logger -t xe-daemon -p debug
root 1138 0.0 0.0 476428 8588 ? Ssl Jul30 0:14 /usr/sbin/NetworkManager --no-daemon
root 1141 0.0 0.0 410872 15292 ? S Jul30 0:06 /usr/libexec/sssd/sssd_be --domain example.com --uid 0 --gid 0 --logger=files
root 1162 0.0 0.0 278404 29956 ? S Jul30 0:05 /usr/libexec/sssd/sssd_nss --uid 0 --gid 0 --logger=files
root 1163 0.0 0.0 251464 5032 ? S Jul30 0:02 /usr/libexec/sssd/sssd_sudo --uid 0 --gid 0 --logger=files
root 1164 0.0 0.0 258072 5100 ? S Jul30 0:05 /usr/libexec/sssd/sssd_pam --uid 0 --gid 0 --logger=files
root 1165 0.0 0.0 259240 6168 ? S Jul30 0:04 /usr/libexec/sssd/sssd_ssh --uid 0 --gid 0 --logger=files
root 1166 0.0 0.0 298220 5896 ? S Jul30 0:02 /usr/libexec/sssd/sssd_pac --uid 0 --gid 0 --logger=files
root 1187 0.0 0.0 36680 1852 ? Ss Jul30 0:24 /usr/lib/systemd/systemd-logind
root 1208 0.0 0.0 126388 1684 ? Ss Jul30 0:05 /usr/sbin/crond -n
root 1241 0.0 0.0 110208 844 tty1 Ss+ Jul30 0:00 /sbin/agetty --noclear tty1 linux
root 1243 0.0 0.0 110208 848 hvc0 Ss+ Jul30 0:00 /sbin/agetty --keep-baud 115200,38400,9600 hvc0 vt220
root 1244 0.0 0.0 110208 856 ttyS0 Ss+ Jul30 0:00 /sbin/agetty --keep-baud 115200,38400,9600 ttyS0 vt220
foreman 1385 0.0 0.0 4356 592 ? Ss Jul30 0:00 /usr/bin/scl enable tfm sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/worker.yml
pulp 1387 0.0 0.1 347056 66664 ? Ss Jul30 1:15 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker --pid=/var/run/pulpcore-worker-1/reserved-resource-worker-1.pid -c pulpcore.rqconfig --disable-job-desc-logging
apache 1388 0.0 0.1 675912 59084 ? Ssl Jul30 3:21 /usr/bin/python /usr/bin/celery beat --app=pulp.server.async.celery_instance.celery --scheduler=pulp.server.async.scheduler.Scheduler
pulp 1389 0.0 0.1 347096 66664 ? Ss Jul30 1:16 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker -n resource-manager --pid=/var/run/pulpcore-resource-manager/resource-manager.pid -c pulpcore.rqconfig --disable-job-desc-logging
apache 1390 0.0 0.1 677560 75832 ? Ssl Jul30 2:04 /usr/bin/python /usr/bin/pulp_streamer --nodaemon --syslog --prefix=pulp_streamer --pidfile= --python /usr/share/pulp/wsgi/streamer.tac
pulp 1391 0.0 0.1 347056 66656 ? Ss Jul30 1:15 /usr/bin/python3 /usr/bin/rq worker -w pulpcore.tasking.worker.PulpWorker --pid=/var/run/pulpcore-worker-2/reserved-resource-worker-2.pid -c pulpcore.rqconfig --disable-job-desc-logging
pulp 1392 0.0 0.0 223260 20200 ? Ss Jul30 0:41 /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind 127.0.0.1:24817 --access-logfile -
pulp 1396 0.0 0.0 260704 29888 ? Ss Jul30 0:42 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -
foreman 1397 0.0 0.0 113284 1468 ? S Jul30 0:00 /bin/bash /var/tmp/sclL7hYiX
apache 1400 0.2 0.1 751480 86172 ? Ssl Jul30 10:20 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid
root 1406 0.0 0.1 384764 56968 ? Ssl Jul30 0:15 /opt/puppetlabs/puppet/bin/ruby /opt/puppetlabs/puppet/bin/puppet agent --no-daemonize
root 1415 0.0 0.0 199064 9460 ? Ss Jul30 0:24 /usr/sbin/httpd -DFOREGROUND
foreman+ 1416 0.0 0.1 949916 59812 ? Ssl Jul30 0:02 ruby /usr/share/foreman-proxy/bin/smart-proxy --no-daemonize
foreman 1427 0.0 0.0 4356 588 ? Ss Jul30 0:00 /usr/bin/scl enable tfm sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/worker-hosts-queue.yml
root 1430 0.0 0.0 52732 2220 ? Ss Jul30 0:00 /usr/sbin/oddjobd -n -p /var/run/oddjobd.pid -t 300
root 1431 0.0 0.0 586444 20108 ? Ssl Jul30 0:28 /usr/bin/python2 -Es /usr/sbin/tuned -l -P
foreman 1437 0.0 0.0 113284 1468 ? S Jul30 0:00 /bin/bash /var/tmp/sclxxyK22
root 1457 0.0 0.0 112924 4356 ? Ss Jul30 0:07 /usr/sbin/sshd -D
qdroute+ 1460 0.1 0.0 1424796 31264 ? Ssl Jul30 5:41 /usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf
root 1477 0.0 0.0 115812 664 ? Ss Jul30 0:00 /usr/bin/rhsmcertd
puppet 1485 10.4 7.6 12805104 3791900 ? Sl Jul30 449:18 /usr/bin/java -Xms2G -Xmx2G -Djruby.logger.class=com.puppetlabs.jruby_utils.jruby.Slf4jLogger -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar:/opt/puppetlabs/puppet/lib/ruby/vendor_ruby/facter.jar:/opt/puppetlabs/server/data/puppetserver/jars/\* clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter
tomcat 1491 0.5 3.4 13171596 1699904 ? Ssl Jul30 25:31 /usr/lib/jvm/jre/bin/java -Xms1024m -Xmx4096m -classpath /usr/share/tomcat/bin/bootstrap.jar:/usr/share/tomcat/bin/tomcat-juli.jar:/usr/share/java/commons-daemon.jar -Dcatalina.base=/usr/share/tomcat -Dcatalina.home=/usr/share/tomcat -Djava.endorsed.dirs= -Djava.io.tmpdir=/var/cache/tomcat/temp -Djava.util.logging.config.file=/usr/share/tomcat/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager org.apache.catalina.startup.Bootstrap start
foreman 1497 0.0 0.0 4356 584 ? Ss Jul30 0:00 /usr/bin/scl enable tfm sidekiq -e production -r /usr/share/foreman/extras/dynflow-sidekiq.rb -C /etc/foreman/dynflow/orchestrator.yml
root 1502 0.0 0.0 323096 9040 ? Ssl Jul30 0:43 /usr/sbin/rsyslogd -n
foreman 1503 0.0 0.0 113284 1464 ? S Jul30 0:00 /bin/bash /var/tmp/sclfYYrm6
redis 1510 0.1 0.0 156616 7924 ? Ssl Jul30 8:10 /opt/rh/rh-redis5/root/usr/bin/redis-server 127.0.0.1:6379
qpidd 1513 0.4 0.1 1380496 52284 ? Ssl Jul30 18:37 /usr/sbin/qpidd --config /etc/qpid/qpidd.conf
root 1529 0.0 0.0 27168 1036 ? Ss Jul30 0:00 /usr/sbin/xinetd -stayalive -pidfile /var/run/xinetd.pid
postgres 1534 0.0 0.1 834256 52132 ? Ss Jul30 0:19 postmaster -D /var/opt/rh/rh-postgresql12/lib/pgsql/data
root 1571 0.0 0.0 96056 3576 ? Ss Jul30 0:00 /usr/sbin/squid -f /etc/squid/squid.conf
squid 1577 0.0 0.0 110068 16380 ? S Jul30 0:25 (squid-1) -f /etc/squid/squid.conf
squid 1689 0.0 0.0 27440 1240 ? S Jul30 0:00 (logfile-daemon) /var/log/squid/access.log
apache 1703 0.2 0.1 748120 79492 ? Ssl Jul30 9:39 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
apache 1725 0.2 0.1 748068 79468 ? Ssl Jul30 9:40 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
apache 1736 0.2 0.1 748064 79472 ? Ssl Jul30 9:40 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
apache 1748 0.2 0.1 748040 79468 ? Ssl Jul30 9:40 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
apache 1753 0.2 0.1 748120 79480 ? Ssl Jul30 9:43 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-4@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-4.pid
apache 1758 0.2 0.1 748120 83708 ? Ssl Jul30 9:42 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-5@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-5.pid
apache 1763 0.2 0.1 748596 82292 ? Ssl Jul30 9:49 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-6@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-6.pid
apache 1767 0.2 0.1 748340 79972 ? Ssl Jul30 9:47 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-7@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-7.pid
foreman 1883 0.2 1.3 1028664 684384 ? Sl Jul30 11:52 sidekiq 5.2.7 [0 of 1 busy]
foreman 1884 0.1 0.8 741028 394936 ? Sl Jul30 5:02 sidekiq 5.2.7 [0 of 5 busy]
foreman 1888 0.8 1.7 1253052 881896 ? Sl Jul30 35:04 sidekiq 5.2.7 [0 of 5 busy]
postgres 1915 0.0 0.0 251804 2176 ? Ss Jul30 0:00 postgres: logger
mongodb 1921 2.0 20.1 11199364 9910176 ? Sl Jul30 88:43 /opt/rh/rh-mongodb34/root/usr/bin/mongod -f /etc/opt/rh/rh-mongodb34/mongod.conf run
postgres 2071 0.0 0.7 834940 383632 ? Ss Jul30 0:25 postgres: checkpointer
postgres 2072 0.0 0.0 834392 24724 ? Ss Jul30 0:07 postgres: background writer
postgres 2073 0.0 0.0 834256 18656 ? Ss Jul30 0:49 postgres: walwriter
postgres 2076 0.0 0.0 835212 3472 ? Ss Jul30 0:09 postgres: autovacuum launcher
postgres 2077 0.0 0.0 252372 2748 ? Ss Jul30 0:45 postgres: stats collector
postgres 2078 0.0 0.0 835100 3068 ? Ss Jul30 0:00 postgres: logical replication launcher
root 2142 0.0 0.0 89704 2204 ? Ss Jul30 0:03 /usr/libexec/postfix/master -w
postfix 2190 0.0 0.0 102232 4320 ? S Jul30 0:01 qmgr -l -t unix -u
pulp 2445 0.0 0.1 354052 69544 ? S Jul30 0:05 /usr/bin/python3 /usr/bin/gunicorn pulpcore.app.wsgi:application --bind 127.0.0.1:24817 --access-logfile -
pulp 2687 0.0 0.1 370704 71296 ? S Jul30 2:15 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -
root 2697 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/12:1H]
root 2703 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/15:1H]
root 2706 0.0 0.0 0 0 ? S< Jul30 0:02 [kworker/4:1H]
pulp 2707 0.0 0.1 370704 71292 ? S Jul30 2:16 /usr/bin/python3 /usr/bin/gunicorn pulpcore.content:server --bind 127.0.0.1:24816 --worker-class aiohttp.GunicornWebWorker -w 2 --access-logfile -
root 2708 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/5:1H]
root 2709 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/6:1H]
root 2711 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/3:1H]
root 2713 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/8:1H]
root 2768 0.0 0.0 0 0 ? S< Jul30 0:01 [kworker/14:1H]
foreman+ 3607 0.0 0.1 810388 68764 ? Sl Jul30 1:16 ruby /usr/bin/smart_proxy_dynflow_core -d -p /var/run/foreman-proxy/smart_proxy_dynflow_core.pid
apache 3782 1.6 0.6 1440164 318268 ? Sl Jul30 71:59 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-6@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-6.pid
apache 3783 1.4 0.6 1446580 333924 ? Sl Jul30 61:31 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-5@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-5.pid
apache 3788 0.0 0.1 897712 76488 ? Sl Jul30 3:21 /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n resource_manager@%h -Q resource_manager -c 1 --events --umask 18 --pidfile=/var/run/pulp/resource_manager.pid
apache 3789 0.3 0.3 1391376 194532 ? Sl Jul30 16:03 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
apache 3792 0.2 0.5 1458236 247976 ? Sl Jul30 9:20 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-7@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-7.pid
apache 3794 0.1 0.3 1379744 183960 ? Sl Jul30 6:12 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
apache 3801 0.4 0.3 1438308 188748 ? Sl Jul30 20:29 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
apache 3805 0.1 0.3 1381716 182412 ? Sl Jul30 7:36 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-4@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-4.pid
apache 3812 3.3 0.4 1484580 227872 ? Sl Jul30 143:20 /usr/bin/python /usr/bin/celery worker -n reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
postgres 3971 0.0 0.0 836232 25600 ? Ss Jul30 0:17 postgres: pulp pulpcore ::1(52680) idle
postgres 3973 0.0 0.0 836260 26076 ? Ss Jul30 0:18 postgres: pulp pulpcore ::1(52682) idle
postgres 3974 0.0 0.0 836232 25856 ? Ss Jul30 0:17 postgres: pulp pulpcore ::1(52684) idle
postgres 4057 0.0 0.0 835572 24904 ? Ss Jul30 0:22 postgres: pulp pulpcore ::1(52697) idle
postgres 4058 0.0 0.0 835572 24904 ? Ss Jul30 0:22 postgres: pulp pulpcore ::1(52696) idle
root 4611 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/7:1H]
apache 5874 0.1 0.2 1390876 145972 ? Sl 03:31 1:13 (wsgi:pulp) -DFOREGROUND
apache 5875 0.1 0.3 1460764 153028 ? Sl 03:31 1:06 (wsgi:pulp) -DFOREGROUND
apache 5876 0.1 0.3 1390364 189520 ? Sl 03:31 1:10 (wsgi:pulp) -DFOREGROUND
apache 5877 0.0 0.1 854540 49828 ? Sl 03:31 0:03 (wsgi:pulp-cont -DFOREGROUND
apache 5878 0.0 0.1 789004 49564 ? Sl 03:31 0:03 (wsgi:pulp-cont -DFOREGROUND
apache 5879 0.0 0.1 854540 51852 ? Sl 03:31 0:03 (wsgi:pulp-cont -DFOREGROUND
apache 5880 0.0 0.1 955596 75908 ? Sl 03:31 0:15 (wsgi:pulp_forg -DFOREGROUND
root 5881 0.0 0.0 217328 2008 ? Ssl 03:31 0:00 PassengerWatchdog
root 5884 0.4 0.0 1051396 6788 ? Sl 03:31 2:55 PassengerHelperAgent
nobody 5891 0.0 0.0 233092 4376 ? Sl 03:31 0:00 PassengerLoggingAgent
apache 5899 0.0 0.0 214084 11444 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5900 0.0 0.0 214100 11448 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5901 0.0 0.0 214072 11464 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5902 0.0 0.0 214080 11464 ? S 03:31 0:03 /usr/sbin/httpd -DFOREGROUND
apache 5903 0.0 0.0 214104 11504 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5904 0.0 0.0 214088 11488 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5905 0.0 0.0 214084 11444 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
apache 5906 0.0 0.0 214076 11460 ? S 03:31 0:02 /usr/sbin/httpd -DFOREGROUND
foreman 6098 0.0 0.6 1063212 318032 ? Sl 03:31 0:39 Passenger AppPreloader: /usr/share/foreman
postgres 6242 0.0 0.0 836872 12388 ? Ss 03:31 0:00 postgres: foreman foreman [local] idle
apache 7185 0.0 0.0 214088 11476 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
apache 7196 0.0 0.0 214076 11448 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
apache 7197 0.0 0.0 214084 11476 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
apache 7198 0.0 0.0 214084 11448 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
apache 7208 0.0 0.0 214092 11448 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
apache 7209 0.0 0.0 214084 11456 ? S 03:37 0:02 /usr/sbin/httpd -DFOREGROUND
postgres 10870 0.0 0.0 836872 12372 ? Ss Jul30 0:00 postgres: foreman foreman [local] idle
postgres 10913 0.0 0.0 836872 12340 ? Ss Jul30 0:00 postgres: foreman foreman [local] idle
postgres 11346 0.0 0.0 835608 7304 ? Ss Jul30 0:00 postgres: foreman foreman [local] idle
postgres 11350 0.0 0.3 844224 186928 ? Ss Jul30 0:44 postgres: foreman foreman [local] idle
postgres 11351 0.0 0.3 840636 188396 ? Ss Jul30 0:44 postgres: foreman foreman [local] idle
postgres 11370 0.0 0.0 835608 7300 ? Ss Jul30 0:00 postgres: foreman foreman [local] idle
postgres 11375 0.0 0.2 843336 139596 ? Ss Jul30 0:13 postgres: foreman foreman [local] idle
postgres 11376 0.0 0.3 841708 162188 ? Ss Jul30 0:13 postgres: foreman foreman [local] idle
postgres 11399 0.0 0.0 835608 7300 ? Ss Jul30 0:00 postgres: foreman foreman [local] idle
postgres 11403 0.0 0.0 836544 32244 ? Ss Jul30 0:02 postgres: foreman foreman [local] idle
postgres 11404 0.0 0.0 836564 32108 ? Ss Jul30 0:02 postgres: foreman foreman [local] idle
root 11454 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/2:1H]
root 12021 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/9:1H]
root 12221 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/10:1H]
root 12258 0.0 0.0 0 0 ? S< Jul30 0:00 [dio/dm-1]
apache 12813 0.0 0.0 214180 11552 ? S 03:54 0:01 /usr/sbin/httpd -DFOREGROUND
apache 12822 0.0 0.0 214108 11460 ? S 03:54 0:01 /usr/sbin/httpd -DFOREGROUND
apache 12860 0.0 0.0 214180 11552 ? S 03:54 0:02 /usr/sbin/httpd -DFOREGROUND
apache 12866 0.0 0.0 214148 11528 ? S 03:54 0:01 /usr/sbin/httpd -DFOREGROUND
apache 12877 0.0 0.0 214096 11440 ? S 03:54 0:01 /usr/sbin/httpd -DFOREGROUND
apache 12878 0.0 0.0 214100 11464 ? S 03:54 0:01 /usr/sbin/httpd -DFOREGROUND
postgres 16654 0.0 0.3 839740 184836 ? Ss Jul30 0:44 postgres: foreman foreman [local] idle
root 16959 0.0 0.0 0 0 ? S< Jul30 0:00 [kworker/13:1H]
postfix 22673 0.0 0.0 102068 4148 ? S Jul30 0:00 tlsmgr -l -t unix -u
root 34895 0.0 0.0 0 0 ? S 06:13 0:00 [kworker/u256:2]
postgres 35865 0.0 0.2 836588 124364 ? Ss Jul30 0:12 postgres: foreman foreman [local] idle
root 46131 0.0 0.0 0 0 ? S 07:29 0:00 [kworker/5:2]
root 51231 0.0 0.0 0 0 ? S 08:07 0:01 [kworker/10:0]
root 64913 0.0 0.0 0 0 ? S 09:35 0:00 [kworker/3:0]
root 64942 0.0 0.0 0 0 ? R 09:35 0:01 [kworker/2:1]
root 73659 0.0 0.0 0 0 ? S 10:35 0:00 [kworker/13:2]
root 85483 0.0 0.0 0 0 ? S 11:57 0:01 [kworker/8:1]
root 94786 0.0 0.0 0 0 ? S 13:00 0:00 [kworker/7:0]
root 97103 0.0 0.0 0 0 ? S 13:13 0:01 [kworker/u256:0]
root 97410 0.0 0.0 0 0 ? S 13:14 0:00 [kworker/1:2]
root 99785 0.0 0.0 0 0 ? S 13:30 0:00 [kworker/1:0]
root 104636 0.0 0.0 0 0 ? S 14:07 0:00 [kworker/11:1]
root 106234 0.0 0.0 0 0 ? S 14:13 0:00 [kworker/0:2]
root 106430 0.0 0.0 0 0 ? S 14:14 0:00 [kworker/4:0]
root 108047 0.0 0.0 0 0 ? S 14:26 0:00 [kworker/7:2]
root 109673 0.0 0.0 0 0 ? S 14:39 0:00 [kworker/12:1]
postfix 110723 0.0 0.0 102056 4072 ? S 14:46 0:00 pickup -l -t unix -u
root 112226 0.0 0.0 0 0 ? S 14:59 0:00 [kworker/15:1]
root 112285 0.0 0.0 0 0 ? S 14:59 0:00 [kworker/8:0]
root 112364 0.0 0.0 0 0 ? S 15:00 0:00 [kworker/10:2]
root 112411 0.0 0.0 0 0 ? S 15:00 0:00 [kworker/4:1]
root 112949 0.0 0.0 0 0 ? S 15:04 0:00 [kworker/12:2]
root 113406 0.0 0.0 0 0 ? S 15:07 0:00 [kworker/8:2]
root 113619 0.0 0.0 0 0 ? S 15:09 0:00 [kworker/9:2]
root 113744 0.0 0.0 0 0 ? S 15:10 0:00 [kworker/11:2]
root 113984 0.0 0.0 0 0 ? S 15:11 0:00 [kworker/14:0]
foreman 115280 1.0 1.0 1709380 526464 ? Sl 15:14 0:11 Passenger RackApp: /usr/share/foreman
postgres 115288 0.0 0.0 835608 6784 ? Ss 15:14 0:00 postgres: foreman foreman [local] idle
postgres 115292 0.0 0.0 836240 14228 ? Ss 15:14 0:00 postgres: foreman foreman [local] idle
postgres 115293 0.0 0.0 836376 9508 ? Ss 15:14 0:00 postgres: foreman foreman [local] idle
postgres 115297 0.0 0.0 841972 46392 ? Ss 15:14 0:01 postgres: foreman foreman [local] idle
apache 115299 0.0 0.0 213940 10816 ? S 15:14 0:00 /usr/sbin/httpd -DFOREGROUND
root 115336 0.0 0.0 0 0 ? S 15:14 0:00 [kworker/3:2]
root 115352 0.0 0.0 0 0 ? S 15:14 0:00 [kworker/5:0]
root 115353 0.0 0.0 0 0 ? S 15:14 0:00 [kworker/2:2]
root 115723 0.0 0.0 183812 5900 ? Ss 15:15 0:00 sshd: k202081 [priv]
k202081 115736 0.0 0.0 183812 2624 ? S 15:15 0:00 sshd: k202081@pts/0
k202081 115737 0.0 0.0 125844 2104 pts/0 Ss 15:15 0:00 -bash
root 115762 0.0 0.0 272724 4940 pts/0 S 15:15 0:00 sudo bash
root 115768 0.0 0.0 0 0 ? S 15:15 0:00 [kworker/6:1]
root 115770 0.0 0.0 115680 2192 pts/0 S 15:15 0:00 bash
root 116106 0.0 0.0 0 0 ? S 15:18 0:00 [kworker/15:2]
root 116668 0.0 0.0 0 0 ? S 15:23 0:00 [kworker/15:0]
root 116758 0.0 0.0 0 0 ? S 15:24 0:00 [kworker/0:1]
root 116773 0.0 0.0 0 0 ? S 15:24 0:00 [kworker/9:0]
foreman 116910 1.0 0.9 1286980 484652 ? Sl 15:25 0:04 Passenger RackApp: /usr/share/foreman
postgres 116918 0.0 0.0 835608 6788 ? Ss 15:25 0:00 postgres: foreman foreman [local] idle
postgres 116922 0.0 0.0 835608 7316 ? Ss 15:25 0:00 postgres: foreman foreman [local] idle
postgres 116923 0.0 0.0 835860 7920 ? Ss 15:25 0:00 postgres: foreman foreman [local] idle
root 117054 0.0 0.0 0 0 ? S 15:26 0:00 [kworker/13:0]
postgres 117200 0.3 0.0 839440 46908 ? Ss 15:27 0:00 postgres: foreman foreman [local] idle
root 117383 0.0 0.0 0 0 ? S 15:29 0:00 [kworker/9:1]
root 117518 0.0 0.0 0 0 ? S 15:30 0:00 [kworker/2:0]
root 117520 0.0 0.0 0 0 ? S 15:30 0:00 [kworker/4:2]
root 117531 0.0 0.0 0 0 ? S 15:30 0:00 [kworker/6:0]
postgres 117691 0.0 0.0 835608 5160 ? Ss 15:31 0:00 postgres: candlepin candlepin 127.0.0.1(54049) idle
postgres 117692 0.0 0.0 835608 5160 ? Ss 15:31 0:00 postgres: candlepin candlepin 127.0.0.1(54052) idle
postgres 117693 0.0 0.0 835608 5160 ? Ss 15:31 0:00 postgres: candlepin candlepin 127.0.0.1(54048) idle
postgres 117755 0.0 0.0 836520 17716 ? Ss 15:32 0:00 postgres: foreman foreman [local] idle
postgres 117814 0.0 0.0 835608 5160 ? Ss 15:32 0:00 postgres: candlepin candlepin 127.0.0.1(54096) idle
postgres 117815 0.0 0.0 835608 5160 ? Ss 15:32 0:00 postgres: candlepin candlepin 127.0.0.1(54097) idle
root 117879 0.0 0.0 165772 1888 pts/0 R+ 15:33 0:00 ps aux
root 121404 0.0 0.0 0 0 ? S 01:41 0:00 [kworker/6:2]
postgres 127410 0.0 0.2 837532 122608 ? Ss Jul31 0:10 postgres: foreman foreman [local] idle
root 129814 0.0 0.0 0 0 ? S 02:46 0:21 [kworker/14:1]
@aruzicka found a new issue and opened a PR here: https://github.com/Dynflow/dynflow/pull/362
if you are able and want to try to apply the patch, go for it, otherwise you’ll need to wait for a build
I have just applied the patch to those two files and restarted. I have had a couple of tasks hanging since yesterday, including some metadata generation.
Restarted what? Just to be sure
Everything:
# foreman-maintain service restart
So far it’s looking good. I didn’t have any hanging tasks. Yesterday, I have updated to 3.16 which reverted my patched files and just this night there was a hanging task again. I have applied the PR again.
I’ll observe now on 3.16 with PR applied…
Any update on this?
3.16 + Foreman 2.0.2 and the patch https://github.com/Dynflow/dynflow/pull/362 ?
I am having the same issue with a large repo (~ 50 GB in size) smaller ones run smooth
That can’t be right. Katello 3.16 goes with Foreman 2.1. If you have 3.16 and 2.0.2 I think there must be something broken…
With the current versions and the PR I don’t see any issues anymore. Which step of the task is hanging exactly?
Dynflow 1.4.7 containing the fix was released to rubygems on Saturday, it will take a few days until it reaches the repos.
Sorry this is correct, 2.1 / 3.16. I have been trying 2.0 + 3.15, and having issues, and other versions to get this working and starting to go crazy