Smart proxy sync throws Katello::Errors::Pulp3Error

Problem: Yesterday I upgraded from katello 4.16.3 to 4.17.1 and since then smart proxy sync task throws Katello::Errors::Pulp3Error occasionally. The pulpcore-worker is killed on the smart proxy side due to general protection fault and creates coredump. When I start the sync of smart proxy again, it finishes succesfully.

Expected outcome: Synchronization of smart proxies runs without errors

Foreman and Proxy versions: 3.15/4.17.1

Foreman and Proxy plugin versions:

Distribution and version: RHEL9

Other relevant data:
Smart proxy has 32GB of RAM.

Logs from smart proxy:

kernel: [48636.858646] traps: pulpcore-worker[254711] general protection fault ip:7fdac58253c7 sp:7fff488870a0 error:0 in libpython3.12.so.1.0[7fdac56ff000+276000]

Logs from foreman server:

2025-08-26T11:46:02 [E|bac|75a11d32] Pulp task error (Katello::Errors::Pulp3Error)
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/pulp3/abstract_async_task.rb:107:in `block in check_for_errors'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/pulp3/abstract_async_task.rb:105:in `each'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/pulp3/abstract_async_task.rb:105:in `check_for_errors'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/pulp3/abstract_async_task.rb:161:in `poll_external_task'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action/polling.rb:100:in `poll_external_task_with_rescue'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action/polling.rb:22:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action/cancellable.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/pulp3/abstract_async_task.rb:10:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:590:in `block (3 levels) in execute_run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/record_smart_proxy_sync_history.rb:26:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:33:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/remote_action.rb:16:in `block in run'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/remote_action.rb:40:in `block in as_remote_user'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/models/katello/concerns/user_extensions.rb:21:in `cp_config'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/remote_action.rb:27:in `as_cp_user'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/remote_action.rb:39:in `as_remote_user'
 75a11d32 | /usr/share/gems/gems/katello-4.17.1/app/lib/actions/middleware/remote_action.rb:16:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/rails_executor_wrap.rb:14:in `block in run'
 75a11d32 | /usr/share/gems/gems/activesupport-7.0.8.7/lib/active_support/execution_wrapper.rb:92:in `wrap'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/rails_executor_wrap.rb:13:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action/progress.rb:29:in `with_progress_calculation'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action/progress.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/load_setting_values.rb:20:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_request_id.rb:15:in `block in run'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_request_id.rb:52:in `restore_current_request_id'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_request_id.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_timezone.rb:15:in `block in run'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_timezone.rb:44:in `restore_curent_timezone'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_timezone.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_taxonomies.rb:15:in `block in run'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_taxonomies.rb:45:in `restore_current_taxonomies'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_taxonomies.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:33:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:28:in `pass'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware.rb:20:in `pass'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_user.rb:15:in `block in run'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_user.rb:54:in `restore_curent_user'
 75a11d32 | /usr/share/gems/gems/foreman-tasks-11.0.0/app/lib/actions/middleware/keep_current_user.rb:15:in `run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/stack.rb:24:in `call'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/middleware/world.rb:31:in `execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:589:in `block (2 levels) in execute_run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:588:in `catch'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:588:in `block in execute_run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:491:in `block in with_error_handling'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:491:in `catch'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:491:in `with_error_handling'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:583:in `execute_run'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/action.rb:304:in `execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:18:in `block (2 levels) in execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/execution_plan/steps/abstract.rb:168:in `with_meta_calculation'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:17:in `block in execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:32:in `open_action'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/execution_plan/steps/abstract_flow_step.rb:16:in `execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/director.rb:95:in `execute'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors/sidekiq/worker_jobs.rb:12:in `block (2 levels) in perform'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors.rb:18:in `run_user_code'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors/sidekiq/worker_jobs.rb:10:in `block in perform'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors/sidekiq/worker_jobs.rb:26:in `with_telemetry'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors/sidekiq/worker_jobs.rb:9:in `perform'
 75a11d32 | /usr/share/gems/gems/dynflow-1.9.1/lib/dynflow/executors/sidekiq/serialization.rb:28:in `perform'
 75a11d32 | [ sidekiq ]
 75a11d32 | [ concurrent-ruby ]

Remark: there was an issue discussed here: Content proxy complete sync out-of-memory - #5 by gvde which I hoped will be solved by this version of katello. May be these two problems are related somehow.

Hi @JendaVodka

We’re currently investigating a segfault issue at sync time that seems to be related to createrepo_c 1.2.1 and potentially Python 3.12.

Can you tell if your machine is memory-constrained when the coredump happens? Or does it seems completely unrelated to memory usage?

If you’re interested in doing some testing, I have RPMs built for Python 3.12 createrepo_c 1.1.3 & a special pulp-rpm 3.29.4 that accepts the older createrepo_c.

We’re trying to determine now if the newer createrepo_c introduced a regression, so we’re looking for data on if syncs succeed 100% of the time on older createrepo_c.

We have a tracker for this issue just on Satellite for now (https://issues.redhat.com/browse/SAT-35513).

Hi @iballou
thank you for your quick response.
The issue happened both on foreman server with 64GB during repo sync and on foreman capsule with 32GB during capsule sync. Especially the foreman server has a lot of memory so this does not seem to be related to overall system memory contention. However it may be connected to limitation for memory consumed by a single process or some ulimit of user pulp (stack etc). The strange thing is that the second sync shortly afterwards finishes successfully.
Unofortunately I have no testing environement only production one for foreman but if you give me some link where those packages can be downloaded and how to install it I may potentially give it a try.

2 Likes

Thanks for the info. In case the sync issue gets annoying, I’ve attached the RPMs to test.
createrepo_c-1_1_3.tgz (947.5 KB)

If you do test this out, just check first what version of pulp-rpm your machine currently has. If it’s lower than 3.29.4, you’ll want to run the installer and restart services to ensure any new Pulp migrations run (usually there aren’t any because Pulp doesn’t typically backport migrations). And, for extra precaution, it could be worth taking a snapshot or taking a foreman-maintain backup.

1 Like

Any progress on this? I have updated to 3.15/0/4.17.1 on Sunday and I am seeing the same issue that content proxy syncs fail due to crashing pulpcore-workers:

Sep  8 03:59:31 foreman8-content kernel: traps: pulpcore-worker[41722] general protection fault ip:7f804b824d77 sp:7ffce2e44710 error:0 in libpython3.12.so.1.0[7f804b6ff000+276000]
Sep  9 04:04:12 foreman8-content kernel: traps: pulpcore-worker[110722] general protection fault ip:7f804b824d77 sp:7ffce2e44710 error:0 in libpython3.12.so.1.0[7f804b6ff000+276000]
Sep  9 04:45:54 foreman8-content kernel: traps: pulpcore-worker[118474] general protection fault ip:7fe408224d77 sp:7ffc6581c550 error:0 in libpython3.12.so.1.0[7fe4080ff000+276000]

This is causing me some pain…

1 Like

I have also noticed that despite of no errors being shown during sync, some repositories have to be re-synced via complete sync otherwise they do not appear correctly on the capsule. In my case these are EPEL repositories which have Immediate and Additive download policy.

1 Like

There is some discussion going on in the related Jira: https://issues.redhat.com/browse/SAT-35513
Debugging is happening but it’s a bit slow due to the complexity of the issue.

I think we may need to revert the version of createrepo_c in the meantime. Since Katello 4.18 GA is supposed to be today, there should be news soon.

Edit: we are indeed delaying Katello 4.18.0 GA for this issue.

1 Like

For an update, we have a PR out to libcomps to fix the seg fault issue: Resolve segfault issues with libcomps python extension by dralley · Pull Request #132 · rpm-software-management/libcomps · GitHub

The patch fixes the seg fault in our minimal reproducer script. Our overnight sync testing also showed no failures.

The current plan is to wait for a maintainer review until next week. If we hear nothing, I’m proposing that we package the patch into libcomps and proceed with the Katello 4.18 GA next week. In that case, we would use the patched libcomps until there is an official release with our fix.

3 Likes

What is the current best workaround for this issue on 3.15/4.17? I’m hitting it almost every night during our automatic publish/promote. I have to do a optimized sync for each content proxy having the problem to get everything visible everywhere…

We are about to officially release a new version of libcomps that includes the patch mentioned above. It is considered a must have for the Katello 4.18 release, which was delayed to today, assuming everything is green. The new package will hit the Pulpcore repo that serves both Katello 4.17 and 4.18.

In fact, I just noticed that our nightly Pulpcore repo has the upgraded package already: Index of /pulpcore/nightly/el9/x86_64

This nightly repo is currently still Pulpcore 3.73 compatible since we haven’t started the 3.85 packaging for Katello 4.19 yet. It should be safe to upgrade libcomps from that repository - it should make it into the normal 3.73 one that serves Katello 4.17 very soon.

The only other workaround that we’ve identified is downgrading createrepo_c, which seems to have uncovered the libcomps segfault issue (but we’re not sure why it wasn’t prevalent earlier).

We would be very interested in hearing if the seg faults do indeed subside with the libcomps upgrade. We’ve so far only been able to reproduce the issue reliably artificially via a script that simulates what Pulp does during syncing.

Update - it sounds like the libcomps upgraded package should be released in the Pulpcore 3.73 repo today.

The new version of pulp-rpm will download https://yum.theforeman.org/pulpcore/3.73/el9/x86_64/libcomps-0.1.23-1.el9.x86_64.rpm this should fix the issue.

1 Like

I have updated it yesterday (three libcomps rpms) and at least this morning there were no failures.

2 Likes

I can confirm that the sync now works without failures.

1 Like

Thanks for the reports back, it’s really helpful especially considering how often y’all reproduced the seg fault.