Pulp3 Migration failed

Hi,
I have done every thing (except delete RHEL 8 and CentOS 8 repos.
How Can I see what repo is causing this possible problem?

[root@server]# foreman-rake katello:pulp3_migration --trace
Rubocop not loaded.
** Invoke katello:pulp3_migration (first_time)
** Invoke dynflow:client (first_time)
** Invoke environment (first_time)
** Execute environment
** Execute dynflow:client
** Execute katello:pulp3_migration
Starting task.
2021-05-10 13:51:40 -0300: Content migration starting.
2021-05-10 14:18:32 -0300: Distribution creation 7/34Migration failed, You will want to investigate: https://server/foreman_tasks/tasks/3fc7cff9-3e8d-4e6e-89ac-59c905e05970
rake aborted!
ForemanTasks::TaskError: Task 3fc7cff9-3e8d-4e6e-89ac-59c905e05970: Katello::Errors::Pulp3Error: 1 subtask(s) failed for task group /pulp/api/v3/task-groups/40bdb2f8-ca2a-4d36-b37f-384706381109/.
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.2.1/lib/katello/tasks/pulp3_migration.rake:33:in block (2 levels) in <top (required)>' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:251:in block in execute’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:251:in each' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:251:in execute’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:195:in block in invoke_with_call_chain' /opt/rh/rh-ruby25/root/usr/share/ruby/monitor.rb:226:in mon_synchronize’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:188:in invoke_with_call_chain' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/task.rb:181:in invoke’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:160:in invoke_task' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:116:in block (2 levels) in top_level’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:116:in each' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:116:in block in top_level’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:125:in run_with_threads' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:110:in top_level’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:83:in block in run' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:186:in standard_exception_handling’
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/lib/rake/application.rb:80:in run' /opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in <top (required)>’
/opt/rh/rh-ruby25/root/usr/bin/rake:23:in load' /opt/rh/rh-ruby25/root/usr/bin/rake:23:in
Tasks: TOP => katello:pulp3_migration

So, I am new to the whole migration thing and am kind of stuck.

I am getting the error: WARNING: MISSING OR CORRUPTED CONTENT DETECTED

And the file indicated in the migration stats shows me a list of a THOUSAND rpms. The packages are all from from “8” repos (redhat8, or centos8, or oracle linux 8).

I have tried running the orphaned object cleanup. No dice.

The list of actions to perform includes: “Performing a ‘Verify Checksum’ sync under Advanced Sync Options, let it complete, and re-running the migration”, but I have not been able to find that option.

In my situation, I cannot wipe out all of the centos8/redhat8/etc stuff because that’s half of my entire katello infrastructure.

And “Mark currently corrupted or missing content as skipped” makes me nervous, as with a thousand packages, I cannot tell if ALL of them are “old” or not.

Recommendations?

Ah, I finally found the advanced sync (with verify content). I am giving that a shot and will reply back.

@caseybea awesome, let us know how it goes!

@pabloalcantara sorry for the delay, You may try it again and check the output of ‘journalctl -u pulpcore-worker@*’ to see if any tracebacks are printed.

Well, zero progress.

first, I redid ALL the syncs, now with “verify” checked under the advanced options.

second, re-executed the orphaned content delete.

third, re-ran the content prepare.

Same error. SAME exactly 1,000 packages listed in the reference file in the migration stats command.

Help?

Running Prepare content for Pulp 3
================================================================================
Prepare content for Pulp 3:
Starting task.
2021-05-25 14:10:30 -0500: Importing migrated content type rpm: 1080/1459
Some corrupted or missing content found, run 'foreman-maintain content migration-stats' for more information.
                                                                      [FAIL]
Failed executing foreman-rake katello:pulp3_migration, exit status 255
--------------------------------------------------------------------------------
Scenario [Prepare content for Pulp 3] failed.

The following steps ended up in failing state:

  [content-prepare]

Resolve the failed steps and rerun
the command. In case the failures are false positives,
use --whitelist="content-prepare"

I would do a little bit of investigation. Its possible that there were multiple copies of the packages that it claims are missing/corrupt. For example, lets say it complained about ‘kernel-3.10.0-957.12.2.el7.x86_64’ being missing/corrupt, you can run:

find /var/lib/pulp/content/ | grep kernel-3.10.0-957.12.2.el7.x86_64

My guess is that will show at least one copy on disk. You could also go on a test client and verify that you can yum install one of the listed rpms (basically proving that the rpms mentioned aren’t the ones missing/corrupt).

Another thing you could do is go into the foreman-rake console and run:

Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: ‘kernel-3.10.0-957.12.2.el7.x86_64.rpm’).count
Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: ‘kernel-3.10.0-957.12.2.el7.x86_64.rpm’)[0].repositories.map(&:name)
Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: ‘kernel-3.10.0-957.12.2.el7.x86_64.rpm’)[0].repositories.map(&:content_view).map(&:name)

The first line will print how many unmigrated rpms have that exact rpm filename.
The second line will print the repository names the first unmigrated rpm with that filename is in.
The 3rd line will print the content views those repositories are in.

Its very likely that these corrupt/missing rpms are missing from some past event and since you haven’t noticed until this point, they aren’t really in use.

I’ve done several picks of RPMS, they all appear to be from Centos8:

irb(main):012:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'rsync-3.1.3-6.el8.x86_64.rpm')[0].repositories.map(&:name)
=> ["centos8_base", "centos8_base", "centos8_base", "centos8_base", "centos8_base", "centos8_base"]
irb(main):013:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'rsync-3.1.3-6.el8.x86_64.rpm')[0].repositories.map(&:content_view).map(&:name)
=> ["Centos8", "Centos8", "Default Organization View", "Centos8", "Centos8", "Centos8"]
irb(main):014:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'rpm-apidocs-4.14.2-25.el8.noarch.rpm')[0].repositories.map(&:content_view).map(&:name)
=> ["Centos8", "Centos8", "Default Organization View", "Centos8", "Centos8", "Centos8"]

And checking a random sampling of more, the versions do appear to be “older”.

I may first try just wiping all traces of centos8 (deleting the hosts, content views, repos, all of it) and see if that does it.

Any of the rpms listed appear in a TON of places (example below), but the katello command above SEEM to only point to the Centos8 content view.

[root@katello unmigratable_content-20210525-18215-1ivjglw]# locate rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/content/units/rpm/81/c2108792e7185de0bd2e19f50bc768eaa737b451d4c279ac30ad7de8fa9c16/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/content/units/rpm/f5/d24a109074170f865b8cfef767a3b6a1d0497c2965d8442099f824eba9be68/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Centos8-Development-0d69ed0f-5f4e-4058-a99f-e8411e38bf82/1620657772.86/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Centos8-Library-0d69ed0f-5f4e-4058-a99f-e8411e38bf82/1620657418.23/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Centos8-Production-0d69ed0f-5f4e-4058-a99f-e8411e38bf82/1620657790.12/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Centos8-v19_0-0d69ed0f-5f4e-4058-a99f-e8411e38bf82/1607441046.98/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Centos8-v20_0-0d69ed0f-5f4e-4058-a99f-e8411e38bf82/1620657415.0/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Development-57f9d878-d7c7-4910-a157-fb9ddd6d5659/1620658270.27/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Library-57f9d878-d7c7-4910-a157-fb9ddd6d5659/1620657879.37/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Production-57f9d878-d7c7-4910-a157-fb9ddd6d5659/1620658290.45/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-v11_0-57f9d878-d7c7-4910-a157-fb9ddd6d5659/1617647575.54/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-v12_0-57f9d878-d7c7-4910-a157-fb9ddd6d5659/1620657869.33/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Development-40e9c880-d53e-4848-941f-20a77265b849/1620659168.87/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Library-40e9c880-d53e-4848-941f-20a77265b849/1620658647.0/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Production-40e9c880-d53e-4848-941f-20a77265b849/1620659185.1/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-v14_0-40e9c880-d53e-4848-941f-20a77265b849/1617648404.04/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-v15_0-40e9c880-d53e-4848-941f-20a77265b849/1620658640.66/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/40e9c880-d53e-4848-941f-20a77265b849/1621426079.95/Packages/r/rsync-3.1.3-6.el8.x86_64.rpm
/var/www/html/centos8/BaseOS/x86_64/kickstart/Packages/rsync-3.1.3-6.el8.x86_64.rpm
/var/www/html/centos8/BaseOS/x86_64/os/Packages/rsync-3.1.3-6.el8.x86_64.rpm

(Note, I have several other quite active redhat8-like repos and views and hosts and… for all of the major centos8 replacements. Oracle Linux 8, RedHat8, Rocky Linux 8, and so on). But the centos 8 one itself can go, since that product is dead now. I was keeping it around for reference and only had one test host tied to it.)

I will try the whole shooting match from the start tomorrow once I verify all of the centos8 bits are totally gone.

OK, so after wiping the centos8 stuff clean (and redoing orphan cleanup), I am down from 1000 to only 36 packages that cause the content migration prepare to fail.

A random check shows that some (maybe all?) of the packages are CURRENT versions, and they exist in all redhat8-like content views.

What do I do now? I’ve already re-done the sync of all repos with the verify option. It looks like the offending packages all came from EPEL8.

Example output of one package:

[root@katello ~]# locate ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/content/units/rpm/ce/f47df05364b886c49a60fd101fc31530041375b93b5a6763923942e169ac45/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Development-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620658270.34/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Library-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620657921.68/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-Production-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620658290.94/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-v11_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1617647617.15/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Oracle8-v12_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620657911.33/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Development-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620659169.02/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Library-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620658692.85/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-Production-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620659185.16/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-v14_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1617648450.98/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-RedHat8-v15_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620658686.12/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Rocky8-Development-012e5b25-7d32-4ad8-a002-9d082dbfc706/1621956310.72/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Rocky8-Library-012e5b25-7d32-4ad8-a002-9d082dbfc706/1621629694.88/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Rocky8-Production-012e5b25-7d32-4ad8-a002-9d082dbfc706/1621956328.38/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Rocky8-v2_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1620310053.06/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/lib/pulp/published/yum/master/yum_distributor/1-Rocky8-v3_0-012e5b25-7d32-4ad8-a002-9d082dbfc706/1621629687.79/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
/var/www/html/epel8/Everything/x86_64/Packages/i/ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm
[root@katello ~]# foreman-rake console
Loading production environment (Rails 6.0.3.4)
irb(main):001:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm').count
=> 1
irb(main):002:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm')[0].repositories.map(&:name)
=> ["epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base", "epel8_base"]
irb(main):003:0>
irb(main):004:0> Katello::Rpm.where(migrated_pulp3_href: nil).where(filename: 'ImageMagick-devel-6.9.10.86-1.el8.x86_64.rpm')[0].repositories.map(&:content_view).map(&:name)
=> ["RedHat8", "RedHat8", "Default Organization View", "Rocky8", "Oracle8", "Oracle8", "Rocky8", "Oracle8", "Rocky8", "Oracle8", "Rocky8", "Rocky8", "RedHat8", "Oracle8", "RedHat8", "RedHat8"]
irb(main):005:0>

I also verified that this package (and exact version) is indeed installable via katello on a sample host

===============================================================================================================================================================================================================================
 Package                                                            Architecture                         Version                                            Repository                                                    Size
===============================================================================================================================================================================================================================
Installing:
 ImageMagick-devel                                                  x86_64                               6.9.10.86-1.el8                                    CTSI_EPEL8_epel8_base           

Hey @caseybea That is good news. Its likely that there were multiple copies of the rpm (which can happen if say a kickstart repo includes an rpm using a sha256 checksum, and an os repo includes one using a sha1 checksum).

I would continue with the migration. After the switchover re-syncing should cause anything that was missed (if any) to be redownloaded if its available in the upstream repo. Worst case is you would have to republish/promote any content views if you are using them.

How can I find which rpm’s are problematic? With which command? To remove the product from Foreman.

Dang, wish I could.

I’m now stuck here:

@jost if you run ‘foreman-maintain content migration-stats’ you should see some output like:

Corrupted or missing content has been detected, you can examine the list of content in /tmp/unmigratable_content-20210325-28817-9cyo58 and take action by either:

That directory should have a list of all the rpms that were corrupt/missing

@caseybea can you check ‘journalctl -u pulpcore-worker@*’ and see if you can find a relevant traceback?

Did that and posted in the relevant thread.

I didn’t get any error at foreman-maintain content migration-stats and /tmp/unmigratable_content-20210528-23274-hfq2wr/ is empty, but step 2 fail:

 foreman-maintain content prepare 
Running Prepare content for Pulp ================================================================================
Prepare content for Pulp 3:
Rubocop not loaded.
Starting task.
2021-05-28 14:27:34 +0200: Processing Pulp 2 repositories, importers, distributors 47/102
2021-05-28 14:30:34 +0200: Initial Migration steps complete.
2021-05-28 14:30:54 +0200: Pre-migrating Pulp 2 RPM content (detail info) 0/3933
2021-05-28 14:31:24 +0200: Pre-migrating Pulp 2 RPM content (detail info) 3923/3928
2021-05-28 14:32:34 +0200: Initial Migration steps complete.
2021-05-28 14:33:54 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 0/15973
2021-05-28 14:34:24 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 9083/24415
2021-05-28 14:36:14 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 39840/54569
2021-05-28 14:36:24 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 44935/59594
2021-05-28 14:36:34 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 50042/64629
2021-05-28 14:37:34 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 67485/81811
2021-05-28 14:41:45 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 157312/170326
2021-05-28 14:43:15 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 186797/199364
2021-05-28 14:45:35 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 238699/250494
2021-05-28 14:46:55 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 266141/277540
2021-05-28 14:48:25 +0200: Pre-migrating Pulp 2 ERRATUM content (general info) 300773/311537
2021-05-28 14:55:26 +0200: Pre-migrating Pulp 2 ERRATUM content (detail info) 431759/435828
2021-05-28 15:04:27 +0200: Initial Migration steps complete.
2021-05-28 15:06:47 +0200: Initial Migration steps complete.
2021-05-28 15:06:57 +0200: Initial Migration steps complete.
2021-05-28 15:30:59 +0200: Initial Migration steps complete.
2021-05-28 15:36:39 +0200: Initial Migration steps complete.
2021-05-28 15:37:39 +0200: Initial Migration steps complete.
2021-05-28 15:50:40 +0200: Initial Migration steps complete.
2021-05-28 16:02:01 +0200: Initial Migration steps complete.
2021-05-28 16:05:01 +0200: Initial Migration steps complete.
2021-05-28 16:12:52 +0200: Initial Migration steps complete.
2021-05-28 16:21:52 +0200: Initial Migration steps complete.
2021-05-28 16:57:55 +0200: Initial Migration steps complete.
2021-05-28 17:12:06 +0200: Initial Migration steps complete.
2021-05-28 17:42:09 +0200: Initial Migration steps complete.
2021-05-28 17:46:49 +0200: Initial Migration steps complete.
2021-05-28 18:01:10 +0200: Initial Migration steps complete.
2021-05-28 18:11:11 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 18:18:52 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 18:23:12 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 19:46:50 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 19:51:10 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 20:00:11 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 20:07:21 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 20:43:55 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 20:50:45 +0200: Migrating rpm content to Pulp 3 rpm 0/116425
2021-05-28 21:00:36 +0200: Migrating rpm content to Pulp 3 rpm 0/116425Migration failed, You will want to investigate: https://cvrtforeman01.acme.com/foreman_tasks/tasks/8d38f826-d14f-4e8a-ac38-e2ba058078e3
rake aborted!
ForemanTasks::TaskError: Task 8d38f826-d14f-4e8a-ac38-e2ba058078e3:
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.2.1/lib/katello/tasks/pulp3_migration.rake:33:in `block (2 levels) in <top (required)>'
/opt/rh/rh-ruby25/root/usr/share/gems/gems/rake-12.3.0/exe/rake:27:in `<top (required)>'
Tasks: TOP => katello:pulp3_migration
(See full trace by running task with --trace)
                                                                      [FAIL]
Failed executing foreman-rake katello:pulp3_migration, exit status 1 --------------------------------------------------------------------------------
Scenario [Prepare content for Pulp 3] failed.
 The following steps ended up in failing state:
  [content-prepare] 
Resolve the failed steps and rerun
the command. In case the failures are false positives,
use --whitelist="content-prepare"

pulp errors, fail at 2021-05-28 21:00:36
journalctl -u pulpcore-worker@*

May 28 20:47:07 cvrtforeman01.acme.com pulpcore-worker-3[13578]: pulp: rq.worker:INFO: Cleaning registries for queue: 13578@cvrtforeman01.acme.com
May 28 21:00:43 cvrtforeman01.acme.com systemd[1]: Stopping Pulp RQ Worker...
May 28 21:00:43 cvrtforeman01.acme.com systemd[1]: Stopping Pulp RQ Worker...
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-2[13576]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '13576@cvrtforeman01.acme.com'.
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-3[13578]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '13578@cvrtforeman01.acme.com'.
May 28 21:00:43 cvrtforeman01.acme.com systemd[1]: Stopping Pulp RQ Worker...
May 28 21:00:43 cvrtforeman01.acme.com systemd[1]: Stopping Pulp RQ Worker...
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-1[13580]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '13580@cvrtforeman01.acme.com'.
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-4[13581]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '13581@cvrtforeman01.acme.com'.
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-3[13578]: pulp: rq.worker:INFO: Warm shut down requested
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-4[13581]: pulp: rq.worker:INFO: Warm shut down requested
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-2[13576]: pulp: rq.worker:INFO: Warm shut down requested
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-1[13580]: pulp: rq.worker:INFO: Cleaning registries for queue: 13580@cvrtforeman01.acme.com
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-1[13580]: pulp: rq.worker:INFO: 13580@cvrtforeman01.acme.com: 95148108-27d4-4ccc-a331-5be509a03f37
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-1[13580]: pulp: pulpcore.tasking.tasks:ERROR: The task 775e6fbe-74c0-4784-8a63-4983ebd13f75 exited immediately for some reason. Marking as failed. Check the logs for more details
May 28 21:00:43 cvrtforeman01.acme.com pulpcore-worker-1[13580]: pulp: rq.worker:INFO: 13580@cvrtforeman01.acme.com: Job OK (95148108-27d4-4ccc-a331-5be509a03f37)
May 28 21:00:44 cvrtforeman01.acme.com systemd[1]: Stopped Pulp RQ Worker.
May 28 21:00:44 cvrtforeman01.acme.com systemd[1]: Stopped Pulp RQ Worker.
May 28 21:00:44 cvrtforeman01.acme.com systemd[1]: Stopped Pulp RQ Worker.
May 28 21:02:13 cvrtforeman01.acme.com systemd[1]: pulpcore-worker@1.service stop-sigterm timed out. Killing.
May 28 21:02:13 cvrtforeman01.acme.com systemd[1]: pulpcore-worker@1.service: main process exited, code=killed, status=9/KILL
May 28 21:02:13 cvrtforeman01.acme.com systemd[1]: Stopped Pulp RQ Worker.
May 28 21:02:13 cvrtforeman01.acme.com systemd[1]: Unit pulpcore-worker@1.service entered failed state.
May 28 21:02:13 cvrtforeman01.acme.com systemd[1]: pulpcore-worker@1.service failed.
May 28 21:04:50 cvrtforeman01.acme.com systemd[1]: Started Pulp RQ Worker.
May 28 21:04:50 cvrtforeman01.acme.com systemd[1]: Started Pulp RQ Worker.
May 28 21:04:51 cvrtforeman01.acme.com systemd[1]: Started Pulp RQ Worker.
May 28 21:04:51 cvrtforeman01.acme.com systemd[1]: Started Pulp RQ Worker.
May 28 21:05:04 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '17134@cvrtforeman01.acme.com'.
May 28 21:05:04 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '17138@cvrtforeman01.acme.com'.
May 28 21:05:04 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '17126@cvrtforeman01.acme.com'.
May 28 21:05:04 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:INFO: Cleaning up shutdown worker '17127@cvrtforeman01.acme.com'.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: rq.worker:INFO: Worker rq:worker:17134@cvrtforeman01.acme.com: started, version 1.5.2
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: rq.worker:INFO: *** Listening on 17134@cvrtforeman01.acme.com...
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: rq.worker:INFO: Worker rq:worker:17127@cvrtforeman01.acme.com: started, version 1.5.2
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: rq.worker:INFO: *** Listening on 17127@cvrtforeman01.acme.com...
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: pulpcore.tasking.services.worker_watcher:INFO: New worker '17134@cvrtforeman01.acme.com' discovered
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:INFO: New worker '17127@cvrtforeman01.acme.com' discovered
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: rq.worker:INFO: Worker rq:worker:17126@cvrtforeman01.acme.com: started, version 1.5.2
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: rq.worker:INFO: *** Listening on 17126@cvrtforeman01.acme.com...
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: rq.worker:INFO: Worker rq:worker:17138@cvrtforeman01.acme.com: started, version 1.5.2
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: rq.worker:INFO: *** Listening on 17138@cvrtforeman01.acme.com...
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: pulpcore.tasking.services.worker_watcher:INFO: New worker '17138@cvrtforeman01.acme.com' discovered
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: rq.worker:INFO: Cleaning registries for queue: 17134@cvrtforeman01.acme.com
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: rq.worker:INFO: Cleaning registries for queue: 17127@cvrtforeman01.acme.com
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: pulpcore.tasking.services.worker_watcher:INFO: New worker '17126@cvrtforeman01.acme.com' discovered
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: rq.worker:INFO: Cleaning registries for queue: 17138@cvrtforeman01.acme.com
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: rq.worker:INFO: Cleaning registries for queue: 17126@cvrtforeman01.acme.com
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:05:05 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: Worker '13580@cvrtforeman01.acme.com' has gone missing, removing from list of workers
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: The worker named 13580@cvrtforeman01.acme.com is missing. Canceling the tasks in its queue.
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-1[17134]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:06:45 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: pulpcore.tasking.services.worker_watcher:ERROR: There are 0 pulpcore-resource-manager processes running. Pulp will not operate correctly without at least one pulpcore-resource-mananger process running.
May 28 21:07:42 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: rq.worker:INFO: 17127@cvrtforeman01.acme.com: 7f6e4de8-2418-4801-a553-3d3bab208991
May 28 21:07:42 cvrtforeman01.acme.com pulpcore-worker-3[17127]: pulp: pulp_2to3_migration.pulp2.connection:INFO: Attempting to connect to localhost:27017
May 28 21:25:08 cvrtforeman01.acme.com pulpcore-worker-4[17138]: pulp: rq.worker:INFO: Cleaning registries for queue: 17138@cvrtforeman01.acme.com
May 28 21:25:08 cvrtforeman01.acme.com pulpcore-worker-2[17126]: pulp: rq.worker:INFO: Cleaning registries for queue: 17126@cvrtforeman0

where can I check what was the reason that this failed?

Two more questions:

1.when it fails I always get this error File icons.tar.gz doesn’t exists

Jun 01 11:33:39 cvrtforeman01. rq[7724]: cr_repomd_record_fill: File icons.tar.gz doesn't exists
Jun 01 11:33:39 cvrtforeman01. pulpcore-worker-3[23656]: pulp: rq.worker:ERROR: Traceback (most recent call last):
Jun 01 11:33:39 cvrtforeman01. pulpcore-worker-3[23656]: File "/usr/lib/python3.6/site-packages/rq/worker.py", line 936, in perform_job
Jun 01 11:33:39 cvrtforeman01. pulpcore-worker-3[23656]: rv = job.perform()
Jun 01 11:33:39 cvrtforeman01. pulpcore-worker-3[23656]: File "/usr/lib/python3.6/site-packages/rq/job.py", line 684, in perform
  1. what exactly does this mean ::Katello::Rpm.where(migrated_pulp3_href: nil)

foreman-rake console
Rubocop not loaded.

Loading production environment (Rails 6.0.3.4)

irb(main):001:0> ::Katello::Rpm.where(migrated_pulp3_href: nil)

=> #<ActiveRecord::Relation [#<Katello::Rpm id: 199099, pulp_id: "1fbe1b1c-d647-4b89-932f-13ad6d263de5", created_at: "2021-05-04 23:17:40", updated_at: "2021-05-04 23:17:40", name: "voms-server", version: "2.1.0", release: "0.19.rc1.el8", arch: "x86_64", epoch: "0", filename: "voms-server-2.1.0-0.19.rc1.el8.x86_64.rpm", sourcerpm: "voms-2.1.0-0.19.rc1.el8.src.rpm", checksum: "fbea7b83ebc8fc1f22b0c24d5216d968ad6c3d33d1d5084913...", version_sortable: "01-2.01-1.01-0", release_sortable: "01-0.02-19.$rc.01-1.$el.01-8", summary: "Virtual Organization Membership Service Server", nvra: "voms-server-2.1.0-0.19.rc1.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\",\"\"(0,)\"\"}\",\"{\"\"(0,)\"\",\"\"(19...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199101, pulp_id: "22c3ba28-04bf-403a-aa79-a8cff37cd7f8", created_at: "2021-05-04 23:17:40", updated_at: "2021-05-04 23:17:40", name: "libjwt", version: "1.12.1", release: "7.el8", arch: "x86_64", epoch: "0", filename: "libjwt-1.12.1-7.el8.x86_64.rpm", sourcerpm: "libjwt-1.12.1-7.el8.src.rpm", checksum: "04067ea32bb5be09d19332f7970baf2c1c71288e75ca870ad8...", version_sortable: "01-1.02-12.01-1", release_sortable: "01-7.$el.01-8", summary: "A Javascript Web Token library in C", nvra: "libjwt-1.12.1-7.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(1,)\"\",\"\"(12,)\"\",\"\"(1,)\"\"}\",\"{\"\"(7,)\"\",\"\"(0...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199102, pulp_id: "34a51679-98ea-4529-873f-e8096150c5c4", created_at: "2021-05-04 23:17:41", updated_at: "2021-05-04 23:17:41", name: "voms-clients-cpp", version: "2.1.0", release: "0.19.rc1.el8", arch: "x86_64", epoch: "0", filename: "voms-clients-cpp-2.1.0-0.19.rc1.el8.x86_64.rpm", sourcerpm: "voms-2.1.0-0.19.rc1.el8.src.rpm", checksum: "1754c3a2b66772c05618ddf549955f5e1742eed02b86334d29...", version_sortable: "01-2.01-1.01-0", release_sortable: "01-0.02-19.$rc.01-1.$el.01-8", summary: "Virtual Organization Membership Service Clients", nvra: "voms-clients-cpp-2.1.0-0.19.rc1.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\",\"\"(0,)\"\"}\",\"{\"\"(0,)\"\",\"\"(19...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199103, pulp_id: "4ce89f4e-6c90-490a-ab53-b6110661b22b", created_at: "2021-05-04 23:17:42", updated_at: "2021-05-04 23:17:42", name: "osslsigncode", version: "2.1", release: "3.el8", arch: "x86_64", epoch: "0", filename: "osslsigncode-2.1-3.el8.x86_64.rpm", sourcerpm: "osslsigncode-2.1-3.el8.src.rpm", checksum: "a938097acd769aeea6394270b0692641113ddc1e1b4c8b92d4...", version_sortable: "01-2.01-1", release_sortable: "01-3.$el.01-8", summary: "OpenSSL based Authenticode signing for PE/MSI/Java...", nvra: "osslsigncode-2.1-3.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\"}\",\"{\"\"(3,)\"\",\"\"(0,el)\"\",\"\"(...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199104, pulp_id: "4f72a234-e873-4a35-ab07-38a809c438a5", created_at: "2021-05-04 23:17:42", updated_at: "2021-05-04 23:17:42", name: "voms-devel", version: "2.1.0", release: "0.19.rc1.el8", arch: "x86_64", epoch: "0", filename: "voms-devel-2.1.0-0.19.rc1.el8.x86_64.rpm", sourcerpm: "voms-2.1.0-0.19.rc1.el8.src.rpm", checksum: "89d7595c01ac22771638de93365d6fcb6ddcd4f8ad8a017695...", version_sortable: "01-2.01-1.01-0", release_sortable: "01-0.02-19.$rc.01-1.$el.01-8", summary: "Virtual Organization Membership Service Developmen...", nvra: "voms-devel-2.1.0-0.19.rc1.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\",\"\"(0,)\"\"}\",\"{\"\"(0,)\"\",\"\"(19...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199105, pulp_id: "7c21fb63-944e-416d-91ca-f64e56b496da", created_at: "2021-05-04 23:17:43", updated_at: "2021-05-04 23:17:43", name: "libjwt-devel", version: "1.12.1", release: "7.el8", arch: "x86_64", epoch: "0", filename: "libjwt-devel-1.12.1-7.el8.x86_64.rpm", sourcerpm: "libjwt-1.12.1-7.el8.src.rpm", checksum: "7e403e3101daa5c07fbacf026cd530109b44a0597269873df6...", version_sortable: "01-1.02-12.01-1", release_sortable: "01-7.$el.01-8", summary: "Development files for libjwt", nvra: "libjwt-devel-1.12.1-7.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(1,)\"\",\"\"(12,)\"\",\"\"(1,)\"\"}\",\"{\"\"(7,)\"\",\"\"(0...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199106, pulp_id: "8173b5b4-f716-404f-aba7-585eaaacfb3e", created_at: "2021-05-04 23:17:43", updated_at: "2021-05-04 23:17:43", name: "voms", version: "2.1.0", release: "0.19.rc1.el8", arch: "x86_64", epoch: "0", filename: "voms-2.1.0-0.19.rc1.el8.x86_64.rpm", sourcerpm: "voms-2.1.0-0.19.rc1.el8.src.rpm", checksum: "c4dcb20e9ea5a1ed0695794768318a7a4d02f825bb76fae68a...", version_sortable: "01-2.01-1.01-0", release_sortable: "01-0.02-19.$rc.01-1.$el.01-8", summary: "Virtual Organization Membership Service", nvra: "voms-2.1.0-0.19.rc1.el8.x86_64", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\",\"\"(0,)\"\"}\",\"{\"\"(0,)\"\",\"\"(19...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199107, pulp_id: "84525119-74ac-47f4-b34a-f30d07a688bf", created_at: "2021-05-04 23:17:43", updated_at: "2021-05-04 23:17:43", name: "did", version: "0.18", release: "1.el8", arch: "noarch", epoch: "0", filename: "did-0.18-1.el8.noarch.rpm", sourcerpm: "did-0.18-1.el8.src.rpm", checksum: "49bb1168ad9689f6c014a4f22b59e958511d4bbd32aeecf0de...", version_sortable: "01-0.02-18", release_sortable: "01-1.$el.01-8", summary: "What did you do last week, month, year?", nvra: "did-0.18-1.el8.noarch", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(0,)\"\",\"\"(18,)\"\"}\",\"{\"\"(1,)\"\",\"\"(0,el)\"\",\"\"...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199108, pulp_id: "89854fb0-968e-46ec-819b-81a7a3798e35", created_at: "2021-05-04 23:17:44", updated_at: "2021-05-04 23:17:44", name: "voms-doc", version: "2.1.0", release: "0.19.rc1.el8", arch: "noarch", epoch: "0", filename: "voms-doc-2.1.0-0.19.rc1.el8.noarch.rpm", sourcerpm: "voms-2.1.0-0.19.rc1.el8.src.rpm", checksum: "827e12e5354e7e1dbd2d288ad88084d04ae2df783789609306...", version_sortable: "01-2.01-1.01-0", release_sortable: "01-0.02-19.$rc.01-1.$el.01-8", summary: "Virtual Organization Membership Service Documentat...", nvra: "voms-doc-2.1.0-0.19.rc1.el8.noarch", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(2,)\"\",\"\"(1,)\"\",\"\"(0,)\"\"}\",\"{\"\"(0,)\"\",\"\"(19...", missing_from_migration: false, ignore_missing_from_migration: false>, #<Katello::Rpm id: 199112, pulp_id: "cd30a592-dd73-4585-b435-062693a18da2", created_at: "2021-05-04 23:17:46", updated_at: "2021-05-04 23:17:46", name: "python3-unicodecsv", version: "0.14.1", release: "23.el8", arch: "noarch", epoch: "0", filename: "python3-unicodecsv-0.14.1-23.el8.noarch.rpm", sourcerpm: "python-unicodecsv-0.14.1-23.el8.src.rpm", checksum: "a5b93b5bc72a61233e6a241788ed5b532405120a26d2784495...", version_sortable: "01-0.02-14.01-1", release_sortable: "02-23.$el.01-8", summary: "Drop-in replacement for Python 2.7's csv module wh...", nvra: "python3-unicodecsv-0.14.1-23.el8.noarch", modular: false, migrated_pulp3_href: nil, evr: "(0,\"{\"\"(0,)\"\",\"\"(14,)\"\",\"\"(1,)\"\"}\",\"{\"\"(23,)\"\",\"\"(...", missing_from_migration: false, ignore_missing_from_migration: false>, ...]>

irb(main):002:0>

for example this package voms-server-2.1.0-0.19.rc1.el8.x86_64 does it mean that this package from epel 8 repo is missing or that epel 8 repo is making problems?
`

This error ( File icons.tar.gz doesn’t exists) was also see here: Pulp 2 to Pulp 3 migration failed - #10 by AdamR

I don’t understand whats going on, but am reaching out to the pulp team on irc to try to understand if its a bug or what the cause might be.

This issue seems tied to very specific suse repositories
The pulp team has identified an issue: Issue #8275: ComplexRepoMigration fails with "file doesn't exists or not a regular file" - Migration Plugin - Pulp