Moving forward with Pulpcore RPM Packaging

I’m posting on behalf of the @packging team, today we package Pulp related RPMs on demand when Katello asks for a new release, this has worked in the past, but today is a heavy work to do, it usually takes weeks and it’s error oriented, our automation at the moment lacks the hability package python projects that only have pyproject.toml.

For Pulp 3.28 packaging we decided to build poetry/hatch and flit, this will allow us to build packages that don’t have setup.py/cfg files. Automation can be created to inspect the package and decide if we are going to use pyp2rpm or pyp2spec, I used both during Pulp 3.28 branching.

My plan in the near future is to create a nightly version of Pulp RPM packages, in the same way the we have today on Foreman/Katello, this will allow us to stay in sync with Pulp or at least try to stay in sync until we have the automation in place, the main goal is to reduce the time when Katello asks for a branch and when @packaging delivers the RPMs.

After talking with @ehelms we decided that copr is the best place to build this nightly right now, since we already have one effort going to move from Koji to copr, I want ahead and created this project pulpcore-nigthly-staging.

We are open to ideas and contributions, specifically in the automation side.

4 Likes

I don’t have much to add from a tech perspective, but I think this would be a great thing to have for future Katello + Pulp integration. It’s easy for Katello to test against a new Pulp – packaging for it is the hardest part by far.

Will there be much overhead to maintaining the nightly RPMs before automation is set up ?

1 Like

Probably much less overhead if we compare with branching like we do right now, having nightly will allow us to stay one or two weeks behind the new X release from Pulp.

The automation will help a lot with the dependency solve, specially if we manage to get some sort of lock file like poetry.lock.

In our internal packaging (which is built on top of, and works because of all the work you do on the upstream packaging), we do have some automation for things like packaging arbitrary Pulp component Z releases using the python package from PyPi, and fully automatic changes to the upstream SPEC file. While this is not in a state to be used as is for any upstream packaging workflows, and I don’t have the bandwidth to meaningfully contribute to your nightly build efforts in the short term, I will keep an eye on these developments. My goal is to keep looking for any opportunities to work together more closely on the upstream packaging in the future.

I have proposed a session at PulpCon on “Workshop: Ready to use Pulp CI jobs for everyone?” that aims in the same direction of assessing if there is an opportunity to move any of our internal CI work upstream, to collaborate more closely on packaging. I would be glad if you could join that session (assuming it happens). I am certainly looking forward to your talk on PEP-517.

Does that include the ability to build packages using setup-tools without a setup.py but only pyproject.yaml too?

It took me a while to understand what this was about. Do I get it right that you want to use automation to do bumping, just like we do in foreman-packaging?

For context: today that works by using the bump_packages workflow. The short summary is that for both deb/develop and rpm/develop it runs a script to list the updates (deb and rpm). Both build a matrix and we then call bumping on that matrix, again using scripts (deb and rpm). Those then submit a PR for each update.

Looking at pulpcore-packaging there is already automation but looking at _generate_deps.sh it hasn’t been maintained since 3.17.

So splitting up the problem: you need a way to update requirements.txt. This can be manual. You can also consider looser requirements that allows updating patch versions automatically when they show up. Then step 2 is writing some GitHub Actions logic to do the updates.

Recently I wrote up blog describing the techniques we use for dynamic matrix testing (Advanced matrix testing with GitHub actions - Partial Solutions) but I’m not sure you need it here.

One thing to note is that GitHub only allows scheduled jobs on the default branch.

Yes, for Pulpcore 3.28 branching we already added this option, if you take a look at python-jaraco-classes.spec we are not using setup.py/cfg, instead we are using pyproject.yaml.

1 Like

Yes, the inspiration is the automation that y’all have on foreman-packaging.

3.18 was the bump to python 3.9, and at that time we started to see a lot of new packages that would use only pyproject.toml, this automation started to be a little complicated to use, in the end I was only using it to generate the list of packages to update, because it would fail if a new package didn’t had setup.py.

My plan is to use something on those lines, maybe using dependabot to update the requirements.txt for now and later if we manage to get a proper lockfile we can iterate on that.

For now the main goal is to update weekly or biweekly the develop branch to make branching easier when Katello asks for it.