RFC: Dropping support for sqlite


This post stemmed from a comment @ehelms made on another post (GraphQL as api v3), which had a good point that I don’t want to get lost in comment #23 of an unrelated thread.
Today, we run tests on 3 databases: postgresql, mysql, and sqlite.
However, in production, only two of those are supported by foreman core (mysql and pg), and katello only supports pg.
I suggest that we drop support for sqlite completely.


  1. Less tests to run on Jenkins, freeing up capacity for other tasks.
  2. Less database-specific code that needs to be written.
  3. Ability to clean up some workarounds needed to make sqlite work.
  4. All developers will use a database used in production during their work, leading to reduced frequency of “It worked on my box” issues and better day-to-day testing on a closer to production environment.
  5. No wasted effort trying to fix issues that are only present on sqlite just to get CI green with no impact for users.
  6. One less gem to package into rpm.


  1. Developers currently using sqlite will need to migrate their development environment to mysql or postgresql.
  2. Slightly more complicated initial dev environment setup (unless using forklift which I think sets up pg anyways).
  3. Running tests locally on sqlite is a bit faster IIRC, this will make local test slower for developers currently on sqlite.

Please let me know your thoughts or if I missed something, or just use the little heart button below if you agree :slight_smile:


One more addition to Pro #1 - in core, we currently run the test suite 5 times in the matrix, and upgrade tests run on all 3 databases (katello obviously only runs on pg). The full test suite is also executed for plugin PRs IIRC. Granted, integration tests are only run on pg and take longer, but I guesstimate dropping sqlite will reduce average Jenkins load caused by PR testing by around 10-15%.

I have no problem with dropping but we should not only ask devs. There are users who might be using it in production. I know we don’t recommend it, but manual says its possible Foreman :: Manual and from time to time, there’s someone asking how do I migrate. After we got ack from users, the manual should be updated too.

Is this the reason why PG tests are twice as slow than MySQL and sqlite3?

Anyway, if there are issues with sqlite3 for graphql, is it a good idea to start talking about dropping MySQL as well?

The PG tests are slower because we only run integration tests on PG.

:+1: on getting the users involved. It doesn’t have to be just a poll (where I guess, from users perspective, keeping something is always better than dropping something), but making sure the reasoning about the suggestion clearly stated and discussed there before we start going that way.

How should we go about it?
Would it make sense declaring that 1.17 is the last release to support sqlite?
we already have guidelines for migrating from sqlite to postgresql or mysql in the Foreman :: Manual

1 Like

I’d keep sqlite for now. We found a way to make graphql work with sqlite (actually, it wasn’t the only limitation) and I think it is handy to have around. We can definitely limit the tests on jenkins, oldest supported ruby version should be enough. But if you want to drop it, I don’t really mind.

Personally I always like the option to quickly get a dev setup installed by using sqlite so if possible, I’d like to keep it.

This is true.

Can you point me to cases where this is needed? Grepping through the source only points me to https://github.com/theforeman/foreman/blob/b9ec31be65210a0ab87202fa95f764f8998ccc51/test/models/compute_resource_test.rb#L210-L219

Again, I’d like to see some examples.

It may lead to using more postgresql specific features which would break on mysql. Sqlite is the lowest common denominator. It’s very likely that anything sqlite can do also works on postgresql and mysql.

How often does his happen?

It’s a subpackage of foreman so it’s automatically rebuilt when we update foreman. We do have one package in the rails SCL, but that’s minimal effort because it’s very stable and doesn’t require updating very often.

Also requires knowledge of a bit more tools. Currently you can wipe the database by using rm, back it up using cp and restore it with mv.

At least for DEBs (I didn’t check the RPM situation) we do use sqlite in core and core plugin builds for at least asset precompilation to work. While this might be possible otherwise, we’d need to research this first.

I’d be all in for limiting all sqlite (and mysql) CI builds to, let’s say, Ruby 2.4.

RPMs also use sqlite in the same way.


There are actually a bunch of others you can find by git grep adapter, mostly in migrations.
There were also cases that we changed our code to work in a database-agnostic way since the initial implementation failed on one of the dbs, though there might not be an easy method of finding them.

That would only be the case if all developers moved to developing on postgresql and none to mysql.
TBH, I think most developers already are on postgres (this is certainly the case in our office, and for developers using forklift) but I may be wrong.

This is a valid point, though I don’t have any idea how many people actually use this ability atm.

I wonder, do we actually need a working database for the asset compilation?

Isn’t this already the case? it is in core, just with 2.3 iirc

I believe it’s actually related to building the apipie cache. https://github.com/theforeman/foreman-packaging/commit/248db50fb9740c0d4a0af531763e0870e3927c31 would suggest it.

I do use sqlite3 for development actually and I’d appreciate possibility not to drop it if there is a chance. But I can migrate to PosgreSQL if that’s too difficult.

Yes, apipie by itself doesn’t require that, but since it needs to actually load the controller definitions, there were often issues with some of them (such as in the documentation itself) were actually referencing the active model classes, which lead to errors when the db was not ready, so the pragmatic solution was actually to use the sequel for this purposes, which made this kind of issues go away.

I am in the same boat where I would like to drop sqlite as an official support method but our build tasks require a database today and thats the lightest-weight. That being said, the apipie:cache and asset compilation are two of the biggest pain points when it comes to doing things like RPM building or container building due to their slowness and requirement to create a database that is then destroyed afterward.

Does the DB for the apipie and asset tasks actually need to be fully migrated and seeded? Or could we go with a “ghost” sqlite db that contains nothing and ignore the Foreman::MaintenanceException thrown when the db isn’t fully migrated?

Fully migrated from my experience.

AFAIK rails has no concept of a descriptive model. In rails it’s derived from the database so you need to migrate it.

If you compare it to Django where database models are explicitly modelled, then you don’t need the database to know how columns are supposed to look. This allows you to use the metadata. This does have the downside that you need to take care of creating migrations, but since a few releases you can generate these. It even considers them as a graph where you can create an alternative squashed migration so you have a-k and all a till m migrations. If you start from a and need to get to m, then you apply a-k + l + m, but if you are at h, then you apply all the incremental changes.