Scenarios for Foreman 1.17 test day

Hello,

it looks like few blocker bugs made it into last Foreman releases and we would like to propose a Test Day event making this an annual event for every Foreman release candidate. The idea is to have a checklist of most important end-to-end scenarios tested with at least one RC version.

We are coming up with an initial version of a document with such scenarios, the idea is to collect few more scenarios and put this to RedMine wiki and link this “template” from ReleaseEngineering process to remind us sending the list along with RC1 announcement so we can all participate in doing the job.

This should hopefully increase our confidence because it looks like that some portion of our user community holds off upgrades for one or several releases. For example, in the last community survey vast majority of respondents were on N-1 or older version (36 + 14 + 13 = 63 per cent). I’d assume most of these users are not participating in RC testing.

https://theforeman.org/2017/03/2017-foreman-survey-analysis.html

Please feel free to add your own end-to-end scenarios to the document, I will move this to RedMine and inform about this move next week here and in the document itself.

If we agree on this, let’s do our initial “test run” event with 1.17 release. It’s not in Release Candidate phase anymore, but it could be good chance to test if this works for us. I volunteer to run the initial round, unless there is someone else interested. I was thinking:

  • prepare new “instance” of the checklist (wiki or somewhere else where people can easily mark as done - suggestions?)
  • announcement thread and email
  • put the day in our shared calendar
  • participate
  • wrap up as a blog post (all new issues from that day considered Test Day bug reports)
2 Likes

Superb idea! We’ve done occasional bug days in the past, usually successfully, so something regular would be good. Using the wiki for the “master” checklist sounds good - does an etherpad page make sense for instances of the checklist?

Happy to assist with write ups and suchlike - if we create an event on the Events category then we can post the results of the day there first, in rough form, and blog it later.

Oh, since you mention the survey data, you might as well have the latest data (I’m preparing the blog now :P)

So that’s down to just 44% on version < latest - but be careful here. The last survey was held just weeks after 1.14 came out. This year the survey was just before 1.17 was released, giving people much more time. If we guess a midpoint, I’d say the results are broadly comparable.

(I know thats a low-res PNG, the final report uses SVG for proper scaling :P)

1 Like

Nice, it all makes sense and it’s not that bad in this light then. We can say with a confidence that adoption is slower, but it’s happening :slight_smile:

I see more and more scenarios coming, please keep going. Make sure your favorite feature, workflow or plugin is on that list because this creates a possibility that someone will pick it up and test it for the next version!

1 Like

How many of these scenarios can be automated? #everyDayIsTestDay

Automating was put out of table on purpose for this effort. This is all about defining GO/NOGO set of scenarios to do a release. We have a good installation and sanity testing automated coverage - these are also not mentioned in the document.

Just a reminder - if you want a workflow to be on the official list, add it to the document. We will likely do our first run with one of 1.18 RCs as we are getting too late. Let’s see.

added few more, I’ll do my best to join the test day if someone organizes it :slight_smile: