RFC: Changing how we handle Webpack Building


#41

This looks promising and the write-up is very detailed, much appreciated! Would it be much work to do a POC within Foreman?


#42

Another important thing, with today’s architecture, jest requires a mock for each and every component we use from foreman core (foremanReact alias), I hope that using meta packages would prevent that.


#43

Sure, I can give it a shot and will let you know.

That would be nice, it seems a meta package would remove a lot of the package duplication that we need for testing in foreman plugins


#44

Thanks @John_Mitsch this is something that we were looking at working on this iteration but glad to see that you got to it first. Even if using foreman as an npm package doesn’t fix the build issues it will definitely help with testing and de-duplication of packages.

We were planning on just referencing foreman from the katello package.json via a github url rather than using the file approach. What do you think of that?


#45

Perhaps it’s more of a downstream problem, but it also relates to our branches. It means every plugin must refer to right branch. Currently plugins like foreman_ansible are built against multiple foreman releases at once. Keeping this flexibility is very important IMHO.

So the question is: how do we find the right sources in an offline environment? Note we do control the build environment.


#46

I don’t know enough about the build process to say for sure, I used the subfolder approach as it was easy to demonstrate.

Sounds like it would have to work in an offline environment like @ekohl says. Maybe the build team can clarify their criteria around this?


#47

So the question is: how do we find the right sources in an offline environment? Note we do control the build environment.

An idea I had for the build machines was hosting a local copy of the node registry within the build environment and utilizing a package.json lock file. I’m not sure of the limitations and capabilities of the build environment so I don’t know if this is a feasible option but I do wonder if this could alleviate the need to package node packages as RPMs.


#48

I don’t think that’s going to work. All RPMs are built in a container without any network access at all. For reproducible builds you need to provide all release artifacts at build time AFAIK, but people can correct me if I’m wrong.

It is also very useful to be able to do local RPM builds so you don’t want to rely on something within Koji. This allows the community to build their own plugin RPMs that are closed source. We know ATIX (cc @x9c4) also rebuilds RPMs for their product, like how RH does downstream builds.


#49

Currently we have some protection against issues like https://github.com/dominictarr/event-stream/issues/116 in the packaging process (even though we don’t do a real audit of the packages). IMHO it’s a huge feature that we don’t pull packages from NPM but rather from a trusted source. I would like to see how the new process would protect us from malicious code and attacks.


#50

These are good points, thanks for the reply!


#51

Another point of concern is tree shaking. I’m not sure if it’ll be a problem, but I think the vendor bundle should not be stripped of unused functions because you don’t know what a plugin can rely on. Since this is something that won’t happen in development, I’m not sure how we can properly detect if it’ll be a problem.