On 2021-12-10 10:43:10 +0100 (+0100), Jean-Philippe Evrard wrote:
On Thu, Dec 9, 2021, at 19:03, Jeremy Stanley wrote: [...]
* Skyline performs its release versioning by committing the version strings to files in its repositories rather than assigning them at build time from Git tag metadata. This leads to ambiguity as to what exact state of the repository represents that version, as well as opening up the possibility of forgetting to merge the version update before tagging (a very common problem which drove OpenStack to rely on its current model). Wholly different tools are also needed in order to track versions for these projects as a result. Should the rest of OpenStack's projects follow suit there? Has something changed to make the idea of using Git to signal releases a bad one?
You're raising again a valid point, for which the skyline team will have to find a solution.
For me, it boils down to defining that projects should follow a "project release interface", which explains the expected behaviour in regards of releasing for projects. We don't have such. I am not claiming we should, I am merely asking us to think about what we are doing.
Projects could indeed move to use pbr/setuptools. However, I am not sure many developers understand why. Should we simply explain what is expected in terms of building the version from git? People might even come with interesting solutions. Promoting creativity will create tech debt, but might also reduce it over time. What if we don't need pbr anymore? Will we even notice?
Isn't that exactly what I did? You'll notice here and in my review I didn't say "use PBR," I outlined the properties desired instead. It's the case that using PBR is likely the easiest way to do that since we already have a lot of examples, but like you I don't think that the implementation is important as long as we can find a suitable solution to the problems we face. That said, using a consistent implementation across as many projects as possible increases our familiarity and ability to address new issues with the fewest set of solutions.
* Skyline uses makefiles to script its build and test workflows rather than tox, adding a tox wrapper around the make targets in order to nominally comply with the OpenStack PTI. As a result, it's unable to rely on most of tox's features for environment isolation, package management, path overrides, et cetera. OpenStack projects have solved a lot of cross-project development workflow challenges through applying consistent changes across the tox.ini files of repositories, should those solutions be abandoned and recreated for make instead?
I didn't say so. I am only partially agreeing with you there, for once :)
For me, tox is a test runner that provides many nice features thanks to its code (and integration with setuptools) I would prefer if python projects using tox for testing interface indeed. Should that be the standard? I don't know. Does it make sense to rely on tox features for isolations/package management/other? I don't know. You might be aware that it's possible to integrate tox with poetry. Does it provide all we _need_? I don't know: We aren't explicit in the requirements of what we need, we just talk about tools.
If I were to rephrase: Is it easier to fit the existing mold or clarify what we need, why, and how we can deal with the tech debt? Probably the former. Should we do it? I don't know. The TC should however evaluate this seriously.
Let's be clear, I have nothing against make. In fact, I somewhat prefer it, since it's useful for projects written in all sorts of different programming languages beyond Python. At one point we even came very close to recommending OpenStack switch from tox to make during one of the more unfortunate backward-incompatible tox releases some years ago. Clark had a proof of concept recreating the fundamentals of tox itself in Makefiles, and it wasn't half bad. But having one project use make while the rest use tox isn't really great for anyone, it turns the make-based project into a second-class citizen within the whole. Its maintainers have to do a lot more work to support the same workflows with a different tool, and the community as a whole incurs some additional burden in working around that difference (additional CI jobs, separate functionality, different developer documentation). If we think make is a superior solution we should use it across all of OpenStack and not just in a few random projects.
* Skyline does not put its external Python package dependencies in central lists nor use tools which would allow it to rely on OpenStack's global constraints implementation, making it very hard for them to guarantee that their software can run with the same versions of shared dependencies as the rest of OpenStack. Is this unnecessary? Have we given up on the idea that two pieces of OpenStack software can be coinstalled into one environment, or that Linux distributions won't be stuck having to package many different versions of the same software because OpenStack projects don't agree on which versions work for them any longer?
Now, we are hitting the nail, IMO :)
This is a bit of the issue's core, isn't it? I think that's what we need to document. While I (personally) don't believe it's important nowadays (cf. containers), I am sure distros will disagree with me. It's important to know who we are pleasing, as we can't please everyone ;) Let's be clear about the audience. [...]
Just my opinion, but I think buying into container hype to the point where we make OpenStack only deployable through containerization would be a massive step backwards. -- Jeremy Stanley