On 10/3/24 19:14, Stephen Finucane wrote:
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
IIUC the conclusion we got from the discussion during last cycle was - We remove python 3.8 from test runtimes and give services one cycle (the whole 2024.2 cycle) to remove testing of py3.8 and bump min - We start dropping support from 3.8 from libraries like oslo because python 3.8 reaches its EOL right after 2024.2 release. "Removing python 3.8 support from libraries once it is removed from services" is the ideal order, I agree but the unfortunate fact is that there are still some "inactive" projects(these are not officially marked inactive) which lacks attention to such global transition. It's no longer practical to be blocked by all service projects because it means we need to wait for forever I can revert all unreleased changes to drop python 3.8 support from oslo, so that such projects are not killed during cycle, but I'm skeptical giving one cycle helps the situation really given the fact many of the projects failed to pay attention to the work even after we got that agreement in the previous cycle. My initial plan was to be stick with original plan and remove python 3.8 support from oslo early 2024.2, to send strong signal to services to tell that they should move away from 3.8. I'll change the plan in case this sounds too drastic to people, but I really want to set the clear dead line (which I believe we already set twice but failed) for extension.
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12.
Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen