On Thu, 2024-10-03 at 10:45 -0700, Ghanshyam Mann wrote:
---- On Thu, 03 Oct 2024 03:14:12 -0700 Stephen Finucane wrote ---
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
Ok, I get your idea now.
Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem.
It sounds like the real issue lies with the services, rather than the libraries: if the services were always updating their 'python_requires' to reflect the lower bound of our tested runtimes matrix then we wouldn't have an issue, right? Perhaps we could make this slightly easier by auto-proposing bumps of this at the start of a new cycle (where appropriate)? That won't guarantee that they'll get merged (you can bring a horse to water...) but at least it sends a signal around expectations for service-type projects and give libraries cover as they do the same later? Stephen
-gmann
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12.
Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen