On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address. Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services. Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful: * Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13). I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this. Thoughts? Stephen