[all] Remove 'python_requires' flag from setup.cfg
Hi All, As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2]. Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed. IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them. [1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... -gmann
This might be probably something we should discuss at vPTG. On 10/2/24 02:41, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
The sad fact is that classifiers are not updated timely and sometimes horribly outdated. If you look at setup.cfg in a few repos you may see some doesn't declare 3.11 support and a few other still declare 3.6 support. Unless we establish a coordinated way to update these fields timely, it eventually leaves confusions on users. What I've learned is that inactive projects with unmaintained CI may not maintain these fields, and I'm unsure "use more less strict fields" strategy really helps the maintenance.
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
I understand the explained pain, but what is the benefit of allowing projects to maintain CI with EOL python version ? Also, If the project is not capable to maintain their CI then it's apparently a sign that the project is inactive and I'm unsure how much tradeoff we should pay to maintain these. We recently started marking some projects actively and this doesn't sound aligned with that direction.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
My main concern here is that some of the internal/external repositories are actually removing compatibility with Python 3.8 in some repos and removing the flag may cause wired errors instead of explicit failure during installation.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
---- On Tue, 01 Oct 2024 11:06:12 -0700 Takashi Kajinami wrote ---
This might be probably something we should discuss at vPTG.
On 10/2/24 02:41, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
The sad fact is that classifiers are not updated timely and sometimes horribly outdated. If you look at setup.cfg in a few repos you may see some doesn't declare 3.11 support and a few other still declare 3.6 support. Unless we establish a coordinated way to update these fields timely, it eventually leaves confusions on users.
What I've learned is that inactive projects with unmaintained CI may not maintain these fields, and I'm unsure "use more less strict fields" strategy really helps the maintenance.
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
I understand the explained pain, but what is the benefit of allowing projects to maintain CI with EOL python version ? Also, If the project is not capable to maintain their CI then it's apparently a sign that the project is inactive and I'm unsure how much tradeoff we should pay to maintain these. We recently started marking some projects actively and this doesn't sound aligned with that direction.
Yeah, we should be cleaning up these as soon as we stop testing anything. When we add release notes about dropping any Python version testing and support, we can easily update the setup.cfg also for the classifier.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
My main concern here is that some of the internal/external repositories are actually removing compatibility with Python 3.8 in some repos and removing the flag may cause wired errors instead of explicit failure during installation.
We declare the Python min version bump for cycle testing runtime and each deliverables python classifier also get updated. I think that is enough information to anyone looking for what all python versions things are tested and what all are not guaranteed. My point is just to tell what we test instead of saying what we do not test. -gmann
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
On 2024-10-01 10:41:57 -0700 (-0700), Ghanshyam Mann wrote:
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them. [...]
The entire point of that value is that, if the software can't work on a particular older version of the CPython interpreter, pip will choose the most recent release of the package known to work on the user's interpreter. If we drop it, anyone using pip to install on older Python versions will end up installing the latest releases of our packages (assuming declared minimum versions of dependencies will work with that interpreter). Increasing the value too early does cause problems for development of projects depending on that package, but removing it causes problems for users installing from PyPI. Can't we just do a better job of gauging when it's safe to increase the setting? -- Jeremy Stanley
---- On Tue, 01 Oct 2024 11:09:18 -0700 Jeremy Stanley wrote ---
On 2024-10-01 10:41:57 -0700 (-0700), Ghanshyam Mann wrote:
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them. [...]
The entire point of that value is that, if the software can't work on a particular older version of the CPython interpreter, pip will choose the most recent release of the package known to work on the user's interpreter. If we drop it, anyone using pip to install on older Python versions will end up installing the latest releases of our packages (assuming declared minimum versions of dependencies will work with that interpreter).
I think this should be ok as users know that they are using an older python version, and installing the latest OpenStack can cause some issues. We still explicitly declare the supported Python versions when testing runtime and package classifiers.
Increasing the value too early does cause problems for development of projects depending on that package, but removing it causes problems for users installing from PyPI. Can't we just do a better job of gauging when it's safe to increase the setting? -- Jeremy Stanley
On 10/2/24 08:39, Ghanshyam Mann wrote:
---- On Tue, 01 Oct 2024 11:09:18 -0700 Jeremy Stanley wrote ---
On 2024-10-01 10:41:57 -0700 (-0700), Ghanshyam Mann wrote:
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them. [...]
The entire point of that value is that, if the software can't work on a particular older version of the CPython interpreter, pip will choose the most recent release of the package known to work on the user's interpreter. If we drop it, anyone using pip to install on older Python versions will end up installing the latest releases of our packages (assuming declared minimum versions of dependencies will work with that interpreter).
I think this should be ok as users know that they are using an older python version, and installing the latest OpenStack can cause some issues. We still explicitly declare the supported Python versions when testing runtime and package classifiers.
That's "theoretically" correct, but honestly speaking I doubt all people using pip for installation checks the relevant document to select the version they install and in most cases they just rely on resolver. It might be a small portion, but I'm sure that this does break some of the existing use pattern.
Increasing the value too early does cause problems for development of projects depending on that package, but removing it causes problems for users installing from PyPI. Can't we just do a better job of gauging when it's safe to increase the setting? -- Jeremy Stanley
Hi, Dnia wtorek, 1 października 2024 20:09:18 CEST Jeremy Stanley pisze:
On 2024-10-01 10:41:57 -0700 (-0700), Ghanshyam Mann wrote:
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them. [...]
The entire point of that value is that, if the software can't work on a particular older version of the CPython interpreter, pip will choose the most recent release of the package known to work on the user's interpreter. If we drop it, anyone using pip to install on older Python versions will end up installing the latest releases of our packages (assuming declared minimum versions of dependencies will work with that interpreter).
Increasing the value too early does cause problems for development of projects depending on that package, but removing it causes problems for users installing from PyPI. Can't we just do a better job of gauging when it's safe to increase the setting? -- Jeremy Stanley
Just for the record, I checked in the 2023 user survey results [1] and it seems that about 18% of users are installing OpenStack using pypi so this is significant number IMO and we shouldn't make their life harder. [1] https://governance.openstack.org/tc/user_survey/analysis-2023.html#how-are-y... -- Slawek Kaplonski Principal Software Engineer Red Hat
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address. Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services. Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful: * Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13). I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this. Thoughts? Stephen
Hi On 10/2/24 13:17, Stephen Finucane wrote:
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
I am absolutely supporting that Stephen. Services and Clients are absolutely different things and must be treated differently. We have users from "older" clouds and we should not stop supporting certain "platforms" just because services move along. With that I am not in any case saying that for clients we should support EOL "platforms". Artem
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions. One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime? -gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12. Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen
On 10/3/24 19:14, Stephen Finucane wrote:
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
IIUC the conclusion we got from the discussion during last cycle was - We remove python 3.8 from test runtimes and give services one cycle (the whole 2024.2 cycle) to remove testing of py3.8 and bump min - We start dropping support from 3.8 from libraries like oslo because python 3.8 reaches its EOL right after 2024.2 release. "Removing python 3.8 support from libraries once it is removed from services" is the ideal order, I agree but the unfortunate fact is that there are still some "inactive" projects(these are not officially marked inactive) which lacks attention to such global transition. It's no longer practical to be blocked by all service projects because it means we need to wait for forever I can revert all unreleased changes to drop python 3.8 support from oslo, so that such projects are not killed during cycle, but I'm skeptical giving one cycle helps the situation really given the fact many of the projects failed to pay attention to the work even after we got that agreement in the previous cycle. My initial plan was to be stick with original plan and remove python 3.8 support from oslo early 2024.2, to send strong signal to services to tell that they should move away from 3.8. I'll change the plan in case this sounds too drastic to people, but I really want to set the clear dead line (which I believe we already set twice but failed) for extension.
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12.
Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen
---- On Thu, 03 Oct 2024 03:14:12 -0700 Stephen Finucane wrote ---
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
Ok, I get your idea now. Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem. -gmann
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12.
Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen
On Thu, 2024-10-03 at 10:45 -0700, Ghanshyam Mann wrote:
---- On Thu, 03 Oct 2024 03:14:12 -0700 Stephen Finucane wrote --- > On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote: > > ---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote --- > > > On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote: > > > > Hi All, > > > > > > > > As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python > > > > version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python > > > > version in setup.cfg and that prevents the project from being installed on lower Python versions. We > > > > do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough > > > > information to tell what all Python versions are tested[2]. > > > > > > > > Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency > > > > like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should > > > > remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated > > > > effort is always missed. > > > > > > > > IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue > > > > users to install things on old python versions if it is working for them. > > > > > > > > [1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... > > > > [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... > > > > > > > > -gmann > > > > > > As has been said by other down thread, I think this is the wrong way to go. > > > python_requires delivers value for end-users and if we're failing to make use of > > > it then that's the real issue we need to address. > > > > > > Our deliverables can be broadly divided into services (a.k.a. "applications"), > > > libraries and clients (a.k.a. "libraries"). My understanding of dependency > > > management is that you want to be strict in what you accept w.r.t. Python and > > > dependency versions for an application (to keep the test matrix manageable), but > > > you need to be significantly looser when it comes to libraries (so that they > > > remain broadly useful). This is why things like Pipfile.lock files are > > > encouraged for applications but not for libraries. Put another way, there's > > > nothing wrong with saying we don't want to support installing Nova on e.g. > > > Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC > > > or a library like debtcollector or stevedore in these environments. As I > > > understand it, the issue we had previously (and the reason we've since been > > > reluctant to bump 'python_requires') is that we've forgot/ignored this rule and > > > proceeded to make our *library* dependencies unnecessarily strict rather than > > > focusing first on the services. > > > > I think Libs and client were my main concern over removing the 'python_requires' > > as those caused the pain of breaking service testing. I am ok with this proposal if > > we can remove this flag for our library and clients but keep it for services. > > Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue > here is that our 'python_requires' don't mean anything because we don't test the > oldest version, and that we haven't been doing this because we did the bump > wrong previously. I'm suggesting that we avoid bumping it for libraries until > after we've bumped all the services *and* that we keep testing services against > the older Python versions. In 2024.2, that would mean we'd have been testing > libraries against Python 3.8 to Python 3.12 (all voting), for example, while > services would only have been testing against Python 3.9 to Python 3.11. In Zed, > this would have meant testing libraries on all Python released from 3.7 to 3.10, > while we only tested services on 3.8 and 3.10. By doing this, we ensure > 'python_requires' means something and people can rely on it.
Ok, I get your idea now.
Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem.
actully from my persepcvie when we drop it form the testing runtime the support is dropped so even if we had not implemnted that in nova or placment once we removed it form the testing runties i woudl have considered it unsupproted even if testing was still in place. but i do think we shoudl be more proactive in updating the testing in the service to better match the runtimes so regardessl of if we have or dont have python_requires i think being more agressive with updating out testign to match the agreed testing runtimes is a good thing.
-gmann
> > > > > > > > > Instead of dropping 'python_requires', how about we start making better use of > > > it to actually signal what we want to support? I see two step we could do to > > > make these things broadly useful: > > > > > > * Drop support for older Python versions in services to reflect the support > > > runtimes [1], and > > > * Start testing libraries and clients against all upstream supported Python > > > versions > > > > > > In concrete terms, this would mean very little right now, since Python 3.8 is > > > now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. > > > 2025.2, that would mean services would all drop support for Python 3.9 (assuming > > > Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while > > > libraries and clients would continue to support it until 2026.1 which would > > > coincide with the upstream EOL. In even more concrete terms, this would ideally > > > involve a bot to publish the 'python_requires' bumps for services and splitting > > > 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only > > > Python versions listed in the supported runtimes doc) and 'openstack-library- > > > jobs' (covering all upstream supported Python versions, which will soon include > > > 3.13). > > > > I like the idea, and this way, we can make libs/clients more usable instead of making > > them uninstallable on older versions. > > I think making them uninstallable on older versions is a feature, not a bug. If > we didn't do this, then a user on e.g. Python 3.8 would need to manually figure > out the last version of e.g. openstacksdk that would work on their environment > rather than let pip figure it out for them. The trick here is to actually ensure > python_requires maps to something realistic and to be careful in how we do the > bumps vis-à-vis services vs. libraries. > > > One question, what do you mean by the > > "all upstream supported Python versions"? Do you mean supported Python versions > > of all supported stable branches (excluding the EOL one), not just current cycle > > testing runtime? > > Not quite. I mean that libraries in 2024.2 would have been tested against Python > 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe > 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against > Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be > supported upstream for the duration of those releases. These versions should be > encoded in stable branches too, so the stable/2024.2 branch will always test > against Python 3.9 to Python 3.12. > > Hope that helps, > Stephen > > > > > -gmann > > > > > > > > I think this would allow us to make 'python_requires' really meaningful, > > > providing lots of upside and limited downside for end-user. It should not > > > consume a lot of extra CI resources, since we would only be adding extra testing > > > for libraries which have much smaller, simpler test suites. It would also have > > > the added advantage of highlighting issues with new Python versions for > > > libraries before services need to start worrying about this. > > > > > > Thoughts? > > > > > > Stephen > > > > > > > > > >
On Thu, 2024-10-03 at 10:45 -0700, Ghanshyam Mann wrote:
---- On Thu, 03 Oct 2024 03:14:12 -0700 Stephen Finucane wrote ---
On Wed, 2024-10-02 at 11:13 -0700, Ghanshyam Mann wrote:
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote ---
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is pinned to the minimum python version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing old Python versions fail. I know we should remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for them.
[1] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81... [2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that they remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way, there's nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client like OSC or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Ah, so I'm not suggesting removing 'python_requires' from anywhere. The issue here is that our 'python_requires' don't mean anything because we don't test the oldest version, and that we haven't been doing this because we did the bump wrong previously. I'm suggesting that we avoid bumping it for libraries until after we've bumped all the services *and* that we keep testing services against the older Python versions. In 2024.2, that would mean we'd have been testing libraries against Python 3.8 to Python 3.12 (all voting), for example, while services would only have been testing against Python 3.9 to Python 3.11. In Zed, this would have meant testing libraries on all Python released from 3.7 to 3.10, while we only tested services on 3.8 and 3.10. By doing this, we ensure 'python_requires' means something and people can rely on it.
Ok, I get your idea now.
Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem.
It sounds like the real issue lies with the services, rather than the libraries: if the services were always updating their 'python_requires' to reflect the lower bound of our tested runtimes matrix then we wouldn't have an issue, right? Perhaps we could make this slightly easier by auto-proposing bumps of this at the start of a new cycle (where appropriate)? That won't guarantee that they'll get merged (you can bring a horse to water...) but at least it sends a signal around expectations for service-type projects and give libraries cover as they do the same later? Stephen
-gmann
Instead of dropping 'python_requires', how about we start making better use of it to actually signal what we want to support? I see two step we could do to make these things broadly useful:
* Drop support for older Python versions in services to reflect the support runtimes [1], and * Start testing libraries and clients against all upstream supported Python versions
In concrete terms, this would mean very little right now, since Python 3.8 is now EOL and Python 3.9 is in the supported runtimes list. However, in e.g. 2025.2, that would mean services would all drop support for Python 3.9 (assuming Rocky Linux 10 is a thing and we no longer test for Rocky Linux 9) while libraries and clients would continue to support it until 2026.1 which would coincide with the upstream EOL. In even more concrete terms, this would ideally involve a bot to publish the 'python_requires' bumps for services and splitting 'openstack-python3-jobs' into e.g. 'openstack-service-jobs' (covering only Python versions listed in the supported runtimes doc) and 'openstack-library- jobs' (covering all upstream supported Python versions, which will soon include 3.13).
I like the idea, and this way, we can make libs/clients more usable instead of making them uninstallable on older versions.
I think making them uninstallable on older versions is a feature, not a bug. If we didn't do this, then a user on e.g. Python 3.8 would need to manually figure out the last version of e.g. openstacksdk that would work on their environment rather than let pip figure it out for them. The trick here is to actually ensure python_requires maps to something realistic and to be careful in how we do the bumps vis-à-vis services vs. libraries.
One question, what do you mean by the "all upstream supported Python versions"? Do you mean supported Python versions of all supported stable branches (excluding the EOL one), not just current cycle testing runtime?
Not quite. I mean that libraries in 2024.2 would have been tested against Python 3.8 to Python 3.12, will be tested against Python 3.9 to Python 3.12 (and maybe 3.13 later in the cycle as non-voting) in 2025.1, and will be tested against Python 3.9 to Python 3.12 in 2025.2. Those are the Python versions that will be supported upstream for the duration of those releases. These versions should be encoded in stable branches too, so the stable/2024.2 branch will always test against Python 3.9 to Python 3.12.
Hope that helps, Stephen
-gmann
I think this would allow us to make 'python_requires' really meaningful, providing lots of upside and limited downside for end-user. It should not consume a lot of extra CI resources, since we would only be adding extra testing for libraries which have much smaller, simpler test suites. It would also have the added advantage of highlighting issues with new Python versions for libraries before services need to start worrying about this.
Thoughts?
Stephen
On Fri, 2024-10-04 at 12:16 +0100, Stephen Finucane wrote:
Ok, I get your idea now.
Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem.
It sounds like the real issue lies with the services, rather than the libraries: if the services were always updating their 'python_requires' to reflect the lower bound of our tested runtimes matrix then we wouldn't have an issue, right? Perhaps we could make this slightly easier by auto-proposing bumps of this at the start of a new cycle (where appropriate)?
because of the slurp lifecycle and new yearly cadance of python that would be teh start of every slurp relase exlucing release where for smoth upgrade we need to supprot the previous python. i.e. we shoudl already requrie 3.9 for 2025.1 for 2026.1 we could move to 3.10 however since we need to supprot upgrade form centos 9 stream (3.9) to centos 10 stream(3.12) we cant actully mkae 3.10 the min until 2026.2
That won't guarantee that they'll get merged (you can bring a horse to water...) but at least it sends a signal around expectations for service-type projects and give libraries cover as they do the same later?
Stephen
On Fri, 2024-10-04 at 14:56 +0100, smooney@redhat.com wrote:
On Fri, 2024-10-04 at 12:16 +0100, Stephen Finucane wrote:
Ok, I get your idea now.
Honestly, I do not think this plan works fine. As Takahshi mentioned, services are not removing the support aggressively, and it is hard to finish the service side work before the deadline. Even if we give them the same cycle deadline or the next one and make libs a little slow to drop the older version, the situation will be the same. We will be bumping the min version in libs, breaking the service gate. Py3.8 drop is the best example to see this situation where services have not dropped the support, and Oslo had to bump the min version as python 3.8 is EOL and cannot be supported in Oslo anymore. That is the main reason I think dropping this flag from libs/client will solve this problem.
It sounds like the real issue lies with the services, rather than the libraries: if the services were always updating their 'python_requires' to reflect the lower bound of our tested runtimes matrix then we wouldn't have an issue, right? Perhaps we could make this slightly easier by auto-proposing bumps of this at the start of a new cycle (where appropriate)?
because of the slurp lifecycle and new yearly cadance of python that would be teh start of every slurp relase exlucing release where for smoth upgrade we need to supprot the previous python.
i.e. we shoudl already requrie 3.9 for 2025.1 for 2026.1 we could move to 3.10 however since we need to supprot upgrade form centos 9 stream (3.9) to centos 10 stream(3.12) we cant actully mkae 3.10 the min until 2026.2
Eek, that would mean supporting 3.9 until October 2026, which would be over a year after support was ended upstream [1]? I'd have expected that you would update the underlying OS before you'd update OpenStack running on top, though none of this is mentioned in the SLURP docs [2]. Stephen [1] https://devguide.python.org/versions/ [2] https://docs.openstack.org/project-team-guide/release-cadence-adjustment.htm...
That won't guarantee that they'll get merged (you can bring a horse to water...) but at least it sends a signal around expectations for service-type projects and give libraries cover as they do the same later?
Stephen
On Fri, Oct 4, 2024, at 6:45 AM, Stephen Finucane wrote:
On Fri, 2024-10-04 at 14:56 +0100, smooney@redhat.com wrote:
i.e. we shoudl already requrie 3.9 for 2025.1 for 2026.1 we could move to 3.10 however since we need to supprot upgrade form centos 9 stream (3.9) to centos 10 stream(3.12) we cant actully mkae 3.10 the min until 2026.2
Eek, that would mean supporting 3.9 until October 2026, which would be over a year after support was ended upstream [1]? I'd have expected that you would update the underlying OS before you'd update OpenStack running on top, though none of this is mentioned in the SLURP docs [2].
In theory 2025.1 will have python3.12 support [3]. That means those doing SLURP on centos/rhel/rocky/alma could follow an upgrade path like this: 1. 2024.1 + CentOS 9 Stream + Python3.9 2. Upgrade to next SLURP release 3. 2025.1 + CentOS 9 Stream + Python3.9 4. Upgrade Operating system 5. 2025.1 + CentOS 10 Stream + Python3.12 6. Upgrade to next SLURP release 7. 2026.1 + CentOS 10 Stream + Python3.12 This means that 2025.1 should be the last to need to support 3.9 on this upgrade path. 2025.2 should be able to remove it a full year and two releases earlier than assumed above.
Stephen
[1] https://devguide.python.org/versions/ [2] https://docs.openstack.org/project-team-guide/release-cadence-adjustment.htm...
[3] https://governance.openstack.org/tc/reference/runtimes/2025.1.html
On 2024-10-04 12:16:57 +0100 (+0100), Stephen Finucane wrote: [...]
if the services were always updating their 'python_requires' to reflect the lower bound of our tested runtimes matrix then we wouldn't have an issue, right? [...]
I think it's orthogonal. Where problems mostly arise is when a dependency has increased its python_requires above the version of Python a CI job is set to use, but we attempt to apply a constraints file which insists on installing only the newer version even though that isn't suitable. Per my other reply, environment markers in the constraints file (or separate constraints files per interpreter minor version) would alleviate this. We experience the same exact problem with external dependencies, but when it comes to our own dependencies we've struggled to accept that we could deal with them in the same way. Less often, it's happening in integration test jobs where python_requires has been increased on a branch of some project and a test for another project wants to install the current state of that branch from source for an older interpreter, and yes coordinating the dropping of the affected jobs with a mass update to python_requires could help there, though we'd need better answers for how to deal with unresponsive reviewers in less-active projects (TC delegates someone to step in and approve those changes? ignore it and let those projects stay broken? mark the projects as officially inactive and drop them from the upcoming release?). -- Jeremy Stanley
On Wed, 2 Oct 2024 at 20:14, Ghanshyam Mann <gmann@ghanshyammann.com> wrote:
On Tue, 2024-10-01 at 10:41 -0700, Ghanshyam Mann wrote:
Hi All,
As you know, we have a flag 'python_requires' in setup.cfg which is
version supported[1]. Every time we upgrade our Python testing, we usually bump the minimum python version in setup.cfg and that prevents the project from being installed on lower Python versions. We do have list of tested and supported Python versions in setup.cfg classifier, which is good and enough information to tell what all Python versions are tested[2].
Flag 'python_requires' creates an issue (which has happened many times in the past) when a dependency like oslo lib bumps the version, and it makes all projects testing
remove the old Python versions tested from projects first, but somehow, we all are busy, and a coordinated effort is always missed.
IMO, we should remove this flag as it creates more issues than benefits. opinion? It will also help to continue users to install things on old python versions if it is working for
[1]
https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
[2] https://github.com/openstack/nova/blob/8c1a47c9cf6e1001fbefd6ff3b76314e39c81...
-gmann
As has been said by other down thread, I think this is the wrong way to go. python_requires delivers value for end-users and if we're failing to make use of it then that's the real issue we need to address.
Our deliverables can be broadly divided into services (a.k.a. "applications"), libraries and clients (a.k.a. "libraries"). My understanding of dependency management is that you want to be strict in what you accept w.r.t. Python and dependency versions for an application (to keep the test matrix manageable), but you need to be significantly looser when it comes to libraries (so that
remain broadly useful). This is why things like Pipfile.lock files are encouraged for applications but not for libraries. Put another way,
nothing wrong with saying we don't want to support installing Nova on e.g. Ubuntu 22.04, but we probably don't want to artificially limit a client
---- On Wed, 02 Oct 2024 04:17:01 -0700 Stephen Finucane wrote --- pinned to the minimum python old Python versions fail. I know we should them. they there's like OSC
or a library like debtcollector or stevedore in these environments. As I understand it, the issue we had previously (and the reason we've since been reluctant to bump 'python_requires') is that we've forgot/ignored this rule and proceeded to make our *library* dependencies unnecessarily strict rather than focusing first on the services.
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Actually I think that it is particularly for clients that we should NOT drop python_requires. End users may be using old distributions and following documentation that currently works, but relies on having python_requires for pip to do the right thing. Similarly, consider automation that is installing packages from PyPI. We don't want to break this – we've sometimes forgotten to bump python_requires in the past and it has been painful.
On Thu, Oct 3, 2024, at 12:43 PM, Pierre Riteau wrote:
On Wed, 2 Oct 2024 at 20:14, Ghanshyam Mann <gmann@ghanshyammann.com> wrote:
I think Libs and client were my main concern over removing the 'python_requires' as those caused the pain of breaking service testing. I am ok with this proposal if we can remove this flag for our library and clients but keep it for services.
Actually I think that it is particularly for clients that we should NOT drop python_requires.
End users may be using old distributions and following documentation that currently works, but relies on having python_requires for pip to do the right thing. Similarly, consider automation that is installing packages from PyPI. We don't want to break this – we've sometimes forgotten to bump python_requires in the past and it has been painful.
Agreed. The issues gmann is referring to have to do with situations where constraints and python_requires are out of sync. When that happens you get an unclear error message about no version available and a failure to install things even though you can clearly see that version is available. But we have to remember that constraints are largely an artifact of how we run our CI jobs, and we can address these issues there. Users in the real world won't have the necessary context to know what specific versions of things work with which specific versions of Python. We should continue to mark that with python_requires so that the tooling handles this automatically for them. Then we can figure out some way of avoiding conflicts between python_requires and constraints going forward. I think the proposals to phase things out in stages should work well at doing that.
On 2024-10-03 13:20:44 -0700 (-0700), Clark Boylan wrote: [...]
Then we can figure out some way of avoiding conflicts between python_requires and constraints going forward. I think the proposals to phase things out in stages should work well at doing that.
We've been putting python_version environment markers in our upper constraints lists when necessary to work around this exact problem with external dependencies. Unless we want to switch to having separate version-specific constraints files, this is the way to rectify python_requires and constraints for our own libraries too. -- Jeremy Stanley
participants (9)
-
Artem Goncharov
-
Clark Boylan
-
Ghanshyam Mann
-
Jeremy Stanley
-
Pierre Riteau
-
smooney@redhat.com
-
Stephen Finucane
-
Sławek Kapłoński
-
Takashi Kajinami