[all][tc] Dropping lower-constraints testing from all projects
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521.... - http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
Should it also be dropped from stable branches? eg in magnum, it blocks the gate for stable/victoria atm.
Cheers, Spyros
On Wed, Jan 6, 2021 at 7:00 PM Ghanshyam Mann gmann@ghanshyammann.com wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
Few of the ML thread around this discussion:
http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
On 2021-01-06 19:22:48 +0100 (+0100), Spyros Trigazis wrote:
Should it also be dropped from stable branches? eg in magnum, it blocks the gate for stable/victoria atm.
[...]
If it's broken, I'd say you have three choices:
1. fix the job on those branches
2. drop the job from those branches
3. EOL those branches
It's a bit soon to be considering #3 for stable/victoria, so you're really left with options #1 and #2 there.
---- On Wed, 06 Jan 2021 12:33:42 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-06 19:22:48 +0100 (+0100), Spyros Trigazis wrote:
Should it also be dropped from stable branches? eg in magnum, it blocks the gate for stable/victoria atm.
[...]
If it's broken, I'd say you have three choices:
fix the job on those branches
drop the job from those branches
EOL those branches
It's a bit soon to be considering #3 for stable/victoria, so you're really left with options #1 and #2 there.
I will suggest dropping the job as it can end up spending the same amount of effort to fix the job on stable or master. If we keep fixing on stable then we can fix on the master and backport so it is better to drop from all gates including the stable branch.
-gmann
-- Jeremy Stanley
Let's decide on a topic and/or story to track it? eg lc_drop
Spyros
On Wed, Jan 6, 2021 at 7:55 PM Ghanshyam Mann gmann@ghanshyammann.com wrote:
---- On Wed, 06 Jan 2021 12:33:42 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-06 19:22:48 +0100 (+0100), Spyros Trigazis wrote:
Should it also be dropped from stable branches? eg in magnum, it blocks the gate for stable/victoria atm.
[...]
If it's broken, I'd say you have three choices:
fix the job on those branches
drop the job from those branches
EOL those branches
It's a bit soon to be considering #3 for stable/victoria, so you're really left with options #1 and #2 there.
I will suggest dropping the job as it can end up spending the same amount of effort to fix the job on stable or master. If we keep fixing on stable then we can fix on the master and backport so it is better to drop from all gates including the stable branch.
-gmann
-- Jeremy Stanley
Hi,
On 1/6/21 6:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
As a downstream distribution package maintainer, I see this as a major regression of the code quality that upstream is shipping. Without l-c tests, there's no assurance of the reality of a lower-bound dependency.
So then we're back to 5 years ago, when OpenStack just artificially was setting very high lower bound because we just didn't know...
Please don't do it.
Cheers,
Thomas Goirand (zigo)
On 2021-01-06 22:04:34 +0100 (+0100), Thomas Goirand wrote: [...]
As a downstream distribution package maintainer, I see this as a major regression of the code quality that upstream is shipping. Without l-c tests, there's no assurance of the reality of a lower-bound dependency.
So then we're back to 5 years ago, when OpenStack just artificially was setting very high lower bound because we just didn't know...
Please don't do it.
The tidbit you're missing here is that we never actually had working lower-bounds checks. The recent update to make pip correctly confirm requested versions of packages get installed, which has caused these jobs to all fall over, proves that. So it's not a regression. I'm personally in favor of doing lower-bounds checking of our software, always have been, but nobody's done the work to correctly implement it. The old jobs we had punted on it, and now we can see clearly that they weren't actually testing what people thought. Properly calculating transitive dependency lower-bounds requires a modified dependency solver with consistent inverse sorting, and that doesn't presently exist.
---- On Wed, 06 Jan 2021 15:04:34 -0600 Thomas Goirand zigo@debian.org wrote ----
Hi,
On 1/6/21 6:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
As a downstream distribution package maintainer, I see this as a major regression of the code quality that upstream is shipping. Without l-c tests, there's no assurance of the reality of a lower-bound dependency.
So then we're back to 5 years ago, when OpenStack just artificially was setting very high lower bound because we just didn't know...
Hi Zigo,
We discussed the usage of l-c file among different packagers in 14th Jan TC meeting[1],
can you confirm if Debian directly depends on l-c file and use them OR it is good for code quality if project maintains it?
Below packagers does not use l-c file instead use u-c - RDO - openstack-charms - ubuntu
[1] http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-01-14-15.00.log.html...
-gmann
Please don't do it.
Cheers,
Thomas Goirand (zigo)
On Mon, Jan 18, 2021 at 7:24 PM Ghanshyam Mann gmann@ghanshyammann.com wrote:
---- On Wed, 06 Jan 2021 15:04:34 -0600 Thomas Goirand zigo@debian.org wrote ----
Hi,
On 1/6/21 6:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
As a downstream distribution package maintainer, I see this as a major regression of the code quality that upstream is shipping. Without l-c tests, there's no assurance of the reality of a lower-bound dependency.
So then we're back to 5 years ago, when OpenStack just artificially was setting very high lower bound because we just didn't know...
Hi Zigo,
We discussed the usage of l-c file among different packagers in 14th Jan TC meeting[1],
can you confirm if Debian directly depends on l-c file and use them OR it is good for code quality if project maintains it?
Below packagers does not use l-c file instead use u-c
- RDO
- openstack-charms
- ubuntu
FWIW, if we are including openstack-charms here, then I can also confirm the same for Kolla - we just use u-c.
-yoctozepto
On 1/18/21 7:23 PM, Ghanshyam Mann wrote:
---- On Wed, 06 Jan 2021 15:04:34 -0600 Thomas Goirand zigo@debian.org wrote ----
Hi,
On 1/6/21 6:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
As a downstream distribution package maintainer, I see this as a major regression of the code quality that upstream is shipping. Without l-c tests, there's no assurance of the reality of a lower-bound dependency.
So then we're back to 5 years ago, when OpenStack just artificially was setting very high lower bound because we just didn't know...
Hi Zigo,
We discussed the usage of l-c file among different packagers in 14th Jan TC meeting[1],
can you confirm if Debian directly depends on l-c file and use them OR it is good for code quality if project maintains it?
Below packagers does not use l-c file instead use u-c
- RDO
- openstack-charms
- ubuntu
[1] http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-01-14-15.00.log.html...
-gmann
Hi,
Of course, I'm using upper-constraints too, to try to package them as much as possible, however, the dependencies are expressed according to lower-constraints.
Let's say we have dependency a that has ===2.1 in u-c, but 1.9 in l-c. I'll write:
Depends: a (>= 1.9)
but will try to get 2.1 in Debian. At the end, if 1.9 is in Debian stable backports, I may attempt to not do the backporting job for 2.1, as the project is telling me 1.9 works ok and that it's useless work.
Does this make sense now?
Cheers,
Thomas Goirand (zigo)
On 2021-01-19 08:03:25 +0100 (+0100), Thomas Goirand wrote: [...]
Of course, I'm using upper-constraints too, to try to package them as much as possible, however, the dependencies are expressed according to lower-constraints.
[...]
The same lower bounds would also typically be expressed in the requirements.txt file. Presumably you looked there before projects added lower-constraints.txt files? Noting that lower bounds testing isn't feasible and the jobs we were running weren't actually correctly testing minimum versions of everything, these have always been a "best effort" assertion anyway.
I gather you run Tempest tests against your OpenStack packages on Debian already, so if a dependency there is too low you'll find out and can let the project maintainers know that their minimum version for that in requirements.txt isn't correct. Hopefully that doesn't come up very often, but for things we can't realistically test, getting notified by downstream distributors and users is the best feedback mechanism we can hope for.
---- On Tue, 19 Jan 2021 09:32:02 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-19 08:03:25 +0100 (+0100), Thomas Goirand wrote: [...]
Of course, I'm using upper-constraints too, to try to package them as much as possible, however, the dependencies are expressed according to lower-constraints.
[...]
The same lower bounds would also typically be expressed in the requirements.txt file. Presumably you looked there before projects added lower-constraints.txt files? Noting that lower bounds testing isn't feasible and the jobs we were running weren't actually correctly testing minimum versions of everything, these have always been a "best effort" assertion anyway.
I gather you run Tempest tests against your OpenStack packages on Debian already, so if a dependency there is too low you'll find out and can let the project maintainers know that their minimum version for that in requirements.txt isn't correct. Hopefully that doesn't come up very often, but for things we can't realistically test, getting notified by downstream distributors and users is the best feedback mechanism we can hope for.
Yeah, in requirments.txt we always have a lower bound of deps and we do not update it or sync it with u-c. Yes, we will not be testing those as such but as Jeremy mentioned if there is some wrong lower bound then we can fix it quickly.
Usually, on every new feature or interface deps, we do bump that lower bound in requirements.txt. We usually check if anything new we are using that is being updated in this file or not
-gmann
-- Jeremy Stanley
Hi Jeremy,
Thanks for your reply.
On 1/19/21 4:32 PM, Jeremy Stanley wrote:
On 2021-01-19 08:03:25 +0100 (+0100), Thomas Goirand wrote: [...]
Of course, I'm using upper-constraints too, to try to package them as much as possible, however, the dependencies are expressed according to lower-constraints.
[...]
The same lower bounds would also typically be expressed in the requirements.txt file. Presumably you looked there before projects added lower-constraints.txt files?
I should rephrase. Yes, I'm looking into requirements.txt to translate this into the dependencies in debian/control. The exact workflow, is to compare what's in requirements.txt and what's in the current Debian Stable. If the version is satisfied in Debian Stable, I just don't express any lower bound. If the dependency in Debian Stable isn't high enough, I do write whatever upstream wrote as minimum version in debian/control, meaning that it will need a backported version to Debian Stable to run, or the version in Debian Testing/Unstable.
I do expect it to be correct in requirements.txt, and as we always say in the OpenStack world "if it's not tested it's broken"... Which is what bothers me here.
Noting that lower bounds testing isn't feasible and the jobs we were running weren't actually correctly testing minimum versions of everything, these have always been a "best effort" assertion anyway.
Correct, though $topic for this thread is "let's throw away the baby with the bath water"^W^W^W^W "Let's stop testing from all projects". :)
If $topic was "let's relax testing on l-c", or "can we find a solution" you'd have my full acceptance, as I can only agree that there's only so much one can do in a day of work, and that our head count is shrinking. Though at the same time, I have a hard time understanding a general call for removing useful tests.
At the end, I'm not the person that will be maintaining these tests. That's not my role, and I simply wouldn't be able to do that upstream work on the 500+ packages that I maintain (nearly) alone. Though it's my duty to warn the community about the consequences it may have downstream.
I gather you run Tempest tests against your OpenStack packages on Debian already
I'm trying to find the time to do it, but until I have a full CI up and running (which isn't fully the case yet for my installer), it's still a manual, and painful process. :/
I'm close to having such a CI up and running, though with my current setup, it demands a lot of resources (ie: a machine with 256GB of RAM).
Contributions would be very much appreciated.
so if a dependency there is too low you'll find out and can let the project maintainers know that their minimum version for that in requirements.txt isn't correct.
It happened numerous times that I did such bug report. Which proves 2 things: - that it's not tested (enough) - that testing would be useful
Hopefully that doesn't come up very often
It used to be that it was about 5 or 6 times per release. People working on each project for long enough can probably remember me asking about failed unit tests, and being told to upgrade this or that.
This type of trouble may mean spending 2 or 3 days not understanding what's happening, until someone on IRC, knowing the project well enough, just finds out in 5 minutes.
Since the tested l-c, I can't remember finding out such a problem that wasn't my fault, so it was a huge improvement.
One also has to keep in mind that, if on a single release, I can find 5 or 6 times a wrong lower-bound in a requirements.txt, this probably means 10 times more wrong lower bounds really being there. I don't see most problems, because I don't test lower-bound myself, and try to package as much as possible what's in u-c. So I just happen to bump into something I forgot to upgrade *AND* upstream has a wrong lower-bound.
but for things we can't realistically test, getting notified by downstream distributors and users is the best feedback mechanism we can hope for.
Something I don't understand: why can't we use an older version of pip, if the problem is the newer pip resolver? Or can't the current pip be patched to fix things? It's not as if there was no prior art... Maybe I'm missing the big picture?
On 1/19/21 7:16 PM, Ghanshyam Mann wrote:
Yeah, in requirments.txt we always have a lower bound of deps and we do not update it or sync it with u-c. Yes, we will not be testing those as such but as Jeremy mentioned if there is some wrong lower bound then we can fix it quickly.
Usually, on every new feature or interface deps, we do bump that lower bound in requirements.txt. We usually check if anything new we are using that is being updated in this file or not
From a downstream distribution package maintainer, having an upstream to
do that work, is just super nice and rare. Though the manual process that you describe above is far from trivial, and very error-prone, unfortunately. And this isn't specific to OpenStack of course.
Cheers,
Thomas Goirand (zigo)
On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote: [...]
Something I don't understand: why can't we use an older version of pip, if the problem is the newer pip resolver? Or can't the current pip be patched to fix things? It's not as if there was no prior art... Maybe I'm missing the big picture?
[...]
To get to the heart of the matter, when using older versions of pip it was just quietly installing different versions of packages than we asked it to, and versions of transitive dependencies which directly conflicted with the versions other dependencies said they required. When pip finally (very recently) implemented a coherent dependency solver, it started alerting us directly to this fact. We could certainly find a way to hide our heads in the sand and go back to testing with old pip and pretending we knew what was being tested there, but the question is whether what we were actually testing that way was worthwhile enough to try to continue doing it, now that we have proof it wasn't what we were wanting to test.
The challenge with actually testing what we wanted has always been that there's many hundreds of packages we depend on and, short of writing one ourselves, no tool available to find a coherent set of versions of them which satisfy the collective lower bounds. The way pip works, it wants to always solve for the newest possible versions which satisfy an aggregate set of version ranges, and what we'd want for lower bounds checking is the inverse of that.
Hi,
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
Lucian Petrut
From: Jeremy Stanleymailto:fungi@yuggoth.org Sent: Wednesday, January 20, 2021 1:52 AM To: openstack-discuss@lists.openstack.orgmailto:openstack-discuss@lists.openstack.org Subject: Re: [all][tc] Dropping lower-constraints testing from all projects
On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote: [...]
Something I don't understand: why can't we use an older version of pip, if the problem is the newer pip resolver? Or can't the current pip be patched to fix things? It's not as if there was no prior art... Maybe I'm missing the big picture?
[...]
To get to the heart of the matter, when using older versions of pip it was just quietly installing different versions of packages than we asked it to, and versions of transitive dependencies which directly conflicted with the versions other dependencies said they required. When pip finally (very recently) implemented a coherent dependency solver, it started alerting us directly to this fact. We could certainly find a way to hide our heads in the sand and go back to testing with old pip and pretending we knew what was being tested there, but the question is whether what we were actually testing that way was worthwhile enough to try to continue doing it, now that we have proof it wasn't what we were wanting to test.
The challenge with actually testing what we wanted has always been that there's many hundreds of packages we depend on and, short of writing one ourselves, no tool available to find a coherent set of versions of them which satisfy the collective lower bounds. The way pip works, it wants to always solve for the newest possible versions which satisfy an aggregate set of version ranges, and what we'd want for lower bounds checking is the inverse of that. -- Jeremy Stanley
On 1/20/21 8:26 AM, Lucian Petrut wrote:
Hi,
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
Lucian Petrut
Hi,
If this could be done, it'd be very nice already.
Cheers,
Thomas Goirand (zigo)
On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote:
Hi, For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file. This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed. Lucian Petrut
Yes, I've looked into doing this elsewhere (as promised) and it seems to do the job quite nicely. It's not perfect but it does seem to be "good enough" and captures basic things like "I depend on this function found in oslo.foo vX.Y and forgot to bump my minimum version to reflect this". I think these jobs probably offer _more_ value now than they did in the past, given pip is now finally honouring the explicit constraints we express in these files, so I would be in favour of this approach rather than dropping l-c entirely. I do realize that there is some degree of effort here in getting e.g. all the oslo projects fixed, but I'm happy to help out with and have already fixed quite a few projects. I also wouldn't be opposed to dropping l-c on *stable* branches so long as we maintained for master, on the basis that they were already broken so nothing is really changing. Sticking to older, admittedly broken versions of pip for stable branches is another option and might help us avoid a deluge of "remove/fix l-c" patches for stable branches, but I don't know how practical that is?
Stephen
From: Jeremy Stanley Sent: Wednesday, January 20, 2021 1:52 AM To: openstack-discuss@lists.openstack.org Subject: Re: [all][tc] Dropping lower-constraints testing from all projects On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote: [...]
Something I don't understand: why can't we use an older version of pip, if the problem is the newer pip resolver? Or can't the current pip be patched to fix things? It's not as if there was no prior art... Maybe I'm missing the big picture?
[...] To get to the heart of the matter, when using older versions of pip it was just quietly installing different versions of packages than we asked it to, and versions of transitive dependencies which directly conflicted with the versions other dependencies said they required. When pip finally (very recently) implemented a coherent dependency solver, it started alerting us directly to this fact. We could certainly find a way to hide our heads in the sand and go back to testing with old pip and pretending we knew what was being tested there, but the question is whether what we were actually testing that way was worthwhile enough to try to continue doing it, now that we have proof it wasn't what we were wanting to test. The challenge with actually testing what we wanted has always been that there's many hundreds of packages we depend on and, short of writing one ourselves, no tool available to find a coherent set of versions of them which satisfy the collective lower bounds. The way pip works, it wants to always solve for the newest possible versions which satisfy an aggregate set of version ranges, and what we'd want for lower bounds checking is the inverse of that. -- Jeremy Stanley
On Wed, Jan 20, 2021 at 12:14 PM Stephen Finucane stephenfin@redhat.com wrote:
On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote:
Hi,
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
Lucian Petrut
Yes, I've looked into doing this elsewhere (as promised) and it seems to do the job quite nicely. It's not perfect but it does seem to be "good enough" and captures basic things like "I depend on this function found in oslo.foo vX.Y and forgot to bump my minimum version to reflect this". I think these jobs probably offer _more_ value now than they did in the past, given pip is now finally honouring the explicit constraints we express in these files, so I would be in favour of this approach rather than dropping l-c entirely. I do realize that there is some degree of effort here in getting e.g. all the oslo projects fixed, but I'm happy to help out with and have already fixed quite a few projects. I also wouldn't be opposed to dropping l-c on *stable* branches so long as we maintained for master, on the basis that they were already broken so nothing is really changing. Sticking to older, admittedly broken versions of pip for stable branches is another option and might help us avoid a deluge of "remove/fix l-c" patches for stable branches, but I don't know how practical that is?
Stephen
Hmm, I agree with this approach. Sounds quite sane.
I have a related question - do you have a tool to recommend that would check whether all modules used directly by the project are in requirements.txt already? I.e. that there are no directly-used modules that are actually pulled in as indirect dependencies? That would improve the proposed approach as well as general requirements condition.
-yoctozepto
On 2021-01-20 12:29:44 +0100 (+0100), Radosław Piliszek wrote: [...]
I have a related question - do you have a tool to recommend that would check whether all modules used directly by the project are in requirements.txt already? I.e. that there are no directly-used modules that are actually pulled in as indirect dependencies? That would improve the proposed approach as well as general requirements condition.
I worked on this problem with r1chardj0n3s at an Infra team get-together circa mid-2014, after Nova unexpectedly broke when a declared dependency dropped one of its own dependencies which Nova had at some point started directly importing from without remembering to also declare it in requirements.txt. I can't take credit, he did all the real work on it, but we ended up not getting it added as a common linter because it reused private internals of pip which later evaporated.
It looks like it was actively adopted and resurrected by a new author six months ago, so may be worth revisiting:
https://pypi.org/project/pip-check-reqs/
FWIW, I still think it's fundamentally a good idea.
On 1/20/21 12:13 PM, Stephen Finucane wrote:
On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote:
Hi, For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file. This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed. Lucian Petrut
Yes, I've looked into doing this elsewhere (as promised) and it seems to do the job quite nicely. It's not perfect but it does seem to be "good enough" and captures basic things like "I depend on this function found in oslo.foo vX.Y and forgot to bump my minimum version to reflect this". I think these jobs probably offer _more_ value now than they did in the past, given pip is now finally honouring the explicit constraints we express in these files, so I would be in favour of this approach rather than dropping l-c entirely. I do realize that there is some degree of effort here in getting e.g. all the oslo projects fixed, but I'm happy to help out with and have already fixed quite a few projects. I also wouldn't be opposed to dropping l-c on *stable* branches
+1 I don't really mind for things already released, and already proven to work, though please everyone: take care when backporting patches.
Thanks a lot for your proposal. :)
Cheers,
Thomas Goirand (zigo)
---- On Wed, 20 Jan 2021 05:13:39 -0600 Stephen Finucane stephenfin@redhat.com wrote ----
On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote:
Hi,
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
Lucian Petrut
Yes, I've looked into doing this elsewhere (as promised) and it seems to do the job quite nicely. It's not perfect but it does seem to be "good enough" and captures basic things like "I depend on this function found in oslo.foo vX.Y and forgot to bump my minimum version to reflect this". I think these jobs probably offer _more_ value now than they did in the past, given pip is now finally honouring the explicit constraints we express in these files, so I would be in favour of this approach rather than dropping l-c entirely. I do realize that there is some degree of effort here in getting e.g. all the oslo projects fixed, but I'm happy to help out with and have already fixed quite a few projects. I
I thought oslo did drop that instead of fixing all failing l-c jobs? May be I am missing something or misreading it?
also wouldn't be opposed to dropping l-c on *stable* branches so long as we maintained for master, on the basis that they were already broken so nothing is really changing. Sticking to older, admittedly broken versions of pip for stable branches is another option and might help us avoid a deluge of "remove/fix l-c" patches for stable branches, but I don't know how practical that is?
I agree on the point about dropping it on stable to make stable maintenance easy. But I think making/keeping n-v is very dangerous and it can easily go as 'false information'. The n-v concept was to keep failing/starting jobs n-v temporarily and once it is fixed/stable then make it voting. I do not think keeping any job as n-v permanently is a good approach.
I am still not convinced how 'removal of indirect deps from l-c' make 'Know the lower bounds of openstack packages' better? I think it makes it less informative than it is currently. How we will know the lower bound for indirect deps? Do not packagers need those or they can go with their u-c if so then why not for direct deps?
In general, my take here as an upstream maintainer is that we should ship the things completely tested/which serve the complete planned mission. We should not ship/commit anything as half baked. And we keep such things open as one of the TODO if anyone volunteers to fix it.
-gmann
Stephen
From: Jeremy Stanley Sent: Wednesday, January 20, 2021 1:52 AM To: openstack-discuss@lists.openstack.org Subject: Re: [all][tc] Dropping lower-constraints testing from all projects
On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote: [...]
Something I don't understand: why can't we use an older version of pip, if the problem is the newer pip resolver? Or can't the current pip be patched to fix things? It's not as if there was no prior art... Maybe I'm missing the big picture?
[...]
To get to the heart of the matter, when using older versions of pip it was just quietly installing different versions of packages than we asked it to, and versions of transitive dependencies which directly conflicted with the versions other dependencies said they required. When pip finally (very recently) implemented a coherent dependency solver, it started alerting us directly to this fact. We could certainly find a way to hide our heads in the sand and go back to testing with old pip and pretending we knew what was being tested there, but the question is whether what we were actually testing that way was worthwhile enough to try to continue doing it, now that we have proof it wasn't what we were wanting to test.
The challenge with actually testing what we wanted has always been that there's many hundreds of packages we depend on and, short of writing one ourselves, no tool available to find a coherent set of versions of them which satisfy the collective lower bounds. The way pip works, it wants to always solve for the newest possible versions which satisfy an aggregate set of version ranges, and what we'd want for lower bounds checking is the inverse of that. -- Jeremy Stanley
On 1/20/21 6:42 PM, Ghanshyam Mann wrote:
I am still not convinced how 'removal of indirect deps from l-c' make 'Know the lower bounds of openstack packages' better? I think it makes it less informative than it is currently. How we will know the lower bound for indirect deps?
I do not expect OpenStack upstream to solve that problem.
Do not packagers need those
We may, but this is studied for each direct dependency one by one. I see no point trying to solve indirect dependency version bounds, as it's up to each direct dependency to test them.
In general, my take here as an upstream maintainer is that we should ship the things completely tested/which serve the complete planned mission. We should not ship/commit anything as half baked. And we keep such things open as one of the TODO if anyone volunteers to fix it.
What was completely wrong was, a few years from now, shipping artificially inflated lower bounds, like, expressing that each and every projected needed the very last version for all oslo library, which was obviously not the case.
The lower bound testing was trying to address this. Not indirect dependencies, which IMO is completely out of scope. However, when testing a lower bound for a direct dependency, you may need a lower version of an indirect dependency, and that's where it becomes tricky.
Cheers,
Thomas Goirand (zigo)
On Wed, 2021-01-20 at 11:42 -0600, Ghanshyam Mann wrote:
---- On Wed, 20 Jan 2021 05:13:39 -0600 Stephen Finucane stephenfin@redhat.com wrote ---- > On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote: > > Hi, > > > > For Windows related projects such as os-win and networking-hyperv, > > we decided to keep the lower constraints job but remove indirect > > dependencies from the lower-constraints.txt file. > > > > This made it much easier to maintain and it allows us to at least cover > > direct dependencies. I suggest considering this approach instead of > > completely dropping the lower constraints job, whenever possible. > > Another option might be to make it non-voting while it’s getting fixed. > > > > Lucian Petrut > > Yes, I've looked into doing this elsewhere (as promised) and it seems to do the > job quite nicely. It's not perfect but it does seem to be "good enough" and > captures basic things like "I depend on this function found in oslo.foo vX.Y and > forgot to bump my minimum version to reflect this". I think these jobs probably > offer _more_ value now than they did in the past, given pip is now finally > honouring the explicit constraints we express in these files, so I would be in > favour of this approach rather than dropping l-c entirely. I do realize that > there is some degree of effort here in getting e.g. all the oslo projects fixed, > but I'm happy to help out with and have already fixed quite a few projects. I
I thought oslo did drop that instead of fixing all failing l-c jobs? May be I am missing something or misreading it?
It's been proposed but nothing is merged, pending discussions.
> also wouldn't be opposed to dropping l-c on *stable* branches so long as we > maintained for master, on the basis that they were already broken so nothing is > really changing. Sticking to older, admittedly broken versions of pip for stable > branches is another option and might help us avoid a deluge of "remove/fix l-c" > patches for stable branches, but I don't know how practical that is?
I agree on the point about dropping it on stable to make stable maintenance easy. But I think making/keeping n-v is very dangerous and it can easily go as 'false information'. The n-v concept was to keep failing/starting jobs n-v temporarily and once it is fixed/stable then make it voting. I do not think keeping any job as n-v permanently is a good approach.
I agree non-voting only makes sense if you plan to fix it at a later date. If not, you should remove it.
I am still not convinced how 'removal of indirect deps from l-c' make 'Know the lower bounds of openstack packages' better? I think it makes it less informative than it is currently. How we will know the lower bound for indirect deps? Do not packagers need those or they can go with their u-c if so then why not for direct deps?
What we have doesn't work, and direct dependencies are the only things we can truly control. In the scenario you're suggesting, not only do we need to track dependencies, but we also need to track the dependencies of dependencies, and the dependencies of the dependencies of the dependencies, and the dependencies of the dependencies of the dependencies of the dependencies etc. etc. down the rabbit hole. For each of these indirect dependencies, of which there may be many, we need to figure out what the minimum version is for each of these indirect dependencies is manually, because as has been noted many times there is no standardized machinery in place in pip etc. to find (and test) the minimum dependency versions supported by a package. Put another way, if we depend on package foo, which depends on package bar, which depends on package baz, we can state our own informed minimum version for foo, but we will need to inspect foo to find a minimum version of bar that is suitable, and we will need to inspect baz to find a minimum version of baz that is suitable. An impossible ask.
In general, my take here as an upstream maintainer is that we should ship the things completely tested/which serve the complete planned mission. We should not ship/commit anything as half baked. And we keep such things open as one of the TODO if anyone volunteers to fix it.
Maintaining l-c for direct dependencies on all OpenStack projects would mean we can at least guarantee that these packages have been tested with their supposed minimum version. Considering that for a package like nova, at least 1/4 of the dependencies are "OpenStack-backed", this is no small deal. These jobs encourage us to ensure these minimums still make sense and to correct things if not. As noted previously, they're not perfect but they still provides a service that we won't have if we simply delete this machinery entirely.
Stephen
-gmann
> > Stephen > > > From: Jeremy Stanley > > Sent: Wednesday, January 20, 2021 1:52 AM > > To: openstack-discuss@lists.openstack.org > > Subject: Re: [all][tc] Dropping lower-constraints testing from all projects > > > > On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote: > > [...] > > > Something I don't understand: why can't we use an older version of > > > pip, if the problem is the newer pip resolver? Or can't the > > > current pip be patched to fix things? It's not as if there was no > > > prior art... Maybe I'm missing the big picture? > > [...] > > > > To get to the heart of the matter, when using older versions of pip > > it was just quietly installing different versions of packages than > > we asked it to, and versions of transitive dependencies which > > directly conflicted with the versions other dependencies said they > > required. When pip finally (very recently) implemented a coherent > > dependency solver, it started alerting us directly to this fact. We > > could certainly find a way to hide our heads in the sand and go back > > to testing with old pip and pretending we knew what was being tested > > there, but the question is whether what we were actually testing > > that way was worthwhile enough to try to continue doing it, now that > > we have proof it wasn't what we were wanting to test. > > > > The challenge with actually testing what we wanted has always been > > that there's many hundreds of packages we depend on and, short of > > writing one ourselves, no tool available to find a coherent set of > > versions of them which satisfy the collective lower bounds. The way > > pip works, it wants to always solve for the newest possible > > versions which satisfy an aggregate set of version ranges, and what > > we'd want for lower bounds checking is the inverse of that. > > -- > > Jeremy Stanley > > > > > >
On 2021-01-21 09:30:19 +0000 (+0000), Stephen Finucane wrote: [...]
What we have doesn't work, and direct dependencies are the only things we can truly control. In the scenario you're suggesting, not only do we need to track dependencies, but we also need to track the dependencies of dependencies, and the dependencies of the dependencies of the dependencies, and the dependencies of the dependencies of the dependencies of the dependencies etc. etc. down the rabbit hole. For each of these indirect dependencies, of which there may be many, we need to figure out what the minimum version is for each of these indirect dependencies is manually, because as has been noted many times there is no standardized machinery in place in pip etc. to find (and test) the minimum dependency versions supported by a package. Put another way, if we depend on package foo, which depends on package bar, which depends on package baz, we can state our own informed minimum version for foo, but we will need to inspect foo to find a minimum version of bar that is suitable, and we will need to inspect baz to find a minimum version of baz that is suitable. An impossible ask.
[...]
Where this begins to fall apart, as I mentioned earlier, is that the larger your transitive dependency set, the more likely it is that a direct dependency is *also* an indirect dependency (maybe many layers down). If a dependency of your dependency updates to a version which insists on a newer version of some other direct dependency of yours than what you've set in lower-constraints.txt, then your jobs are going to break and need lower bounds adjustments or additional indirect dependencies added to the lower-constraints.txt to roll them back to versions which worked with the others you've set. Unlike upper-constraints.txt where it's assumed that a complete transitive set of dependencies is covered, this will mean additional churn in your stable branches over time.
Or is the idea that we would only every do lower bounds checking on the release under development, and then remove those jobs when we branch?
On Thu, 2021-01-21 at 14:50 +0000, Jeremy Stanley wrote:
On 2021-01-21 09:30:19 +0000 (+0000), Stephen Finucane wrote: [...]
What we have doesn't work, and direct dependencies are the only things we can truly control. In the scenario you're suggesting, not only do we need to track dependencies, but we also need to track the dependencies of dependencies, and the dependencies of the dependencies of the dependencies, and the dependencies of the dependencies of the dependencies of the dependencies etc. etc. down the rabbit hole. For each of these indirect dependencies, of which there may be many, we need to figure out what the minimum version is for each of these indirect dependencies is manually, because as has been noted many times there is no standardized machinery in place in pip etc. to find (and test) the minimum dependency versions supported by a package. Put another way, if we depend on package foo, which depends on package bar, which depends on package baz, we can state our own informed minimum version for foo, but we will need to inspect foo to find a minimum version of bar that is suitable, and we will need to inspect baz to find a minimum version of baz that is suitable. An impossible ask.
[...]
Where this begins to fall apart, as I mentioned earlier, is that the larger your transitive dependency set, the more likely it is that a direct dependency is *also* an indirect dependency (maybe many layers down). If a dependency of your dependency updates to a version which insists on a newer version of some other direct dependency of yours than what you've set in lower-constraints.txt, then your jobs are going to break and need lower bounds adjustments or additional indirect dependencies added to the lower-constraints.txt to roll them back to versions which worked with the others you've set. Unlike upper-constraints.txt where it's assumed that a complete transitive set of dependencies is covered, this will mean additional churn in your stable branches over time.
Ah, I didn't grasp this particular point when you initially raised it. I've spent some time playing around with pip (there's a fun code base) to no avail, so this (the reduced dependency set) is still the best idea I've got, flaws and all. With that said, our 'upper-constraints.txt' should still capture all dependencies, including those indirect ones, no? As such, none of those should be increasing on stable branches, which means the set of transitive dependencies should remain fixed once the branch is cut?
Or is the idea that we would only every do lower bounds checking on the release under development, and then remove those jobs when we branch?
This is also an option if it proves to be particularly painful. It does feel like this would indicate a failure of our upper-constraints to cap a dependency, even if its indirect, but I realize there are limits to what we can achieve here.
Stephen
On 2021-01-27 10:00:14 +0000 (+0000), Stephen Finucane wrote:
On Thu, 2021-01-21 at 14:50 +0000, Jeremy Stanley wrote:
On 2021-01-21 09:30:19 +0000 (+0000), Stephen Finucane wrote: [...]
What we have doesn't work, and direct dependencies are the only things we can truly control. In the scenario you're suggesting, not only do we need to track dependencies, but we also need to track the dependencies of dependencies, and the dependencies of the dependencies of the dependencies, and the dependencies of the dependencies of the dependencies of the dependencies etc. etc. down the rabbit hole. For each of these indirect dependencies, of which there may be many, we need to figure out what the minimum version is for each of these indirect dependencies is manually, because as has been noted many times there is no standardized machinery in place in pip etc. to find (and test) the minimum dependency versions supported by a package. Put another way, if we depend on package foo, which depends on package bar, which depends on package baz, we can state our own informed minimum version for foo, but we will need to inspect foo to find a minimum version of bar that is suitable, and we will need to inspect baz to find a minimum version of baz that is suitable. An impossible ask.
[...]
Where this begins to fall apart, as I mentioned earlier, is that the larger your transitive dependency set, the more likely it is that a direct dependency is *also* an indirect dependency (maybe many layers down). If a dependency of your dependency updates to a version which insists on a newer version of some other direct dependency of yours than what you've set in lower-constraints.txt, then your jobs are going to break and need lower bounds adjustments or additional indirect dependencies added to the lower-constraints.txt to roll them back to versions which worked with the others you've set. Unlike upper-constraints.txt where it's assumed that a complete transitive set of dependencies is covered, this will mean additional churn in your stable branches over time.
Ah, I didn't grasp this particular point when you initially raised it. I've spent some time playing around with pip (there's a fun code base) to no avail, so this (the reduced dependency set) is still the best idea I've got, flaws and all. With that said, our 'upper-constraints.txt' should still capture all dependencies, including those indirect ones, no? As such, none of those should be increasing on stable branches, which means the set of transitive dependencies should remain fixed once the branch is cut?
Or is the idea that we would only every do lower bounds checking on the release under development, and then remove those jobs when we branch?
This is also an option if it proves to be particularly painful. It does feel like this would indicate a failure of our upper-constraints to cap a dependency, even if its indirect, but I realize there are limits to what we can achieve here.
Well, that's perhaps an option, you're basically suggesting combining the complete global upper-constraints.txt from openstack/requirements with an incomplete local lower-constraints.txt in each project. That should allow you to have a complete transitive set, however how you would go about integrating them together would still need to be determined.
On 2021-01-20 11:13:39 +0000 (+0000), Stephen Finucane wrote: [...]
I also wouldn't be opposed to dropping l-c on *stable* branches so long as we maintained for master, on the basis that they were already broken so nothing is really changing.
[...]
The main proposal was for dropping them from stable branches of most projects due to the complexities of cascading dependencies between stable point releases of interdependent projects with essentially frozen requirements. I also think we should be okay if some teams don't feel they have time to fix or maintain master branch testing of lower bounds.
Sticking to older, admittedly broken versions of pip for stable branches is another option and might help us avoid a deluge of "remove/fix l-c" patches for stable branches, but I don't know how practical that is?
[...]
Most of our testing is wrapped by tox, and the version of pip used depends on what's vendored into the version of virtualenv automatically pulled in by tox. In short, tox wants to use the latest available virtualenv (and thus also pip, setuptools, wheel...). Controlling this is doable, but nontrivial and a bit fragile.
On 2021-01-20 07:26:05 +0000 (+0000), Lucian Petrut wrote:
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
[...]
The fewer dependencies a project has, the easier this becomes. I'm not against projects continuing to do it if they can get it to work, but wouldn't want us to pressure folks to spend undue effort on it when they already have a lot more on their plates. I can understand where for projects with a very large set of direct dependencies this still has the problem that your stated minimums may conflict with (semi-circular) dependencies declared elsewhere in the transitive dependency set outside your lower-constraints.txt/requirements.txt file.
---- On Wed, 20 Jan 2021 16:45:19 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-20 07:26:05 +0000 (+0000), Lucian Petrut wrote:
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
[...]
The fewer dependencies a project has, the easier this becomes. I'm not against projects continuing to do it if they can get it to work, but wouldn't want us to pressure folks to spend undue effort on it when they already have a lot more on their plates. I can understand where for projects with a very large set of direct dependencies this still has the problem that your stated minimums may conflict with (semi-circular) dependencies declared elsewhere in the transitive dependency set outside your lower-constraints.txt/requirements.txt file.
I tried with the direct deps with cinder, nova and placement by keeping only deps what we have in requirements.txt, test-requirements.txt, and 'extras' in setup.cfg. We need to get all those three to pass the requirement-check job as it checks for all those deps to be in lower-constraints.txt
- https://review.opendev.org/q/topic:%22l-c-direct-deps-only%22+(status:open%2...)
Some nova test failing which might need some deps version bump but overall It seems direct deps work fine and removes a lot of deps from l-c file (77 from nova, 64 from cinder). With that testing, I am ok with that proposal now (as in my experience on community goals effort, I spent 50% of the time on fixing the indirect deps ).
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
- Only keep direct deps in lower-constraints.txt - Remove the lower constraints testing from all stable branches.
-gmann
-- Jeremy Stanley
On 2021-02-17 18:49:53 -0600 (-0600), Ghanshyam Mann wrote:
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
- Only keep direct deps in lower-constraints.txt
- Remove the lower constraints testing from all stable branches.
And proactively remove lower-constraints jobs *when* branching as well, since your unconstrained transitive dependencies will necessarily drift over time (unlike how jobs relying on upper-constraints work).
Also the lower-constraints jobs remain optional, if teams find the hassle of maintaining them outweighs the testing benefits they see, then they should feel free to stop running those.
---- On Thu, 18 Feb 2021 08:52:52 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-02-17 18:49:53 -0600 (-0600), Ghanshyam Mann wrote:
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
- Only keep direct deps in lower-constraints.txt
- Remove the lower constraints testing from all stable branches.
And proactively remove lower-constraints jobs *when* branching as well, since your unconstrained transitive dependencies will necessarily drift over time (unlike how jobs relying on upper-constraints work).
+1, This is something we can do in the release bot may be?
Also the lower-constraints jobs remain optional, if teams find the hassle of maintaining them outweighs the testing benefits they see, then they should feel free to stop running those.
Indeed, any extra testing as per team interest and bandwidth is always welcome and helpful.
-gmann
-- Jeremy Stanley
On 2/17/21 7:49 PM, Ghanshyam Mann wrote:
---- On Wed, 20 Jan 2021 16:45:19 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-20 07:26:05 +0000 (+0000), Lucian Petrut wrote:
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
[...]
The fewer dependencies a project has, the easier this becomes. I'm not against projects continuing to do it if they can get it to work, but wouldn't want us to pressure folks to spend undue effort on it when they already have a lot more on their plates. I can understand where for projects with a very large set of direct dependencies this still has the problem that your stated minimums may conflict with (semi-circular) dependencies declared elsewhere in the transitive dependency set outside your lower-constraints.txt/requirements.txt file.
I tried with the direct deps with cinder, nova and placement by keeping only deps what we have in requirements.txt, test-requirements.txt, and 'extras' in setup.cfg. We need to get all those three to pass the requirement-check job as it checks for all those deps to be in lower-constraints.txt
Some nova test failing which might need some deps version bump but overall It seems direct deps work fine and removes a lot of deps from l-c file (77 from nova, 64 from cinder). With that testing, I am ok with that proposal now (as in my experience on community goals effort, I spent 50% of the time on fixing the indirect deps ).
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
Thanks for all your work on this, Ghanshyam.
- Only keep direct deps in lower-constraints.txt
I think this makes sense in master as a sanity check that the requirements.txt minima haven't drifted out of date to the extent that they break unit tests.
- Remove the lower constraints testing from all stable branches.
I agree, there doesn't seem to be any point in this any more.
One question: A takeaway I had from the discussion is that the lower-constraints files aren't being used by packagers, etc., so there doesn't seem to be any point in trying to keep the specified versions as low as possible. So I'm planning to update the cinder project deliverables requirements.txt and lower-constraints.txt files for the upcoming release to whatever pip freeze indicates is actually being used right after Milestone 3. Are there any objections to this strategy?
-gmann
-- Jeremy Stanley
---- On Fri, 19 Feb 2021 07:51:27 -0600 Brian Rosmaita rosmaita.fossdev@gmail.com wrote ----
On 2/17/21 7:49 PM, Ghanshyam Mann wrote:
---- On Wed, 20 Jan 2021 16:45:19 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-20 07:26:05 +0000 (+0000), Lucian Petrut wrote:
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
[...]
The fewer dependencies a project has, the easier this becomes. I'm not against projects continuing to do it if they can get it to work, but wouldn't want us to pressure folks to spend undue effort on it when they already have a lot more on their plates. I can understand where for projects with a very large set of direct dependencies this still has the problem that your stated minimums may conflict with (semi-circular) dependencies declared elsewhere in the transitive dependency set outside your lower-constraints.txt/requirements.txt file.
I tried with the direct deps with cinder, nova and placement by keeping only deps what we have in requirements.txt, test-requirements.txt, and 'extras' in setup.cfg. We need to get all those three to pass the requirement-check job as it checks for all those deps to be in lower-constraints.txt
Some nova test failing which might need some deps version bump but overall It seems direct deps work fine and removes a lot of deps from l-c file (77 from nova, 64 from cinder). With that testing, I am ok with that proposal now (as in my experience on community goals effort, I spent 50% of the time on fixing the indirect deps ).
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
Thanks for all your work on this, Ghanshyam.
- Only keep direct deps in lower-constraints.txt
I think this makes sense in master as a sanity check that the requirements.txt minima haven't drifted out of date to the extent that they break unit tests.
- Remove the lower constraints testing from all stable branches.
I agree, there doesn't seem to be any point in this any more.
One question: A takeaway I had from the discussion is that the lower-constraints files aren't being used by packagers, etc., so there doesn't seem to be any point in trying to keep the specified versions as low as possible. So I'm planning to update the cinder project deliverables requirements.txt and lower-constraints.txt files for the upcoming release to whatever pip freeze indicates is actually being used right after Milestone 3. Are there any objections to this strategy?
As per the checks in TC meeting and ML discussion, it was found that only Debian checks lower-constraints and rest all packagers use upper constraints[1]
Are you suggesting making the latest Milestone 3 as a base for lower constraints and in future, keep them as long as they work fine or every cycle you will update them to the latest release (which is nothing but the upper-constraints right?)?
Maybe Thomas can answer your question if that works for Debian.
[1]- http://lists.openstack.org/pipermail/openstack-discuss/2021-January/019877.h...
-gmann
-gmann
-- Jeremy Stanley
On 2021-02-19 12:43:47 -0600 (-0600), Ghanshyam Mann wrote: [...]
As per the checks in TC meeting and ML discussion, it was found that only Debian checks lower-constraints and rest all packagers use upper constraints[1]
Are you suggesting making the latest Milestone 3 as a base for lower constraints and in future, keep them as long as they work fine or every cycle you will update them to the latest release (which is nothing but the upper-constraints right?)?
[...]
Debian is somewhat unique among the distributions providing OpenStack packages due to:
1. they carry packages for a vast amount of software (nearly 100K packages lined up for the next release) so lots of opportunities for version conflicts
2. they attempt when possible to only carry one version of any given software project in order to minimize their maintenance and security burden
3. they include OpenStack and its dependencies directly in the main distribution rather than in a side repository like UCA or RDO
4. they insist that all software within the distribution should be co-installable
As such, the more information they can get about what versions of dependencies we expect them to be able to use, the better able they'll be to find compromises in dependency version conflicts between OpenStack and unrelated software they also distribute.
---- On Wed, 17 Feb 2021 18:49:53 -0600 Ghanshyam Mann gmann@ghanshyammann.com wrote ----
---- On Wed, 20 Jan 2021 16:45:19 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-20 07:26:05 +0000 (+0000), Lucian Petrut wrote:
For Windows related projects such as os-win and networking-hyperv, we decided to keep the lower constraints job but remove indirect dependencies from the lower-constraints.txt file.
This made it much easier to maintain and it allows us to at least cover direct dependencies. I suggest considering this approach instead of completely dropping the lower constraints job, whenever possible. Another option might be to make it non-voting while it’s getting fixed.
[...]
The fewer dependencies a project has, the easier this becomes. I'm not against projects continuing to do it if they can get it to work, but wouldn't want us to pressure folks to spend undue effort on it when they already have a lot more on their plates. I can understand where for projects with a very large set of direct dependencies this still has the problem that your stated minimums may conflict with (semi-circular) dependencies declared elsewhere in the transitive dependency set outside your lower-constraints.txt/requirements.txt file.
I tried with the direct deps with cinder, nova and placement by keeping only deps what we have in requirements.txt, test-requirements.txt, and 'extras' in setup.cfg. We need to get all those three to pass the requirement-check job as it checks for all those deps to be in lower-constraints.txt
Some nova test failing which might need some deps version bump but overall It seems direct deps work fine and removes a lot of deps from l-c file (77 from nova, 64 from cinder). With that testing, I am ok with that proposal now (as in my experience on community goals effort, I spent 50% of the time on fixing the indirect deps ).
I am summarizing the discussion and earlier proposal below, please let us know if that works fine for everyone and accordingly, we can take the next step to document this somewhere and project start working on this.
- Only keep direct deps in lower-constraints.txt
- Remove the lower constraints testing from all stable branches.
TC continued the discussion on the previous week[1] and this week's meeting[2]. With all the current resources and compute bandwidth and effort of maintaining this, it is fine to drop if the project wants.
In the Project team guide, lower bound testing is already mentioned as an optional thing to test[3]. We can clarify it more and in the testing section too, I am doing that in - https://review.opendev.org/c/openstack/project-team-guide/+/781900
As summary: This is up to the project to maintain and test the lower bounds if they want to drop it then also it is entirely fine.
Feel free to reach out to TC on #openstack-tc for further query/clarification.
[1] http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-03-11-15.00.log.html... [2] http://eavesdrop.openstack.org/meetings/tc/2021/tc.2021-03-18-15.01.log.html... [3] 3rd paragraph in https://docs.openstack.org/project-team-guide/dependency-management.html#sol...
-gmann
-gmann
-- Jeremy Stanley
On Wed, 6 Jan 2021 at 19:07, Ghanshyam Mann gmann@ghanshyammann.com wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
I would suggest dropping the lower-constraints job only in projects where it becomes too difficult to maintain.
I fixed those jobs in Blazar yesterday. It was a bit painful, but in the process I discovered several requirements were incorrectly defined, as we were using features not available in the minimum version required…
On Wed, 2021-01-06 at 11:59 -0600, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
Out of curiosity, would limiting the list in lower-constraints to the set of requirements listed in 'requirements.txt' help matters? That would at least ensure the lower version of our explicit dependencies worked. The main issue I could see with this is potentially a lot of thrashing from pip as it attempts to find versions of implicit dependencies that satisfy the various constraints, but I guess we'll have to address that when we come to it.
Stephen
On 2021-01-07 10:15:59 +0000 (+0000), Stephen Finucane wrote: [...]
Out of curiosity, would limiting the list in lower-constraints to the set of requirements listed in 'requirements.txt' help matters? That would at least ensure the lower version of our explicit dependencies worked. The main issue I could see with this is potentially a lot of thrashing from pip as it attempts to find versions of implicit dependencies that satisfy the various constraints, but I guess we'll have to address that when we come to it.
You can try it locally easily enough, but my recollections from before is that what you'll find for larger projects is old releases of some dependencies don't pin upper bounds of their own dependencies and wind up not being usable because they drag in something newer than they can actually use, so it'll be an iterative effort to figure those out. Which has essentially been my problem with that lower bounds testing model, it's a manual effort to figure out what versions of modules in the transitive set will actually be compatible with one another, and then you basically get to redo that work any time you want to adjust a lower bound for something.
But do give it a shot and let us know if it winds up being easier than all that.
---- On Thu, 07 Jan 2021 10:34:21 -0600 Jeremy Stanley fungi@yuggoth.org wrote ----
On 2021-01-07 10:15:59 +0000 (+0000), Stephen Finucane wrote: [...]
Out of curiosity, would limiting the list in lower-constraints to the set of requirements listed in 'requirements.txt' help matters? That would at least ensure the lower version of our explicit dependencies worked. The main issue I could see with this is potentially a lot of thrashing from pip as it attempts to find versions of implicit dependencies that satisfy the various constraints, but I guess we'll have to address that when we come to it.
You can try it locally easily enough, but my recollections from before is that what you'll find for larger projects is old releases of some dependencies don't pin upper bounds of their own dependencies and wind up not being usable because they drag in something newer than they can actually use, so it'll be an iterative effort to figure those out. Which has essentially been my problem with that lower bounds testing model, it's a manual effort to figure out what versions of modules in the transitive set will actually be compatible with one another, and then you basically get to redo that work any time you want to adjust a lower bound for something.
But do give it a shot and let us know if it winds up being easier than all that.
I have not tested it yet but from past testing observation, I remember I end up adding some implicit deps in l-c as they were not compatible with project explicit deps and their deps compatibility so it has to be in l-c explicitly. So I am not sure if restricting the l-c with requirements.txt deps can work or not but good to try.
-gmann
-- Jeremy Stanley
On Thu, Jan 7, 2021 at 5:42 PM Jeremy Stanley fungi@yuggoth.org wrote:
On 2021-01-07 10:15:59 +0000 (+0000), Stephen Finucane wrote: [...]
Out of curiosity, would limiting the list in lower-constraints to the set of requirements listed in 'requirements.txt' help matters? That would at least ensure the lower version of our explicit dependencies worked. The main issue I could see with this is potentially a lot of thrashing from pip as it attempts to find versions of implicit dependencies that satisfy the various constraints, but I guess we'll have to address that when we come to it.
You can try it locally easily enough, but my recollections from before is that what you'll find for larger projects is old releases of some dependencies don't pin upper bounds of their own dependencies and wind up not being usable because they drag in something newer than they can actually use <snip>
This is also why we can't really have a smart-enough solver trying to minimize dep versions as some have no bounds on either side. What would the verdict then be? If it was to install the oldest version ever, I bet it would fail most of the time.
For me, lower constraints are well too complicated to really get right, and, moreover, checking only unit tests with them is likely not useful enough to warrant that they result in working deployments.
I don't envy distro packagers but really the only thing they can bet on is deciding on a set of packaged deps and running tempest against the curated deployment (plus unit tests as they are much cheaper anyhow). Thanks to upper-constraints we know there is at least one combination of deps that will work. We can't really ensure any other in a simple manner.
-yoctozepto
On 2021-01-07 17:55:44 +0100 (+0100), Radosław Piliszek wrote: [...]
This is also why we can't really have a smart-enough solver trying to minimize dep versions as some have no bounds on either side. What would the verdict then be? If it was to install the oldest version ever, I bet it would fail most of the time.
Yes, I expect that too would require some manual tweaking to find appropriate versions to override with, however that wouldn't need to be redone nearly as often as what you end up with when you're fighting tools which always want to install the most recent available version.
For me, lower constraints are well too complicated to really get right, and, moreover, checking only unit tests with them is likely not useful enough to warrant that they result in working deployments.
[...]
This I agree with. I think lower bounds checking is theoretically possible with appropriate tools (which don't currently exist), but would still involve filling in yourself for the authors of less rigorously maintained projects in your transitive dependency set. More generally, basically nothing in the Python packaging ecosystem is designed with the idea of supporting a solution to this, and there's very little to encourage a project to even list much less keep up minimum versions of dependencies, except in order to force an upgrade.
On 1/6/21 12:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
On 21-01-08 10:21:51, Brian Rosmaita wrote:
On 1/6/21 12:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
/requirements hat
l-c was mainly promoted as a way to know when you are using a feature that is not in an old release. The way we generally test is with newer constraints, which don't test what we state we support (the range between the lower bound in requirements.txt and upper-contraints).
While I do think it's useful to know that the range of versions of a library needs to be updated... I understand that it may not be useful, either because of the possible maintenance required by devs, the load on the testing infrastructure generated by testing lower-constraints or that downstream packagers do not use it.
Search this for lower-constraints. https://docs.openstack.org/project-team-guide/dependency-management.html
Indirect dependencies in lower-constraints were not encouraged iirc, both for maintenance reasons (lot of churn) and because 'hopefully' downstream deps are doing the same thing and testing their deps for changes they need.
/downstream packager hat
I do not look at lower-constraints, but I do look at lower-bounds in the requirements.txt file (from which lower-constraints are generated). I look for updates in the lower-bounds to know if a library that was already packaged needed updating, though I do try and target the version mentioned in upper-constraints.txt when updating. More and more I've just made sure that the entire dependency tree for openstack matches what is packaged. Even then though, if the minimum is not updated then this pushes it down on users.
/user (deployer) perspective
Why does $PROJECT not work, I'm going to report it as a bug to $distro, $deployment and $upstream.
What they did was not update the version of pyroute2 (or something) because $project didn't update the lower bound to require it.
On 1/8/2021 11:04 AM, Matthew Thode wrote:
On 21-01-08 10:21:51, Brian Rosmaita wrote:
On 1/6/21 12:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
/requirements hat
l-c was mainly promoted as a way to know when you are using a feature that is not in an old release. The way we generally test is with newer constraints, which don't test what we state we support (the range between the lower bound in requirements.txt and upper-contraints).
While I do think it's useful to know that the range of versions of a library needs to be updated... I understand that it may not be useful, either because of the possible maintenance required by devs, the load on the testing infrastructure generated by testing lower-constraints or that downstream packagers do not use it.
Search this for lower-constraints. https://docs.openstack.org/project-team-guide/dependency-management.html
I am in the same boat as Brian that the lower-constraints have never made much sense to me. The documentation you share above is helpful to understand how everything works but I think it maybe needs to be enhanced as it isn't clear to me as a Cinder team member what I should do to avoid breakages.
If we can add some documentation and guidance as to how to maintain these in the branches to avoid a major breakage like this in the future I think it would be a useful effort.
Jay
Indirect dependencies in lower-constraints were not encouraged iirc, both for maintenance reasons (lot of churn) and because 'hopefully' downstream deps are doing the same thing and testing their deps for changes they need.
/downstream packager hat
I do not look at lower-constraints, but I do look at lower-bounds in the requirements.txt file (from which lower-constraints are generated). I look for updates in the lower-bounds to know if a library that was already packaged needed updating, though I do try and target the version mentioned in upper-constraints.txt when updating. More and more I've just made sure that the entire dependency tree for openstack matches what is packaged. Even then though, if the minimum is not updated then this pushes it down on users.
/user (deployer) perspective
Why does $PROJECT not work, I'm going to report it as a bug to $distro, $deployment and $upstream.
What they did was not update the version of pyroute2 (or something) because $project didn't update the lower bound to require it.
/requirements hat
l-c was mainly promoted as a way to know when you are using a feature that is not in an old release. The way we generally test is with newer constraints, which don't test what we state we support (the range between the lower bound in requirements.txt and upper-contraints).
While I do think it's useful to know that the range of versions of a library needs to be updated... I understand that it may not be useful, either because of the possible maintenance required by devs, the load on the testing infrastructure generated by testing lower-constraints or that downstream packagers do not use it.
Search this for lower-constraints. https://docs.openstack.org/project-team-guide/dependency-management.html
I am in the same boat as Brian that the lower-constraints have never made much sense to me. The documentation you share above is helpful to understand how everything works but I think it maybe needs to be enhanced as it isn't clear to me as a Cinder team member what I should do to avoid breakages.
If we can add some documentation and guidance as to how to maintain these in the branches to avoid a major breakage like this in the future I think it would be a useful effort.
Jay
I agree documentation could really be improved here. But one thing to keep in mind that I think is worth pointing out is that the current breakages are really due to enhancements in pip.
Our l-c jobs have been broken for a long time, and we are now just being made painfully aware of it. But now that pip actually enforces these things, theoretically at least, once we fix the problems we have in l-c it should be much easier to maintain going forward. We shouldn't have random stable branch failures, because once we fix our requirements they should remain stable going forward.
// 2 cents
---- On Fri, 08 Jan 2021 11:04:41 -0600 Matthew Thode mthode@mthode.org wrote ----
On 21-01-08 10:21:51, Brian Rosmaita wrote:
On 1/6/21 12:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
/requirements hat
l-c was mainly promoted as a way to know when you are using a feature that is not in an old release. The way we generally test is with newer constraints, which don't test what we state we support (the range between the lower bound in requirements.txt and upper-contraints).
While I do think it's useful to know that the range of versions of a library needs to be updated... I understand that it may not be useful, either because of the possible maintenance required by devs, the load on the testing infrastructure generated by testing lower-constraints or that downstream packagers do not use it.
Search this for lower-constraints. https://docs.openstack.org/project-team-guide/dependency-management.html
Indirect dependencies in lower-constraints were not encouraged iirc, both for maintenance reasons (lot of churn) and because 'hopefully' downstream deps are doing the same thing and testing their deps for changes they need.
/downstream packager hat
I do not look at lower-constraints, but I do look at lower-bounds in the requirements.txt file (from which lower-constraints are generated). I look for updates in the lower-bounds to know if a library that was already packaged needed updating, though I do try and target the version mentioned in upper-constraints.txt when updating. More and more I've just made sure that the entire dependency tree for openstack matches what is packaged. Even then though, if the minimum is not updated then this pushes it down on users.
I do not have downstream packager maintenance experience but in my local or deps resolver time I do look at the lower bound in requirements.txt
The challenge with that will to keep req.txt lower bound up to date as our CI will be testing with u-c.
-gmann
/user (deployer) perspective
Why does $PROJECT not work, I'm going to report it as a bug to $distro, $deployment and $upstream.
What they did was not update the version of pyroute2 (or something) because $project didn't update the lower bound to require it.
-- Matthew Thode
---- On Fri, 08 Jan 2021 09:21:51 -0600 Brian Rosmaita rosmaita.fossdev@gmail.com wrote ----
On 1/6/21 12:59 PM, Ghanshyam Mann wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it.
I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Yes, that is the plan. This ML thread is to get initial feedback and then discuss in meeting.
I have added this to the next TC meeting on the 14th Jan agenda.
- https://wiki.openstack.org/wiki/Meetings/TechnicalCommittee#Agenda_Suggestio...
-gmann
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521....
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
participants (12)
-
Brian Rosmaita
-
Ghanshyam Mann
-
Jay S. Bryant
-
Jeremy Stanley
-
Lucian Petrut
-
Matthew Thode
-
Pierre Riteau
-
Radosław Piliszek
-
Sean McGinnis
-
Spyros Trigazis
-
Stephen Finucane
-
Thomas Goirand