On 1/8/2021 11:04 AM, Matthew Thode wrote:
On 21-01-08 10:21:51, Brian Rosmaita wrote:
Hello Everyone,
You might have seen the discussion around dropping the lower constraints testing as it becomes more challenging than the current value of doing it. I think the TC needs to discuss this explicitly (at a meeting or two, not just on the ML) and give the projects some guidance. I agree that there's
On 1/6/21 12:59 PM, Ghanshyam Mann wrote: little point in maintaining the l-c if they're not actually useful to anyone in their current state, but if their helpfulness (or potential helpfulness) outweighs the maintenance burden, then we should keep them. (How's that for a profound statement?)
Maybe someone can point me to where I can RTFM to get a clearer picture, but my admittedly vague idea of what the l-c are for is that it has something to do with making packaging easier. If that's the case, it would be good for the TC to reach out to some openstack packagers/distributors to find outline how they use l-c (if at all) and what changes could be done to make them actually useful, and then we can re-assess the maintenance burden.
This whole experience with the new pip resolver has been painful, I think, because it hit all projects and all branches at once. My experience, however, is that if I'd been updating the minimum versions for all the cinder deliverables in their requirements.txt and l-c.txt files every cycle to reflect a pip freeze at Milestone-3 it would have been a lot easier.
What do other projects do about this? In Cinder, we've just been updating the requirements on-demand, not proactively, and as a result for some dependencies we claimed that foo>=0.9.0 is OK -- but except for unit tests in the l-c job, cinder deliverables haven't been using anything other than foo>=16.0 since rocky. So in master, I took advantage of having to revise requirements and l-c to make some major jumps in minimum versions. And I'm thinking of doing a pip-freeze requirements.txt minimum version update from now on at M-3 each cycle, which will force me to make an l-c.txt update too. (Maybe I was supposed to be doing that all along? Or maybe it's a bad idea? I could use some guidance here.)
It would be good for the l-c to reflect reality, but on the other hand, updating the minimum versions in requirements.txt (and hence in l-c) too aggressively probably won't help packagers at all. (Or maybe it will, I don't know.) On the other hand, having the l-c is useful from the standpoint of letting you know when your minimum acceptable version in requirements.txt will break your unit tests. But if we're updating the minimum versions of dependencies every cycle to known good minimum versions, an l-c failure is going to be pretty rare, so maybe it's not worth the trouble of maintaining the l-c.txt and CI job.
One other thing: if we do keep l-c, we need to have some guidance about what's actually supposed to be in there. (Or I need to RTFM.) I've noticed that as we've added new dependencies to cinder, we've included the dependency in l-c.txt, but not its indirect dependencies. I guess we should have been adding the indirect dependencies all along, too? (Spoiler alert: we haven't.)
This email has gotten too long, so I will shut up now.
cheers, brian
Few of the ML thread around this discussion:
- http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019521.... - http://lists.openstack.org/pipermail/openstack-discuss/2020-December/019390....
As Oslo and many other project dropping or already dropped it, we should decide it for all other projects also otherwise it can be more changing than it is currently.
We have not defined it in PTI or testing runtime so it is always up to projects if they still want to keep it but we should decide a general recommendation here.
-gmann
/requirements hat
l-c was mainly promoted as a way to know when you are using a feature that is not in an old release. The way we generally test is with newer constraints, which don't test what we state we support (the range between the lower bound in requirements.txt and upper-contraints).
While I do think it's useful to know that the range of versions of a library needs to be updated... I understand that it may not be useful, either because of the possible maintenance required by devs, the load on the testing infrastructure generated by testing lower-constraints or that downstream packagers do not use it.
Search this for lower-constraints. https://docs.openstack.org/project-team-guide/dependency-management.html
I am in the same boat as Brian that the lower-constraints have never made much sense to me. The documentation you share above is helpful to understand how everything works but I think it maybe needs to be enhanced as it isn't clear to me as a Cinder team member what I should do to avoid breakages. If we can add some documentation and guidance as to how to maintain these in the branches to avoid a major breakage like this in the future I think it would be a useful effort. Jay
Indirect dependencies in lower-constraints were not encouraged iirc, both for maintenance reasons (lot of churn) and because 'hopefully' downstream deps are doing the same thing and testing their deps for changes they need.
/downstream packager hat
I do not look at lower-constraints, but I do look at lower-bounds in the requirements.txt file (from which lower-constraints are generated). I look for updates in the lower-bounds to know if a library that was already packaged needed updating, though I do try and target the version mentioned in upper-constraints.txt when updating. More and more I've just made sure that the entire dependency tree for openstack matches what is packaged. Even then though, if the minimum is not updated then this pushes it down on users.
/user (deployer) perspective
Why does $PROJECT not work, I'm going to report it as a bug to $distro, $deployment and $upstream.
What they did was not update the version of pyroute2 (or something) because $project didn't update the lower bound to require it.