[all][tc] Dropping lower-constraints testing from all projects

Stephen Finucane stephenfin at redhat.com
Thu Jan 21 09:30:19 UTC 2021


On Wed, 2021-01-20 at 11:42 -0600, Ghanshyam Mann wrote:
>  ---- On Wed, 20 Jan 2021 05:13:39 -0600 Stephen Finucane <stephenfin at redhat.com> wrote ----
>  > On Wed, 2021-01-20 at 07:26 +0000, Lucian Petrut wrote:
>  > > Hi,
>  > >  
>  > > For Windows related projects such as os-win and networking-hyperv,
>  > > we decided to keep the lower constraints job but remove indirect
>  > > dependencies from the lower-constraints.txt file.
>  > >  
>  > > This made it much easier to maintain and it allows us to at least cover
>  > > direct dependencies. I suggest considering this approach instead of
>  > > completely dropping the lower constraints job, whenever possible.
>  > > Another option might be to make it non-voting while it’s getting fixed.
>  > >  
>  > > Lucian Petrut
>  > 
>  > Yes, I've looked into doing this elsewhere (as promised) and it seems to do the
>  > job quite nicely. It's not perfect but it does seem to be "good enough" and
>  > captures basic things like "I depend on this function found in oslo.foo vX.Y and
>  > forgot to bump my minimum version to reflect this". I think these jobs probably
>  > offer _more_ value now than they did in the past, given pip is now finally
>  > honouring the explicit constraints we express in these files, so I would be in
>  > favour of this approach rather than dropping l-c entirely. I do realize that
>  > there is some degree of effort here in getting e.g. all the oslo projects fixed,
>  > but I'm happy to help out with and have already fixed quite a few projects. I
> 
> I thought oslo did drop that instead of fixing all failing l-c jobs? May be I am missing something or
> misreading it?

It's been proposed but nothing is merged, pending discussions.

>  > also wouldn't be opposed to dropping l-c on *stable* branches so long as we
>  > maintained for master, on the basis that they were already broken so nothing is
>  > really changing. Sticking to older, admittedly broken versions of pip for stable
>  > branches is another option and might help us avoid a deluge of "remove/fix l-c"
>  > patches for stable branches, but I don't know how practical that is?
> 
> I agree on the point about dropping it on stable to make stable maintenance
> easy. But I think making/keeping n-v is very dangerous and it can easily go
> as 'false information'. The n-v concept was to keep failing/starting jobs n-v
> temporarily and once it is fixed/stable then make it voting. I do not think keeping
> any job as n-v permanently is a good approach.   

I agree non-voting only makes sense if you plan to fix it at a later date. If
not, you should remove it.

> I am still not convinced how 'removal of indirect deps from l-c' make 
> 'Know the lower bounds of openstack packages' better? I think it makes it
> less informative than it is currently. How we will know the lower
> bound for indirect deps? Do not packagers need those or they can go
> with their u-c if so then why not for direct deps? 

What we have doesn't work, and direct dependencies are the only things we can
truly control. In the scenario you're suggesting, not only do we need to track
dependencies, but we also need to track the dependencies of dependencies, and
the dependencies of the dependencies of the dependencies, and the dependencies
of the dependencies of the dependencies of the dependencies etc. etc. down the
rabbit hole. For each of these indirect dependencies, of which there may be
many, we need to figure out what the minimum version is for each of these
indirect dependencies is manually, because as has been noted many times there is
no standardized machinery in place in pip etc. to find (and test) the minimum
dependency versions supported by a package. Put another way, if we depend on
package foo, which depends on package bar, which depends on package baz, we can
state our own informed minimum version for foo, but we will need to inspect foo
to find a minimum version of bar that is suitable, and we will need to inspect
baz to find a minimum version of baz that is suitable. An impossible ask.

> In general, my take here as an upstream maintainer is that we should ship
> the things completely tested/which serve the complete planned mission.
> We should not ship/commit anything as half baked. And we keep such things
> open as one of the TODO if anyone volunteers to fix it.

Maintaining l-c for direct dependencies on all OpenStack projects would mean we
can at least guarantee that these packages have been tested with their supposed
minimum version. Considering that for a package like nova, at least 1/4 of the
dependencies are "OpenStack-backed", this is no small deal. These jobs encourage
us to ensure these minimums still make sense and to correct things if not. As
noted previously, they're not perfect but they still provides a service that we
won't have if we simply delete this machinery entirely.

Stephen

> -gmann
> 
> 
>  > 
>  > Stephen
>  > 
>  > > From: Jeremy Stanley
>  > > Sent: Wednesday, January 20, 2021 1:52 AM
>  > > To: openstack-discuss at lists.openstack.org
>  > > Subject: Re: [all][tc] Dropping lower-constraints testing from all projects
>  > >  
>  > > On 2021-01-20 00:09:39 +0100 (+0100), Thomas Goirand wrote:
>  > > [...]
>  > > > Something I don't understand: why can't we use an older version of
>  > > > pip, if the problem is the newer pip resolver? Or can't the
>  > > > current pip be patched to fix things? It's not as if there was no
>  > > > prior art... Maybe I'm missing the big picture?
>  > > [...]
>  > >  
>  > > To get to the heart of the matter, when using older versions of pip
>  > > it was just quietly installing different versions of packages than
>  > > we asked it to, and versions of transitive dependencies which
>  > > directly conflicted with the versions other dependencies said they
>  > > required. When pip finally (very recently) implemented a coherent
>  > > dependency solver, it started alerting us directly to this fact. We
>  > > could certainly find a way to hide our heads in the sand and go back
>  > > to testing with old pip and pretending we knew what was being tested
>  > > there, but the question is whether what we were actually testing
>  > > that way was worthwhile enough to try to continue doing it, now that
>  > > we have proof it wasn't what we were wanting to test.
>  > >  
>  > > The challenge with actually testing what we wanted has always been
>  > > that there's many hundreds of packages we depend on and, short of
>  > > writing one ourselves, no tool available to find a coherent set of
>  > > versions of them which satisfy the collective lower bounds. The way
>  > > pip works, it wants to always solve for the newest possible
>  > > versions which satisfy an aggregate set of version ranges, and what
>  > > we'd want for lower bounds checking is the inverse of that.
>  > > -- 
>  > > Jeremy Stanley
>  > >  
>  > 
>  > 
>  > 
>  > 
> 





More information about the openstack-discuss mailing list