On Thu, Jan 7, 2021 at 5:42 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2021-01-07 10:15:59 +0000 (+0000), Stephen Finucane wrote: [...]
Out of curiosity, would limiting the list in lower-constraints to the set of requirements listed in 'requirements.txt' help matters? That would at least ensure the lower version of our explicit dependencies worked. The main issue I could see with this is potentially a lot of thrashing from pip as it attempts to find versions of implicit dependencies that satisfy the various constraints, but I guess we'll have to address that when we come to it.
You can try it locally easily enough, but my recollections from before is that what you'll find for larger projects is old releases of some dependencies don't pin upper bounds of their own dependencies and wind up not being usable because they drag in something newer than they can actually use <snip>
This is also why we can't really have a smart-enough solver trying to minimize dep versions as some have no bounds on either side. What would the verdict then be? If it was to install the oldest version ever, I bet it would fail most of the time. For me, lower constraints are well too complicated to really get right, and, moreover, checking only unit tests with them is likely not useful enough to warrant that they result in working deployments. I don't envy distro packagers but really the only thing they can bet on is deciding on a set of packaged deps and running tempest against the curated deployment (plus unit tests as they are much cheaper anyhow). Thanks to upper-constraints we know there is at least one combination of deps that will work. We can't really ensure any other in a simple manner. -yoctozepto