On 2022-02-22 17:56:15 +0100 (+0100), Thomas Goirand wrote: [...]
I'm now thinking: we should re-do lower constraints checks!!!
Your thoughts anyone?
The reason they were dropped is that the current way we install Python library dependencies for testing is to rely on pip, and when pip eventually grew a consistent dependency solver it quickly pointed out that we weren't testing what we thought we were. Back when pip was willing to install mutually-conflicting versions of transitive dependencies, the lower bounds we claimed to test were a convenient fiction. There is no automated tool available which can find a coherent set of lower bounds; pip is focused on finding the highest available versions which meet specified ranges. One reason such a tool doesn't exist though is that, as you choose earlier and earlier versions of packages, you also effectively travel back in time to older packaging standards with less available data for making appropriate dependency decisions (and you'll eventually hit bedrock in places where nobody was declaring minimum versions so you get foo==0.0.1 for lots of foo). It's this "time travel" aspect which is most problematic, because dependencies are declared within the packages themselves, so there is no way to go back later and update the minimum requirements for an already published version. It's probably feasible to hand curate, with lots of trial and error, a coherent lower bound set for a project with a moderate number of dependencies, but every time you update it you have to repeat that same cumbersome manual experimentation due to ripple effects from interdependencies between the packages. At OpenStack's collective scale, it borders on impossible. -- Jeremy Stanley