[all] Lower-constraints in some projects broken - update your repos
Lower-constraints should test that the minimal requirements work together. The way we use the install-command in tox.ini, pip often ignores the constraints and install a newer package then requested. First Example: cloudkitty (see [1], [2] ,[3] , [4]) The repo has: install_command = pip install -c{env:UPPER_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master} {opts} {packages} and in lower-constraints.txt: "python-keystoneclient==1.9.0" And this runs the test with "keystoneclient==4.0.0" [1] Removing the install_command [3] or moving constraints from install_command into deps [2] gives "ERROR: Could not find a version that satisfies the requirement python-keystoneclient==1.9.0" So, this works as expected - but means this repo never tested what it expected to test. Second example: kolla-cli [5] Removing the install_command with constraints, suddenly lower-constraints fails with: "Could not find a version that satisfies the requirement mypy==0.6" Same problem as with cloudkitty. I fear that every repo that has constraints in its install_command has broken lower-constraints file. Just remove it and see that nothing works anymore ;/ Therefore, I suggest for repos to remove install_command - the default is just fine in newer tox (3.x) - and fix their lower-constraints.txt to really test that the specified package versions work together. Information about lower-constraints.txt is at [6], Andreas References: [1] https://review.opendev.org/720767 [2] https://review.opendev.org/720768 [3] https://review.opendev.org/720770 [4] https://review.opendev.org/720775 [5] https://review.opendev.org/#/c/720754/3 [6] https://docs.openstack.org/project-team-guide/dependency-management.html -- Andreas Jaeger aj@suse.com Twitter: jaegerandi SUSE Software Solutions Germany GmbH, Maxfeldstr. 5, D 90409 Nürnberg (HRB 36809, AG Nürnberg) GF: Felix Imendörffer GPG fingerprint = EF18 1673 38C4 A372 86B1 E699 5294 24A3 FF91 2ACB
Thanks, Andreas, for analysis. I have more to add: in Zuul we rely on extra wheels. In the wild usually only on PyPI ones. This adds more flavour to lower-constraints breakage as py3 incompats make their appearance in full, e.g.: Collecting MarkupSafe==1.0 Using cached MarkupSafe-1.0.tar.gz (14 kB) ERROR: Command errored out with exit status 1: Complete output (5 lines): Traceback (most recent call last): File "<string>", line 1, in <module> File "/tmp/pip-install-_vser3_6/MarkupSafe/setup.py", line 6, in <module> from setuptools import setup, Extension, Feature ImportError: cannot import name 'Feature' -yoctozepto On Fri, Apr 17, 2020 at 6:53 PM Andreas Jaeger <aj@suse.com> wrote:
Lower-constraints should test that the minimal requirements work together.
The way we use the install-command in tox.ini, pip often ignores the constraints and install a newer package then requested.
First Example: cloudkitty (see [1], [2] ,[3] , [4])
The repo has: install_command = pip install -c{env:UPPER_CONSTRAINTS_FILE: https://releases.openstack.org/constraints/upper/master} {opts} {packages} and in lower-constraints.txt: "python-keystoneclient==1.9.0"
And this runs the test with "keystoneclient==4.0.0" [1]
Removing the install_command [3] or moving constraints from install_command into deps [2] gives "ERROR: Could not find a version that satisfies the requirement python-keystoneclient==1.9.0"
So, this works as expected - but means this repo never tested what it expected to test.
Second example: kolla-cli [5] Removing the install_command with constraints, suddenly lower-constraints fails with: "Could not find a version that satisfies the requirement mypy==0.6"
Same problem as with cloudkitty.
I fear that every repo that has constraints in its install_command has broken lower-constraints file. Just remove it and see that nothing works anymore ;/
Therefore, I suggest for repos to remove install_command - the default is just fine in newer tox (3.x) - and fix their lower-constraints.txt to really test that the specified package versions work together.
Information about lower-constraints.txt is at [6], Andreas
References: [1] https://review.opendev.org/720767 [2] https://review.opendev.org/720768 [3] https://review.opendev.org/720770 [4] https://review.opendev.org/720775 [5] https://review.opendev.org/#/c/720754/3 [6] https://docs.openstack.org/project-team-guide/dependency-management.html -- Andreas Jaeger aj@suse.com Twitter: jaegerandi SUSE Software Solutions Germany GmbH, Maxfeldstr. 5, D 90409 Nürnberg (HRB 36809, AG Nürnberg) GF: Felix Imendörffer GPG fingerprint = EF18 1673 38C4 A372 86B1 E699 5294 24A3 FF91 2ACB
On 2020-04-17 19:00:31 +0200 (+0200), Radosław Piliszek wrote:
Thanks, Andreas, for analysis.
I have more to add: in Zuul we rely on extra wheels. In the wild usually only on PyPI ones. This adds more flavour to lower-constraints breakage as py3 incompats make their appearance in full, e.g.:
Collecting MarkupSafe==1.0 Using cached MarkupSafe-1.0.tar.gz (14 kB) ERROR: Command errored out with exit status 1: Complete output (5 lines): Traceback (most recent call last): File "<string>", line 1, in <module> File "/tmp/pip-install-_vser3_6/MarkupSafe/setup.py", line 6, in <module> from setuptools import setup, Extension, Feature ImportError: cannot import name 'Feature' [...]
I don't understand what you're trying to demonstrate here. MarkupSafe-1.0.tar.gz is an sdist, not a wheel. In our CI jobs we retrieve sdists from PyPI (via a caching proxy), and also do the same for any wheels which are published on PyPI. Some packages which aren't generally available as wheels for our target platforms are supplemented by sets we periodically pre-build and serve from a central wheelhouse, to save time so jobs don't have to redundantly rebuild them all from sdists themselves on every run. -- Jeremy Stanley
Sorry for lack of example, here it is: https://b702c277c3869af6f0a9-3df34f04e18be629eb587340c626577b.ssl.cf5.rackcd... Collecting MarkupSafe==1.0 Downloading http://mirror.bhs1.ovh.openstack.org/wheel/ubuntu-18.04-x86_64/markupsafe/Ma... (31 kB) -yoctozepto On Fri, Apr 17, 2020 at 7:43 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-17 19:00:31 +0200 (+0200), Radosław Piliszek wrote:
Thanks, Andreas, for analysis.
I have more to add: in Zuul we rely on extra wheels. In the wild usually only on PyPI ones. This adds more flavour to lower-constraints breakage as py3 incompats make their appearance in full, e.g.:
Collecting MarkupSafe==1.0 Using cached MarkupSafe-1.0.tar.gz (14 kB) ERROR: Command errored out with exit status 1: Complete output (5 lines): Traceback (most recent call last): File "<string>", line 1, in <module> File "/tmp/pip-install-_vser3_6/MarkupSafe/setup.py", line 6, in <module> from setuptools import setup, Extension, Feature ImportError: cannot import name 'Feature' [...]
I don't understand what you're trying to demonstrate here. MarkupSafe-1.0.tar.gz is an sdist, not a wheel. In our CI jobs we retrieve sdists from PyPI (via a caching proxy), and also do the same for any wheels which are published on PyPI. Some packages which aren't generally available as wheels for our target platforms are supplemented by sets we periodically pre-build and serve from a central wheelhouse, to save time so jobs don't have to redundantly rebuild them all from sdists themselves on every run. -- Jeremy Stanley
On 2020-04-17 19:47:22 +0200 (+0200), Radosław Piliszek wrote:
Sorry for lack of example, here it is: https://b702c277c3869af6f0a9-3df34f04e18be629eb587340c626577b.ssl.cf5.rackcd...
Collecting MarkupSafe==1.0 Downloading http://mirror.bhs1.ovh.openstack.org/wheel/ubuntu-18.04-x86_64/markupsafe/Ma... (31 kB) [...]
Yep, so in that case the job is fetching a Markupsafe wheel we pre-built with Python 3.6 on Ubuntu 18.04 (Bionic) for 64-bit x86 CPU architectures. That seems normal. If we didn't provide that in the configured wheelhouse, pip would retrieve the Markupsafe sdist and build an equivalent platform-specific wheel at job runtime to install instead. The build log you linked shows a failure unrelated to the Markupsafe package: ERROR: No matching distribution found for mypy==0.6 (from -c /home/zuul/src/opendev.org/openstack/kolla-cli/lower-constraints.txt (line 43)) The output leading up to that seems to indicate that mypy==0.6 is not a valid/available package version on PyPI, and indeed https://pypi.org/project/mypy/#history confirms that to be the case. -- Jeremy Stanley
Yeah. That is all true. I just wanted to let you know that lower-constraints may break more often when run outside of CI. That said, what should lower-constraints really include? I understand [1] that it should just explicitly ask for specific versions of packages in requirements.txt However, per what Sean just told me [2], it is seemingly not enough. All in all, it passed CI checks successfully. :-) [1] https://docs.openstack.org/project-team-guide/dependency-management.html#upd... [2] https://review.opendev.org/720754 -yoctozepto On Fri, Apr 17, 2020 at 8:09 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-17 19:47:22 +0200 (+0200), Radosław Piliszek wrote:
Sorry for lack of example, here it is: https://b702c277c3869af6f0a9-3df34f04e18be629eb587340c626577b.ssl.cf5.rackcd...
Collecting MarkupSafe==1.0 Downloading http://mirror.bhs1.ovh.openstack.org/wheel/ubuntu-18.04-x86_64/markupsafe/Ma... (31 kB) [...]
Yep, so in that case the job is fetching a Markupsafe wheel we pre-built with Python 3.6 on Ubuntu 18.04 (Bionic) for 64-bit x86 CPU architectures. That seems normal. If we didn't provide that in the configured wheelhouse, pip would retrieve the Markupsafe sdist and build an equivalent platform-specific wheel at job runtime to install instead.
The build log you linked shows a failure unrelated to the Markupsafe package:
ERROR: No matching distribution found for mypy==0.6 (from -c /home/zuul/src/opendev.org/openstack/kolla-cli/lower-constraints.txt (line 43))
The output leading up to that seems to indicate that mypy==0.6 is not a valid/available package version on PyPI, and indeed https://pypi.org/project/mypy/#history confirms that to be the case. -- Jeremy Stanley
On 4/17/20 1:40 PM, Radosław Piliszek wrote:
Yeah. That is all true.
I just wanted to let you know that lower-constraints may break more often when run outside of CI.
That said, what should lower-constraints really include? I understand [1] that it should just explicitly ask for specific versions of packages in requirements.txt However, per what Sean just told me [2], it is seemingly not enough. All in all, it passed CI checks successfully. :-)
[1] https://docs.openstack.org/project-team-guide/dependency-management.html#upd... [2] https://review.opendev.org/720754
-yoctozepto
This had to be explained to me a few times before I got it, so don't feel bad. ;) The lower-constraints file really does need to include everything, not just the project's direct dependencies. If it only contains what is listed in requirements.txt, then there is the possibility that one of the indirect dependencies that does not strictly enforce its own required version will end up pulling in a newer version than what is expected. The goal of lower constraints testing is to have a static set of all requirements that are known and validated to work. If we only restrict the 1st level dependencies, then without changing anything, that known set that was passing tests could suddenly stop working. We actually had a good discussion of this on IRC today, so it might be helpful to read that for a little more context: http://eavesdrop.openstack.org/irclogs/%23openstack-sdks/%23openstack-sdks.2... I think ultimately the goal when this was added was to be able to communicate downstream an acceptable range of compatible packages that could be installed together and expect to work. We could probably be more aggressive about raising those minimums to keep picking up newer things. But unless we can get a static(ish) snapshot of the whole dependency graph like this, even that would like break often. Or not accurately test what we think we are testing.
On Fri, Apr 17, 2020 at 9:06 PM Sean McGinnis <sean.mcginnis@gmx.com> wrote:
This had to be explained to me a few times before I got it, so don't feel bad. ;)
No problem, I don't. :-) Cannot feel bad when docs say what they say.
The lower-constraints file really does need to include everything, not just the project's direct dependencies. If it only contains what is listed in requirements.txt, then there is the possibility that one of the indirect dependencies that does not strictly enforce its own required version will end up pulling in a newer version than what is expected.
The goal of lower constraints testing is to have a static set of all requirements that are known and validated to work. If we only restrict the 1st level dependencies, then without changing anything, that known set that was passing tests could suddenly stop working.
We actually had a good discussion of this on IRC today, so it might be helpful to read that for a little more context:
http://eavesdrop.openstack.org/irclogs/%23openstack-sdks/%23openstack-sdks.2...
I read it all and agree with Dmitry that this would make most sense if we had to pin only direct deps (which is what I did exactly). However, I see the point that pip has shortcomings in dependency resolution and may end up with broken dependency chain.
I think ultimately the goal when this was added was to be able to communicate downstream an acceptable range of compatible packages that could be installed together and expect to work.
Let's ask packagers if they ever used this info. Maybe we are doing something just for doing it (and not really *doing* it actually). ;-)
We could probably be more aggressive about raising those minimums to keep picking up newer things. But unless we can get a static(ish) snapshot of the whole dependency graph like this, even that would like break often. Or not accurately test what we think we are testing.
These are pain points for sure. On the one hand, it looks like lower constraints tries to be like npm/yarn lockfile in nodejs world [1] [2]. On the other, that it is to ensure our lower constraints are modern enough to handle our code. But it seemingly falls short doing either, because the first is not enforced, and the second is additionally limited in that we only run unit testing (as far as I could see) where real deps testing will actually happen in functional testing (as in unit testing a ton of functionality is mocked anyway, so we may often end up testing importability at most). All in all, I think we discovered shortcomings in our methodology, tooling and docs, and it's a good topic for meeting agenda (reqs team? QA? TC?). [1] https://docs.npmjs.com/configuring-npm/package-locks.html [2] https://classic.yarnpkg.com/en/docs/yarn-lock/ -yoctozepto
On 2020-04-18 10:41:16 +0200 (+0200), Radosław Piliszek wrote: [...]
Let's ask packagers if they ever used this info. Maybe we are doing something just for doing it (and not really *doing* it actually). ;-) [...]
Distro package maintainers asked us to indicate what ranges of dependencies our projects can work with. Some projects additionally want to be able to assert that they support older versions of some of their dependencies when deployed in a stand-alone fashion. It's been only relatively recently that we switched from enforcing exactly matching version ranges for dependencies across all OpenStack projects, to allowing them to indicate that they support older versions of software than some other projects. This assertion isn't much use, though, if it's not tested. Providing a project-specific list of lower constraints allows a project to verify that its indicated minimum dependency versions are actually working in some capacity, and at the same time provides a list of the exact versions of its entire transitive dependency set used in performing those tests.
it looks like lower constraints tries to be like npm/yarn lockfile in nodejs world [1] [2].
The constraints mechanism in pip was actually added by members of the OpenStack community, as a stop-gap until pip could grow an intelligent dependency solver. It allows us to pre-compute a known compatible set of package versions. I'm not really that familiar with the state of art in the Javascript ecosystem, but pip constraints lists are not a list of dependencies, they're a list of versions the installer should use to override its normal version selection for any matching package names in your list of dependencies and their transitive dependency chain as well. This allows you to keep your declaration of supported direct dependencies and their possible ranges of versions separate from the list of exact versions you want applied for some particular scenario. Because a constraints list is treated as a sieve, it can be shared between multiple projects in a particular deployment scenario without changing the list of packages which will be installed for each (merely which versions of them are installed).
On the other, that it is to ensure our lower constraints are modern enough to handle our code. But it seemingly falls short doing either, because the first is not enforced,
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
and the second is additionally limited in that we only run unit testing (as far as I could see) where real deps testing will actually happen in functional testing (as in unit testing a ton of functionality is mocked anyway, so we may often end up testing importability at most). [...]
There is nothing stopping folks from applying constraints lists in whatever Python-based jobs they want. -- Jeremy Stanley
On Sat, Apr 18, 2020 at 2:56 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 10:41:16 +0200 (+0200), Radosław Piliszek wrote:
it looks like lower constraints tries to be like npm/yarn lockfile in nodejs world [1] [2].
The constraints mechanism in pip was actually added by members of the OpenStack community, as a stop-gap until pip could grow an intelligent dependency solver. It allows us to pre-compute a known compatible set of package versions. I'm not really that familiar with the state of art in the Javascript ecosystem, but pip constraints lists are not a list of dependencies, they're a list of versions the installer should use to override its normal version selection for any matching package names in your list of dependencies and their transitive dependency chain as well. This allows you to keep your declaration of supported direct dependencies and their possible ranges of versions separate from the list of exact versions you want applied for some particular scenario. Because a constraints list is treated as a sieve, it can be shared between multiple projects in a particular deployment scenario without changing the list of packages which will be installed for each (merely which versions of them are installed).
Sure, the upper-constraints are doing that just fine globally. The issue is lower-constraints are not really trustworthy.
On the other, that it is to ensure our lower constraints are modern enough to handle our code. But it seemingly falls short doing either, because the first is not enforced,
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
Oh, I mean the lockfile part. If lower-constraints jobs pass without enforcing each transitive dependency, then it's not enforced in this way.
and the second is additionally limited in that we only run unit testing (as far as I could see) where real deps testing will actually happen in functional testing (as in unit testing a ton of functionality is mocked anyway, so we may often end up testing importability at most). [...]
There is nothing stopping folks from applying constraints lists in whatever Python-based jobs they want.
Indeed, I'm just pointing it out. -yoctozepto
On 2020-04-18 16:18:24 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 2:56 PM Jeremy Stanley <fungi@yuggoth.org> wrote: [...]
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
Oh, I mean the lockfile part. If lower-constraints jobs pass without enforcing each transitive dependency, then it's not enforced in this way. [...]
I wouldn't mind digging into a specific example of this. It seems likely to be one (or more) of: * an incorrect or incomplete configuration * a misunderstanding about what is being constrained * a bug in pip or setuptools * a broken CI job The way it's supposed to work is that when pip decides to install a package (whether directly or as a dependency of something else) it checks the available versions of that package against the supplied list of version constraints and errors if there is no available version of the package which meets those constraints. If that's not what's happening, then something's clearly wrong. -- Jeremy Stanley
On Sat, Apr 18, 2020 at 5:18 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 16:18:24 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 2:56 PM Jeremy Stanley <fungi@yuggoth.org> wrote: [...]
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
Oh, I mean the lockfile part. If lower-constraints jobs pass without enforcing each transitive dependency, then it's not enforced in this way. [...]
I wouldn't mind digging into a specific example of this. It seems likely to be one (or more) of: <snip>
Be my guest. The case is about *transitive* dependencies, not direct. See the already mentioned kolla-cli change. [1] This is what is not enforced (except for Sean's legit -1 :-) ). [1] https://review.opendev.org/720754 -yoctozepto
On 2020-04-18 17:22:38 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 5:18 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 16:18:24 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 2:56 PM Jeremy Stanley <fungi@yuggoth.org> wrote: [...]
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
Oh, I mean the lockfile part. If lower-constraints jobs pass without enforcing each transitive dependency, then it's not enforced in this way. [...]
I wouldn't mind digging into a specific example of this. It seems likely to be one (or more) of: <snip>
Be my guest. The case is about *transitive* dependencies, not direct. See the already mentioned kolla-cli change. [1] This is what is not enforced (except for Sean's legit -1 :-) ).
Thanks. So you're asserting that the problem here is that 720754,4 has a passing openstack-tox-lower-constraints build but you think it should not? Can you explain little more as to why you think it should have failed? I see you removing a bunch of package versions from the constraints list, but that's not what the job is intended to catch. It's there to find out if tests pass using the versions you're saying you want installed. If that's the concern, it's like saying a job should fail if you remove some tests from it. We run jobs to tell us if the things we want to test work, not to tell us that we've stopped testing something (coverage jobs being an obvious exception there). What I would consider a problem with the job is if the constraints file specified one version of a package but a different version of that package got installed instead. If *that's* what's happening (though skimming the logs I don't see any evidence of it) then I agree something is wrong and we should seek to fix it. I did at least double-check that the entries you've left in lower-constraints.txt match those pip installed according to tox/lower-constraints-1.log so it looks to me like its working as designed. -- Jeremy Stanley
On Sat, Apr 18, 2020 at 6:01 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 17:22:38 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 5:18 PM Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 16:18:24 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 2:56 PM Jeremy Stanley <fungi@yuggoth.org> wrote: [...]
In what way is it not enforced? Or put another way, what were you expecting it to enforce which it doesn't?
Oh, I mean the lockfile part. If lower-constraints jobs pass without enforcing each transitive dependency, then it's not enforced in this way. [...]
I wouldn't mind digging into a specific example of this. It seems likely to be one (or more) of: <snip>
Be my guest. The case is about *transitive* dependencies, not direct. See the already mentioned kolla-cli change. [1] This is what is not enforced (except for Sean's legit -1 :-) ).
Thanks. So you're asserting that the problem here is that 720754,4 has a passing openstack-tox-lower-constraints build but you think it should not? Can you explain little more as to why you think it should have failed?
I am not sure whether it should. I am just asking how far this thing should go. This is all regarding the Sean's comment that I removed too much. No other agenda. :D
I see you removing a bunch of package versions from the constraints list, but that's not what the job is intended to catch. It's there to find out if tests pass using the versions you're saying you want installed. If that's the concern, it's like saying a job should fail if you remove some tests from it. We run jobs to tell us if the things we want to test work, not to tell us that we've stopped testing something (coverage jobs being an obvious exception there).
OK, that makes it clear. No coverage, just extra install pinned fun. Too lazy to test but I guess empty lower-constraints.txt would make tox happy as well.
What I would consider a problem with the job is if the constraints file specified one version of a package but a different version of that package got installed instead. If *that's* what's happening (though skimming the logs I don't see any evidence of it) then I agree something is wrong and we should seek to fix it. I did at least double-check that the entries you've left in lower-constraints.txt match those pip installed according to tox/lower-constraints-1.log so it looks to me like its working as designed.
Nah, it's surely not happening. The point was really about coverage and what we want from lower-constraints really (as OpenStack community). -yoctozepto
On 2020-04-18 18:54:33 +0200 (+0200), Radosław Piliszek wrote:
On Sat, Apr 18, 2020 at 6:01 PM Jeremy Stanley <fungi@yuggoth.org> wrote: [...]
What I would consider a problem with the job is if the constraints file specified one version of a package but a different version of that package got installed instead. If *that's* what's happening (though skimming the logs I don't see any evidence of it) then I agree something is wrong and we should seek to fix it. I did at least double-check that the entries you've left in lower-constraints.txt match those pip installed according to tox/lower-constraints-1.log so it looks to me like its working as designed.
Nah, it's surely not happening. The point was really about coverage and what we want from lower-constraints really (as OpenStack community).
In another subthread, Andreas has pointed out an example where the lower constraints job was actually ignoring mismatched requirements under some circumstances, which does seem buggy. If we want to keep these jobs, we should probably find a way to make them fail on unsatisfiable constraints sets. Subsequent investigation has, however, led me to suspect that the effort involved in establishing and maintaining a coherent set of lower constraints for many projects is currently untenable. -- Jeremy Stanley
More fun: change https://review.opendev.org/720877 removes: install_command = pip install {opts} {packages} And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;( Andreas -- Andreas Jaeger aj@suse.com Twitter: jaegerandi SUSE Software Solutions Germany GmbH, Maxfeldstr. 5, D 90409 Nürnberg (HRB 36809, AG Nürnberg) GF: Felix Imendörffer GPG fingerprint = EF18 1673 38C4 A372 86B1 E699 5294 24A3 FF91 2ACB
On Sat, Apr 18, 2020 at 6:14 PM Andreas Jaeger <aj@suse.com> wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
Umm, is there something special about this? It looks like kolla-cli case to me. -yoctozepto
On 2020-04-18 18:11:58 +0200 (+0200), Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
After analyzing this over IRC, it looks like openstack/networking-hyperv probably has an incoherent lower-constraints.txt file because it asks for (one of many examples) neutron-lib==1.28.0 and oslo.concurrency==3.25.0 but neutron-lib 1.28.0 declares it requires oslo.concurrency>=3.26.0 so this won't work. The lower constraints jobs are built on the premise that they're working with a satisfiable set of constraints. Any time you alter a lower bound for any dependency or add a new dependency, you'll need to *fully* rebuild your lower-constraints.txt because of the possible knock-on effects such a change can have on the transitive dependencies tracked there. https://review.opendev.org/675572 replaced neutron-lib>=1.18.0 with neutron-lib>=1.28.0 in the requirements.txt, but only edited the corresponding entry in lower-constraints.txt instead of taking all of neutron-lib's dependencies (and their dependencies, and so on) into account. The result is a lower constraints job which isn't testing what they think it's testing. The good news is that's all fixable. The bad news is that this problem is probably widespread, because the places I expected to talk about how to properly maintain your lower-constraints.txt file provide minimal and potentially misleading advice: https://docs.openstack.org/project-team-guide/dependency-management.html#add... https://docs.openstack.org/project-team-guide/dependency-management.html#upd... Neither of those suggest rebuilding your lower-constraints.txt when you alter a lower bound or add a new requirement. Further, I thought I remembered there being a utility in the openstack/requirements repository for correctly generating a lower-constraints.txt file, but this does not actually exist that I can find. The more I think back, the more I'm starting to remember that I suggested the only thorough way (short of reimplementing pip's version selection logic ourselves) to generate a lower constraints file from the lower bounds in a requirements set would be by using an altered copy of pip which inverted its version comparisons so that it selected the lowest available versions of packages which would satisfy every version range rather than the highest. I don't think anybody has made that, and so I expect instead we've collectively hand-waved with an assumption that people would be able to figure out a working set of lower constraints instead (which is in my opinion rather unrealistic given the complexity of the transitive dependency trees of even some of our medium-sized projects). The upshot of this is that correctly calculating a lower constraint set for any project is likely to involve a lot of trial and error, probably far more than most projects are going to want to bear. If we had a tool we could run to generate a coherent lower-constraints.txt for a project, then this might be different, but I suspect that too is going to be more hassle than anyone wants to invest their time in creating. -- Jeremy Stanley
On Apr 18, 2020, at 1:29 PM, Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 18:11:58 +0200 (+0200), Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
After analyzing this over IRC, it looks like openstack/networking-hyperv probably has an incoherent lower-constraints.txt file because it asks for (one of many examples) neutron-lib==1.28.0 and oslo.concurrency==3.25.0 but neutron-lib 1.28.0 declares it requires oslo.concurrency>=3.26.0 so this won't work.
The lower constraints jobs are built on the premise that they're working with a satisfiable set of constraints. Any time you alter a lower bound for any dependency or add a new dependency, you'll need to *fully* rebuild your lower-constraints.txt because of the possible knock-on effects such a change can have on the transitive dependencies tracked there.
https://review.opendev.org/675572 replaced neutron-lib>=1.18.0 with neutron-lib>=1.28.0 in the requirements.txt, but only edited the corresponding entry in lower-constraints.txt instead of taking all of neutron-lib's dependencies (and their dependencies, and so on) into account. The result is a lower constraints job which isn't testing what they think it's testing.
The good news is that's all fixable. The bad news is that this problem is probably widespread, because the places I expected to talk about how to properly maintain your lower-constraints.txt file provide minimal and potentially misleading advice:
https://docs.openstack.org/project-team-guide/dependency-management.html#add...
https://docs.openstack.org/project-team-guide/dependency-management.html#upd...
Neither of those suggest rebuilding your lower-constraints.txt when you alter a lower bound or add a new requirement. Further, I thought I remembered there being a utility in the openstack/requirements repository for correctly generating a lower-constraints.txt file, but this does not actually exist that I can find.
IIRC, the scripting I did way back when [1] was pretty naive. It looked at requirements.txt, converted >= to == and assumed either that any second order dependencies installed as a result of that set of constraints were the lowest likely to work or that teams would take care of editing the list. There did used to be an update-requirements tool in the requirements repo, but it was removed as part of that work and I’m not sure it ever dealt with lower bounds. [1] http://lists.openstack.org/pipermail/openstack-dev/2018-March/128352.html
The more I think back, the more I'm starting to remember that I suggested the only thorough way (short of reimplementing pip's version selection logic ourselves) to generate a lower constraints file from the lower bounds in a requirements set would be by using an altered copy of pip which inverted its version comparisons so that it selected the lowest available versions of packages which would satisfy every version range rather than the highest. I don't think anybody has made that, and so I expect instead we've collectively hand-waved with an assumption that people would be able to figure out a working set of lower constraints instead (which is in my opinion rather unrealistic given the complexity of the transitive dependency trees of even some of our medium-sized projects).
The upshot of this is that correctly calculating a lower constraint set for any project is likely to involve a lot of trial and error, probably far more than most projects are going to want to bear. If we had a tool we could run to generate a coherent lower-constraints.txt for a project, then this might be different, but I suspect that too is going to be more hassle than anyone wants to invest their time in creating.
Yes, it seems that very few distributions actually want the *lowest* values, either. What they seem to want is something that matches what they are already packaging, which is likely to be less than the version in the upper constraints list.
-- Jeremy Stanley
On Apr 19, 2020, at 10:37 AM, Doug Hellmann <doug@doughellmann.com> wrote:
On Apr 18, 2020, at 1:29 PM, Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2020-04-18 18:11:58 +0200 (+0200), Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
After analyzing this over IRC, it looks like openstack/networking-hyperv probably has an incoherent lower-constraints.txt file because it asks for (one of many examples) neutron-lib==1.28.0 and oslo.concurrency==3.25.0 but neutron-lib 1.28.0 declares it requires oslo.concurrency>=3.26.0 so this won't work.
The lower constraints jobs are built on the premise that they're working with a satisfiable set of constraints. Any time you alter a lower bound for any dependency or add a new dependency, you'll need to *fully* rebuild your lower-constraints.txt because of the possible knock-on effects such a change can have on the transitive dependencies tracked there.
https://review.opendev.org/675572 replaced neutron-lib>=1.18.0 with neutron-lib>=1.28.0 in the requirements.txt, but only edited the corresponding entry in lower-constraints.txt instead of taking all of neutron-lib's dependencies (and their dependencies, and so on) into account. The result is a lower constraints job which isn't testing what they think it's testing.
The good news is that's all fixable. The bad news is that this problem is probably widespread, because the places I expected to talk about how to properly maintain your lower-constraints.txt file provide minimal and potentially misleading advice:
https://docs.openstack.org/project-team-guide/dependency-management.html#add...
https://docs.openstack.org/project-team-guide/dependency-management.html#upd...
Neither of those suggest rebuilding your lower-constraints.txt when you alter a lower bound or add a new requirement. Further, I thought I remembered there being a utility in the openstack/requirements repository for correctly generating a lower-constraints.txt file, but this does not actually exist that I can find.
IIRC, the scripting I did way back when [1] was pretty naive. It looked at requirements.txt, converted >= to == and assumed either that any second order dependencies installed as a result of that set of constraints were the lowest likely to work or that teams would take care of editing the list.
There did used to be an update-requirements tool in the requirements repo, but it was removed as part of that work and I’m not sure it ever dealt with lower bounds.
[1] http://lists.openstack.org/pipermail/openstack-dev/2018-March/128352.html <http://lists.openstack.org/pipermail/openstack-dev/2018-March/128352.html>
The more I think back, the more I'm starting to remember that I suggested the only thorough way (short of reimplementing pip's version selection logic ourselves) to generate a lower constraints file from the lower bounds in a requirements set would be by using an altered copy of pip which inverted its version comparisons so that it selected the lowest available versions of packages which would satisfy every version range rather than the highest. I don't think anybody has made that, and so I expect instead we've collectively hand-waved with an assumption that people would be able to figure out a working set of lower constraints instead (which is in my opinion rather unrealistic given the complexity of the transitive dependency trees of even some of our medium-sized projects).
https://github.com/pypa/pip/pull/8086 <https://github.com/pypa/pip/pull/8086> might be useful. It doesn’t force the use of lower bounds, but it should result in the minimum versions being selected where possible.
The upshot of this is that correctly calculating a lower constraint set for any project is likely to involve a lot of trial and error, probably far more than most projects are going to want to bear. If we had a tool we could run to generate a coherent lower-constraints.txt for a project, then this might be different, but I suspect that too is going to be more hassle than anyone wants to invest their time in creating.
Yes, it seems that very few distributions actually want the *lowest* values, either. What they seem to want is something that matches what they are already packaging, which is likely to be less than the version in the upper constraints list.
-- Jeremy Stanley
On 18/04/2020 18.11, Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
Andreas
So, as diagnosed by Jeremy on IRC, the old command ignores failures, check this output: https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b/... It shows errors - but succeeds the installation, full results in https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b So, there are no new errors - we finally see them ;( More in Jeremy's email, Andreas -- Andreas Jaeger aj@suse.com Twitter: jaegerandi SUSE Software Solutions Germany GmbH, Maxfeldstr. 5, D 90409 Nürnberg (HRB 36809, AG Nürnberg) GF: Felix Imendörffer GPG fingerprint = EF18 1673 38C4 A372 86B1 E699 5294 24A3 FF91 2ACB
On Sat, Apr 18, 2020 at 7:48 PM Andreas Jaeger <aj@suse.com> wrote:
On 18/04/2020 18.11, Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
Andreas
So, as diagnosed by Jeremy on IRC, the old command ignores failures, check this output:
https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b/...
It shows errors - but succeeds the installation, full results in https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b
So, there are no new errors - we finally see them ;(
Hmm, but I see that stage 1 behaves the same in either patch: https://zuul.opendev.org/t/openstack/build/83104f33d2c24ac5bab700583066e2a7/... It is only stage 2 that fails: https://zuul.opendev.org/t/openstack/build/83104f33d2c24ac5bab700583066e2a7/... -yoctozepto
On Sat, 18 Apr 2020 at 19:48, Andreas Jaeger <aj@suse.com> wrote:
On 18/04/2020 18.11, Andreas Jaeger wrote:
More fun:
change https://review.opendev.org/720877 removes:
install_command = pip install {opts} {packages}
And suddenly lower-constraints fails - and the failure looks correct, so removing it is fine but will cause a surprise ;(
Andreas
So, as diagnosed by Jeremy on IRC, the old command ignores failures, check this output:
https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b/...
It shows errors - but succeeds the installation, full results in https://zuul.opendev.org/t/openstack/build/cbac188c8bcf4125b38c91fc229a4c3b
So, there are no new errors - we finally see them ;(
More in Jeremy's email,
Ah, that explains what happened in neutron when we dropped the install_command then: https://review.opendev.org/#/c/694568/ Same result, dropping the line caused the lower-constraints job to start failing and show the errors previously ignored -- Bernard Cafarelli
participants (6)
-
Andreas Jaeger
-
Bernard Cafarelli
-
Doug Hellmann
-
Jeremy Stanley
-
Radosław Piliszek
-
Sean McGinnis