[openstack-dev] [all][pbr] splitting our deployment vs install dependencies

Robert Collins robertc at robertcollins.net
Wed Apr 15 22:44:15 UTC 2015


On 16 April 2015 at 01:50, Sean Dague <sean at dague.net> wrote:
> On 04/12/2015 06:43 PM, Robert Collins wrote:

>> Thoughts? If there's broad apathy-or-agreement I can turn this into a
>> spec for fine coverage of ramifications and corner cases.
>
> I'm definitely happy someone else is diving in on here, just beware the
> dragons, there are many.
>
> I think some of the key problems are the following (lets call these the
> requirements requirements):

:) There's definitely enough meat here we're going to want a spec to
review in one place the conclusions and inputs.

> == We would like to be able to install multiple projects into a single
> devstack instance, and have all services work.

-- why do we want this? [for completeness - no objecting]. I think we
want this because its easier for folk mucking around to not have to
remember which venv etc; because we have projects like neutron and
nova that install bits into each others processes via common
libraries; because deployers have asked us to be sure that we can
co-install everything.

> This is hard because:
>
> 1. these are multiple projects so pip can't resolve all requirements at
> once to get to a solved state (also, optional dependencies in particular
> configs mean these can be installed later)

I don't understand your first clause here. Pip certainly can resolve
all requirements at once: for instance, 'pip install path_to_nova
path_to_swift path_to_neutron' would resolve all the requirements for
all three at once. We're not doing that today, but its not a pip
limitation. Today https://github.com/pypa/pip/issues/2687 will rear
its head, but that may be quite shallow.

As far as optional deps go - we need to get those into extra
requirements, then pip can examine that for us. Enabling that is on my
stack that I'm rolling up at the moment.

> 2. pip's solver ignores setup_requires - https://github.com/pypa/pip/issues
> /2612#issuecomment-91114298 which means we can get inconsistent results

Ish. The actual issue we ran into was
https://bitbucket.org/pypa/setuptools/issue/141/setup_requires-feature-does-not-handle
. We can tackle that directly and then require a newer setuptools to
solve this - it doesn't need any larger context.

> 3. doing this iteratively in projects can cause the following to happen
>
> A requires B>1.0,<2.0
> C requires B>1.2
>
> pip install C can make the pip install A requirements invalid later.

https://github.com/pypa/pip/issues/2687 again. I suspect this is a ~10
line patch - read all the package metadata present on the system, and
union its deps in inside PackageSet.add_requirement.

> This can end up in a failure of a service to start (if pkg_resources is
> actually checking things), or very subtle bugs later.
>
> Today global-requirements attempts to address this by continuously
> narrowing the requirements definitions for everything we have under our
> control so that pip is living in a rubber room and can only get an
> answer we know works.

> == However.... this has exposed an additional issue, libraries not
> released at release time
>
> Way more things are getting g-r syncs than top level projects.
> Synchronizing requirements for things that all release at the same time
> makes a lot of sense. However we're synchronizing requirements into
> libraries that release at different cadence. This has required all
> libraries to also have stable/ branches, for requirements matching.
>
> In an ideal world libraries would have very broad requirements, which
> would not have caps in them. non library projects would have narrower
> requirements that we know work.

I mostly agree. That is I think the heart of the issue I started this
thread about.
For libraries I trivially agree.
For non-library projects, I think we still need to be Known-Not-Bad,
vs Known-Good, but for CI our overrides can resolve that into
Known-Good - and we can publish this in some well known,
automation-friendly way.

Concretely, devstack should be doing one pip install run, and in
stable branches that needs to look something like:

$ pip install -r known-good-list $path_to_nova $path_to_neutron ....

> == End game?
>
> *If* pip install took into account the requirements of everything
> already installed like apt or yum does, and resolve accordingly
> (including saying that's not possible unless you uninstall or upgrade
> X), we'd be able to pip install and get a working answer at the end. Maybe?

As Clint notes, this makes co-installability a constraint, but
pragmatically is already is. As I noted above
https://github.com/pypa/pip/issues/2687 is that issue, and is shallow.
It won't fix it for us though. It will still leave us open to
https://github.com/pypa/pip/issues/988 which and
https://bitbucket.org/pypa/setuptools/issue/141/setup_requires-feature-does-not-handle
at a minimum.

> Honestly, there are so many fixes on fixes here to our system, I'm not
> sure even this would fix it.

-Rob


-- 
Robert Collins <rbtcollins at hp.com>
Distinguished Technologist
HP Converged Cloud



More information about the OpenStack-dev mailing list