[openstack-dev] [stable][requirements] External dependency caps introduced in 499db6b

Adam Gandelman adamg at ubuntu.com
Fri Feb 20 05:26:24 UTC 2015


Its more than just the naming.  In the original proposal, requirements.txt
is the compiled list of all pinned deps (direct and transitive), while
requirements.in reflects what people will actually use.  Whatever is in
requirements.txt affects the egg's requires.txt. Instead, we can keep
requirements.txt unchanged and have it still be the canonical list of
dependencies, while
reqiurements.out/requirements.gate/requirements.whatever is an upstream
utility we produce and use to keep things sane on our slaves.

Maybe all we need is:

* update the existing post-merge job on the requirements repo to produce a
requirements.txt (as it does now) as well the compiled version.

* modify devstack in some way with a toggle to have it process dependencies
from the compiled version when necessary

I'm not sure how the second bit jives with the existing devstack
installation code, specifically with the libraries from git-or-master but
we can probably add something to warm the system with dependencies from the
compiled version prior to calling pip/setup.py/etc

Adam



On Thu, Feb 19, 2015 at 2:31 PM, Joe Gordon <joe.gordon0 at gmail.com> wrote:

>
>
> On Thu, Feb 19, 2015 at 1:48 PM, Adam Gandelman <adamg at ubuntu.com> wrote:
>
>> This creates a bit of a problem for downstream (packagers and probably
>> others)  Shipping a requirements.txt with explicit pins will end up
>> producing an egg with a requires.txt that reflects those pins, unless there
>> is some other magic planned that I'm not aware of.  I can't speak for all
>> packaging flavors, but I know debian packaging interacts quite closely with
>> things like requirements.txt and resulting egg's requires.txt to determine
>> appropriate system-level package dependencies.  This would require a lot of
>> tedious work on packagers part to get something functional.
>>
>> What if its flipped? How about keeping requirements.txt with the caps,
>> and using that as input to produce something like requirements.gate that
>> passed to 'pip install --no-deps'  on our slaves?  We'd end up installing
>> and using the explicit pinned requirements while the services/libraries
>> themselves remain flexible.  This might the issue Doug pointed out, where
>> requirements updates across projects are not synchronized.
>>
>>
> Switching them to requirements.txt and requirements.gate works for me. If
> a simple renaming makes things better, then great!
>
> As for Doug's comment, yes we need to work something out to overwrite
> requirements.gate, under your proposed naming, with global requirments
>
>
>> Adam
>>
>>
>>
>> On Thu, Feb 19, 2015 at 12:59 PM, Joe Gordon <joe.gordon0 at gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Wed, Feb 18, 2015 at 7:14 AM, Doug Hellmann <doug at doughellmann.com>
>>> wrote:
>>>
>>>>
>>>>
>>>> On Wed, Feb 18, 2015, at 10:07 AM, Donald Stufft wrote:
>>>> >
>>>> > > On Feb 18, 2015, at 10:00 AM, Doug Hellmann <doug at doughellmann.com>
>>>> wrote:
>>>> > >
>>>> > >
>>>> > >
>>>> > > On Tue, Feb 17, 2015, at 03:17 PM, Joe Gordon wrote:
>>>> > >> On Tue, Feb 17, 2015 at 4:19 AM, Sean Dague <sean at dague.net>
>>>> wrote:
>>>> > >>
>>>> > >>> On 02/16/2015 08:50 PM, Ian Cordasco wrote:
>>>> > >>>> On 2/16/15, 16:08, "Sean Dague" <sean at dague.net> wrote:
>>>> > >>>>
>>>> > >>>>> On 02/16/2015 02:08 PM, Doug Hellmann wrote:
>>>> > >>>>>>
>>>> > >>>>>>
>>>> > >>>>>> On Mon, Feb 16, 2015, at 01:01 PM, Ian Cordasco wrote:
>>>> > >>>>>>> Hey everyone,
>>>> > >>>>>>>
>>>> > >>>>>>> The os-ansible-deployment team was working on updates to add
>>>> support
>>>> > >>>>>>> for
>>>> > >>>>>>> the latest version of juno and noticed some interesting
>>>> version
>>>> > >>>>>>> specifiers
>>>> > >>>>>>> introduced into global-requirements.txt in January. It
>>>> introduced some
>>>> > >>>>>>> version specifiers that seem a bit impossible like the one for
>>>> > >>> requests
>>>> > >>>>>>> [1]. There are others that equate presently to pinning the
>>>> versions of
>>>> > >>>>>>> the
>>>> > >>>>>>> packages [2, 3, 4].
>>>> > >>>>>>>
>>>> > >>>>>>> I understand fully and support the commit because of how it
>>>> improves
>>>> > >>>>>>> pretty much everyone’s quality of life (no fires to put out
>>>> in the
>>>> > >>>>>>> middle
>>>> > >>>>>>> of the night on the weekend). I’m also aware that a lot of the
>>>> > >>>>>>> downstream
>>>> > >>>>>>> redistributors tend to work from global-requirements.txt when
>>>> > >>>>>>> determining
>>>> > >>>>>>> what to package/support.
>>>> > >>>>>>>
>>>> > >>>>>>> It seems to me like there’s room to clean up some of these
>>>> > >>> requirements
>>>> > >>>>>>> to
>>>> > >>>>>>> make them far more explicit and less misleading to the human
>>>> eye (even
>>>> > >>>>>>> though tooling like pip can easily parse/understand these).
>>>> > >>>>>>
>>>> > >>>>>> I think that's the idea. These requirements were generated
>>>> > >>>>>> automatically, and fixed issues that were holding back several
>>>> > >>> projects.
>>>> > >>>>>> Now we can apply updates to them by hand, to either move the
>>>> lower
>>>> > >>>>>> bounds down (as in the case Ihar pointed out with stevedore)
>>>> or clean
>>>> > >>> up
>>>> > >>>>>> the range definitions. We should not raise the limits of any
>>>> Oslo
>>>> > >>>>>> libraries, and we should consider raising the limits of
>>>> third-party
>>>> > >>>>>> libraries very carefully.
>>>> > >>>>>>
>>>> > >>>>>> We should make those changes on one library at a time, so we
>>>> can see
>>>> > >>>>>> what effect each change has on the other requirements.
>>>> > >>>>>>
>>>> > >>>>>>>
>>>> > >>>>>>> I also understand that stable-maint may want to occasionally
>>>> bump the
>>>> > >>>>>>> caps
>>>> > >>>>>>> to see if newer versions will not break everything, so what
>>>> is the
>>>> > >>>>>>> right
>>>> > >>>>>>> way forward? What is the best way to both maintain a stable
>>>> branch
>>>> > >>> with
>>>> > >>>>>>> known working dependencies while helping out those who do so
>>>> much work
>>>> > >>>>>>> for
>>>> > >>>>>>> us (downstream and stable-maint) and not permanently pinning
>>>> to
>>>> > >>> certain
>>>> > >>>>>>> working versions?
>>>> > >>>>>>
>>>> > >>>>>> Managing the upper bounds is still under discussion. Sean
>>>> pointed out
>>>> > >>>>>> that we might want hard caps so that updates to stable branch
>>>> were
>>>> > >>>>>> explicit. I can see either side of that argument and am still
>>>> on the
>>>> > >>>>>> fence about the best approach.
>>>> > >>>>>
>>>> > >>>>> History has shown that it's too much work keeping testing
>>>> functioning
>>>> > >>>>> for stable branches if we leave dependencies uncapped. If
>>>> particular
>>>> > >>>>> people are interested in bumping versions when releases happen,
>>>> it's
>>>> > >>>>> easy enough to do with a requirements proposed update. It will
>>>> even run
>>>> > >>>>> tests that in most cases will prove that it works.
>>>> > >>>>>
>>>> > >>>>> It might even be possible for someone to build some automation
>>>> that did
>>>> > >>>>> that as stuff from pypi released so we could have the best of
>>>> both
>>>> > >>>>> worlds. But I think capping is definitely something we want as a
>>>> > >>>>> project, and it reflects the way that most deployments will
>>>> consume this
>>>> > >>>>> code.
>>>> > >>>>>
>>>> > >>>>>     -Sean
>>>> > >>>>>
>>>> > >>>>> --
>>>> > >>>>> Sean Dague
>>>> > >>>>> http://dague.net
>>>> > >>>>
>>>> > >>>> Right. No one is arguing the very clear benefits of all of this.
>>>> > >>>>
>>>> > >>>> I’m just wondering if for the example version identifiers that I
>>>> gave in
>>>> > >>>> my original message (and others that are very similar) if we
>>>> want to make
>>>> > >>>> the strings much simpler for people who tend to work from them
>>>> (i.e.,
>>>> > >>>> downstream re-distributors whose jobs are already difficult
>>>> enough). I’ve
>>>> > >>>> offered to help at least one of them in the past who maintains
>>>> all of
>>>> > >>>> their distro’s packages themselves, but they refused so I’d like
>>>> to help
>>>> > >>>> them anyway possible. Especially if any of them chime in as this
>>>> being
>>>> > >>>> something that would be helpful.
>>>> > >>>
>>>> > >>> Ok, your links got kind of scrambled. Can you next time please
>>>> inline
>>>> > >>> the key relevant content in the email, because I think we all
>>>> missed the
>>>> > >>> original message intent as the key content was only in footnotes.
>>>> > >>>
>>>> > >>> From my point of view, normalization patches would be fine.
>>>> > >>>
>>>> > >>> requests>=1.2.1,!=2.4.0,<=2.2.1
>>>> > >>>
>>>> > >>> Is actually an odd one, because that's still there because we're
>>>> using
>>>> > >>> Trusty level requests in the tests, and my ability to have
>>>> devstack not
>>>> > >>> install that has thus far failed.
>>>> > >>>
>>>> > >>> Things like:
>>>> > >>>
>>>> > >>> osprofiler>=0.3.0,<=0.3.0 # Apache-2.0
>>>> > >>>
>>>> > >>> Can clearly be normalized to osprofiler==0.3.0 if you want to
>>>> propose
>>>> > >>> the patch manually.
>>>> > >>>
>>>> > >>
>>>> > >> global-requirements for stable branches serves two uses:
>>>> > >>
>>>> > >> 1. Specify the set of dependencies that we would like to test
>>>> against
>>>> > >> 2.  A tool for downstream packagers to use when determining what to
>>>> > >> package/support.
>>>> > >>
>>>> > >> For #1, Ideally we would like a set of all dependencies, including
>>>> > >> transitive, with explicit versions (very similar to the output of
>>>> > >> pip-freeze). But for #2 the standard requirement file with a range
>>>> is
>>>> > >> preferred. Putting an upper bound on each dependency, instead of
>>>> using a
>>>> > >> '==' was a compromise between the two use cases.
>>>> > >>
>>>> > >> Going forward I propose we have a requirements.in and a
>>>> requirements.txt
>>>> > >> file. The requirements.in file would contain the range of
>>>> dependencies,
>>>> > >> and
>>>> > >> requirements.txt would contain the pinned set, and eventually the
>>>> pinned
>>>> > >> set including transitive dependencies.
>>>> > >>
>>>> > >> Thoughts?
>>>> > >
>>>> > > I'm interested in seeing what that list looks like. I suspect we
>>>> have
>>>> > > some libraries listed in the global requirements now that aren't
>>>> > > actually used, and I'm sure there is a long list of transitive
>>>> > > dependencies to add to it.
>>>> > >
>>>> > > I'm not entirely comfortable with the idea of pinning completely,
>>>> but I
>>>> > > guess it's the best of two bad options. It solves the "we don't have
>>>> > > enough people around to manage stable branches" problem in one way
>>>> (by
>>>> > > not letting releases outside our control break our test jobs), but
>>>> if we
>>>> > > don't have people around now to fix things who is going to keep up
>>>> with
>>>> > > updating that requirements list as new versions of projects come
>>>> out? We
>>>> > > can write a job to automatically detect new packages and test them,
>>>> but
>>>> > > who is going to review patches submitted by that bot? Maybe that's a
>>>> > > small enough amount of work that it will be easier to find help.
>>>> > >
>>>> > > We've been playing whack-a-mole with issues because we made changes
>>>> to
>>>> > > the way we deal with versions and dependencies without fully
>>>> > > understanding the consequences of some of the changes. They looked
>>>> > > innocent at first, but because of assumptions in other jobs or other
>>>> > > parts of the system they caused problems. So I think we should be
>>>> > > careful about making this decision and think about some of the other
>>>> > > things that might fall out before pushing more changes up.
>>>> > >
>>>> > > For example, if we're syncing requirements into stable branches of
>>>> > > projects based on requirements.txt, and that becomes a set of pins
>>>> > > instead of a set of ranges with caps, how do we update projects?
>>>> Should
>>>> > > we sync from requirements.in instead of requirements.txt, to allow
>>>> > > projects to maintain the ranges in their own requirements files? Or
>>>> do
>>>> > > we want those requirements files to reflect the pins from the global
>>>> > > list?
>>>> >
>>>> > I'm not sure I fully understand what folks are proposing here with two
>>>> > different files, but if you’re putting ``==`` specifiers into the
>>>> > install_requires of various projects, then I believe that is going to
>>>> > cause a
>>>> > fairly large amount of pain.
>>>>
>>>> As I understand it, Joe's idea is to have an input file
>>>> ("requirements.in") that uses >=, <=, and != to specify a range of
>>>> valid
>>>> versions. Those ranges would guide packagers about what requirements we
>>>> think work. The list would also be "compiled" into a list of
>>>> requirements using only == to create a requirements.txt file thatwould
>>>> be used for the tests on our CI systems.
>>>>
>>>
>>> That is correct. The workflow would be:
>>>
>>> When preparing a stable branch:
>>>
>>> In global requirements
>>>
>>> * rename requirements.tx to requirements.in
>>> * create a new requirements.txt file that consists of a full set of
>>> pinned, that is using '==', dependencies (transitive and all).
>>>
>>> For each project consuming global requirements
>>>
>>> * rename requirements.txt to requirements.in
>>> * create a new requirements.txt file that consists of a full set of
>>> pinned, that is using '==', dependencies (transitive and all). While making
>>> sure the versions allign with global requirements
>>> * Change how we install dependencies from 'pip install -r requirements'
>>> to 'pip install --no-deps -r requirements.txt
>>>
>>>
>>> So now we have two files for requirements. requirements.txt to specify
>>> the exact versions we use for testing, and requirements.in to specify
>>> the range we think *should* work.
>>>
>>>
>>>> >
>>>> > ---
>>>> > Donald Stufft
>>>> > PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
>>>> >
>>>> >
>>>> __________________________________________________________________________
>>>> > OpenStack Development Mailing List (not for usage questions)
>>>> > Unsubscribe:
>>>> > OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>>>> > http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>>> > Email had 1 attachment:
>>>> > + signature.asc
>>>> >   1k (application/pgp-signature)
>>>>
>>>>
>>>> __________________________________________________________________________
>>>> OpenStack Development Mailing List (not for usage questions)
>>>> Unsubscribe:
>>>> OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>>>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>>>
>>>
>>>
>>>
>>> __________________________________________________________________________
>>> OpenStack Development Mailing List (not for usage questions)
>>> Unsubscribe:
>>> OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>>
>>>
>>
>> __________________________________________________________________________
>> OpenStack Development Mailing List (not for usage questions)
>> Unsubscribe:
>> OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>
>>
>
> __________________________________________________________________________
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20150219/26d06ea0/attachment.html>


More information about the OpenStack-dev mailing list