[openstack-dev] [oslo][requirements][all] requesting assistance to unblock SQLAlchemy 1.1 from requirements

Mike Bayer mbayer at redhat.com
Wed Mar 15 16:39:48 UTC 2017


On 03/15/2017 11:42 AM, Sean Dague wrote:
> Perhaps, but in doing so oslo.db is going to get the pin and uc from
> stable/ocata, which is going to force it back to SQLA < 1.1, which will
> prevent oslo.db changes that require >= 1.1 to work.

so.... do we want to make that job non-voting or something like that?



>
> 	-Sean
>
> On 03/15/2017 11:26 AM, Roman Podoliaka wrote:
>> Isn't the purpose of that specific job -
>> gate-tempest-dsvm-neutron-src-oslo.db-ubuntu-xenial-ocata - to test a
>> change to the library master branch with stable releases (i.e. Ocata)
>> - of all other components?
>>
>> On Wed, Mar 15, 2017 at 5:20 PM, Sean Dague <sean at dague.net> wrote:
>>> On 03/15/2017 10:38 AM, Mike Bayer wrote:
>>>>
>>>>
>>>> On 03/15/2017 07:30 AM, Sean Dague wrote:
>>>>>
>>>>> The problem was the original patch kept a cap on SQLA, just moved it up
>>>>> to the next pre-release, not realizing the caps in general are the
>>>>> concern by the requirements team. So instead of upping the cap, I just
>>>>> removed it entirely. (It also didn't help on clarity that there was a
>>>>> completely unrelated fail in the tests which made it look like the
>>>>> system was stopping this.)
>>>>>
>>>>> This should hopefully let new SQLA releases very naturally filter out to
>>>>> all our services and libraries.
>>>>>
>>>>>     -Sean
>>>>>
>>>>
>>>> so the failure I'm seeing now is *probably* one I saw earlier when we
>>>> tried to do this, the tempest run fails on trying to run a keystone
>>>> request, but I can't find the same error in the logs this time.
>>>>
>>>> In an earlier build of https://review.openstack.org/#/c/423192/, we saw
>>>> this:
>>>>
>>>> ContextualVersionConflict: (SQLAlchemy 1.1.5
>>>> (/usr/local/lib/python2.7/dist-packages),
>>>> Requirement.parse('SQLAlchemy<1.1.0,>=1.0.10'), set(['oslo.db',
>>>> 'keystone']))
>>>>
>>>> stack trace was in the apache log:  http://paste.openstack.org/show/601583/
>>>>
>>>>
>>>> but now on our own oslo.db build, the same jobs are failing and are
>>>> halting at keystone, but I can't find any error:
>>>>
>>>> the failure is:
>>>>
>>>>
>>>> http://logs.openstack.org/30/445930/1/check/gate-tempest-dsvm-neutron-src-oslo.db-ubuntu-xenial-ocata/815962d/
>>>>
>>>>
>>>> and is on:  https://review.openstack.org/#/c/445930/
>>>>
>>>>
>>>> if someone w/ tempest expertise could help with this that would be great.
>>>
>>> It looks like oslo.db master is being used with ocata services?
>>> http://logs.openstack.org/30/445930/1/check/gate-tempest-dsvm-neutron-src-oslo.db-ubuntu-xenial-ocata/815962d/logs/devstacklog.txt.gz#_2017-03-15_13_10_52_434
>>>
>>>
>>> I suspect that's the root issue. That should be stable/ocata branch, right?
>>>
>>>         -Sean
>>>
>>> --
>>> Sean Dague
>>> http://dague.net
>>>
>>> __________________________________________________________________________
>>> OpenStack Development Mailing List (not for usage questions)
>>> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>
>> __________________________________________________________________________
>> OpenStack Development Mailing List (not for usage questions)
>> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>>
>
>



More information about the OpenStack-dev mailing list