[openstack-dev] gate jobs - papercuts
Morgan Fainberg
morgan.fainberg at gmail.com
Tue Jan 31 21:55:06 UTC 2017
On Tue, Jan 31, 2017 at 10:37 AM, Matthew Treinish <mtreinish at kortar.org>
wrote:
> On Tue, Jan 31, 2017 at 01:19:41PM -0500, Steve Martinelli wrote:
> > On Tue, Jan 31, 2017 at 12:49 PM, Davanum Srinivas <davanum at gmail.com>
> > wrote:
> >
> > > Folks,
> > >
> > > Here's the list of job failures that failed in the gate queue.
> > > captured with my script[1][2] since around 10:00 AM today. All jobs
> > > failed with just one bad test.
> > >
> > > http://logs.openstack.org/48/423548/11/gate/gate-keystone-
> > > python27-db-ubuntu-xenial/a1f55ca/
> > > - keystone.tests.unit.test_v3_auth.TestMFARules
> > >
> > > <http://logs.openstack.org/61/424961/1/gate/gate-tempest-
> dsvm-cells-ubuntu-xenial/8a1f9e7/>
> >
> >
> > This was due to a race condition between token issuance and validation,
> > should be fixed.
>
> Is there a bug open for this? If so lets get an elastic-recheck query up
> for it
> so we can track it and get it off the uncategorized page:
>
>
No bug. Also this is not really fixable because time resolution within
tokens and revocations is 1 second. The answer is
to use freezegun and freeze time when doing things that can cause
revocations at the same time as issuance (usually can only really be hit
within keystone's unit tests). It is also unlikely to be something that can
easily be searched for in elastic search as it revolves around a "token
cannot be validated" message (token Not found/revoked/etc), which is used
in many cases where tokens cannot be validated (both correctly and in cases
like this).
The other case(es) that hit this actually were so bad they only passed at a
~5% rate.
So in short, an elastic-recheck-query would be pointless here short of
looking specifically for the test name as a failure.
> http://status.openstack.org/elastic-recheck/data/integrated_gate.html
>
> Our categorization rate is quite low right now and it'll only make things
> harder
> to debug other failures if we've got a bunch of unknown races going on.
>
> We have a lot of tools to make debugging the gate easier and making
> everyone more
> productive. But, it feels like we haven't been utilizing them fully lately
> which
> makes gate backups more likely and digging out of the hole harder.
>
> Thanks,
>
> Matt Treinish
>
> __________________________________________________________________________
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/openstack-dev/attachments/20170131/bd145f79/attachment.html>
More information about the OpenStack-dev
mailing list