[openstack-dev] [TripleO] Review metrics - what do we want to measure?
openstack at nemebean.com
Thu Aug 14 00:51:52 UTC 2014
One thing I am very interested in finally following up on, especially in
light of the snazzy new Gerrit separation for CI jobs, is to make the
check-tripleo job leave an actual vote rather than just a comment. This
would clean up the (usually) many reviews sitting with a failing CI run,
for the purposes of stats anyway. It would also make it easier to find
reviews that need a recheck or are legitimately breaking CI.
On 08/13/2014 06:03 PM, James Polley wrote:
> In recent history, we've been looking each week at stats from
> http://russellbryant.net/openstack-stats/tripleo-openreviews.html to get a
> gauge on how our review pipeline is tracking.
> The main stats we've been tracking have been the "since the last revision
> without -1 or -2". I've included some history at , but the summary is
> that our 3rd quartile has slipped from 13 days to 16 days over the last 4
> weeks or so. Our 1st quartile is fairly steady lately, around 1 day (down
> from 4 a month ago) and median is unchanged around 7 days.
> There was lots of discussion in our last meeting about what could be
> causing this. However, the thing we wanted to bring to the list for the
> discussion is:
> Are we tracking the right metric? Should we be looking to something else to
> tell us how well our pipeline is performing?
> The meeting logs have quite a few suggestions about ways we could tweak the
> existing metrics, but if we're measuring the wrong thing that's not going
> to help.
> I think that what we are looking for is a metric that lets us know whether
> the majority of patches are getting feedback quickly. Maybe there's some
> other metric that would give us a good indication?
>  Current "Stats since the last revision without -1 or -2" :
> Average wait time: 10 days, 17 hours, 6 minutes
> 1st quartile wait time: 1 days, 1 hours, 36 minutes
> Median wait time: 7 days, 5 hours, 33 minutes
> 3rd quartile wait time: 16 days, 8 hours, 16 minutes
> At last week's meeting we had: 3rd quartile wait time: 15 days, 13 hours,
> 47 minutes
> A week before that: 3rd quartile wait time: 13 days, 9 hours, 11 minutes
> The week before that was the mid-cycle, but the week before that:
> 19:53:38 <lifeless> Stats since the last revision without -1 or -2 :
> 19:53:38 <lifeless> Average wait time: 10 days, 17 hours, 49 minutes
> 19:53:38 <lifeless> 1st quartile wait time: 4 days, 7 hours, 57 minutes
> 19:53:38 <lifeless> Median wait time: 7 days, 10 hours, 52 minutes
> 19:53:40 <lifeless> 3rd quartile wait time: 13 days, 13 hours, 25 minutes
>  Some of the things suggested as potential causes of the long 3rd median
> * We have small number of really old reviews that have only positive scores
> but aren't being landed
> * Some reviews get a -1 but then sit for a long time waiting for the author
> to reply
These aren't reflected in the stats above though. Anything with a
negative vote gets kicked out of that category, and it's the one I think
we care about.
> * We have some really old reviews that suddenly get revived after a long
> period being in WIP or abandoned, which reviewstats seems to miscount
> * Reviewstats counts weekends, we don't (so a change that gets pushed at
> 5pm US Friday and gets reviewed at 9am Aus Monday would be seen by us as
> having no wait time, but by reviewstats as ~36 hours)
I would also add to this list:
* Old reviews with failed CI runs that no one has bothered to
recheck/fix. I find quite a few of these when I'm going through the
outstanding review list. These would be addressed by the voting change
I mentioned above because they would have a negative vote then.
> OpenStack-dev mailing list
> OpenStack-dev at lists.openstack.org
More information about the OpenStack-dev