[openstack-dev] [reviews] putting numbers on -core team load

Daniel P. Berrange berrange at redhat.com
Thu Nov 14 14:03:30 UTC 2013


On Thu, Nov 14, 2013 at 08:29:36PM +1300, Robert Collins wrote:
> At the summit there were lots of discussions about reviews, and I made
> the mistake of sending a mail to Russell proposing a few new stats we
> could gather.
> 
> I say mistake, because he did so and then some... we new have extra
> info - consider:
> 
> http://russellbryant.net/openstack-stats/nova-reviewers-30.txt
> 
> There are two new things:
> In each row a new column 'received' - this counts the number of
> incoming reviews to each reviewer. It's use should be obvious - but
> remember that folk who contribute the occasional patch probably don't
> have the context to be doing reviews... those who are contributing
> many patches and getting many incoming reviews however...
> This gives us the philanthropists (or perhaps team supporters...)
> 
> |        klmitch **       |     137    0  19   0 118   9    86.1% |
> 14 ( 11.9%)  |      0 (∞)    |
> 
> (the bit at the end is a unicode infinity... we'll need to work on that).
> 
> And so on :)
> 
> Down the bottom of the page:
> 
> Total reviews: 2980 (99.3/day)
> Total reviewers: 241
> Total reviews by core team: 1261 (42.0/day)
> Core team size: 17
> New patch sets in the last 30 days: 1989 (66.3/day)
> 
> and for 90 days:
> 
> Total reviews: 10705 (118.9/day)
> Total reviewers: 406
> Total reviews by core team: 5289 (58.8/day)
> Core team size: 17
> New patch sets in the last 90 days: 7515 (83.5/day)
> 
> This is the really interesting bit. Remembering that every patch needs
> - at minimum - 2 +2's, the *minimum* viable core team review rate to
> keep up is patch sets per day * 2:
> 30 days: 132 core reviews/day
> 90 days: 167 core reviews/day
> 
> But we're getting:
> 30 days: 42/day or 90/day short
> 90 days: 59/day or 108/day short
> 
> One confounding factor here is that this counts (AIUI) pushed changes,
> not change ids - so we don't need two +2's for every push, we need two
> +2's for every changeid - we should add that as a separate metric I
> think, as the needed +2 count will be a bit lower.

NB, this analysis seems to assume that we actally /want/ to eventually
approve ever submitted patch. There's going to be some portion that are
abandoned or rejected. It is probably a reasonably small portion of the
total so won't change the order of magnitude, but I'd be interested in
stats on how many patches we reject in one way or another.

> Anyhow, to me - this is circling in nicely on having excellent
> information (vs data) on the review status, and from there we can
> start to say 'right, to keep up, Nova needs N core reviewers
> consistently doing Y reviews per day. If Y is something sensible like
> 3 or 4, we can work backwards. Using the current figures (which since
> we don't have changeId as a separate count are a little confounded)
> that would give us:
> 
> time period      reviews/core/day     core-team-size
> 30 days           3                            44
> 30 days           4                            33
> 30 days           8                            17
> 90 days           3                            56
> 90 days           4                            42
> 90 days           10                          17
> 
> Also note that these are calendar days, so no weekends or leave for -core!
> 
> What else....
> 
> in the last 30 days core have done 42% of reviews, in the last 90 days
> 49%. So thats getting better.
> 
> I know Russell has had concerns about core cohesion in the past, but I
> don't think doing 8 detailed reviews every day including weekends is
> individually sustainable. IMO we badly need more core reviewers....
> and that means:
> 
>  - 20 or so volunteers
>  - who step up and do - pick a number - say 3 - reviews a day, every
> work day, like clockwork.
>  - and follow up on their reviewed patches to learn what other
> reviewers say, and why
>  - until the nova-core team & Russell are happy that they can
> contribute effectively as -core.
> 
> Why such a big number of volunteers? Because we need a big number of
> people to spread load, because Nova has a high incoming patch rate.

One concern I have is that this exercise could turn out to be a bit like
building new roads to solve traffic jams. The more roads you build, the
more car usage you trigger. In some ways the limited rate of approvals
can be seen as acting as a natural break on the rate of patch submissions
getting out of control. I'm not saying we've reached that point currently,
but at some point I think it is wise to ask what is the acceptable rate
of code churn we should try to support as a project.


One of the problems of a project as large as Nova is that it is hard for
one person to be expert at reviewing all areas of the code. As you increase
the size of core review team we have to be careful we don't get a situation
where we have too many patches being reviewed & approved by people who are
not the natural experts in an area. At the same time you don't want people
to be strictly siloed to just one area of the codebase, because you want
people seeing the big picture. If we were grow the core review team I think
it may be worth trying to be a bit clearer about who the experts are for
different areas of the code, so they can focus reviews on areas where it
would be most valuable. We do this a little informally currently, but
I think it'd be useful to make it a bit more explicit. 

Regards,
Daniel
-- 
|: http://berrange.com      -o-    http://www.flickr.com/photos/dberrange/ :|
|: http://libvirt.org              -o-             http://virt-manager.org :|
|: http://autobuild.org       -o-         http://search.cpan.org/~danberr/ :|
|: http://entangle-photo.org       -o-       http://live.gnome.org/gtk-vnc :|



More information about the OpenStack-dev mailing list