[openstack-dev] [OpenStack][Cinder-Core]Nomination for Cinder core members

Mark McLoughlin markmc at redhat.com
Thu May 9 09:51:30 UTC 2013


On Wed, 2013-05-08 at 16:00 -0400, Russell Bryant wrote:
> On 05/08/2013 02:13 PM, Sheng Bo Hou wrote:
> > Hi Cinder contributors,
> > 
> > I would like to be a bit immodest to nominate Vincent Hou, myself as a
> > core contributor for OpenStack Block Storage(Cinder) project.
> > 
> > Last October, I began my work in cinder by testing and looking for bugs.
> > Then, by implementing the blueprint and fixing bugs. After that, I
> > started reviewing the patches. I took some screenshots from the
> > Launchpad website, indicating my contribution and how active I am in
> > cinder community. You can find my name in the list of the ranking.
> > 
> > Please take a look at the snapshots below. They are why I think I can be
> > qualified core member. I sincerely hope OpenStack Cinder community can
> > take me into consideration for a new core member. Thank you very much.
> > 
> > PS:
> > 1. Here are some of the snapshots for the patches I reviewed:
> 
> Screenshots are kind of a bizarre way to do this ... I think links to
> relevant pages would have sufficed.  :-)
> 
> This message prompted me to take a look at who is doing Cinder reviews.
>  I'm not on cinder-core, so I don't really get a vote, but I'll provide
> some commentary anyway.  :-)
> 
> Review numbers aren't the only metric, but they are something.  When I
> look at these things, I'm looking for some indication of how involved
> people are, as well as whether people are putting effort into providing
> good constructive feedback with -1/-2, or if they are just rubber
> stamping things (primarily just +1/+2).
> 
> Dolph Mathews summarized core member criteria *really* well in another
> thread when he said:
> 
> > Ultimately, "core contributor" to me simply means that this person's downvotes on code reviews are consistently well thought out and meaningful, such that an upvote by the same person shows a lot of confidence in the patch.
> 
> I think that is a *great* way to summarize the criteria that should be used.
> 
> The numbers for the last 90 days are here (please correct me if the
> cinder-core members are marked incorrectly)

This is really useful data. I think it's interesting to compare across
projects, so I've posted the same same 90 days stats for nova here:

  https://gist.github.com/markmc/5546607

Some comments below.

> The line for the self-nomination here is:
> 
> |     houshengbo    |       54 (0|2|52|0) (96.3%)       |
> 
> 
> ** -- cinder-core team member
> +-------------------+-----------------------------------+
> |      Reviewer     | Reviews (-2|-1|+1|+2) (+/- ratio) |
> +-------------------+-----------------------------------+
> |  john-griffith ** |     307 (13|27|1|266) (87.0%)     |

Looks like John has a massive reviewer burden. He's doing far more
reviews than anyone else.

> |   avishay-il **   |      109 (1|28|46|34) (73.4%)     |
> |  zhiteng-huang ** |       80 (3|14|5|58) (78.8%)      |
> |      rushiagr     |       75 (0|11|64|0) (85.3%)      |
> |     thingee **    |       71 (0|19|8|44) (73.2%)      |
> |  duncan-thomas ** |       66 (0|15|3|48) (77.3%)      |
> |     houshengbo    |       54 (0|2|52|0) (96.3%)       |
> |   kurt-f-martin   |       46 (0|10|36|0) (78.3%)      |
> |      eharney      |       38 (0|7|31|0) (81.6%)       |
> |   oliver-leahy-l  |       26 (0|6|20|0) (76.9%)       |

Just looking at the top 10 reviewers, the +/- ratio is consistently
above 70% and averaging at ~80%.

Looking at the top 15 nova reviews, the +/- ratio is consistently above
60% and averages at ~75%.

So ... not a massive amount of difference between projects. I had looked
at the review stats for the last 14 days of nova and the +/- ratio for
some were more like 40% so the cinder stats looked high to me at first.

Cheers,
Mark.




More information about the OpenStack-dev mailing list