[openstack-dev] stalled bug fixes for vmware driver

Russell Bryant rbryant at redhat.com
Fri Sep 20 20:37:49 UTC 2013


On 09/20/2013 04:11 PM, Dan Wendlandt wrote:
> Hi Russell, 
> 
> Thanks for the detailed thoughts.  Comments below,
> 
> Dan
> 
> 
> On Fri, Sep 20, 2013 at 11:52 AM, Russell Bryant <rbryant at redhat.com
> <mailto:rbryant at redhat.com>> wrote:
> 
>     On 09/20/2013 02:02 PM, Dan Wendlandt wrote:
>     > I think the real problem here is that in Nova there are bug fixes that
>     > are tiny and very important to a particular subset of the user
>     > population and yet have been around for well over a month without
>     > getting a single core review.
>     >
>     > Take for example https://review.openstack.org/#/c/40298/ , which fixes
>     > an important snapshot bug for the vmwareapi driver.  This was posted
>     > well over a month ago on August 5th.  It is a solid patch, is 54
>     > new/changed lines including unit test enhancements.  The commit
>     message
>     > clearly shows which tempest tests it fixes.  It has been reviewed by
>     > many vmware reviewers with +1s for a long time, but the patch just
>     keeps
>     > having to be rebased as it sits waiting for core reviewer attention.
>     >
>     > To me, the high-level take away is that it is hard to get new
>     > contributors excited about working on Nova when their well-written and
>     > well-targeted bug fixes just sit there, getting no feedback and not
>     > moving closer to merging.  The bug above was the developer's first
>     patch
>     > to OpenStack and while he hasn't complained a bit, I think the
>     > experience is far from the community behavior that we need to
>     encourages
>     > new, high-quality contributors from diverse sources.  For Nova to
>     > succeed in its goals of being a platform agnostic cloud layer, I think
>     > this is something we need a community strategy to address and I'd love
>     > to see it as part of the discussion put forward by those people
>     > nominating themselves as PTL.
> 
>     I've discussed this topic quite a bit in the past.  In short, my
>     approach has been:
> 
>     1) develop metrics
>     2) set goals
>     3) track progress against those goals
> 
>     The numbers I've been using are here:
> 
>         http://russellbryant.net/openstack-stats/nova-openreviews.html
> 
> 
> Its great that you have dashboards like this, very cool.  The
> interesting thing here is that the patches I am talking about are not
> waiting on reviews in general, but rather core review.  They have plenty
> of reviews from non-core folks who provide feedback (and they keep
> getting +1'd again as they are rebased every few days).  Perhaps a good
> additional metric to track would be be items that have spent a lot of
> time without a negative review, but have not gotten any core reviews.  I
> think that is the root of the issue in the case of the reviews I'm
> talking about.  

The numbers I track do not reset the timer on any +1 (or +2, actually).
 I only resets when it gets a -1 or -2.  At that point, the review is
waiting for an update from a submitter.  Point is, getting a bunch of
+1s does not make it show up lower on the list.  Also, the 3rd list
(time since the last -1) does not reset on a rebase, so that's covered
in this tracking, too.

-- 
Russell Bryant



More information about the OpenStack-dev mailing list