[openstack-dev] Grizzly's out - let the numbers begin...

Mark McLoughlin markmc at redhat.com
Fri Apr 5 11:43:14 UTC 2013


On Fri, 2013-04-05 at 11:03 +0200, Thierry Carrez wrote:
> Michael Still wrote:
> > On Fri, Apr 5, 2013 at 2:47 AM, Stefano Maffulli <stefano at openstack.org> wrote:
> >> On 04/04/2013 08:24 AM, Mark McLoughlin wrote:
> > 
> > [snip]
> > 
> >>> Frankly, though, I'm losing faith in this "number of commits" statistic
> >>> being all that useful. Most of my commits were probably trivial cfg
> >>> cleanups in Nova.
> >>
> >> Not one single number has meanings by itself but only acquires a meaning
> >> inside a story. If you look at count of commits together with the number of
> >> bugs closed over time, so you can see trends, for example, you may have an
> >> idea of how much effort goes into stabilization vs new features. Looking at
> >> commits and lines of code added/removed can also help understand where the
> >> development effort is going. Or at least the numbers will seem to tell
> >> something and force you to want to dive deeper.
> > 
> > Yes. I am increasingly worried about employers managing to these
> > numbers as well. I feel that the numbers that get discussed in the
> > community are the ones employees are likely to optimise for.
> > 
> > I'd like to see a more general discussion about what contributions we
> > value as a community, and how we encourage those contributions.
> 
> Yeah, I'm having the same concern, and the official "activity board" is
> only making it more pressing:
> 
> On one hand, it's extremely difficult to get those stats right. Tracking
> where everyone works is painful (the Bitergia report, for example, still
> counts me under "Rackspace"). The metrics themselves can be a bit
> unfair: we already mentioned the issue with "number of commits"... but
> others fail as well (like "bug activity": stable branch management will
> result in opening a lot of bugs for process tracking, which will be an
> order of magnitude bigger than the number of bugs opened due to QA
> research).
> 
> On the other hand, more and more people/companies want to be able to
> brag about their OpenStack contribution, or want to not appear too bad
> on those lists. That leads to the incentive to game the system, once
> some stats are made "official".
> 
> Until now, the stats posted were mostly one-shot and ad-hoc, to serve as
> a reputational pressure encouraging people to contribute more,
> especially in strategic areas. But if they become official and stable
> metrics, they will be gamed, defeating their original purpose.
> 
> Bruce Schneier puts it in better words than I do in "Liars and Outliers":
> 
> "When you start measuring something and then judge people based on that
> measurement, you encourage people to game the measurement instead of
> doing whatever it is you wanted in the first place."

Agree with everything you say. I don't think I've seen obvious examples
of people trying to game the stats but, with the emphasis we're putting
on these stats, it's only natural that people will start gaming them.

Cheers,
Mark.




More information about the OpenStack-dev mailing list