[openstack-dev] [all][stackalytics] Gaming the Stackalytics stats
Nikhil Komawar
nik.komawar at gmail.com
Fri Apr 8 22:35:44 UTC 2016
Hi,
Steve, thanks for pointing that out and Dims, thanks for starting the
discussion.
I guess I feel that the drastic step is/may not be necessary. Here's the
reason why: we're trying to solve a subjective problem with a objective
solution. All systems have loopholes and there will be people who will
try to take advantage of them, so we should into look into more
contextual info while forming opinions.
To say this in more "in practice terms" we are disallowing counting
stats for certain specific events, although there could be significant
number of those +1s that do matter. An extreme case of this being, a
downstream openstack consumer/cloud operator appointing someone to take
a look at the ongoing efforts upstream on the requirements and packages;
report back on some of the 'internally beware' changes. It would be a
loss for the management to track info on such individuals. To lesser
extreme, if I've to ask someone to take another look at requirements
changes and make sure that the project changes are appropriate that
potentially conflict with the updates, such individuals might be
demotivated to pick such jobs -- specifically stable and release
liaisons, sometime cross project efforts. I think we need to value such
work and give a way to (first) the individuals to keep themselves
motivated and then management to keep a check.
Hence, this is a subjective problem, it applies to some cases and
doesn't to others; the info is valuable to have but needs to be consumed
correctly. On top of that, I think a general rule of statistics is that
-- the more info/large sample set you have, more accurate are the
results. How and where you need to read them is we should solve. And I
think there are a few today who avoid such speculative results, for ex.
quarterly results at your resp. orgs, aren't you interested, do they
always tell story of the value addition/subtraction by the org as a
whole? Yet they are important, to keep us moving and keep us motivated!
Hence, my proposal is:
* instead of completely ignoring the stats on such reviews, we either
ignore or not ignore them on "generic" +1s
* introduce a new more-info-like tab/UI-stuff in Stackalytics and keep
those stats there, consequently we need to modify the Stackalytics
processor to show that info there
* encourage the teams to carefully read the review stats, say % of - vs.
+ and be more subjective on the evaluation by browsing some of the
reviews (TBH, I know that +0s are sometimes the best feedback on
reviews). I think this is a bit easier for me to say because I'm
primarily looking from Glance perspective which is a relatively small
team and we happen to stumble upon each others' reviews often.
In the interest of keeping the community inclusive, collaborative and
healthy,
yours sincerely,
On 4/8/16 1:26 PM, Davanum Srinivas wrote:
> Team,
>
> Steve pointed out to a problem in Stackalytics:
> https://twitter.com/stevebot/status/718185667709267969
>
> It's pretty clear what's happening if you look here:
> https://review.openstack.org/#/q/owner:openstack-infra%2540lists.openstack.org+status:open
>
> Here's the drastic step (i'd like to avoid):
> https://review.openstack.org/#/c/303545/
>
> What do you think?
>
> Thanks,
> Dims
>
>
>
--
Thanks,
Nikhil
More information about the OpenStack-dev
mailing list