I'm going to preface this with a comment that I'm not completely sold on
this idea yet, but I think it's worth a larger discussion and might be a
seed to help with discussion. I've tagged Ironic as this proposal is for
Ironic specifically, but I'm very, very interested in the larger
community opinion for this kind of approach so I also tagged all.
One of the toughest parts about walking the path to becoming core is how
fuzzy the targets are. With Ironic recently having added a two-tier
review system, with reviewers and approvers separate, we have an
opportunity to allow a much, much bigger tent for the ironic-reviewers
group without risking code getting merged without approval by one of our
folks with more experience and context in OpenStack and Ironic as a whole.
So here's the suggestion -- we come up with a static criteria and period
with a clear bar to "if you do this, you are an ironic-reviewer". For
instance, we could say (I'm not sure about these numbers): if you do an
average of Y (5?) reviews/week and have merged a major Ironic feature or
multiple bugfix patches, you would automatically be a reviewer.
Obviously, we'd have to keep safeguards to prevent abuse; in our
community in the past we had a rule that changes required review from
two separate corporate entities -- in this case, I'd suggest a waiting
period to prevent quick-merging by someone with a single perspective.
Exceptions would be, as in the current system, for emergency CI fixes
and documentation improvements.
To be clear: the safeguards shouldn't be seen as a lack of trust, but
maybe instead as an acknowledgement that structural controls hold value
even if there's no reason to assume mistrust.
Full proposal:
* ironic-approvers (+2 and Workflow +1) continues to operate as usual,
with admission being based on technical merit, trust, and review frequency.
* ironic-reviewers (+2 only) becomes a data-driven group, where
reviewers are added/removed based on individual performance metrics TBD
(likely reviews/commits/bugs fixed). -- I'll note that today, there's
not really any set of reasonable metrics that would add additional
reviewers to this group.
* Code changes would either need a review from two separate
company-representatives, or to have sat for a short waiting period to
ensure other folks had a chance to look at it.
* Urgent CI fixes, trivial changes, and documentation updates would
still only need one review to land.
What do folks think? Would this be a good carrot for existing non-cores
to up their review and upstream contribution commitments?
-
Jay Faulkner
G-Research OSS