[openstack-dev] Avoiding regression in project governance

Tim Bell Tim.Bell at cern.ch
Wed Mar 11 19:06:21 UTC 2015


> -----Original Message-----
> From: Stefano Maffulli [mailto:stefano at openstack.org]
> Sent: 11 March 2015 03:16
> To: openstack-dev at lists.openstack.org
> Subject: Re: [openstack-dev] Avoiding regression in project governance
> 
> On Tue, 2015-03-10 at 15:23 -0700, James E. Blair wrote:
> > The holy grail of this system would be the "suitable for production
> > deployment" tag, but no one has figured out how to define it yet.
> 
> Are crazy ideas welcome in this phase?
> 
> I start with 2 below:
> 
> Preface: an idea circulates about visually displaying in a web page the
> projects.yaml file and the tags in there. Visitors would be able to browse the list
> of projects and sort, pick, search and find what they need from a nice
> representation of the 'big tent'.
> 
> 1) how about we pull the popularity of OpenStack projects as reported in the
> User Survey and display such number on the page where we list the projects?
> What if, together with the objective tags managed by TC and community at
> large, we show also the number of known deployment as guidance?
> 

I think we can make this work. Assuming more than N (to my mind > 5  or so) deployments report they are using project X, we can say that this is used in production/POC/... and the number of nodes/hypervisors/etc.

This makes it concrete and anonymous to avoid the fishing queries. It also allows our community to enter what they are doing in one place rather than answering multiple surveys. I am keen to avoid generic queries such as "How many hypervisors are installed for public clouds using Xen" but if we have an agreement that >5 avoids company identification, I feel this is feasible.

It does help address the "maturity" question concretely. If it's in prod in 200 deployments, I would consider this to be reasonably mature. If there is only 1, I would worry.

> 2) there are some 'fairly objective' indicators of quality of open source code,
> developed in a handful of academic projects that I'm aware of (Calipso and sos-
> opensource.org come to mind, but there are other).
> Maybe we can build a tool that pulls those metrics from each of our repositories
> and provides more guidance to visitors so they can form their own mind?
> 
> Nobody can really vet for 'production ready' but probably we can provide data
> for someone to get a more informed opinion. Too crazy?
> 

If an operator says that they are using this for their production cloud and there is a reasonable profile of scalability, this is a strong (but not guaranteed) endorsement for me. There could be influence but given the survey results can be scrutinised in more detail by the people with NDA access, it would discourage this behaviour.

> .stef
> 
> 
> _________________________________________________________________
> _________
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: OpenStack-dev-request at lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev



More information about the OpenStack-dev mailing list