[User-committee] User Committee Meeting Monday
lauren at openstack.org
Fri Dec 11 22:31:45 UTC 2015
To answer Jon’s question, NPS was added by request of several board members early last year.
Heidi Joy Tretheway has been the point of contact on our Foundation staff working with the independent analyst to produce the survey report. Unfortunately, she’s out this week and will be flying back Monday, so she won’t be able to join the meeting. In the meantime, she passed along a quick update which I will share here.
When we analyzed the latest user survey data, we looked at a demographic variable (user role, e.g. app developer), a firmographic variable (e.g. company size), and deployment stage. We learned that overall, there was no significant difference in NPS scores for people who identified with a specific function, such as app developers, or for companies by size. As a result, we didn’t do further data cuts on demographic/firmographic variables. We did learn that people with deployments in production tended to rate OpenStack more highly (NPS of 43 for production, vs 24 for dev/qa and 20 for POC). This is covered in the survey report, along with verbatim comments from respondents: https://www.openstack.org/assets/survey/Public-User-Survey-Report.pdf
One cause for variance is that unfortunately we’re not comparing apples to apples with the trending data in the latest survey report. Upon further review, the chart on the top right of page 9 is misleading, because the May 2015 responses are based on deployments only, while the September 2015 responses are from all survey respondents (not just deployments). Going forward, I think we should focus on deployments as our trend line. The independent analyst took a first pass at comparing NPS results from the last three surveys (with V3 being Sept 2015, V2 being May 2015 and V1 being October 2014), and shared the table below. These are INITIAL findings, but I wanted to share the direction, and we may need to update page 9 in the report.
As a next step, the independent analyst plans to draw up correlations (particularly for low scores) associated with particular technology decisions (e.g. projects or tools) and attitudinal data from the “your thoughts” section (e.g. we might find that firms that value X highly tend to rate OpenStack lowest). We might also want to take the opposite approach and look at why the NPS score seemed to pop in April 2015. With the holiday schedule, I imagine it will be January before we can deliver more data. In the meantime, if you have any thoughts on that approach or specific questions / comparisons you’d like to see, please speak up.
> On Dec 11, 2015, at 2:49 PM, Roland Chan <roland at aptira.com> wrote:
> Indeed. If we are to drop NPS, it should only be in exchange for some other measure of satisfaction.
> At present this is the only place that the we get quantifiable data on the performance of OpenStack as a whole. Getting more specific data (eg on particular services/tools, or particular initiatives) could be a substitute, but going without entirely would be fairly risky. Having said that, an overall figure is very useful, especially in combination with commentary from the survey.
> On Sat, 12 Dec 2015 07:19 Frank Days <frank.days at tesora.com <mailto:frank.days at tesora.com>> wrote:
> NPS is an industry standard measure. It is also a super tough standard to score well. I think people have a tendency to look at a 40 and think they are failing when that is a actually a pretty good score.
> I’m a marketer so my bias is towards things like NPS that can show that people who are using OpenStack are more than satisfied with the platform. This is also something that the naysayers can’t argue with.
> I think if we drop NPS, then the haters will have an easy way to ask why and if we have something to hide.
> Frank Days | VP, Marketing
> Direct: +1.978.707.8010 ext. 1017
> frank.days at tesora.com <mailto:frank.days at tesora.com> | @tangyslice | Skype: fmdays
> 125 CambridgePark Drive, Suite 400, Cambridge, MA 02140
>> On Dec 11, 2015, at 2:44 PM, Jonathan Proulx <jon at csail.mit.edu <mailto:jon at csail.mit.edu>> wrote:
>> Yes NPS is what is meant on the agenda.
>> I actually don't have any recallection why it was added to the
>> survey, so woudl welcome some discusson on that topic now.
>> It's on the currnet agenda because Roland was concerned with the delta
>> in the last two surveys and was looking for more insight into that.
>> If it's not a useful number (which I can see your case for that) then
>> we shouldn't collect it an dif we're not going to collect it in the
>> future there's no point in looking more deeply at it now.
>> Does anyone recall the thought behind adding NPS or have an argument
>> for its relevence in our context?
>> On Fri, Dec 11, 2015 at 11:32:48AM -0800, Stefano Maffulli wrote:
>> :On 12/10/2015 04:11 PM, Shilla Saebi wrote:
>> :> This is a reminder to let everyone know that we have the monthly user
>> :> committee meeting scheduled for Monday December 14th at 1900 UTC in
>> :> #openstack-meeting on freenode.
>> :Thanks for the reminder. Unfortunately I won't be able to join the
>> :real-time conversation (conflicting work schedule).
>> :Looking at the minimalist agenda (copied below for convenience), I see
>> :topics that would be very useful to discuss on the mailing list for
>> :deeper analysis before bringing them to real-time chat.
>> :I would like to debate the NP score, which I assume is the Net Promoter
>> :score. There are many strong reasons for removing the NPS from the
>> :survey altogether.
>> :The main objection, NPS is used to track *customers* loyalty to a
>> :*brand*. For typical corporations, they can identify with precision
>> :customers and brand, therefore effectively measure such loyalty.
>> :OTOH OpenStack Foundation has many types of customers, the concept of
>> :its products is not exactly defined and ultimately OpenStack can't be
>> :considered a brand in the sense of the original article that launche NPS
>> :https://hbr.org/2003/12/the-one-number-you-need-to-grow/ <https://hbr.org/2003/12/the-one-number-you-need-to-grow/>
>> :I'd argue that the NPS collected in the Survey in its current form has
>> :no value whatsoever. http://dilbert.com/strip/2008-05-08 <http://dilbert.com/strip/2008-05-08>
>> :Is the User Committee convinced that an open source project like
>> :OpenStack gets any value from tracking this score? Why exactly is that
>> :number tracked in the survey, what exactly does the UC want to get from
>> :that number, what actionable results are expected from it?
>> :PS adding the Proposed Agenda below so you all can see it without extra
>> :* Meeting schedule - proposal to witch to biweekly (from monthly) & have
>> :alternating time with 1 APAC friendly
>> :* Next steps for Survey analysis (cuts against NP score)
>> :* discuss where we want to take the UC and what we can change
>> :User-committee mailing list
>> :User-committee at lists.openstack.org <mailto:User-committee at lists.openstack.org>
>> :http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee <http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee>
>> User-committee mailing list
>> User-committee at lists.openstack.org <mailto:User-committee at lists.openstack.org>
>> http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee <http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee>
> User-committee mailing list
> User-committee at lists.openstack.org <mailto:User-committee at lists.openstack.org>
> http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee <http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee>
> User-committee mailing list
> User-committee at lists.openstack.org
-------------- next part --------------
An HTML attachment was scrubbed...
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 57241 bytes
Desc: not available
More information about the User-committee