[User-committee] User Survey Data Analysis

Lauren Sell lauren at openstack.org
Tue Nov 17 00:57:44 UTC 2015


> On Nov 16, 2015, at 6:24 PM, Stefano Maffulli <stefano at openstack.org> wrote:
> 
> On 11/16/2015 03:50 PM, Roland Chan wrote:
>> I'm requesting further analysis of the user survey data in order to
>> determine what factors maybe contributing to the deterioriation of the
>> NPS results over the last two surveys.
> 
> This is one of the cases where I'm not particularly concerned in the
> change that score but in its absolute value: the latest NPS in the
> latest edition could probably be considered a new base point instead of
> a down trend because the sample size of the survey has changed so
> dramatically.
> 
> Also, I'm not sure how exactly valuable the NPS score is for the thing
> that is called 'openstack' in the context of the survey: is that the
> upstream code? the packages from vendors? the old releases unmaintained?
> neutron? devstack? How valuable is really that number?

I mostly agree the NPS question in itself isn’t the biggest issue, especially since the respondent makeup is quite different as you mentioned. Making sure we get the commentary and feedback (from the entire report) in front of the right people in the community is a bigger priority to me. There were quite a few comments as to the “why,” and the most representative are listed in pages 10-13 of the report. We also provided project-specific commentary to PTLs who submitted questions for the survey.

That said, I’m definitely open to further analysis. We can see if the lower NPS scores correlate with specific releases, projects, tools, etc. I’m open to any ideas people might have on where to start here. The User Experience team also seems keen to do deeper analysis of the commentary, which could lead to more insights and trending. Would the product team be a good place to funnel this feedback?

>> I don't know what form the data for the last two user surveys is, but if
>> it's in surveymonkey, 
> 
> it's not in surveymonkey, it's in a custom-built tool managed by the
> Foundation. Access to the data is restricted to very few individuals
> because of its super-sensitive content.

That’s correct. The three User Committee members, Foundation staff and the independent analyst currently have access to the raw data. Tim Bell has championed forming a larger survey working group that can help carry the load of deeper analysis, as well as coordination across different groups in the community who are currently conducting their own surveys. That was the topic we were wanting to continue on the mailing list following the meeting today.

>> we had some good results using their built in
>> toolset (credit to Lauren for that) with the Diversity survey results.
> 
> Speaking of which, I heard rumors that the results were published at the
> Board meeting in Tokyo but I haven't seen that presentation made public
> yet. Did I miss the public announcement?

Yes! The full (and most in-depth report to date!) is available here: https://www.openstack.org/assets/survey/Public-User-Survey-Report.pdf

I believe Tom sent it to the mailing lists the day before the Tokyo Summit.

> 
> /stef
> 
> _______________________________________________
> User-committee mailing list
> User-committee at lists.openstack.org
> http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee




More information about the User-committee mailing list