[User-committee] Analysis of User Survey
Roland Chan
roland at aptira.com
Tue May 17 23:53:55 UTC 2016
Last time we had someone do an analysis of quite a lot of the survey to
look for correlations. There were some found relating to how OpenStack
software was obtained. At a minimum we can refresh the analysis of
questions. If there are any other new ones people want to add, please speak
up.
I can't find the last analysis, BTW. I remember analysis for correlation
being done on methods used to obtain OpenStack, and I think deployment
size, but beyond that I don't recall.
What I think would be useful is having a consistent and deeper (relative to
the headline items published in the user survey) approach that gets done
automatically as part of the survey process. Perhaps we should be tagging
questions for analysis as part of the survey construction process.
Roland
On Tue, 17 May 2016 at 05:19 Kruithof Jr, Pieter <
pieter.kruithof.jr at intel.com> wrote:
> Hi Roland,
>
> Are interested in correlating the NPS scores with the open-ended
> questions?
>
> Piet
>
>
>
> Piet Kruithof
>
>
>
> Sr User Experience Architect,
>
> Intel Open Source Technology Group
>
>
>
> Project Technical Lead (PTL)
>
> OpenStack UX project
>
> From: Heidi Joy Tretheway <heidijoy at openstack.org>
> Date: Monday, May 16, 2016 at 12:56 PM
> To: Roland Chan <roland at aptira.com>
> Cc: "user-committee at lists.openstack.org" <
> user-committee at lists.openstack.org>
> Subject: Re: [User-committee] Analysis of User Survey
>
> Hi Roland,
>
> I’d like to better understand which specific questions you’d like to
> correlate with NPS answers. Once we have the scope of the project nailed
> down, we will understand the cost associated and whether to spend the
> time/money on that effort. Another key question is whether we are looking
> at raw NPS scores, or a change in NPS scores over time relative to a
> specific question.
>
> An example of what we’d need to proceed:
>
> - App developer section (section 3)
> - Question: With which other clouds do app users interact? (Figure 3.2)
> - NPS scores from 2016-01 ONLY
> - Show an NPS score from the population that answered each response.
>
> Keep in mind that only sections one and two of the survey provide adequate
> volume of responses (1000+) to be able to break down answers to an NPS
> score and have statistical significance. The question above, for example,
> would only show ~100-200 NPS responses per answer, which is less than 8-15%
> of the total who answered the NPS question.
>
> I recommend forming a team with others who want to dig more deeply into
> NPS and deciding which questions are most important to further analyze.
> Additionally, I’d like your team to discuss how to act on the data, as a
> key discipline of NPS is having a plan in place to address the findings.
> For example, if the questions reveal a challenge of complexity, would the
> recommendation be to create further documentation or content that addresses
> this user concern?
>
> The comment analysis committee has already identified key themes from the
> NPS comment data that each could generate some great action items. Rather
> than focusing on the quantitative side of deeper NPS metrics, may I suggest
> looking at the qualitative input from the questions on NPS score reason and
> what areas of OpenStack have further room for improvement?
>
>
> On May 12, 2016, at 7:58 PM, Roland Chan <roland at aptira.com> wrote:
>
> I was looking for the NPS analysis we did: attempting to correlate the NPS
> scores (particularly detractors) to other responses.
>
> Roland
>
> On 13 May 2016 at 12:48, Lauren Sell <lauren at openstack.org> wrote:
>
>> Good question. We recruited a user survey working group this cycle to
>> help with comment analysis, and Piet was one of the volunteers.
>>
>> The comments they reviewed were not associated with any user, but we
>> still asked them to sign the User Committee confidentiality agreement.
>>
>>
>> On May 12, 2016, at 8:25 PM, Roland Chan <roland at aptira.com> wrote:
>>
>> Thanks Piet. I think we need to get the Foundation to coordinate the
>> activity because of the privacy concerns around looking at the raw survey
>> data.
>>
>> Lauren, is that correct?
>>
>> Roland
>>
>> On 12 May 2016 at 12:21, Kruithof Jr, Pieter <
>> pieter.kruithof.jr at intel.com> wrote:
>>
>>> Hi Roland,
>>>
>>> I was one of the folks that helped with the analysis of the qualitative
>>> data this last survey. I don’t mind helping again if you’re looking for
>>> volunteers.
>>>
>>> Piet
>>>
>>> Piet Kruithof
>>>
>>>
>>> Sr User Experience Architect,
>>> Intel Open Source Technology Group
>>>
>>>
>>> Project Technical Lead (PTL)
>>> OpenStack UX project
>>>
>>> From: Roland Chan <roland at aptira.com>
>>> Date: Wednesday, May 11, 2016 at 7:49 PM
>>> To: "user-committee at lists.openstack.org" <
>>> user-committee at lists.openstack.org>
>>> Subject: [User-committee] Analysis of User Survey
>>>
>>> Hi All,
>>>
>>> We did a little bit of deeper analysis of the last user survey,
>>> particularly with respect to NPS scores. I'd certainly be interested in
>>> continuing that effort to see if there are trends in the data now that we
>>> have another data point.
>>>
>>> Anyone else think so?
>>>
>>> Roland
>>>
>>
>>
>>
> _______________________________________________
> User-committee mailing list
> User-committee at lists.openstack.org
> http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/user-committee/attachments/20160517/39e5160b/attachment.html>
More information about the User-committee
mailing list