[User-committee] User Committee Meeting Monday
Roland Chan
roland at aptira.com
Sun Dec 13 21:54:06 UTC 2015
This analysis and followup you describe is the normal approach to an NPS
survey, although I'm not sure whether we (in the broadest sense) can ask
questions of particular respondents directly. Questions may have to be
routed through the Foundation for privacy reasons. I don't see that
responding to the verbatim responses and analysing the NPS score are
mutually exclusive, particularly since we have to outsource the analysis,
and it does not distract us in the slightest.
The analysis I've requested is to attempt to draw correlations between
high/low/medium scores and other factors. As you point out, it could be
particular vendors (or in fact use of any vendor) that is driving a scores
higher or lower. It could also be any number of other things.
Whilst we do have heaps of anecdotal evidence that can guide us, one thing
anecdotal evidence isn't good for is identifying trends and progress over
time. In any NPS driven program it is critical to listen to the verbatim
comments as they can provide important direction. The NPS data also
provides useful information when combined with other correlated data, or if
the NPS queries are performed after specific interactions (which is
something we might want to look at in the medium term).
In summary: analysis of the performance of OpenStack needs to be both
qualitative and quantitative.
*Roland Chan*
*Aptira - Asia Pacific’s leading provider of OpenStack*
Direct/mobile: +61 4 28 28 48 58
General enquiries: +61 2 8030 2333
Australia toll free: 1800 APTIRA
Website aptira.com
Twitter @aptira
On 12 December 2015 at 11:07, Stefano Maffulli <stefano at openstack.org>
wrote:
> On 12/11/2015 02:31 PM, Lauren Sell wrote:
> > When we analyzed the latest user survey data, we looked at a demographic
> > variable (user role, e.g. app developer), a firmographic variable (e.g.
> > company size), and deployment stage. We learned that overall, there was
> > no significant difference in NPS scores for people who identified with a
> > specific function, such as app developers, or for companies by size.
> [...]
>
> that's all very clear from the survey report. I'm still wondering what
> this is supposed to mean though: how did the promoters/detractors
> acquire 'openstack'? In other words, when they cast the 1-10 vote, what
> are they exactly grading, what does openstack mean to them in such
> context? Are they grading the open source project or their vendor?
>
> A much better use of User Committee time would then be to read page
> 11-12 of the report and discuss those comments, instead of worrying
> about a change in a number that has vague origins and is incomparable to
> previous editions as you explain below.
>
> > One cause for variance is that unfortunately we’re not comparing apples
> > to apples with the trending data in the latest survey report.
> [...]
> > As a next step*, *the independent analyst plans to draw up
> > correlations
> [...]
>
> I think this effort is a huge distraction: we have plenty of anecdotal
> evidence from the survey of pain points, and we're using time in the
> User Committee and Foundation to slice and dice a number that comes from
> vague definitions.
>
> In fact, I'd like to see more work to follow up with individual
> interviews to 'detractors' to get actionable insights. For example, who
> is this person saying that "Development is happening at a rapid
> pace, [...] but ‘productization’ is lagging". What's 'productization',
> what exactly would that person like to see? Asking them will give us
> more insights and things to do to improve the situation, if really needed.
>
> Also, I'd like to see more focus on other data we already have from the
> contributor and user community: are the quarterly reports with the
> efficiency metrics in bugs and code contributions still being produced?
> Knowing how fast are bugs being fixed and features added is much more
> valuable than stretching an 'industry standard' to an open source project.
>
> /stef
>
> _______________________________________________
> User-committee mailing list
> User-committee at lists.openstack.org
> http://lists.openstack.org/cgi-bin/mailman/listinfo/user-committee
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openstack.org/pipermail/user-committee/attachments/20151214/0ca08918/attachment.html>
More information about the User-committee
mailing list