From the forums: How we make the most of our NPS data

By Lori Gauthier, Ph.D.

Published February 11, 2016
Last modified February 11, 2016

Ever wonder how Zendesk analyzes and follows up on NPS data? We know as well as our customers that collecting NPS data is one thing, but making sense of all that information, and making it actionable, is sometimes another.

Last week, as part of our ongoing “Zendesk on Zendesk” discussion series, I shared a bit about how we group customers according to their NPS responses, how we effectively analyze Detractor comments and identify root causes of dissatisfaction, and then how we share NPS results with internal teams.

In case you missed it, what follows are some highlights from the full post.

Organizing customers by NPS response
Once we've sent out the NPS survey and collected responses, we then group customers’ ratings according to NPS guidelines:

  • 0 to 6 are known as Detractors. Scores in this category could indicate a dissatisfaction with your company
  • 7 or 8 are known as Passives
  • 9 or 10 are Promoters

The score itself is calculated by simply subtracting the percentage of Detractors from the percentage of Promoters. Passives are excluded.

To display these groups in Zendesk, you can create a customer list for each segment: Promoters, Passives, and Detractors. Customer lists, which are part of the same add-on that includes NPS surveys, are like views, but for customers instead of tickets. Then, from the list, we have the option to engage with those customers through an email campaign using MailChimp or SurveyMonkey.

NPS-based customer lists also make it easy to review a list of customers who fall into each group and identify potential trends. For example, are there significant differences between small businesses, mid-market, and enterprise companies? Do certain industries tend to consistently have issues with the product? These insights can help us determine where to focus our energy to best serve our customers' needs.

Analyzing comments
In the past, we’ve relied quite heavily on text analysis software, but we’ve found that their algorithms frequently miss the real reasons why customers are unlikely to recommend Zendesk. These text analysis programs are great at synthesizing what survey respondents say, but fail miserably at discerning why they are saying it. Such insights often require a human touch.

Our NPS surveys include a follow-up free form question asking customers to explain their rating. We've found that digging deep into the comments provided is a great way to understand the real reasons behind their frustrations.

Customers can sometimes misidentify exactly what's causing them difficulty. For example, our customers may comment that they’re not likely to recommend Zendesk because of an issue with the product. Yet when we review those same comments within a broader context, we often find the “product issues” can be remedied with advice, training, or plan updates. Some questions we ask include:

  • Is the customer’s Zendesk properly configured?
  • Have they received the training they need to use the product effectively?
  • Have they outgrown their plan? Would another plan be more appropriate?

Reviewing the comments is an important step that allows us not only to see what Detractors are saying, but to get to the why behind their comments.

Join us in the forums to read the entire post and continue the conversation