Why many CX Programs are Failing to Deliver ROI
In the Customer Experience space, the emphasis of recent discussion has been largely centred around the choice of CX metrics i.e. NPS vs other. While the choice of metric is important, in the context of customer feedback program success it's really only part of the story.

It’s not all about Net Promoter Score (NPS)

If you want to have an NPS vs xyz discussion, feel free to send me a note and I’m happy to discuss and debate based on real examples. This discussion however is about the other things that matter just as much, and potentially more when achieving customer feedback program success.

So here’s the first in a series of discussions about the other things that are crucial to the development of a successful customer feedback program (Yes, it’s not sexy, but customer feedback is really what this discussion centres around and the $8 billion acquisition of Qualtrics by SAP).

As someone who has over the last 10 years developed and managed some of the biggest (and smallest) customer feedback programs globally, there are a number of gaps that I regularly see which I will detail and discuss in a series of practical discussions with real examples.

So let’s talk ‘gaps’ in experience programs and how we can close those gaps to deliver better experiences.

Human vs Machine or Human and Machine?

While most surveys do and should have closed ended questions such as rating scales, by far, the richest data exists in the unstructured verbatim responses from customers. This unstructured comment data is often where they tell you the ‘why’ behind their experience.

For many years, and even today, many survey based businesses limit or avoid the amount of unstructured comment data because they haven’t been able to process and draw meaning from it.

Customer comment data was always seen as a wealth of knowledge, but without an easy way to classify it, it was the poor cousin in the development of true insight. and was (and still is) often used simply to ‘support’ the findings in the closed ended questions. But now it can and should be the hero!

Previously gaining insight into the drivers of dissatisfaction (and delight) required complex, time consuming and expensive research to firstly qualitatively explore and then quantitatively measure the incidence and impact of different aspects of product or service delivery.

The Human and the Machine - Gaining Insight and Driving Change through the real Voice of the Customer

Enter Artificial Intelligence, and in particular Text Analysis using Natural Language Processing.

Sounds complex? Well it kind of is - but with the advent of artificial intelligence and advances in text processing algorithms, the good news is it’s now accessible.

Firstly, to be clear, this is not a word cloud, and a word cloud is not text analysis.

Leveraging the ‘Machine’ we can quickly identify key themes and topics within open text data and determine the incidence and impact of specific aspects of product or service delivery.

As a researcher, what I find most exciting is how it leverages the ‘customer voice’ to determine these drivers. It’s not closed questions that we’ve ‘assumed’ are the things that matter to our customers, it’s what our customers’ are telling us matters.

And once we’ve identified the specific things that annoy or delight customers, we can even continuously monitor customer comments to measure the change or improvement over time to determine if initiatives are a) being implemented and b) have worked as hoped.

And don’t think this requires big data or huge and costly initiatives.

Real World Example

For one of our clients we identified a problem that affected 5% of their customers, but is a highly important segment: families with young children.

A part of the experience for this segment was having a monumentally negative effect on their experience and as such their likelihood to recommend or return in the future.

What was the issue?

Baby/Child Seats.

Knowing they had an issue with baby/child seats did they make operational changes to address it?  

Yes.

Did the change have the desired effect?

Absolutely.

Did they know this before the analysis of the customer comment data?

No, and without the assistance of both a smart machine and smart humans they would not have quickly and easily identified the issue.This is a simple singular example of how you can use existing unstructured comment data to identify insights that produce true business change and impact.

Think you might have something to learn from your existing data?

Whilst we have a full solution for capturing customer feedback through to producing actionable insights, we can also work with external comment sources such as internal customer complaints data, online customer review data, survey data, or all of these!

Get in touch for a free review of your first data source.

Next  I’ll discuss and share real world examples of how taking action at an individual customer level can have a monumental impact on individual customer retention and advocacy.


Like to know more?

Book a time to discuss how we can help.