Tag Archives: survey

From drivers to design thinking

networkDriver analysis is great, isn’t it? It reduces the long list of items on your questionnaire to a few key drivers of satisfaction or NPS. A nice simple conclusion—”these are the things we need to invest in if we want to improve”.

But what if it’s not clear how to improve?

Often the key drivers turn out to be big picture, broad-brush, items. Things like “value” or “being treated as a valued customer” which are more or less proxies for overall satisfaction. Difficult to action.

Looking beyond key drivers, there’s a lot of insight to be gained by looking at how all your items relate to each other, as well as to overall satisfaction and NPS. Those correlations, best studied as either a correlogram (one option below) or network diagram (top right) can tell you a lot, without requiring much in the way of assumptions about the data.
correlogram
In particular, examining the links between specific items can support a design thinking approach to improving the customer experience based on a more detailed understanding of how your customers see the experiences you create.

Your experiences have a lot of moving parts—don’t you think you ought to know how they mesh together?

Tagged , , , , , , ,

The Graphic Gameplan

noun_75258My job is to give clients actionable insight about their customers.

“Actionable insight”—what a dreadful phrase! Can we make it a bit less management speak?

My job is to help clients understand what their customers want so that they can do a better job of giving it to them.

The trouble is that understanding is only the first step. If we stop at understanding we’re likely to do more harm than good. I like to quote Bruce Lee:

“Knowing is not enough; we must apply.

Willing is not enough; we must do.”

Bruce Lee

So how do we turn our knowledge about customers, and our willingness to improve, into action?

You need three things: top-level commitment, buy-in from throughout the business, and ideas. To get them, you’re going to need to go further than simply presenting the results of your customer insight—you need to involve your colleagues in creating an action plan.

That means some kind of workshop. Workshops are great, but they can often be feelgood days that generate loads of ideas and enthusiasm with little in the way of concrete results.

Good workshops require structure. Build exercises to explore and generate ideas, but finish with a converging exercise in order to deliver a clear way forward. ‘Gamestorming’ is a great book I turn to when I need an exercise for a workshop.

workshop

One of my favourites for helping people move from insight to action is the “Graphic Gameplan“. The beauty of this exercise is that it forces participants to break ideas for improving the customer experience into specific actions, slotting them into a strategic timeline view. It leaves you with momentum, accountability, and a clear vision of what is happening next.

If you don’t have a gameplan for improving your customer experience, maybe it’s time to organise a workshop?

Tagged , , , , , , , ,

Why you want a low score

noun_166704It’s surprising how often I meet organisations whose leaders want a high score more than they want happy customers.

Some don’t even seem to notice the mental bait-and-switch they’ve played when they pretend it’s the same thing.

In order to improve you need what a client of ours once called a “burning platform for change”.

A score that looks ok, even if we’d rather it was higher, means there is no burning platform. No burning platform means no significant change.

Often what gets in the way is a measurement process which flatters the organisation.

We’ll ignore deliberate gaming of the score, or completely biased questionnaires, and look at two more subtle problems.


Using a weak measure

All customer survey scores show a skew towards the top end of the scale. Most customers are at least reasonably happy with most organisations. After all, how long would you stick with a company that you were scoring in the bottom end of the scale?

At the same time, relatively few organisations have a majority of customers giving them “top box” scores at the extreme end of the scale.

In other words, most companies are quite good at customer satisfaction, but few are consistently excellent. Data from the UKCSI as well as our own client league table backs this up.

When it comes to score, this means that measuring “% Satisfied” (i.e. the proportion of customers in the top end of the scale) is a tremendously weak and flattering measure.

Companies with over 90% “satisfied” customers can be below average performers when a strong measure is used.

But it sounds good, doesn’t it?

Both Customer Satisfaction Index (CSI) and Net Promoter Score (NPS) will give you a much tougher measure, one that’s more likely to push your organisation to change.

 

Benchmarking for comfort, not for ideas

Benchmarking can be a brilliant tool for improvement, or a distraction that does nothing but get in the way. David Ogilvy once said:

We all have a tendency to use research as a drunkard uses a lamppost—for support, not for illumination.

Benchmarking is much the same.

Internal benchmarking is a very powerful way to improve an organisation’s performance by sharing best practice and taking advantage of people’s natural competitiveness. Enterprise Rent-A-Car used this very effectively in the late 90s, as discussed in this classic HBR case study.

External benchmarking is useful to help you understand the range of performance that’s been achieved by others, and to find ideas for improvement (Southwest Airlines looked at Formula 1 pit crews to improve their turnaround time).

In practice, many organisations indulge in what I call vanity benchmarking – redefining your comparison set until you find a league table you look good in.

Even worse, some organisations (inadvertently or otherwise) cheat. They use different scales, or different methodologies, or change the way NPS is calculated, or exclude customers who made a complaint, or any one of 1,000 other tricks.

Benchmarking should be about finding opportunities to improve, not a PR exercise.

Tagged , , , ,