I’ve written about Customer Effort Score (CES) before and kind of chided the intractability of defining it specifically. Of course, it’s not fair to pick on CES, as I’ve written in other instances, even common definitions like First Contact Resolution runs into definitional problems when they encounter actual Customer opinions (we all have our own definitions).
But specifically, when it comes to Effort (or, as I sometimes will call it, “hassle”), I remember a wise Process Engineer who used to work for me once noted: “we’re defining ‘hassle’ from our own perspective.” And he was correct to point it out in that instance. I wonder: Are you doing the same? (more…)
With all due deference to Matt Dixon, sometimes “effort” is a tricky thing to define. I worked with one team that ran around and around about it constantly it seemed. Matt’s Customer Effort Score (CES) metric basically asks the Customer to rank his or her satisfaction with the amount of effort expended to solve an issue or otherwise accomplish something.
Now, right away you can see the question begged: How do we even know the issue has been solved in the first place? This, of course, goes to an age-old conundrum of how we can ensure a Customer’s issue has been solved before we send out a survey for feedback, regardless of the survey type. After all, it’s insult added to injury if we ask, “hey, how’d we do?” while the Customer is still waiting for a solution. But let’s put that issue aside for now as it’s a common concern (NPS, C-SAT, and all the others have the same limitation). (more…)
“Well, it’s because they’re different.”
The not-deliberately snarky, yet somewhat oversimplified tautological response was understandably not satisfying for the support business leader who’d asked me why I thought NPS would be different for the different lines of business his organization supported. But in the end, it’s no more complicated than that. Forget that I was, without it occurring to me, loosely quoting Vanilla Ice, but sometimes it’s hard for us in the context of the business we think we already understand to see the forest for the trees.
In my defense, the Customer profiles were different, the products in the different lines of business were tremendously different, and even the people they had supporting the two products were different, and located in different contact centers.
This occurred to me the other day when I was reading through some discussions regarding two approaches in the VoC world: Transactional versus Relationship surveys. (more…)
The topic of the Voice of the Customer (VoC) has many branches and sub-categories. Just on the topic of surveys alone (which is only a part of VoC), there are tons of thoughts: We discuss things like the formatting of surveys, the proper response rates, how and what sorts of questions to ask, which channel we should use to survey, even whom to survey. Beyond that there are numerous other methods of collecting the Voice of the Customer: market analyses, social media (SoMe) monitoring and analysis, competitive comparisons, and of course we can’t forget Walking in the Customers’ Shoes. Each of these other methods likewise comes with their own set of approaches and execution methods.
But what about what comes out of those efforts? Sometimes we concern ourselves so much with the day-to-day transactional concerns about collecting the VoC, we forget why we’re doing it in the first place. In the worst case, we substitute raw winning vs. losing motivations for insights, and devolve the entire process to: “What’s the score today?” Let’s back up a bit, though, and recognize what I’ve said so many times I should just make a bumper-sticker out of it: VoC insights are of no use if you don’t use them to improve your Customers’ experiences. That leads to a remarkable—and to some, shocking—conclusion:
You should be hungry for negative feedback. (more…)
I’m a big fan, as you know, of negative feedback. I suggest that CX professionals be greedy for negative feedback. Since slaps on the back and hoorahs from your most ardent fans don’t really help you improve, you should be eager to hear “suggestions” from your Customers as to how you can better serve them. Fortunately, there’s rarely a shortage of such inputs. So what do you do with this feedback? There are three ways in which you should be using every negative piece of information you receive from your Customers, regardless of the method of transmission: (more…)
This is part two of a series of posts on the four components of a good CX system. I introduced the concept here and my first post, on CX strategic alignment, is here. Soon I’ll write about Process Engineering and wrap up with notes on what it takes to build and maintain a Customer-centric culture.
Folks often simply boil the Voice of the Customer (VoC) down to surveying. This is a big mistake. You’ve heard of the expression that we need to “meet the Customers where they are” when it comes to our offerings. Well, gathering their input should also take that approach. Just as different needs of different Customers are differently met by our products and services, so too should we understand the individuality and unique journeys of each segment of our Customer base when it comes to soliciting their feedback. (more…)
In this article I’ll explain the components needed in place to elevate an organization’s Customer Experience. I won’t go too far in depth into the four parts but rather provide an overview of how they work together with a brief explanation of what they are. Following, in a continuation of this series of articles, I’ll explain with more specificity how these work (and work together), as they each deserve their own writings.