I recently wrote about how some brands don’t really listen to their Customers when they develop new functions, features, and even new products, and how frustrating that can be. That’s often because a brand feels so comfortable and strong that they don’t really need to listen. “We know what our Customers want, we’ve been doing this for years,” they seem to be saying.
A lot of the work I do is with entrepreneurs, getting started with CX, and inculcating their new organizations with a truly Customer-centric culture and organization. I was discussing this dynamic with one of my start-up friends recently and it brought up a trend with early-stage organizations that I shared with her and figured I’d also pass along here. […]
I participated recently in another one of these awesome forums where CXers gather to chat and share ideas about our profession. The main topic centered on VoC approaches, and at one point someone brought up the challenge of interpreting his company’s NPS results. To paraphrase him, sometimes one Customer may rate an experience as a ‘0’ while another Customer may rate the exact same experience as a ‘10’. […]
I once worked with a client who was having a somewhat complicated concern about survey data: Every time they interacted with a Customer they’d send out a survey invitation. That was well and good, but they never knew (does anybody?) when they’d get a response. They put a time limit on the survey in that the invitation expired after 30 days, but they knew their Customers were busy, so it may be a couple of weeks before someone got around to filling it out.
That dynamic made reporting a bit trickier because the results were not necessarily reflective of how their performance was at the time they came in. Because they’d trickle in over the course of several weeks to a month, the picture and insights from one day’s experiences would be diluted and confused with those of later days. Now, while in some situations, this may seem a trivial concern (What difference would it make, for example, if the Monday results got mixed in with the Tuesday and Wednesday results?), I’d been working with this group for a while and they’d embraced my admonition to act on the insights they were gathering from their Customers. So in this case, the changes they were making should show up in the results of their VoC program…keeping the survey results aligned with what was going on when the encounter it was to represent occurred was important. […]
A friend of mine was ribbing me the other day about the Net Promoter System and how I’m a pretty ardent critic, or at least questioner of it…not least because I think there’s a better approach (hint: It has to do with the proper purpose of doing CX in the first place). […]
This is another in a series of articles I decided to start writing a while back calling out brands for doing the right thing when it comes to CX. There’s a lot of negativity out there, and I’m even a big fan of learning from our (and others’!) bad CX practices. You’ll notice I don’t name names when I write about poor CX, but here it’s the opposite and I endeavor to highlight those who get it right. […]
I’ve been with a certain service provider for about 20 years now. It’s definitely the longest I’ve ever been with any brand that I can think of off the top of my head. Sometimes you stay because it’s the only game in town (you can likely consider your cable company for this example). Sometimes you stay out of a sense of laziness (are you a Coke person or a Pepsi person?). But I’m not sure if I’d say that I’m staying with them out of ‘loyalty’.
It’s kind of a weird thing to think about: loyalty to a brand. […]