118: CSAT with Nicholas Zeisler

118: CSAT with Nicholas Zeisler

Nicholas Zeisler, of Zeisler Consulting, joins me on the podcast for the first time, to talk about how we should pay attention to all the detail behind the scenes when analysing and working to improve our CSAT metrics. He’s promised to return to talk about tying CSAT to wider company goals, but in the meantime, here’s a video he made earlier on that.


I’d love your thoughts on this episode! Comment below, and like/love/share/support if you found this inspiring, thought-provoking, or useful!

Charlotte Ward 0:13
Hello, and welcome to Episode 118 of the customer support leaders podcast. I’m Charlotte Ward. The theme for this week is customer satisfaction. So stay tuned for five leaders talking about that very topic. I’d like to welcome to the podcast today for the first time Nicholas Zeisler. Nicholas, it’s lovely to meet you and have you on the show. Would you like to introduce yourself?

Nicholas Zeisler 0:42
Thanks, Charlotte. Certainly I appreciate the opportunity to join you and your and your voluminous fans out there. My name is Nicholas Zeisler. I am the principal of Zeisler Consulting. It’s a cx consultancy that blends Voice of the Customer work for those insights along with process engineering, which is to say doing something with the voice of the customer that you’re collecting and then also concentrating quite a bit on developing customer centric culture within organisations as well.

Charlotte Ward 1:10
Awesome. Thank you so much. So voice of the customer is a big part of what I brought you here what I lowered you here to talk about false pretences. I’m sorry, I absolutely wasn’t, I would love to spend the next you know, few minutes talking with you about customer satisfaction, good old c sound. Which I so far in all of my conversations with other support leaders recorded or not, as has elicited quite a variety of responses from love to low thing I have to say, Where are you on that scale?

Nicholas Zeisler 1:44
I am on that scale. And, which is to say I find myself all over that scale. There’s a lot of discussion about this. And I used to be quite the contrarian. When I worked at HP, I was a bit opposed to the use of NPS, which was something that we swore by the time. And the more I looked into it, the more I realised that it wasn’t so much. nps sucks cset sucks customer effort score, what are we going to use as a top line? cx KPI is terrible, because there’s always going to be people who are acolytes on one side, or the other people are going to champion for this one or that one. And quite seriously, Charlotte, at the end of the day, it’s more what you do with it. Because people who are working in an organisation that is just they swear to see sad, I swear, that’s the ultimate. That’s what we use around here. That’s the gold standard. And there’ll be some folks in there. that’ll say, oh, wow, this is just no good. We shouldn’t be using this. It’s no good. And I say, you know, you’re absolutely right, because on its own, it is no good. And I break it down into two things. First of all, it’s no good without some sort of supporting information. Because I see sat NPS just like revenues or sales, quite frankly, there’s not a knob you can turn to make it go to where you want it to go. It’s an output measure, it is a top level KPI. And, you know, I started this whole career of mine as an analyst. And one of the things you learn right away is that those higher level numbers are influenced by so many other things. And so if you’re just collecting and just looking at cset, or NPS or pick your poison, you’re not going to do yourself any good unless you’re also looking at attribute data. Unless you’re asking about well, let’s dig into the different aspects of the experience with the customer hand. Let’s talk about, you know, the agents attitude whether the agent solved the problem whether the agent, you know, was conversational or whatnot, what is going on behind the scenes below the surface of that KPI? Yeah, if all you’re looking at a cset, it does suck, because they’re gonna give you that insight.

Charlotte Ward 3:59
Even if it’s a big number, it still sucks, right? Because what did you tell? Yeah,

Nicholas Zeisler 4:03
that’s right, we’re at 100. And the information sucks your numbers great. But yeah, but it’s but it’s not a good measure. If you’re not digging down below the surface and finding out what’s driving and you have to look for those corresponding numbers, you have to look at the correlation between all of these other attribute data and that and quite frankly, you have to do something which also kind of sucks because there’s a lot of work involved in it. You have to look into your verbs to have to read the things that your customers say to you in those open ended text questions that ultimately you definitely should be asking I trust you are Charlotte.

Charlotte Ward 4:38
Well, I have yet to my new place to implement a C SAP programme but believe me, I, when I wrote about CSAC quite some time ago, I expressed a desire. I had a dream necklace, I had a dream. That one day and maybe maybe this is the place who knows, but I will implement it. The cset programme has no numbers tied to it at all.

Nicholas Zeisler 5:04
Very good,

Charlotte Ward 5:05
is just an open ended question. And it will not even be very specific. The question will be, is there anything you would like to tell us?

Nicholas Zeisler 5:16
And there’s no question mark. And that’s the end of it. Right? I love it. That’s fantastic. Because, quite frankly, you know, the second part of my admonition about C, SAP sucking as a measure is, so what so what are you gonna do with it. And the one thing that’s for sure about any of these top line numbers, you can’t do anything about them. You can’t do anything about cset, you can’t do anything about NPS, you can’t do anything about customer effort score, generally speaking, but you sure as heck can do something about why all of our customers are telling us that we’re not solving their problems, when they call into our help centre, all our customers are telling us that our IVR is is a real pain in the ass to try to navigate. Whatever that you can actually do something about that. And that’s where, quite frankly, to your point where those open ended questions really serve you a lot. Serve you very well. It’s unfortunate that a lot of times people just want to make a word cloud, or say, Hey, here’s, here’s some sort of search that we did. But you’re asking your customers for their feedback, they’re doing you a favour to tell you what their experience was, You should be grateful for that. And as a result, you should be reading every damn one of those. Yeah. And I think what it is

Charlotte Ward 6:26
absolutely, absolutely, I want to I want to track back slightly to something you said before, which is just about looking at the underlying, like attributions, and and correlations. And obviously, there are all sorts of things you can go look at what go away and look at in your organisation that might be an influence on C SAP, for example, like easy one response times, right? That’s a nice easy one. And that’s one that people often incentivize on or look for correlations to with sec. You know, one thing I learned about correlation is that it doesn’t equal causation. And I’d be interested in like, how you identify what those like what those factors, what those attributions are or might be, and how you identify what’s an accidental correlation with what is actually something you can action?

Nicholas Zeisler 7:16
Yeah, good. That’s a great question, Charlotte. And in fact, you’d be surprised how often it is that if you ask that top level cset number, and then ask five or six attribute questions, you’d be surprised how highly correlated some of those five or six attribute questions are. And you know why? Well, because those are the damned questions that you asked, they’re going to give you the insights unless you give them that open ended. And that, by the way, should lead into your voc programme and your surveys shouldn’t be cast in stone. And it’s those insights that you’re getting from those open ended questions that should inform what the newer version of your survey with new attribute questions are, you could just add and add and add and add every time you see something. And then of course, nobody will respond to your survey, right? Because it’s, it’s as long as your arm. But you should be looking for insights in those open ended text responses so that you can refine the questions that you’re asking. Yeah, absolutely. And then I’d also say, make sure that you’re certain that when you define something some way that your customers also understand it the same way as well. When you talk about resolution. The resolution is resolution mean that your agent close to the ticket? Or does resolution mean that the customer is satisfied and is fixed and is made whole?

Charlotte Ward 8:35
I love that just make sure that you’re you’re both talking about the same thing, because resolution doesn’t mean the same thing to everyone. Yeah, that’s that’s a great big dollop of food for thought just in that one sentence.

Nicholas Zeisler 8:47
Well, my work is done here.

Charlotte Ward 8:48
I might, we might have to have a long pause now. While I just think.

Nicholas Zeisler 8:53
Now Charlotte, and Nicholas will reflect

Charlotte Ward 8:55
Yeah, exactly. So, exactly. So you talked about refining the questions there. You’re probably also refining a lot of stuff internally as well. You’re improving processes. You’re redefining, like, incentivization. As you’re, you’re, you’re doing a myriad of things in internally and externally. Maybe this is a question for another podcast, Nicholas, but but even tell me, if you can tell me in a couple of sentences, maybe how do you decide the cause and effect as you’re changing both aspects? Because these are both kind of moving targets as

Nicholas Zeisler 9:30
well. Yeah, the thing about it is, is that we satisfy yourself with cx being cx people. And we all convince ourselves for example, and each other how important it is. And in a similar sense, we can become kind of solipsistic and just figure Well, we’re going to fix this stuff, right? Well, we should be able to show to our colleagues in other parts of the business, that things are getting better as a result of it, right. The some brilliant person once said, that if you want to change reality, or if you want to change, perception, change, react. And of course, I messed up the quote there. But the idea is that we’re measuring this. And this we believe is driving c SAT. And this presumes that you’ve already made the connection between driving c SAS drives revenues drives sales drives the bottom line, right? But with that, you know, you’re you’ve already got that shit, you’ve got that one. That’s good. All right. But you have to be able to show that when you do this, when I turn this knob here, this input variable when I hire a couple more people for the context, and when we change the way that we do this, when the script goes this way, when we blow up that stupid IVR and implement this new one, we’re going to see people stop complaining about this, they’re not stop complaining, but they’ll stop complaining about this. Well, that’s an impact, right? And if that correlates, and that that leads to that improvement in cset, or NPS or whatever, your cx KPI and then you can feel confident that you’ve actually done it and people will look at that, okay, I don’t know all this cx Voodoo that you’re doing over there. But I do see that when we change this, it really did change the top line KPI for you know, from the customer experience perspective. Sometimes that’s all you can get, right? You better be sure that when that KPI changes, that your revenues and your sales do change as well. But like I said, that you want to you want to topic for another for another podcast. That’s the one there too. Yeah.

Charlotte Ward 11:27
Please come back and record that another time with me will you be happy to. That’s it for today. Go to customersupportleaders.com/118 for the show notes, and I’ll see you next time.

Transcribed by https://otter.ai


A little disclaimer about the podcast, blog interviews and articles on this site: the views, thoughts, and opinions expressed in the text and podcast belong solely to the author or interviewee, and not necessarily to any employer, organization, committee or other group or individual.