Andy's Support Notes logo

Andy's Support Notes

Subscribe
Archives
November 6, 2023

Customer satisfaction surveys and you

I sat, you sat, we all sat for CSAT

"Eureka Silk is the best [front]" by Boston Public Library is licensed under CC BY 2.0.

As you’re getting your ticketing system up and running, you may notice an option for enabling customer satisfaction (CSAT) surveys. These are simple (1-2 question) surveys that are an attempt to answer the question: how well did this support experience go? Typically how it works is this:

  1. The issue is resolved

  2. Depending on how integrated the CSAT tool is, it either automatically sends a survey to the customer or the engineer initiates the process

  3. The customer (ideally) responds to the survey

  4. The support engineer and their manager review the customer response and determine next steps, if any

The best feedback from customers is the kind that comes immediately, rather than months later during a Customer Success check-in or, even worse, at renewal. If you’re doing well, it’s good to know. If you need to make course corrections, the sooner you know this the better off you’ll be.

Personal note

Before I get into the details of what makes a good CSAT setup, I wanted to talk a little about my own experience. For a long time I was resistant to enabling CSAT in my own ticketing system, for several good (and less good) reasons:

  • It felt very transactional. We went to great lengths to make our ticket handling process feel very human, for lack of a better term. No autoresponses, human-written ticket nudges, no canned message templates. Following all that up with an automated message to fill out a survey felt like it would ruin the impression we’d gone to so much trouble to cultivate.

  • How good would the data be, anyway? We all hate filling out surveys, and speaking for myself, the only time I bother is if I’m really happy or really mad. Would we even get useful data out of these CSAT surveys?

  • What if it tells us things we don’t want to hear? The flip side of the above question. Maybe we’d get lots of responses and they’d be terrible. Were we just fooling ourselves in our opinion that we were doing a good job with customers? In case it’s not obvious, this is not a good reason to avoid CSAT!

Now, obviously, I’ve come around since then. And all it took was actually taking the plunge and setting up the CSAT survey and letting it run for a couple of weeks. What that experience taught us:

  • Letting our customers talk about their experience made things more human, not less. We got valuable feedback outside of the immediate context of having a problem to solve. Instead, customers spoke about the overall experience with the benefit of hindsight.

  • When customers could share feedback with just a couple of clicks, there was a lot of it. We ended up getting a 20-30% response rate almost immediately, which was far higher than we expected.

  • The feedback was a mix of praise and constructive criticism. If we were doing a terrible job overall, we’d have heard about it outside the CSAT mechanism. By normalizing feedback like this, we were able to get a balanced view of how the team was doing, and act on it accordingly.

That last point is important, and I’ll go into it more below: no matter the tenor of the feedback, there are lessons to be learned and applied. If you’re not doing that, the feedback loop is incomplete.

Components of a good CSAT survey

If your ticketing system doesn’t offer CSAT surveys, or if the package you’ve chosen doesn’t include CSAT, there are a number of inexpensive third-party providers available as well that can integrate with most ticketing systems. Whichever method you choose, CSAT is a powerful adjunct to your support ticketing process. When you’re setting up your CSAT survey, particularly with tools that permit more customization than just ‘enable/disable’, here are a few things to keep in mind to maximize the response rate—and usefulness—of CSAT.

  • Short: remember that these surveys are fundamentally an imposition. You’re asking your customer for a favor, and so the best way to get a response is to make it as easy as possible on them. One question, an optional space for free-form comments, done. Any more than that and your busy customers just won’t bother.

  • Focused: there are all kinds of different customer sentiment measurements out there, but CSAT is specifically aimed at just one thing. It’s right there in the name: customer satisfaction. Don’t worry about asking bigger picture questions about the product or the company—this survey is all about measuring the effectiveness of the customer interaction that just occurred.

  • Automatic: however you choose to implement CSAT surveys, make sure that they’re automatically sent for every single ticket. It’s the only way to make sure you’re getting a fair sampling of all support interactions, not just the ones where the engineer remembers to send the link (or, more troublingly, decides to send it only when they feel it was a successful interaction).

  • Directly attributable: you must be able to link CSAT responses to the specific support issue they’re associated with. A separate survey tool that isn’t integrated with the ticketing system can be fine for getting general customer sentiment, but if you can’t point to the specific customer interactions that the customer liked (or disliked) then you’re wasting your time.

Using the responses

So now that you’re getting responses, hopefully a good number of them, what do you do with that information? After all, it’s just a couple of data points. But even with that limited information there is a lot you can learn. Start looking for patterns in both positive and negative responses: what do they have in common? Is there something that your customers particularly like or dislike? Is a negative CSAT response an outlier or indicative of a systemic issue? The only way to find out is to look at the context, that is, the specific support issue that CSAT response is connected to.

  • Individual development: if one engineer is consistently getting good (or bad) feedback, bring it up with them directly! On one hand, everyone likes to hear they’re on the right track. Even if the news isn’t great, it still needs to be raised so you can work with them on figuring out what needs improvement.

  • Process improvement: if you get negative customer feedback but can’t find anything in the ticket itself that would point to dissatisfaction, that’s a sign you need to gather more context. Did things move too slowly? Do your ticket processes not include regular updates? You may need to reach out to the customer, perhaps with the assistance of Customer Success, to find the root cause of the negative feedback.

  • Product feedback: sometimes CSAT responses, especially in a free-form response section, have nothing to do with the interaction itself and everything to do with the product you’re supporting. Positive or negative feedback directed toward the product can be just as valuable, however, and should be shared regularly with Product leadership.

  • Change from baseline: once you’ve been gathering CSAT responses for a few months, you’ll start to have a baseline response rate and satisfaction rate that you can monitor going forward. Any major deviations, positive or negative, are a sign that you need to look more closely to see what has changed.

The sooner you start gathering this information, the sooner you’ll be getting use out of it. The next time I set up a ticketing system, I’ll be enabling CSAT as early as possible to start getting the benefits of that additional data as I build a team and support processes.

Thanks for reading Andy's Support Notes 💻💥📝!

Don't miss what's next. Subscribe to Andy's Support Notes:
LinkedIn
This email brought to you by Buttondown, the easiest way to start and grow your newsletter.