CSAT: Customer Satisfaction

From Support Driven Wiki
Jump to: navigation, search

What is CSAT?

CSAT stands for “customer satisfaction”. The “A” and “T” don’t stand for anything, but with “customer support” commonly being initialized as CS, there needed to be another acronym for customer satisfaction. CSAT is often pronounced “see sat”, or, as Tim Cook calls it, “customer sat”.

CSAT measures the satisfaction of of every customer's interaction with your company’s customer support and is used for customer support agent improvement or development. It differentiates from Net Promoter Score (NPS), which measures overall company and product feedback, and Customer Effort Score (CES), which measures overarching feedback about your support process.


What are some common ways of using CSAT?

Thumbs up, thumbs down

A simple thumbs up, thumbs down approach is good for getting high numbers to display to a board or on a website, but is not so good for further action or doing anything else.

Happy, neutral, sad

A 3-4 happy to sad face rating or including a simple “How did I do? Good, just okay, not good.” with each email reply is best for rating the customer service representatives themselves and the experience of that particular support interaction. Being able to find ratings on specific replies rather than the entire experience is more helpful in reviewing how you are doing with your customer support responses. It also gives you a chance to up your game on the next response rather than the customer being sent a separate CSAT survey 24 hours later and then waiting to get their feedback.

1-5 or 1-10 score

Some use the Net Promoter Score (NPS) and CSAT interchangeably with a 1-5 or 1-10 score approach. This is best to evaluate the product and how good of a product you have, but isn’t necessarily as useful for evaluating customer support representatives.


What are the pitfalls of using CSAT?

CSAT does not equal loyalty

CSAT isn’t a strong indicator of customer loyalty. Customers can be easily satisfied with their support experience, but not willing to stick around with your company. It’s extremely helpful for understanding whether your particular support interaction was satisfactory - and many support reps find it valuable for editing their replies - but it’s not the best way to suggest larger, overall sentiments customers might have.

This problem stems from the fact that the methodology behind CSAT was previously built around a much more traditional means of support. Support has changed, and continues to change, meaning what we track has to evolve with it.

Does not identify process or product pain

CSAT is often not a good way of identifying a bad support process or product pain. An over-performing customer support representative can overcome a bad process or buggy feature to “save the day”, win the customer’s favour, and receive a high CSAT rating. However, high CSAT scores don’t indicate whether support processes or product issues need to be addressed.

Mistaken for product satisfaction

Users sometimes tend to rate the product instead of the customer support experience or response. This distorts the survey findings and could sometimes incorrectly be assumed to be bad customer support. This is also why there are different measurements for CSAT and NPS.

Score chasing

Teams sometimes make the mistake of chasing CSAT targets. CSAT is always a by-product or reflection of the quality of customer support. Teams must set quality benchmarks and work towards improving their customer support quality first. Therefore, CSAT will improve when quality of responses improve.

Isolated scores = misleading data

Measuring the CSAT score or percentage without also measuring the response rate can give a distorted view of support agent quality. For example, one support agent might have a CSAT score of 100% but a response rate of 10%, while another might have a 96% CSAT score but with a 35% response rate. The high-CSAT scoring agent might be providing technically correct answers but not doing a great job engaging with the customers, while the other might be great with customers but might be rushing their answers or getting them wrong sometimes. Measuring both CSAT and response rate lets you tailor your coaching to make the difference between “Yay!” vs. “Boo!” and “Yay!” vs. “Meh”.

What tools can be used to collect CSAT?

Help desk integrations

Help desk software like Zendesk, Groove, Desk.com, etc. offer in-app CSAT ratings to be included with each support ticket response and provide in-app CSAT analytics. This makes it incredibly easy to collect data and have it flow back to conversations and run reports that track back to help desk stats.

Third-party software

If your help desk provider doesn’t include a way to track CSAT, you can also use software such as Nice Reply and Delighted (used primarily for NPS) to gauge CSAT.