Metrics to create winning customer and agent experiences

How the world's best cloud contact centers are turning to CX & AX observability to delight customers.

Let’s be honest - measuring actual CX/AX is a tricky business. With so many moving parts to a modern Contact Center service, it is difficult knowing what to monitor, what matters, and what doesn’t. Even if we have an ocean of data available to us, how do we know if the customer indeed did have an exceptional experience - or, more to the point, when they did not?

Metrics to create winning customer and agent experiences

You can't improve what you don't measure

CX observability is all about measuring the interaction across the customer journey to detect areas that need improvement. Operata makes this part simple.

Customer Experience (CX) is the sum of its parts..lots of parts…

Delivering and operating a modern contact center service is complex: CCaaS providers, cloud infrastructure, corporate networks & VPNs, work from home agent environments, PCs, ISPs, softphones, integrations, headsets, virtual desktops - the list goes on. Any one of these can contribute to, or cause, a poor customer experience. For instance, an agent using the incorrect microphone/headset could result in an inaudible conversation.

The Agent Experience (AX) is a leading indicator of Customer Experience.

Unhappy agents = unhappy customers. This is not a new equation; however, good customer experience is often prioritized over good agent experiences. By monitoring AX, you can provide an indicator of the customer experience, i.e., if the agent is frustrated due to technical difficulties while speaking with customers.  So how can you effectively measure CX - and AX - in your organization?

1. Ditch the MOS score as the sole indicator of CX/AX.

Your cloud contact center service may report on Mean Opinion Score (MOS); however, MOS in isolation is a poor measure of customer experience. While it can be an indicator of a problem network, it reflects only three elements of an interaction: Jitter, Packet Loss, and Latency. Furthermore, it is generally consumed as an average and ignores more problematic factors such as audio levels or agent behavior.

In order to get a more holistic understanding of the CX/AX, it is important to consider the following:

  • What percentage of hold, mute & talk time occurred during the interaction
  • Audio levels (i.e., could the customer hear the agent? Could the agent hear the customer?)
  • Errors & PC performance impact the agent's ability to service a customer
  • Browser versions (older browser versions may affect WebRTC performance)
  • One-way speech (agent could not hear a customer or vice versa)
  • Multiple softphones contending

Operata continuously observes these factors alongside MOS to give superior insights into the actual customer experience.

An excellent example of where MOS fails the CX measurement test is an interaction where the customer is disconnected midway through their request after queuing for an above-average length of time. The MOS on this call would be perfect, but the CX and AX were terrible.

Remember, exceptional CX is the sum of its parts and not one measure.

2. Consider which metrics are the best representation
of CX & AX in your organization.

Business users may find it challenging to interpret what, for example, a “High Softphone Error Rate” means to CX. Equally, a technical operations team member won’t necessarily know what a CSAT score is or what to do with it. Let’s not forget the agents - showing them a live MOS score is somewhat unhelpful if they can’t do anything about it!

When evaluating a service monitoring platform like Operata, consider the needs of both technical and non-technical users - after all, improving CX and AX is a team effort.

Operata provides default dashboards with mission-critical measures for various users: IT Operations, DevOps, CXOs, supervisors, and everyone in-between.

3. Listen and respond to feedback from your agents.

Agents are the eyes and ears of your organization and play a vital role in assessing CX. Often, agents lack the tools or means to quickly and effectively communicate with service management teams who look after the Contact Centre technology.

Adequately equipped, agents are ideally placed to provide real-time feedback on issues impacting CX and AX, such as:

  • Poor sound quality
  • Missed calls
  • Softphone failures
  • One-way audio
  • Call disconnections
  • Audio delays
  • User experience issues

For newly implemented services, this provides immediate feedback to the delivery team as to the overall system health as considered by the end-user. For those already in operation, it provides an excellent summary of how satisfied your agents are with the performance of the service.

Operata provides a discreet, user-friendly widget that allows agents to report these issues quickly and efficiently to get back to what they do best: delighting customers.

Remember, happy agents = happy customers!

Measuring CX can be a difficult task, but it need not be!

  • Consider your audience and what metrics help them create winning customer and agent experiences.
  • Ditch the MOS score as the sole measure of customer experience; instead, consider it one of many contributing factors.
  • Listen to your agents - their insights and feedback are one of the most powerful and simple indicators of CX/AX.

About Operata

At Operata our mission is to help businesses have quality conversations. It's about a better connection. We are doing this by building the world's first experience management platform for cloud contact centers. Operata customers include leading insurers, telcos, banks, and MSPs.

Sign up for 14 days of free trial here

‍

GET YOUR FULL COPY OF THE WHITEPAPER

Thanks — keep an eye on your inbox for your whitepaper!
Oops! Something went wrong. Please fill in the required fields and try again.
Article by 
John Mitchem
Published 
April 26, 2022
, in 
CX