This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
Contact Center Data: Making the Most of Your Stats
Contact centers abound with data. I would argue it is the only department in an organization that knows down to the minute what its employees are doing, how they are doing, and what the overall customer experience is at any given point in time. Contact centers succeed — or not — by the data on an interval-by-interval, day-by-day basis.
Contact center data comes from a multitude of sources, and that alone is clearly a challenge. Those sources will have different databases, different timeframes for reporting and even use different calculations for similar metrics, such as service level. Understanding which source is best to retrieve the information is crucial. What you are trying to measure, why you are measuring it, and who the data/report is for is vital to know before delving into all the information that is available. It is also important to have a flow chart of the routing of contacts through the system and to recognize how the routing rules can affect the statistics; different designs can achieve similar customer experiences yet produce different statistical results.
Furthermore, it is beneficial to have the ability to test the routing of each contact type to verify the pegging (counting) and timing of that data in the database. I have found over the years that the documentation is not always accurate or complete. Being able to test and verify that data instills confidence in the information provided by the application. A diagram of how the applications are connected and interact, and what data is sent or exchanged, is important for the business analyst and contact center systems support person. They will want to know, for example, that an ACD routing application feeds data into a workforce management (WFM) application; usually there are two feeds — a real-time data feed and an interval-historical data feed.
The various sources of data include, but are not limited to:
- ACD routing application (voice, chat, SMS, email, social, video, dialer)
- IVR application
- WFM application
- Quality management (QA) application, providing voice and text analytics
- Learning management application
- Knowledge management application
- CRM application (sales data)
- Customer surveys – CSAT/Net Promoter Score (NPS)
- Website analytics
There is a glut of data and along with that glut comes confusion. Since the dawn of contact center data and reports, the reoccurring questions are:
- How come X does not add up to Y?
- What is the calculation for service level?
- Why is the service level from my routing solution different from my WFM solution?
- Why is an agent’s chat (or email) time greater than their logged in time for the interval or day?
- How come agent X has 0 calls answered from 4:00 p.m. to 4:30 p.m. but 10 minutes of talk time?
- Why are there more abandoned calls than calls offered?
- Why does answered plus abandoned not equal calls offered?
- Is the time the customer is put on hold by the agent included in talk time?
- How can I get information about contacts when my center is closed?
- What happens when an agent goes into their Break Not Read Code while still on a customer call, is that time being collected as soon as they enter it? The Break Timer is showing on the real-time display.
To answer some of these questions, it is imperative to obtain the vendor’s “data dictionary.” The data dictionary will provide information such as the data structure, what kind of database it is, how to access it for custom reporting, time frames of the data (interval, day, week, month) and what the various data elements mean and what calculations are in use. The data dictionary may be a little cryptic at first; however, with some assistance from the vendor, it will quickly become a go-to resource.
Most applications have canned reports and most are inadequate for a supervisor or manager. When I have questioned vendors on why they don’t create more useful canned reports, the common response I’ve received is that customers all want different things and they can always access the database using Crystal Reports or some other type of report writing software (and expertise) to customize the reports in the form they need. Often in the SMB market, the company does not have those skills in house and extra investment is required to get even the basic reports in forms that are useful. To top it off, it is not just about accessing the database; you have to understand what you are accessing, how that data is stored — different perspectives/views — where to get the data you are looking for, and how that data is calculated. I believe the vendors could create better canned reports that a majority of customers could start out using.
It is important never to assume anything about contact center statistics, especially when moving from one vendor’s application to another. When data, such as talk time, crosses an interval, what happens?
In the example below, a call comes in and is answered at 8:10 a.m. but finishes at 8:20 a.m. That data would look different depending on which system you are looking at:
Same call, two different systems, overall data for the end of the day will be the same — however, interval data is pegging differently and when System B puts all the call data into the next interval of calls that cross the interval, this could have an impact on your overall WFM application and staffing, especially for the opening few intervals. If you were to use System B, you would likely need to adjust your planning accordingly.
Another calculation that varies among the vendors is the service-level equation. There are four different formulas used in the industry, and you need to know which one your vendor uses — oftentimes, the routing application vendor will use a different formula than the WFM application vendor. If you don’t agree with the formula a vendor uses, then hopefully your application is flexible enough to create your own and use it in the canned reports without having to create custom reports. In most systems, however, I have found that not to be the case.
As mentioned above, know your audience and the type of data and reports that are most useful to them:
- Graphs are especially beneficial for highlighting trends or comparisons, while rows and columns of data usually just turn people off.
- Directors are interested in the big picture data like forecasted volume to actual and forecast accuracy, budget staff to actual staff, customer service and NPS scores, accessibility, costs per contact and/or revenue per contact, and overall quality scores.
- A contact center manager needs to review all the metrics the directors get, along with the overall customer journey or experience, IVR utilization and completion, web analytics for customer self-service, staff attrition, supervisors’ agent group statistics — adherence, attendance, average handle time (AHT), quality scores, CSAT and employee engagement, and a myriad of other metrics that can point out areas for improvement and areas of success.
- A supervisor needs data on their own agent group broken down per agent — adherence, attendance, AHT, and QA — on all the modes of customer interaction, CSAT, sales, employee engagement, and learning objectives met to date. For supervisors, it is always good to show information on their employees compared to the group, provided they are handling similar types of contacts. As with directors, graphs better demonstrate information, and with a group of agents, a control chart better illustrates where individuals are within a group (see graph below).
The data available from all the sources is plentiful, yet the information required for each level of management is much more focused. You must know the best source for the data, understand how that data is calculated or what is included in it, and what data can be compared. The goal is to develop meaningful reports that demonstrate how the organization or individual is doing compared to their goals and objectives and where improvements could be made. That sounds simple, yet it is not a task for those who don’t enjoy doing a lot of exploration into the reams of information from potentially nine different sources.
Helm is writing on behalf of the Society of Communications Technology Consultants, an international organization of independent information and communications technology professionals serving clients in all business sectors and government worldwide.