Part 17 (1/2)
Customer Satisfaction was another example of data that was easy to attain The department utilized a third-party survey company that sent survey invitations (via e-mail) to the customer of every closed trouble call The vendor collected the responses, tabulated the results, and sent a weekly report to the departiven a monthly copy of the results for inclusion in the Report Card All of the measures we had identified were collectable unobtrusively, with no disruption of the department's workflow This is not always possible, but it is always a goal
Since h automated systeh level of detachment from human error The few places where humans did interact with the data, the team's desire to produce accurate inforshi+p
Recall the Metric Development Plan We realized it was important to identify the source, the data (each component), how and when to collect, and how to analyze it In the case of the Service Desk ories and measures identified for the Service Desk and further breaks them down into the data needed, where that data will be found, and sorammed into a software tool for display
This was our starting point We identified thesecollected and analyzed As with s, there were other options to choose fro at the analysis, we reevaluated our draft of the measures
Let's return to the Metric Develop: Purpose statement
How the metrics will be used
How the metrics won't be used
Customers of the metrics
analysis
Schedules
A Picture for the Rest of Us
Prose
Purpose statement Our purpose statement was defined for us-how can we communicate to our leadershi+p how healthy our services and products are?
Hoill it be used You ht think this ansould be simple and obvious; it would be used to answer the questions In this particular case, it would be used to communicate the health of the Service Desk, from the customer's point of view
Hoon't be used Most people want this to be obvious also, expecting not to have to answer the question Of course I had to answer it
It would not be used to differentiate between analysts It would not be used for performance reviews It would not be used to push the team of analysts to reach different levels of perforets to be achieved
Customers of the metrics The customers of this metric (Health of the Service Desk) was first and foreer and the analysts were the owners of the data, and they were the ”rightful” owners of the information derived Another custoer answered to), the CIO, and finally the executive All of these were customers Each customer needed different levels of inforer) could benefit from even the lowest levels of the data The director would need to see the anomalies She would want to knohat the causes of those anomalies were The CIO would want to know about anomalies that required his level of involverade to their phone system, a new auto would have to be approved by the CIO The data would help support these requests
The CIO would also want to know about any trends (positive or negative), or anoht reflect customer dissatisfaction Basically, the CIO would want to know about anoht ask about Most of the tiroup of customers complained about a problem area The CIO shouldn't hear about the anomaly from his boss
The same can be said of the executive If the service's health was below expectations, and it ended up reflecting back on the parent organization, the executive would rightly want to knohy and as being done to s better (either repeat the positive experiences or eliative)
analysis Besides the planned analysis, the results of the infor and/or round work laid out, it was time to dive a little deeper We had to collect the data and analyze it to ensure our initial guesses of e'd use were on target
Availability
We started with the abandoned call rate for the service When we looked at the data shown in Figure 9-1, I asked the er (and the staff) to perforht the department was unresponsive to the custoher than expected, was it accurate? If it as it so high?
Figure 9-1 Abandoned call rate The er had heard many times before that abandoned rates were standard measures of performance for call centers When she looked at the data she said it ”didn't feel right” Not because it cast the departht, but because she had confidence that her unit was more responsive to the needs of the customer than the rate showed (the data showed that the depart” more than two out of every ten calls)
This proated We looked at two facets-the processes and procedures used to answer calls and the raw data the system produced The process showed that calls that were not answered within two rings were sent to an auto the caller that all analysts were busy and one would be with the caller shortly It was telling that the recording provided information about any known issues with the IT Services like, ”the current network outage was being worked and should be back up shortly,” is one exa's first 30 seconds conveyed information that may have satisfied the callers' needs
Upon further inspection, we found that the raw data included the length of the call (initiation time vs abandoned time) This allowed us to pull another measure, as shown in Table 9-4
The ure 9-2 We looked at it compared to the total abandoned rate to see if it told a clearer story
Figure 9-2 Percentage of abandoned calls less than 30 seconds in duration As with all measures, a raphical representation to use) In the case of Availability, we started with an Abandoned Call Rate in the fore of Calls Abandoned When we added the ain used a percentage We chose to show it in relation to the total or only show the Percentage of Abandoned Calls with the qualification that ”abandoned” was defined as calls abandoned after 30 seconds
After a year of looking at the measures in conjunction with i), the department chose to drop the Total Abandoned Rate and use only Abandoned Calls Less Than 30 Seconds This was a better answer to the question of availability since it would allow for the following: Wrong numbers
Questions answered/proble
Custoed their mind (They may have chosen to use the new e-mail or chat functions for assistance Perhaps their proble) While the assumption that a caller who didn't wait more than 30 seconds was not disappointed by the as only an assumption, it was believed that this would provide a more accurate account This would have to be compared with the Speed (see the additions to the Speed measures) and the customer satisfactionout the survey were ones who stayed on the phone long enough to have their call answered
With a solid start on Availability, let's look at Speed Speed started as Time to Resolve, which was known to be a concern with custoanization was deficient in this aspect, but that the custo it took to resolve an issue
Speed
We started with the open and close ti system This data required huious in his behaviors and adherence to the processes, procedures, and policies established around trouble-call tracking If the er, I would have had to spend a considerable a that the workforce understood that the information would not be misused and that it would be in the best interest of the departer for the data input into the systeardless of the story that it told If the analyst ”fudged” the data so that it wouldn't ”look so bad” or so that it ”looked extra good,” the infor decisions could be er and only spent athe i measures and information could be used to improve processes-and would not reflect on individual performance The key to this explanation was consistent with any of the measures and any of the units I worked with
The data, then measures, then information, and finally metrics should not reflect the performance of an individual
The metrics, moreover, did not reflect on how efficiently the department was run
What the data, measures, information, and metrics did clearly reflect was the custoardless of what the ”truth” was, the depart the customers' perception This is especially true in speed