Skip to main content
search
0

In the first article in this series, I outlined the 11 most common failure modes for IT outsourcing relationships.  These are summarized below for your reference:

  • The vendor over-promises, and fails to deliver on their commitments
  • The client fails to exercise proper governance over the vendor contract
  • The vendor underprices the contract and fails to earn a profit
  • The contract fails to align vendor with client goals and objectives
  • Vendor reports contain raw data, but rarely include proper diagnosis
  • The client does not understand the metrics included in vendor reports
  • Both client and vendor view the contract as a zero-sum game
  • Vendors spin data and reports to cast themselves in the most favorable light
  • Continuous improvement is ill defined or not included in the contract
  • Vendors experience extremely high turnover on a client project
  • Vendors and/or the client do not adequately train personnel

In this sixth installment of the series, I will address the problem of vendor reporting, and the disconnect between data and analysis in vendor reports.

Data Without Analysis

ITSM vendors are notoriously famous for conflating data and analysis. Or they mistakenly assume that data, by itself, produces insight.  On both counts they are wrong, and that has serious consequences for their clients and the industry as a whole.

Today’s ITSM and related systems – e.g., telephony, knowledge, remote control, AI, etc. – can produce an unlimited number of reports with a single keystroke.  These reports can be customized for any timeframe, any format (tabular, line graph, bar chart, pie chart, etc.), and can be packaged with a multitude of colors and layouts to be deceptively beautiful.  How are they deceptive?  Because reports that are automatically produced by the aforementioned systems contain raw data almost exclusively. Yet many consumers of these reports are fooled by the professional presentation, and assume that the reports contain deep insights.

The best way to explain the difference between raw data and insight is by example.  If I read a report that tells me that last month’s First Contact Resolution Rate (FCR) was 76%, Customer Satisfaction (CSAT) was 82%, and the Average Speed of Answer was 17 seconds, that is raw data.  It might be interesting, but it is otherwise useless.  Without context; without interpretation; it is virtually meaningless.  However, if I am presented with a time series of the same data over 12 months, and the report is accompanied with the following analysis, I can do something with it:

  • FCR has been trending downwards, from 82% to 76% over the past 12 months, and is now in the bottom quartile of all service desks.
  • Customer Satisfaction has also been trending downwards from 85% to 82% over the past 12 months, and is now in the third quartile of all service desks.
  • Average Speed of Answer has improved from 66 seconds to 17 seconds over the past 12 months, and it now in the top quartile of all service desks.
  • We know that FCR and CSAT are strongly correlated, so it makes sense that CSAT has decreased at the same time First Contact Resolution has decreased.
  • The service desk added 20 new staff over the past 12 months. This has reduced the ASA dramatically, but their inexperience has produced a lower FCR, and hence a lower CSAT.
  • The aggressive ASA is an indication that the service desk may be overstaffed at this point. Essentially, this service desk has traded quantity for quality.  Staffing up without adequate training and experience has produced very fast response times, but two critical quality metrics have suffered: FCR and CSAT.
  • The service desk needs to reverse direction, and return to its previous focus on quality. This may require some remedial training for the service desk analysts.  Additionally, top quartile performance targets for CSAT and FCR should be established, which would be 94% and 88% respectively.
  • ASA is relatively unimportant, as customers are far more sensitive to First Contact Resolution Rate than they are to ASA. So, it may be appropriate to reduce analyst headcount through attrition, and focus future hiring efforts on more qualified, more experienced, and better educated analysts.  The net effect on cost is likely to be a wash, and ASA will increase, but you can expect quality – i.e., FCR and CSAT – to improve dramatically.

Although this very simple example is hypothetical, it is appalling that even this basic level of analysis is lacking in almost all vendor reports.  The bottom line is that you cannot act upon raw data, but you can act upon insight.  So, ask yourself this question the next time you read a vendor report: What does it mean, and what are you going to do about it?  If you cannot answer those two simple questions, your vendor reports are not delivering value.

For an industry that is now almost 50 years old, why does this gap between data and insight persist?  Part of the answer is that clients don’t know any better.  They put too much faith in their vendor, and assume that the data rich reports produced by the vendor contain insight and analysis.  Moreover, they are not trained to ask the right questions, so the important questions that should be asked, simply don’t get asked.  The second part of the answer is that vendors benefit from this opacity.  The less you know about cause-and-effect relationships, underlying drivers, and acceptable performance targets, the better it is for the vendor.  It means that they can deliver sloppy results and avoid accountability, both of which hurt you, the client.

So, what’s the answer?  How do we make the leap from meaningless reports to deep insights?  Good governance is part of the answer, so it may be worth re-reading Part III of this series.  Additionally, you, the client, must be bold and fearless when it comes to asking “Why” and “How”.  Why is CSAT trending downward?  How do you know FCR drives CSAT?  How can you improve FCR?  When can I expect to see improvements?  How will the improvement be sustained? Where will you find recruits that are qualified to hit the FCR target?  You get the idea.  You must be relentless!  Don’t stop asking questions until you are satisfied with your vendors’ answers.  And then, hold them accountable!

Some Final Thoughts on Data vs. Analysis

The best defense against vendor reports that lack analysis is an educated client.  So, I am going to suggest some additional resources that will bring you up to speed, and give you the tools you need to hold your vendor accountable.

Cause-and-Effect for Service Desk KPIs | IT Service Desk Metrics

The Seven Most Important Service Desk Key Performance Indicators

Introduction to Outsourced IT Service Desk Metrics | 43 Definitions, Formulas & Key Correlations

Unleashing the Enormous Power of IT Service and Support KPIs

With a bit of effort, you can transform your ITSM vendor into a force for good in your organization.  A key part of that transformation involves overhauling the existing management reports to deliver true insights that lead to continuous improvement, and ultimately world-class performance!

Jeffrey Rumburg

Jeff Rumburg is a co-founder and Managing Partner of MetricNet, where he is responsible for global strategy, product development, and financial operations for the company. As a leading expert in benchmarking and re-engineering, Mr. Rumburg authored a best selling book on benchmarking, and has been retained as a benchmarking expert by such well known companies as American Express, Hewlett-Packard, General Motors, IBM, and Sony. Mr. Rumburg was honored in 2014 by receiving the Ron Muns Lifetime Achievement Award for his contributions to the IT Service and Support industry. Prior to co-founding MetricNet, Mr. Rumburg was president and founder of The Verity Group, an international management consulting firm specializing in IT benchmarking. While at Verity, Mr. Rumburg launched a number of syndicated benchmarking services that provided low cost benchmarks to more than 1,000 corporations worldwide. Mr. Rumburg has also held a number of executive positions at META Group, and Gartner. As a vice president at Gartner, Mr. Rumburg led a project team that reengineered Gartner’s global benchmarking product suite. And as vice president at META Group, Mr. Rumburg’s career was focused on business and product development for IT benchmarking. Mr. Rumburg’s education includes an M.B.A. from the Harvard Business School, an M.S. magna cum laude in Operations Research from Stanford University, and a B.S. magna cum laude in Mechanical Engineering. He is author of A Hands-On Guide to Competitive Benchmarking: The Path to Continuous Quality and Productivity Improvement, and has taught graduate-level engineering and business courses.

Leave a Reply

Close Menu