My counterparts in the UK have an article that is a worthy read Systems Thinking and the Case Against Benchmarking that discusses benchmarking in the public sector for housing repairs. In this article Paul Buxton outlines four conditions that must be met in order to be able to learn from another organization. They are:
- That the other organization is operating in a comparable environment.
- That the other organization’s performance is better than your own.
- Be able to understand the reasons that the performance is better.
- The lessons learned can be applied to your organization.
Paul points out that there are “significant difficulties” in meeting any of these conditions and “all are necessary to be sure that performance is not made worse.”
My background in customer service consulting has allowed me to observe that organizations using benchmarking either gives the organization a false sense of security (my metrics compare well) or creates tampering with the system when the metrics are determined to be sub-optimal. In the latter case, I have seen where the “industry benchmarked standard” for call answer rate is 93.49% and a service organization is at 85%. There are problems with this comparison (as an example).
- What is the operational definition of “answer rate”? I came from a background of defining SLAs favorably for the Fortune 500 companies that hired me. What is counted and not counted in that answer rate? If the phone system is down that may not get counted or any of a number of other scenarios where we take out things that might make the answer rate % lower. Believe me data gets manipulated all the time to put companies in a positive light.
- Is my service really better or worse if my answer rate is lower? Besides the operational definition problem, there are other problems. I could be answering 100% of calls and still not be providing good service.
- By taking action, I risk sub-optimization. My pursuit to get 100% calls answered, may negatively impact other parts of the system leading to increased total costs.
- Different customer demands than the “benchmarked standard” may effect call answer rate. Every service company has a different set of customers.
The command and control thinker likes the idea of having the benchmark to create the “standard” and usually the next step is to use the standard as a target. The organization than sets its resources (people, technology, process,etc.) to achieving the “benchmarked” target. The target neither guarantees better service or reduced costs. There are too many unknowns about what we are comparing against making benchmarking a waste of resources.
Worse, service companies try to copy competitors when their systems are completely different . . . different people, culture, technology, processes, etc. leading to disastrous consequences. Based on faulty assumptions that benchmarking promotes.
There is a better way. A service organization can identify measures that relate to purpose and acting on causes related to variation. W. Edwards Deming taught us this. What is your system capable of achieving and what are the causes of variation for your system. The 95 Method (my choice of method) promotes performing “check” to gain knowledge of your system (purpose, measures, demand, flow and value). This systems thinking approach will give you a strategic change strategy that will allow you to achieve business improvement.
So, what is benchmarking good for? Absolutely nothing! . . . say it again.
Tripp Babbitt is a speaker, blogger and consultant to service industry (private and public). He is focused on exposing the problems of command and control management and the termination of bad service through application of new thinking . . . systems thinking. Download free Understanding Your Organization as a System and gain knowledge of systems thinking or contact us about our intervention services at [email protected]. Reach him on Twitter at www.twitter.com/TriBabbitt.Share This: