Implementation: How It Looks When You Get There
A previous post (Implementing Performance Measurement, November 12, 2005), explored what it takes to implement a court performance measurement system (CPMS). It reached the conclusion that even a well-conceived, well-designed CPMS will not necessarily get used unless it is woven into the fabric of a court’s management practices and processes. While such integration with strategic planning, performance-based budgeting and other formal management processes may be more demanding (more on this will follow in future postings), some relatively simple things can be done with powerful effects.
Two simple techniques that facilitate the implementation of a CPMS are: (1) making the review of the court's core performance results a permanent agenda item on the court’s executive meetings and (2) assigning ownership of core measures to key managers. Consider the following ideal scenario.
Your court has successfully built a CPMS that includes a set of eight core performance measures aligned with the court’s key success factors and strategic goals. Using computer software to display live performance data, court managers can easily access performance data of core measures, as well as that of various subordinate measures linked to the core measure (see “Step 3. Developing hierarchies or families of measures.” Six-Step Process for Building an Effective Court Performance Measurement System (CPMS), October 15, 2005).
Your executive team has decided to make review of the courts core performance measures a standing item on its meeting agendas. In addition, it has assigned an “owner” to each of the core measures. While owners of measures are all top managers, only two are members of the executive team.
Members of the team expect to address four questions at every executive meeting:
1. Performance Level. What is the current performance level compared to established upper and lower “controls” (e.g., performance targets, objectives, benchmarks and tolerance levels)?
2. Trends. What does performance look like over time? Is it better, worse or flat? How much variability is there?
3. Problem Diagnosis. What happened to make performance decline, improve or stay the same. What are some credible explanations?
4. Action. What should be done to improve poor performance, reverse and declining trend, or to celebrate or recognize good performance (i.e., one that reached or exceeded upper controls.
Executive team meeting protocols require that the team first view the few summary screens of the performance measurement display to determine which of the eight measures are out of range of the upper and lower controls. Because executive team members and measure “owners” reviewed the performance measures several times in the days leading up to the executive meeting, Question 1 and 2 are answered quickly within a few minutes. The lower and upper controls for each core measure allow the executive team members to be comfortable with delving deeper into the data only with select measures that warrant their attention. By prior agreement, only those core measures out of range will be discussed by the executive team, except if issues are raised by the meeting participants (e.g., some recent event has made it likely that a particular measure will soon decline precipitously even though it is currently in control). Those executive team members and managers assigned ownership of measures that are currently out of range -- who already know they will be asked to explain what happened to make performance decline or improve, and what should be done about it -- are not caught surprise and are well prepared to answer Question 3 and 4 about “their” measures.
Making performance measurement results a standing item on executive meeting agendas and assigning ownership of performance measures are just two relatively simple techniques for integrating performance measurement into the court’s management practices and processes. Executive team members focus on what’s most important and don’t get bogged down on too much data.
Copyright CourtMetrics 2005. All rights reserved.