Q & A: Adopted, Adapted or Home Made Measures?

A: Why not simply adopt the set of 10 performance measures prescribed by the CourTools or by other authoritative sources and dispense with the six steps for building a court performance measurement system (See October 15, 2005, posting, Six-Step Process for Building an Effective Court Performance System (CPMS))? After all, the CourTools are endorsed by the Conference of Chief Justices (CCJ) and the Conference of State Court Administrators (COSCA) (see Resolution 14). Why reinvent the wheel?

Q: This is an important question, especially in view of the significant investment of time and resources required for building a court performance measurement system (CPMS). Unquestionably, models such as the CourTools are extremely valuable. No individual court, no court system, and no justice system (that includes courts) considering performance measurement should – and none of the dozen or so my colleagues have assisted in the last year did -- proceed without first studying the CourTools. However, for performance measurement to be effective, potential users must commit to a measure’s purpose in the context of the court’s goals, values and operations, not to just to the measure itself. No two courts are alike. No effective performance measurement system can be built in an organizational and political vacuum. Therefore, no simple plug-and-play adoptions are likely to succeed.

Regardless of whether a court eventually adopts, adapts or decides to develop its own unique performance measures, it must answer fundamental questions: Is the measure meaningfully related to what the court’s leaders and stakeholders care about? What would a court do in response to the information revealed by the performance measure? Is the measure similar to measures currently used by the court or its justice partners? What required or desired performance information currently is not available to the court? What specific performance measures would provide that information? How does the court select a vital few, instead of a trivial many, performance measures? To what degree should the selected measures be tried and demonstrated before implementation? When and how should the performance data be collected? To whom and how often should it be distributed? In what format and on what schedule should the performance information be conveyed and displayed? Unquestionably, models such as the CourTools are extremely valuable. The six-step design process simply introduces discipline, conceptual clarity and method to the myriad conceptual, methodological, analytic issues associated with answering these questions.

Alignment of Performance Measurement with Mission

Effective performance measurement must be aligned with the unique guiding ideas of a court. Few would take issue with the need for a court to formulate its own values, purpose, and vision of its future that define its unique organizational "character." While there are methods and techniques for clarifying values, crafting mission statements, and building scenarios for the future, there are no plug-and-play guiding ideas for courts. Every court must create its own guiding ideas. This requirement extends to the identification of performance measures and the development of a system for measuring court performance.

Effective performance measures are those aligned with a court's mission, values and vision, and those compatible with its operational environment. For example, though attractive as a model, the 68 performance measures of the Court Performance Standards reflect distinct guiding principles and civic ideals -- such as a focus on outcomes for citizens to the almost complete exclusion of attention to internal processes -- that may not be compatible with the mission, vision and values of a particular court. The Court Performance Standards are characterized by five performance areas representing the perspective of the citizens served by the courts: access to justice; expedition and timeliness; equality, fairness and integrity; independence and accountability; and public trust and confidence. The perspectives of those who "run" the court are conspicuously absent. Other perspectives or key performance areas -- such as those focused on financial, internal business, innovation, and organizational "learning" factors -- are legitimate alternative perspectives for building a system of performance. What perspectives best reflect a court's guiding ideas will to a large degree determine the selection of performance measures.

Several years ago, in the first version of its performance measurement program, the 19th Judicial Circuit of Lake County, Illinois, relied on the four perspectives of the balanced scorecard approach to identify performance measures: (1) short and long-term budget impact (What are the real costs incurred versus real costs saved? How are the costs being controlled and allocated?); (2) client/customer satisfaction (Are our customers satisfied with our services? Are they receiving high quality and valuable services?); (3) internal efficiency of operations/processes (Are the program functions/operations the most efficient way of doing things?); and (4) innovation and growth (How has the program enhanced the skills, training and progress of the organization?). The Hennepin County (Fourth Judicial of Minnesota) District Court in Minneapolis used four perspectives to identify and categorize an initial listing of 37 possible performance measures: (1) customer -- provide access, fair and equal treatment; (2) process -- fair, timely, efficient and effective case processing; (3) human resources -- employee satisfaction, quality staff and diversity promotion; and (4) financial -- joint judicial and administrative management, adequate and stable funding. The perspectives chosen by these courts to shape their selection of specific performance measures include not only that of the citizens who are served by the courts, the perspective of the Court Performance Standards, but also the performance areas of concern to the those who run the courts.

Compatibility with Strategic Initiatives and Management Processes

Simple adoption is likely to be unsuccessful and considerable adaptation and homemade solutions are likely to be necessary because performance measurement must be done not only in view of a court's unique guiding ideas but also its strategic initiatives, operating environment, and key management processes. Drawing on their experience with performance measurement in the City of Memphis, Joy Clay and Victoria Bass writing in a 2002 issue of Government Finance Review note that:

More often than not, implementing performance measurement as an isolated "add on" requirement results in superficial measures and an effort that neither supports improved resource allocation nor ensures that individual priorities are aligned with government-wide priorities. As a result, performance measurement becomes another fleeting management fad of no added value to the organization. Aligning performance measurement with other key management processes, however, institutionalizes performance measurement as driver of improved management decision-making. Again, this is not simply a matter of adopting models, though these models can be instructive. Model measures simply may not be compatible with a court's strategic initiatives.

Given the fact that most courts already use dozens, even hundreds, of input, output and outcome measures and indicators, the imposition of a set of model measures "adopted" by the court's leaders and top-level managers may be resisted by court staff. Resistance is likely to be greatest when the adopted performance measures are seen to be a new and unnecessary wrinkle on what the court already has in place.

In 1997, Dan Straub and his colleagues helped the California Administrative Office of the Courts to "streamline" the 68 performance measures of the Court Performance Standards to bring them within the capabilities of smaller courts. Three criteria were used to identify select measures:

Return on investment (ROI). If a performance measure would require more court resources than it would return in useful information, it was modified or eliminated altogether from further consideration.
Managerial relevance. What could the court do in response to the information revealed by the performance measure? If a court would likely do little with the performance data, the performance measure was modified or eliminated.
Existing performance measures. Performance measures of the Court Performance Standards that were similar to measures currently used by the court (e.g., state approved testing of interpreters) were not considered.
Jurisdiction. To be considered, performance measures had to be applicable to the jurisdiction of the court.

Unique Jurisdiction and Organizational Structures

As recognized by the California Administrative Office of the Courts, a court's jurisdiction as well as its organizational and political structure will dictate the selection and development of performance measures. Courts that include large adult and juvenile probation departments are likely to consider measures of the success of probation as essential to an effective performance measurement system. Similarly, family courts with jurisdiction over child abuse and neglect cases are likely to take their cue from the federal Adoption and Safe Family Act complementary state legislation and formulate performance measures gauging the outcomes of court programs and services on the safety and well-being of children and families. Limited jurisdiction courts that have no, or just a few jury trials, will not be interested in performance measures of the fairness and equality in the outcomes produced by the methods a court uses to compile juror source lists and to draw a venire representative of the total adult population of the jurisdiction.

Copyright CourtMetrics 2005. All rights reserved.

Popular posts from this blog

A Logic Model of Performance Inputs, Outputs and Outcomes

Q & A: Outcome vs. Measure vs. Target vs. Standard

Top 10 Reasons for Performance Measurement