Step 6 – Designing Performance Measurement Displays, Dashboards and Scorecard Systems (Part 1)
This is the eleventh in a series of articles exploring the Six-Step Process for Building an Effective Court Performance Measurement System (CPMS) first summarized in Made2Measure (M2M) in October 2005. See below for a listing of the parts in this series to date.
If you have reached this final step of the design of a court performance measurement system (CPMS), you’re probably already convinced that your court or court system gains a lasting advantage by providing leaders, managers, and staff -- as well as other stakeholders – clear and actionable data to monitor, analyze and manage performance. But how are you going to do this? How are you going to ensure that those who need the performance data are able to access it? Performance measurement displays, dashboards or scorecards represent the final output of the design of an effective CPMS.
The Right Place, Time and Manner
It makes no sense to take the steps of designing the right measures and assembling the critical performance information if the data are not delivered to the right people, at the right time, and in the right way. The goal of creating a performance measurement display system is to provide intended users with meaningful information that they can quickly and easily access, assimilate and understand. An effective display enables court leaders, managers and other users to monitor, analyze, and manage the critical processes and activities needed to achieve goals. Computer-based performance displays -- often referred to as performance dashboards or scorecards -- let busy managers and staff view performance measures at a glance, and then move easily through successive levels of actionable strategic, tactical and operational performance information to get the insight they need to solve problems and to improve program and services.
Step 6 of building a CPMS is the design, development, and implementation of performance measurement display system – presentations that allow end users to access, to view and to use the performance data in an effective manner. Critical questions of functionality of this system are addressed in this step: Beyond a vital few core performance measures that focus on the strategic level, how deep down into the hierarchy of measures into the tactical and operational levels should a user be able to drill? Should the display have only monitoring functions to track exceptions, or should it include analysis and management functions? By what media (e.g., static or dynamic reports) will performance data be delivered to the users? How will the data be displayed (e.g., real-time graphic displays of performance data)? How will users “navigate” through the performance data (e.g., drill-down capabilities, links, and flexible reporting functionality)? How should the display look (e.g., like dashboard of an automobile)?
The Traditional Way – Written or Static Electronic Reports
Performance measurement displays have benefited greatly from technology advances. Organizations are putting a lot of energy into building dashboards and scorecards using a variety of business intelligence (BI) and data integration technologies. Data warehouses have today made the sourcing of performance data less onerous. Technology vendors have developed computer software applications with dynamic performance displays that sit on top of automated management information systems. Electronic presentations on computer screens have replaced paper reports.
Unfortunately, an early problem that still plagues most courts is that the desired performance data are delivered in the form of written or static electronic reports that are not readily accessible to potential users. Consequently, considerable human effort is required to monitor, analyze and manage performance results.
Consider the state-of-the-art in most courts today. Standardized printed or static electronic performance reports, typically limited to case processing data, are first produced by the information technology (IT) division of the court, the IT division of a county or city, or a state administrative office of courts (AOC) with data provided in written form by operational staff. The reports are then assembled, printed and distributed. Weeks or even months may pass between the generation of the performance data and the distribution of the reports. Critical performance data might not reach the attention of potential users until months after the performance data are generated. More often than not, the reports are difficult to understand for all but the most sophisticated users. Finally, if permitted at all, a court manager’s requests for any “special” or ad hoc reports are reviewed and handled by court analysts or statisticians and then passed on to the very busy IT division, where the request may sit unattended for months in an over-filled inbox.
Because it was impractical, if not impossible, to pre-design all the hundreds, even thousands, of permutations of measurement results for even a few core and subordinate measures, the contents of standardized reports are typically predetermined a year or more in advance by a committee distant from the potential court users. The result is too much irrelevant data, delivered too late, in an “unfriendly” format.
A Better Way
There is a better way. Courts today are drowning in data and will succeed to the extent that they are able to harness relevant performance information to make better, more informed and quicker decisions. They will need to develop better performance guidance systems and displays to make real-time decisions on the basis of real-time performance data. The development of such guidance systems is likely to take center stage in the field of judicial administration in next few years. Although written or standardized electronic reports are useful for many users, they can not meet the needs of courts that need rapid and flexible self-service access to performance data at the strategic, tactical and operational levels.
Requirements of an Effective Performance Measurement Display
An effective performance display is one that users can readily access, that is easily read and understood, and that is organized for easy navigation among core and subordinate measures. It provides a “line of sight” between core strategic measures and subordinate tactical and operational measures that conveys to everyone what the drivers of success are and provides them with the concrete knowledge of how they contribute to that success.
An overriding design principle in the creation of measurement display systems is that they are simple, yet intuitively informative and useful. The car dashboard metaphor is apt. An instrument panel on a car’s dashboard should not require lengthy inspection. It should yield all the needed performance-related information by a quick scan. Similarly, performance measurement displays should be designed in ways that provide critical performance data on demand and at a glance. Although file drawers and paper reports are still the dominant means of storing and delivering information in many courts, computers are today the dominant means of creating, storing and modifying data. The advent of client/server computing and data warehousing, and the increasing availability of electronic databases and computer software applications that incorporate a variety of technologies for harvesting, monitoring, analyzing, and managing performance data, has made computer-based electronic information the delivery medium of choice. Today performance data displays can be made available on demand by means of computer-based methods and systems.
Easy Access, Good Organization
An effective display design is one that is attractive to a user. A general principle of organization is that the most critical or frequently used performance information should occupy the most dominant or most easily accessible position possible and that supporting information should be placed peripherally. A user should not have to wade through vast amount of data to find a needed piece of information. An effective display includes quick and easy ways to sift the exact information needed from the massive amount of information that is available but not needed.
Rapid Assimilation and Comprehension
An effective display should quickly yield all critically needed performance information. Single values of a particular measure should be placed in the context of the baselines, dispersion (range) of data, controls of exceptions (e.g., performance goals and benchmarks), and trends to give them meaning. This is best accomplished by charts, figures and other graphic displays. You may not know the significance of a measurement in isolation -- the average cost per case of $837, for example. However, such measurement takes on meaning when charted with a baseline value (a blue line), benchmarks (a green line), and trend lines
Line of Sight
In order for performance measures to serve as an incentive -- for measures to be motivational -- there must be a line of sight between the measure and the actions that can be taken by the individual to change the value of the measure. The users of the display should be able to track how their performance contributes to the performance of the court as a whole. This line of sight creates what Bob Frost (Measuring Performance: Using the new Metrics to Deploy Strategy and Improve Performance. Dallas, TX: Measurement International, 2000) calls a “results tracking culture …. one of the most powerful competitive advantages your enterprise can have.” Line of sight through strategic, tactical and operational levels can be created by graphic displays of measurement hierarchies, navigation techniques, definitions, explanations, and references.
Is Sophisticated Technology Required?
But what about courts, especially small courts, that do not have the money to buy, or the technology resources to build, sophisticated computer-based performance display systems? Are such systems a necessary requirement for success? Not necessarily.
The recommendation (see Step 6.1 below) that courts begin their consideration of a performance data display system with a study of commercially available computer software applications (or the efforts of courts like the Maricopa County Trial Courts that are building their own display systems) simply recognizes that computers can be invaluable for collecting, assembling, and delivering critical performance data. There is no single best way to display and to communicate performance data. Performance display software is simply an efficient way of doing it that avoids the difficulties of written reports and electronic spreadsheets.A common reason that courts fail to detect performance problems is that the methods used to report and communicate performance data are poorly designed – or not designed at all. Potential users of the performance information get bogged down in too much data or too many reports. A typical format for performance reports (usually limited to case data) is spreadsheets with many columns and rows of mind-numbing figures identified by words, phrases and acronyms that are not easily understood. A midsize court in Arizona routinely prepares a “transmittal sheet” for its monthly statistical report to the state administrative office with a note that the presiding judge “makes no representation that these figures … are completely accurate.” It’s little wonder that important performance data get missed.Few courts today operate without spreadsheet programs like Excel or Access to organize, analyze and present data for budget and case processing reports. Computer software that sets up a simple, easy-to-read and easy-to-update system to track performance is merely an extension of such spreadsheet programs. Step 6 of building an effective CPMS should not be read as a necessary requirement to buy or to develop sophisticated performance management computer software, but instead as a strong recommendation to look at increasingly available technology to reach farther, faster, wider and deeper than paper reports can.
Next in this series: Step 6. Dashboards, Scorecards and Data Display Systems (Part 2)
Previous postings in this series:
If you have reached this final step of the design of a court performance measurement system (CPMS), you’re probably already convinced that your court or court system gains a lasting advantage by providing leaders, managers, and staff -- as well as other stakeholders – clear and actionable data to monitor, analyze and manage performance. But how are you going to do this? How are you going to ensure that those who need the performance data are able to access it? Performance measurement displays, dashboards or scorecards represent the final output of the design of an effective CPMS.
The Right Place, Time and Manner
It makes no sense to take the steps of designing the right measures and assembling the critical performance information if the data are not delivered to the right people, at the right time, and in the right way. The goal of creating a performance measurement display system is to provide intended users with meaningful information that they can quickly and easily access, assimilate and understand. An effective display enables court leaders, managers and other users to monitor, analyze, and manage the critical processes and activities needed to achieve goals. Computer-based performance displays -- often referred to as performance dashboards or scorecards -- let busy managers and staff view performance measures at a glance, and then move easily through successive levels of actionable strategic, tactical and operational performance information to get the insight they need to solve problems and to improve program and services.
Step 6 of building a CPMS is the design, development, and implementation of performance measurement display system – presentations that allow end users to access, to view and to use the performance data in an effective manner. Critical questions of functionality of this system are addressed in this step: Beyond a vital few core performance measures that focus on the strategic level, how deep down into the hierarchy of measures into the tactical and operational levels should a user be able to drill? Should the display have only monitoring functions to track exceptions, or should it include analysis and management functions? By what media (e.g., static or dynamic reports) will performance data be delivered to the users? How will the data be displayed (e.g., real-time graphic displays of performance data)? How will users “navigate” through the performance data (e.g., drill-down capabilities, links, and flexible reporting functionality)? How should the display look (e.g., like dashboard of an automobile)?
The Traditional Way – Written or Static Electronic Reports
Performance measurement displays have benefited greatly from technology advances. Organizations are putting a lot of energy into building dashboards and scorecards using a variety of business intelligence (BI) and data integration technologies. Data warehouses have today made the sourcing of performance data less onerous. Technology vendors have developed computer software applications with dynamic performance displays that sit on top of automated management information systems. Electronic presentations on computer screens have replaced paper reports.
Unfortunately, an early problem that still plagues most courts is that the desired performance data are delivered in the form of written or static electronic reports that are not readily accessible to potential users. Consequently, considerable human effort is required to monitor, analyze and manage performance results.
Consider the state-of-the-art in most courts today. Standardized printed or static electronic performance reports, typically limited to case processing data, are first produced by the information technology (IT) division of the court, the IT division of a county or city, or a state administrative office of courts (AOC) with data provided in written form by operational staff. The reports are then assembled, printed and distributed. Weeks or even months may pass between the generation of the performance data and the distribution of the reports. Critical performance data might not reach the attention of potential users until months after the performance data are generated. More often than not, the reports are difficult to understand for all but the most sophisticated users. Finally, if permitted at all, a court manager’s requests for any “special” or ad hoc reports are reviewed and handled by court analysts or statisticians and then passed on to the very busy IT division, where the request may sit unattended for months in an over-filled inbox.
Because it was impractical, if not impossible, to pre-design all the hundreds, even thousands, of permutations of measurement results for even a few core and subordinate measures, the contents of standardized reports are typically predetermined a year or more in advance by a committee distant from the potential court users. The result is too much irrelevant data, delivered too late, in an “unfriendly” format.
A Better Way
There is a better way. Courts today are drowning in data and will succeed to the extent that they are able to harness relevant performance information to make better, more informed and quicker decisions. They will need to develop better performance guidance systems and displays to make real-time decisions on the basis of real-time performance data. The development of such guidance systems is likely to take center stage in the field of judicial administration in next few years. Although written or standardized electronic reports are useful for many users, they can not meet the needs of courts that need rapid and flexible self-service access to performance data at the strategic, tactical and operational levels.
Requirements of an Effective Performance Measurement Display
An effective performance display is one that users can readily access, that is easily read and understood, and that is organized for easy navigation among core and subordinate measures. It provides a “line of sight” between core strategic measures and subordinate tactical and operational measures that conveys to everyone what the drivers of success are and provides them with the concrete knowledge of how they contribute to that success.
An overriding design principle in the creation of measurement display systems is that they are simple, yet intuitively informative and useful. The car dashboard metaphor is apt. An instrument panel on a car’s dashboard should not require lengthy inspection. It should yield all the needed performance-related information by a quick scan. Similarly, performance measurement displays should be designed in ways that provide critical performance data on demand and at a glance. Although file drawers and paper reports are still the dominant means of storing and delivering information in many courts, computers are today the dominant means of creating, storing and modifying data. The advent of client/server computing and data warehousing, and the increasing availability of electronic databases and computer software applications that incorporate a variety of technologies for harvesting, monitoring, analyzing, and managing performance data, has made computer-based electronic information the delivery medium of choice. Today performance data displays can be made available on demand by means of computer-based methods and systems.
Easy Access, Good Organization
An effective display design is one that is attractive to a user. A general principle of organization is that the most critical or frequently used performance information should occupy the most dominant or most easily accessible position possible and that supporting information should be placed peripherally. A user should not have to wade through vast amount of data to find a needed piece of information. An effective display includes quick and easy ways to sift the exact information needed from the massive amount of information that is available but not needed.
Rapid Assimilation and Comprehension
An effective display should quickly yield all critically needed performance information. Single values of a particular measure should be placed in the context of the baselines, dispersion (range) of data, controls of exceptions (e.g., performance goals and benchmarks), and trends to give them meaning. This is best accomplished by charts, figures and other graphic displays. You may not know the significance of a measurement in isolation -- the average cost per case of $837, for example. However, such measurement takes on meaning when charted with a baseline value (a blue line), benchmarks (a green line), and trend lines
Line of Sight
In order for performance measures to serve as an incentive -- for measures to be motivational -- there must be a line of sight between the measure and the actions that can be taken by the individual to change the value of the measure. The users of the display should be able to track how their performance contributes to the performance of the court as a whole. This line of sight creates what Bob Frost (Measuring Performance: Using the new Metrics to Deploy Strategy and Improve Performance. Dallas, TX: Measurement International, 2000) calls a “results tracking culture …. one of the most powerful competitive advantages your enterprise can have.” Line of sight through strategic, tactical and operational levels can be created by graphic displays of measurement hierarchies, navigation techniques, definitions, explanations, and references.
Is Sophisticated Technology Required?
But what about courts, especially small courts, that do not have the money to buy, or the technology resources to build, sophisticated computer-based performance display systems? Are such systems a necessary requirement for success? Not necessarily.
The recommendation (see Step 6.1 below) that courts begin their consideration of a performance data display system with a study of commercially available computer software applications (or the efforts of courts like the Maricopa County Trial Courts that are building their own display systems) simply recognizes that computers can be invaluable for collecting, assembling, and delivering critical performance data. There is no single best way to display and to communicate performance data. Performance display software is simply an efficient way of doing it that avoids the difficulties of written reports and electronic spreadsheets.A common reason that courts fail to detect performance problems is that the methods used to report and communicate performance data are poorly designed – or not designed at all. Potential users of the performance information get bogged down in too much data or too many reports. A typical format for performance reports (usually limited to case data) is spreadsheets with many columns and rows of mind-numbing figures identified by words, phrases and acronyms that are not easily understood. A midsize court in Arizona routinely prepares a “transmittal sheet” for its monthly statistical report to the state administrative office with a note that the presiding judge “makes no representation that these figures … are completely accurate.” It’s little wonder that important performance data get missed.Few courts today operate without spreadsheet programs like Excel or Access to organize, analyze and present data for budget and case processing reports. Computer software that sets up a simple, easy-to-read and easy-to-update system to track performance is merely an extension of such spreadsheet programs. Step 6 of building an effective CPMS should not be read as a necessary requirement to buy or to develop sophisticated performance management computer software, but instead as a strong recommendation to look at increasingly available technology to reach farther, faster, wider and deeper than paper reports can.
Next in this series: Step 6. Dashboards, Scorecards and Data Display Systems (Part 2)
Previous postings in this series:
- Introduction to the Six-Step Process for the Design of an Effective Performance Measurement System (Part 1), Made2Measure, June 6, 2006
- Introduction to the Six-Step Process (Part 2), Made2Measure, June 12, 2006
- Step 1 -- Assessing Currently Used Performance Measures, Made2Measure, June 17, 2006
- Q & A: Can Step 1 Be Taken At the State-Level, Made2Measure, June 23, 2006
- Step 2 -- Identifying Desired Performance Measures (Part 1), Made2Measure, July 2, 2006
- Step 2 – Identifying Desired Performance Measures (Steps 2.1 and 2.2), Made2Meassure, July 10, 2006
- Step 2 – Identifying Desired Performance Measures (Steps 2.3 and 2.4), Made2Measure, July 15, 2006
- Step 3 – Creating Measurement Hierarchies, Made2Measure, July 27, 2006
- Step 4 – Testing, Demonstrating and Developing Measures, Made2Measure, August 07, 2006
- Step 5 – Developing Data Collection and Delivery Timeframes,
Made2Measure, August 20, 2006
For the latest posts and archives of Made2Measure click here.
Copyright CourtMetrics 2006. All rights reserved.
Comments
Post a Comment