Step 6 – Designing Performance Measurement Display, Dashboards and Scorecard Systems (Part 2)
This is the final installment in a series of articles exploring the Six-Step Process for Building an Effective Court Performance Measurement System (CPMS) first summarized in Made2Measure (M2M) in October 2005. See listing below.
The design of a measurement display system should not wait to begin until the first five design steps have been taken. Instead, a rough mock-up of a display of the core performance measures – in the form of a simple hand-sketch, for example – can be produced as early as Step 2 - Identifying Desired Performance Measures. A simple mock-up will facilitate the first five steps by helping potential users to imagine what they might see when they interface with the court performance measurement system (CPMS). It will help to focus and to energize the building of the CPMS. Completion of Step 3- Creating Measurement Hierarchies adds detail to this mock-up with the addition of subordinate measures cascading from the core measures.
The Role of Technology
The last several years have seen significant technological innovation related to the sixth and final step of building a CPMS – the development of performance information displays, dashboards and scorecards systems.
Buy or Build
Courts committed to the development of performance display systems are faced with the common question facing court managers and court IT professionals as they seek to automate their business processes: whether to buy and adapt a commercially available system or build one themselves from scratch. Regardless of the choice, a good place to start Step 6 is with a study of the functionality of computer software applications referred to as business performance management (BPM) software, business intelligence (BI) solutions, or simply “performance dashboards,” offered by an increasing number of technology companies including Microsoft, Microstrategy, SAS, SPSS, PerformanceSoft®, Cognos®, and BusinessObjects®, to name just a few.
Although there are many commercially available BPM, BI and performance dashboard applications, at the time of this writing only one performance management application, CourtMetrix, produced by ACS, Inc., is specific to courts. A prototype of CourtMetrix was introduced at the Court Technology Conference (CTC8) in Kansas City, Missouri, in October 2003. It was tested in various courts in Washington State in 2004 and 2005. (Full disclosure: the author is a co-inventor of CourtMetrix.) Several courts, including the Superior Court of Arizona in Maricopa County (Phoenix) and the Multnomah County Circuit Court in Portland, Oregon, and state-wide court systems in at least two states, Arizona and North Carolina, have developed their own performance measurement display systems.
Wayne W. Eckerson, the author of Performance Dashboards – Measuring, Monitoring, and Managing Your Business (Wiley, 2006), believes that the development of BPM, BI, and performance dashboards is the last big market for business computer software. His book is a good reference for court managers thinking about designing performance measurement displays. The appendix at the end of his book provides a comprehensive set of criteria to evaluate performance measurement displays, dashboards and scorecards along several dimensions including design (e.g., web-based, end-user design, strategy maps), analysis (e.g., drill down/up, interactive reports), delivery (e.g., custom output, publishing), administration (e.g., metadata, alerts), infrastructure (e.g., compatibility, usage statistics), and vendor (e.g., service and support, pricing).
Generally, the aim of BPM, BI and dashboard applications is to provide relevant, valid and timely performance information to decision makers at the strategic, tactical and operational levels. The applications typically incorporate a variety of software for capturing, organizing, analyzing, and displaying performance data. Their functionality may include: (a) automatic data extraction, transformation, and loading of performance data from various sources (e.g., automated case processing systems, spreadsheets, and packaged applications) on a predetermined schedule; (b) subscription to various levels and sub-levels of strategic, tactical and operational data displays as determined by the court or the individual users; and (c) convenient access to, analysis and management of interrelated performance data organized to allow users to combine numerous attributes (e.g., location, division, type of case) to any performance measure (e.g., clearance rates). Users are able to drill down to lower levels of data breakouts, slice and dice subsets of data for on-screen viewing, and link to other information. Performance displays may include a variety of management and collaboration capabilities (e.g., alerts and notifications) whereby executives and managers can guide the way users communicate and share performance information. Such applications are invaluable for managers and other users who need to understand performance data but who may not have the time or talent for ad hoc queries and analysis of a variety performance databases.
The opening screen of CourtMetrix, for example, summarizes a court’s current performance information by a single score, the Court Performance Index or CPI, a composite of seven court performance measures based on the CourTools. A green or red triangle and a number indicating whether and how much the CPI has moved is shown alongside the CPI. (Think of the way the Dow Industrial Average is shown on television screens and monitors.) With a few clicks of their keyboards, and by “dropping down” through several screens of progressively more detailed and less aggregated data, a court manager or judge is able to pinpoint the court, division, case type or resources needing attention and to get clues about the root cause of the downturn and possible corrective actions. CourtMetrix enables -- in a matter of seconds -- the critical functions of performance measurement: (1) identifying current performance (establishing baselines); (2) ascertaining trends; (3) determining goals, boundaries, tolerances and exceptions (setting controls); (4) diagnosing problem areas; and (5) supporting planning and management efforts (prediction and forecasting).
Step 6.1 – Designing the Measurement Display System
The first sub-step in Step 6 is to determine how the display should look and function. Assuming that the display will be computer-based, the initial design task overlaps with Step 6.2, software development, outlined below. The designers should begin by preparing a static mock-up of a graphical user interface (GUI) -- computer screen pages of the core and subordinate measures -- as well as a general description of the ways users will navigate and use the display to monitor, analyze and manage performance data for all the core and subordinate measures.
Regardless of whether a court purchases or builds a performance measurement display system, it must tackle critical issues of system architecture, including data import and export, a calculation engine, graphic publication capability, and a calculated results database for every performance measure. Connectivity to and extraction of relevant data from external databases in various automated systems are challenges likely to occupy court technologists for some time.
Step 6.2 – Software Development
Step 6.2 includes not only the actual writing of computer code but also the preparation of requirements and objectives, the design of what is to be coded, as well as testing and confirmation that what is developed has met objectives. It should proceed through successive phases that are familiar to computer software engineers:
1. Identification of required software
2. Analysis of software requirements
3. Detailed specifications of software requirements
4. Software design
6. System integration and testing
These phases should be coordinated with the steps of building a CPMS. Software engineers should have the freedom to be innovative, to feel ownership of the CPMS. Activities such as definitions of the process and business requirements, documentation of user, functional and system requirements, as well as the top level architecture, technical approach and system design, ideally should proceed in parallel with the initial steps. This is especially critical in the design of the conceptual, logical, and physical structure of the databases, specifications of data objects, data extraction, management and transfer functions for all the performance measures of the CPMS. An effective, usable database is only possible if its designers have adequately incorporated the results of a full needs and task analysis. The database designer, in addition to programming skills, must understand the target users of the database, its intended functionality in both the short and long term, and all system and human parameters affecting its use.
A Future Scenario
Suppose a quick morning check of a court's performance display featuring the Court Performance Index (CPI) -- a composite of core performance like that in CourtMetrix (see above) -- reveals a green, upward pointing triangle signifying a slight increase of the CPI since the last check. All is well, concludes the court administrator checking the display. She can relax and get a cup of coffee. On the other hand, the coffee may have to wait if the check reveals a red downward-pointing triangle signifying a downturn in the CPI.
Assume that this particular morning, after she has gotten a cup of coffee, the court manager decides to click the CPI icon to show a more detailed screen displaying the balanced scorecard of seven core performance measures or indices that make up the CPI. Each of the performance measures is displayed with the same “look” as the CPI, i.e., a single score (value) for the core measure, a smaller number indicating a change in the measure (both the value and percent of change), if any, and a triangle (arrow) indicating a downturn or upturn in the measure. She discovers that two of the core measures are down from her previous check of the performance display, and that one core measure -- the measure Employee Opinion, an online survey of court employees posted the evening of the previous day -- is largely responsible for dragging down the court's CPI. A click on the tab for this measure reveals a graph of the measure over time. The court manager sees that the measure has dropped a fraction of a point below the "control" level, a predetermined level of tolerance marking when some corrective action needs to be taken with respect to this measure. She decides to call the director of human services, the “owner” of the measure and ask him to meet her in her office.
Together, the court administrator and human resource director drill down into the tactical levels of the display. The breakouts or disaggregations of the measure by court division on the right side of the screen displays results for the Family Division, which happen to be the lowest among all five divisions of the court. It is clear that the Family Division’s Employee Opinion results are largely responsible for the downturn in the measure while the other divisions' survey results have remained essentially unchanged. By clicking on the Family Division tab, the court administrator and human resource director are able to see yet another screen showing the breakdown of the family division survey item by survey by item, as well as a graph showing the overall results for the family division over the last several quarters. They both learn that the downturn in the family division is mostly due to significant decreases in the Family Division's 63 employees' ratings of the first three of the 20 items of the survey:
· I know what is expected of me at work.
· I have the materials and equipment I need to do my work right.
· I have the opportunity to do what I do best every day.
Less than fifty percent of the Family Division staff agreed or strongly agreed with these statements, far below the overall court averages and below the Family Division’s averages for these items in the past. The dissatisfaction reflected by the results for these three items makes sense to the human resource director, who had studied the data earlier that morning. It is largely due, he surmises, to the fact that the Family Division just moved into new court facilities. It has yet to have all of its computer equipment and files moved from the old location and things are in turmoil. While the court administrator is not happy with these data, she is gratified that more than four out five of the Family Division staff still think that their mission is important to them, as indicated by the results of yet another survey item.
The human resource director expresses his confidence that once the move to the new court building is complete the Family Division staff will become more satisfied -- at least this is what he’s prepared to tell the judges' executive committee. He also knows from the results of previous surveys over the last five quarters that the early responders to the survey tend to be far more negative than the later responders and that the measure is likely to improve in a day or two before the survey is closed. The court administrator decides not to take action immediately but to mobilize her management team in the event that the survey results do not improve as more of the court employees complete and transmit the online survey questionnaires. She hopes that the increase in satisfaction of the later responders in the Family Division might take the overall Employee Opinions measure back above the control level, obviating the need to put the performance measure on the executive committee meeting agenda.
A few more mouse clicks produce screens revealing important additional “metadata” related to this measure including trends over time; the alignment of the measures with the court's management processes such as strategic planning, budgeting, quality improvement, and employee evaluation; related measures; and best practices in the performance area gauged by the measures.
A Cautionary Note
Performance displays are here to stay. They give users an intuitive way to get performance data on a self-service basis. However, the design and development of the performance-related information displays, dashboards and scorecards -- the "how should it look" part, although important -- is only the final step in building a performance measurement system. It is quite possible that early projections of attractive but fanciful performance dashboards, like the awe inspiring tip of an iceberg, may distract court managers from looking beneath them and building the necessary base upon which the performance information displays must rest. After all, if your court is off course and heading into trouble, it is great fun to imagine yourself at the control panel of a spaceship checking the management equivalent of a flight speedometer, odometer and temperature gauge. The lure of an attractive graphical user interface should not allow a false sense of security that all that is left to be done is building a spreadsheet or engaging in some relatively straightforward software development task. As noted by Wayne W. Eckerson, the author of Performance Dashboards: Measuring, Monitoring, and Managing Your Business, an effective performance dashboard “is more than just a screen with fancy performance graphics on it: it is a full-fledged business information system that is built on a business intelligence and data integration infrastructure.”
Previous posts in this series:
- Introduction to the Six-Step Process for the Design of an Effective Performance Measurement System (Part 1), Made2Measure, June 6, 2006
- Introduction to the Six-Step Process (Part 2), Made2Measure, June 12, 2006
- Step 1 -- Assessing Currently Used Performance Measures, Made2Measure, June 17, 2006
- Q & A: Can Step 1 Be Taken At the State-Level, Made2Measure, June 23, 2006
- Step 2 -- Identifying Desired Performance Measures (Part 1), Made2Measure, July 2, 2006
- Step 2 – Identifying Desired Performance Measures (Steps 2.1 and 2.2), Made2Meassure, July 10, 2006
- Step 2 – Identifying Desired Performance Measures (Steps 2.3 and 2.4), Made2Measure, July 15, 2006
- Step 3 – Creating Measurement Hierarchies, Made2Measure, July 27, 2006
- Step 4 – Testing, Demonstrating and Developing Measures, Made2Measure, August 07, 2006
- Step 5 – Developing Data Collection and Delivery Timeframes, Made2Measure, August 20, 2006
- Step 6 – Designing Performance Measurement Displays, Dashboards and Scorecard Systems (Part 1), Made2Measure, September 10, 2006
For the latest posts and archives of Made2Measure click here.
Copyright CourtMetrics 2006. All rights reserved.