Measuring and Managing Encounters of Court Users and Court Employees

Along with a number of client-partners, I’ve been rethinking the way courts gauge the satisfaction of court users and the strength and engagement of their work force And, yes, how we can improve the court user – court employee encounter.

Quite a few court leaders and managers still regard these lines of performance measurement and management as “touchy feely,” “soft” or “subjective,” despite the overwhelming evidence to the contrary from some hard-nosed researchers and practitioners. But that’s a subject for another time.

Human Sigma

In their book, Human Sigma: Managing the Employee – Customer Encounter (Gallup Press, 2007) – a much expanded version of their groundbreaking article “Manage Your Human Sigma” that appeared in the July/August 2006 issue of Harvard Business Review – Gallup researchers John H. Fleming and Jim Asplund rewrite the rules for how we should measure and improve employee - customer encounters. Though based on extensive research of employees and customers of private companies, Fleming and Asplund have much to say to managers of courts and court systems.

Their work is especially relevant to courts and court systems that assess court user satisfaction and employee engagement using the National Center for State Courts’ CourTools Measure 1, Access and Fairness, and Measure 9, Court Employee Satisfaction, or some variations of these two performance measures.

Although precise counts do not exist, I estimate that today over 100 courts in a dozen states have employed these performance measures. A number of states, including Arizona and Utah, have done so on a state-wide basis. Appellate courts also have gotten into the game. The Oregon Court of Appeals, for example, conducted its first court user survey of members of the appellate and trial bench this time last year and is planning a second survey in the near future. (See a summary of results of the first bench and bar survey on the Oregon Court of Appeals website.)

Six Sigma (from which Human Sigma derives) is a quality management approach pioneered by Motorola twenty five years ago. It became a genuine management movement that revolutionized the way private companies approached manufacturing (mostly non-human) quality by emphasizing the setting of extremely high objectives, collecting data, and analyzing results to systematically control variation and improve manufacturing processes (the Greek letter sigma is used to denote variation from a standard or other reference points).

Six Sigma was inspired by quality improvement methodologies like quality control, Total Quality Management (TQM), and Zero Defects. Motorola has reported billions of dollars in savings from using Six Sigma. Along with Motorola, companies that use Six Sigma methodologies today include General Electric, Bank of America, and Honeywell International.

Fleming and Asplund’s Human Sigma applies the Six Sigma principles and techniques to sales and service organizations relying heavily on the measurement of human -- i.e., employee and customer -- engagement.

New Rule

Human Sigma’s Rule # 1 is that employee and customer experience are not separate entities and, therefore, should be assessed and managed together. Good employees make good customers; you can’t have one without the other, at least not for long.

Because most companies are not organized or prepared to manage these human resources together, Fleming and Asplund suggest that companies may need to reorganize to do so. That’s the organizational part of Rule # 1.

This new rule is especially relevant for the increasing number of courts and court systems that are measuring court users’ and employee satisfaction and engagement using the Measure 1 and Measure 9 of CourTools or some variations. The message is that we must view both sides of the court user – court employee encounter as interrelated and mutually dependent.

While the four of the CourTools measures focused on timeliness and expedition of case processing are lumped together as Measures 2, 3, 4 and 5, and their descriptions are cross-referenced, it is revealing that Measures 1 and 9 are separate and are not cross-referenced. Rule # 1 should apply.

The Court’s Human Sigma

Measure 1, a survey of court users, and Measure 9, a survey of court employees, use a similar metric -- namely, the percent of respondents who agree with statements about their encounter with the court and the court’s treatment of them. Disaggregating the metrics by survey item, respondent demographics, and by characteristics of the encounter (e.g., where it occurred and for what reason) can pinpoint variations in the court user and court employee encounter that may require management attention.

For example, a problem as well a good course of action, may be suggested by the following survey results: a relatively low percent of court users in a particular situation (e.g., appearing as witnesses) who believe that they were treated with courtesy, dignity and respect in a particular court division; and a relatively low percent of court employees in that same division who believe that their colleagues care about high quality. If these results deviate meaningfully from the central tendencies of other divisions or other court users in different situations, an astute manager may identify a problem and a possible solution.

Simply by identifying other divisions or situations with superior results, good managers may be close to a solution to the problem. See Undesirable Variation in Court Performance, Made2Measure, April 23, 2007.

The point is that when court user measures and court employee measures are viewed together as interrelated and mutually dependent, problems are more readily identified and solutions more easily found.

For the latest posts and archives of Made2Measure click here.

© Copyright CourtMetrics 2008. All rights reserved.

Comments

Popular posts from this blog

Top 10 Reasons for Performance Measurement

The “What Ifs” Along the Road in the Quest of a Justice Index

Q & A: Outcome vs. Measure vs. Target vs. Standard