top of page

B&G REPORT.

Search

Guidelines for Building a Potent Performance Management System



A few weeks ago, we were honored to join Harry Hatry, a pioneer of performance management and a distinguished fellow at the Urban Institute and Batia Katz also of the Urban Institute in co-authoring a paper called “Do’s and Don’ts: Tips for Strengthening Your Performance Management Systems.


The paper also included contributions from a formidable group of performance experts including Maria Aristigueta, dean of the Biden School of Public Policy and Administration at the University of Delaware, Don Moynihan, McCourt Chair at the McCourt School of Public Policy at Georgetown University and Kathy Newcomer, professor in the Trachtenberg School of Public Policy and Public Administration at the George Washington University.


We thought we’d whet your palette for the paper with a good handful of our favorite items under the category of “Do’s”. They aren’t taken verbatim from the paper – we’ve edited many for length.


And by the way, we’ll be joining Harry and Maria in a webinar about the paper, sponsored by the Center for Accountability and Performance (CAP) at the American Society of Public Administration (ASPA) on March 1, at 1:00 Eastern Time. You can register for it here: https://register.gotowebinar.com/register/2151968976437873676?source=TW


Collecting Performance Data


Do seek input from stakeholders as part of your process for selecting performance measures to track. Stakeholders are likely to include frontline employees (those who serve your program participants); special interest group members; elected officials; and, especially, participants in each relevant demographic group.


Do make sure that mission statements focus on expected benefits. What benefits or outcomes are sought from the program, and for whom? What negative outcomes does the program seek to avoid or alleviate? Too often, mission statements identify only how benefits will be achieved without clarifying what benefits are sought.


Do include both output and outcome indicators in performance measurement systems. Select the measurements used to track the performance of new programs at an early stage of program development. Defining and sharing these measurements will provide guidance for people in positions of authority in the program about what is expected of them.


Analyzing Performance Data


Do compare the outcome values broken out (disaggregated) by demographic characteristics (such as by age group, race/ethnicity, gender, educational level, and location.) This is of major importance in identifying service equity issues and identifying different service procedures that would boost the outcomes for different demographic groups. Identify and highlight in performance reports unexpected issues indicated by these breakouts.


Do compare the performance values over time. Look for trends that indicate the need for action.


Do compare performance indicator actual values with targets that had been set for individual performance indicators. Targets are used both as an accountability tool and to motivate program staff.


Presenting Performance Findings


Do identify any significant ways in which the data collection has changed over time. Otherwise, users may be misled by year-to-year changes that are not attributable to real-world improvements or declines but simply changes in the way the data have been created.


Do clearly define each performance indicator. Both the data collectors and data users should be able to understand what is being measured. For example, fire departments, can measure response time from the moment the call comes in until trucks arrive at the scene, or alternately they can provide the same measure beginning the moment the trucks leave the station.


Do tell stories that illustrate data’s meaning and importance. Numbers alone will only communicate effectively to readers who enter a document with curiosity. Real-world anecdotes will engage a far larger audience.


Disseminating Performance Findings

Do share findings with the press. One common complaint is that performance information only gets attention when negative. That can only be counteracted with a proactive approach. One key to getting attention in the press is to provide information that runs contrary to common assumptions.


Do make the latest data on the performance indicators readily accessible to program managers throughout the year. As issues arise during the year, the latest performance data should be available to managers to use in addressing those issues.


Do provide summaries and highlights to report audiences after each performance report is produced.


Using Performance Findings


Do reply to queries about findings, even if they are critical in nature. If it turns out that a query challenges findings in a way that could raise some doubts, it’s worth acknowledging that. Trust and credibility grow when room for doubt is acknowledged.


Do periodically review the performance measurement process and update it as needed. Is it tracking the right things? Are the performance data and the data collection procedures producing data of sufficient quality? Is the information of sufficient value to justify a measurement’s added cost? Are the performance findings clearly presented? Has the information gotten to all those who can use the information?


Do unleash the full power of performance data, not only through regularly published reports, but also at other opportunities throughout the year. Use the performance measurement process to help address issues as they arise. This will enable decisions to be made with the latest available performance data. It will also enable program managers to obtain performance information tailored more closely to the particular issue at hand.

bottom of page