STRENGTHENING PERFORMANCE MANAGEMENT
- greenebarrett
- 12 hours ago
- 3 min read
In January 2022, a little more than a year before his death at the age of 92, we had the honor of working with Harry Hatry on an Urban Institute report titled “Dos and Don’ts Tips for Strengthening Your Performance Management Systems.” While we’d been writing about this broad topic for years, what made this special was that we were working once again with one of the giants in the realm of performance management.
Contributions to this report were also made by well-known figures in the world of performance: Maria Aristigueta, Don Moynihan, Kathy Newcomer, Jonathan Schwabish and Tim Triplett.
Hatry died on February 20, 2023, and as far as we know this was the last major paper of his incredibly productive life.
In a recent conversation, Chris Mihm, retired managing director of strategic issues at the U.S. Government Accountability Office (GAO), mentioned that he was currently using this report in a performance management course he is teaching at the Maxwell School of Citizenship and Public Affairs at Syracuse University. It struck us that its contents are evergreen – and worth sharing again.
The report was divided into five sections: Collecting performance data; analyzing performance data; presenting performance findings; disseminating performance findings and using performance findings.
Following are our favorite Do’s and Don’ts from each of these categories.

Collecting performance data
Do signal to all staff that performance is a top priority for the program.
Do seek input from stakeholders as part of your process for selecting performance measures.
Do make sure that mission statements focus on expected benefits.
Do include both output and outcome indicators in performance measurement systems.
Do identify and track performance indicators of potentially important unintended effects that can occur.
Do obtain information for each performance indicator on each participant’s demographic characteristics to enable calculation for each performance indicator for each characteristic.
Don’t assume that regularly surveying participants is costly.
Don’t overwhelm your data users with too many measurements but instead provide ready access to add subgroup breakouts when needed.
Analyzing Performance Data
Do compare the outcome values broken out (disaggregated) by demographic characteristics (such as by age group, race/ethnicity, gender, educational level and location).
Do dig into the outcome data and compare outcomes for similar participant groups by different key service characteristics.
Do compare performance values over time.
Do compare performance values before versus after services to help assess whether changes in service delivery practices have been successful.
Do consider expressing targets as range of values rather than selecting a single number.
Don’t ask leading questions in participant surveys.
Don’t overemphasize rankings as a substitute for examining rankings when comparing organization units.
Don’t use “number of percentages of targets met” as a significant performance indicator.
Presenting Performance Findings
Do provide information identifying other departments, agencies or programs contributing to achievement of a given result.
Do identify any significant ways in which the data collection has changed over time.
Do clearly define each performance indicator. Both the data collectors and data users should be able to understand what is being measured.
Do provide information on the extent of uncertainty of any important findings reported.
Do provide explanatory notes in performance reports sharing the performance data especially for the important findings, both for major negative and positive findings.
Don’t report differences as statistically significant without also providing information on the size of the difference.
Don’t hide bad news.
Don’t cherry-pick the most favorable results for publication to hype your results.
Disseminating Performance Findings
Do provide regular reports (scorecards) on the findings and disseminate them to all interested staff and public officials including frontline workers.
Do provide summaries and highlights to report to audiences after each performance report is produced.
Do share findings with the press.
Do make the latest data on the performance indicators readily accessible to program managers through the year.
Do ask a staff data analyst to highlight the findings of interest.
Do hold regular “how are we doing” performance reviews with staff, using the findings from the performance measurement data as a starting point for those meetings.
Using Performance Findings
Do use performance measurement data as a major service learning and improvement tool.
Do follow up with program managers and staff members to discuss performance results.
Do train people at all levels in an organization on how to use performance measurements and what they mean.
Do use performance information to motivate employees and contractors to improve outcomes even if the
rewards are non-monetary recognition rewards.
Do connect with other agencies to tackle complex issues involving multiple agencies and programs.
Do reply to queries about findings.
Don’t assume the program caused notable changes in the outcomes
Don’t use performance measurements as a “gotcha’ exercise.
#StateandLocaPerformanceMeasurement #StateandLocalPerformanceManagement #StateandLocalGovernmentPerformance #StateandLocalGovernmentData #StateandLocalDataAnalysis #CityGovernmentPerformanceManagement #CityGovernmentPerformanceMeasurement #CityGovernmentPerformance #CityGovernmentData #CityGovernmentDataAnalysis #CountyGovernmentPerforanceMeasurement #CountyGovernmentPerformanceManagement #CountyGovernmentPerformance #CountyGovernmentData #CountyGovernmentDataAnalysis #DisaggregatingPerformanceInformation #DisaggregatingPerformanceMeasurement #ReportingStateandLocalGovernmentPerformance #StateandLocalPerformanceMeasurementRecommendations #HarryHatryAndPerformanceMeasurement #StateandLocalPerformanceManagementTraining #DisseminatingPublicSectorPerformanceFindings #UsingPublicSectorPerformanceFindings #PresentingPublicSectorPerformanceFindings #StateandLocalGovernmentManagement #UrbanInstitute #HarryHatry #ChrisMihm #MaxwellSchool #BandGReport #BarrettandGreeneInc