Back in 2020, then President Donald Trump proclaimed that “The murder rate in Baltimore and Detroit is higher than El Salvador, Guatemala or even Afghanistan.” That statement was misleading and part of it was outright false, but even beyond that, he left out the fact that reported homicides in Detroit were near 50-year lows.
Currently, Detroit has the third highest homicide rate in the country, according to World Population Review, which is still an unfortunate state of affairs. But look at the trends and a new picture emerges. According to the city’s data, it “finished 2023 with 252 homicides, the fewest recording since 1966.”
Most experts would agree – and Detroit is a perfect illustration – that any single point of data can be misleading if it’s not put into a broader framework, often with the use of trend lines. As Ron Holifield, CEO of Strategic Government Resources, told us, “When you’re just looking at a single piece of data without context it’s like looking through a peephole without seeing the entire room. Under the worst of circumstances that leads to a false and misleading perception.”
It’s certainly easy for reporters to take a single point of data from a recent year and turn it into a headline (either positive or negative). But historical perspective changes a single piece of information into something that’s genuinely informative.
Says Liz Steward, the vice president of marketing and research at Envisio, a strategy and performance management software company. “Only sharing point in time data can be worse than providing no data at all because showing an individual number can minimize a very big problem or exaggerate one.”
Sometimes, it’s in the interest of a reporter or an advocacy group to avoid looking further than a single digit and use it as representative of a full story. “If you see a number that supports your argument it might be easier to just take it, without digging deeper,” according to Sam Gallaher, head of data science at Third Line an audit and financial management software company. “It’s definitely a challenge in doing research and being open to numbers that challenge your hypothesis. It takes some real effort to get past that.”
On the flip side, digging a little deeper into statistical history can turn a bad news story into a good one. Entities that understand this and make a point of it can help the press to get the story right. We developed a deep understanding of this in the years that preceded our work on the Government Performance Project. As we’ve recalled in this space, “In the early 1990s Alabama’s leaders took a very poor grade in our evaluations of state government management capacity for the long-defunct Financial World magazine and compared them to our prior --- and even worse—evaluation. The state got some very positive reports in the local press by pointing to the improvement, with promises of more to come.”
It's worth noting, however, that simply showing information one or two years back can have the perverse effect of misleading people when the most recent historical data was misleading. For example, comparing data in the last year or so to that which was accurate during the depths of the pandemic can lead to misunderstandings. As a result, many data-wise organizations are comparing current data to that which was generated pre-pandemic. For example, when the Pew Charitable Trusts examined employment rates last summer, it compared first quarter 2023 numbers with those from early 2020.
As Mike Maciag, a policy researcher and former data analyst told us, “I’m sure you guys have come across dashboards, where they show information compared to the prior year, which is better than nothing. But snapshots are snapshots, and you’re comparing things to a point in time and that can be misleading when the prior year was abnormal.”
While space limitations may prevent many sources of data from featuring a table that shows ten years of prior data, there is an alternative that can help: Compare current year data to a five-or-ten-year average.
While there’s no control over how the press or social media outlets use data, state and local governments can help to keep the public better informed by making it easier for others to get a reasonable understanding of its meaning. “Reporters might, if they have time, go back and look at trend lines,” says Maciag. But a lot of times that’s difficult to find.”
Cities, counties and states that produce well-wrought publicly available dashboards can help overcome that challenge. Take Corona, a city of about 166,000 in Riverside County, California. Its dashboard shows point-in-time data for a number of key performance indicators, but very clearly directs users to historical data.
For example, average response time to a fire there in the most recent quarter was four minutes and 53 seconds. Was that good? Bad? Indifferent? Taken on its own, this number lacks meaning. But at the click of a button you can see that eight quarters ago, it was 5 minutes and ten seconds, and the trend line shows that though there have been ups and downs, the fire department has been bringing that number down steadily.
In the final analysis, Nate Silver author of “The Signal and the Noise: Why So Many Predictions Fail But Some Don’t" had it just right, when he wrote “Data is Useless Without Context.”
#StateandLocalGovernmentData #StateandLocalPerformanceMeasurement #CityData #DetroitHomicideRate #StrategicGovernmentResources #RonHolifield #Envisio #CityCrimeData #CityTrendData #DataTrends #DataContext #LyingWithStatistics #CityDataWithoutContext #PublicSectorDashboards #MisleadingData #MisleadingCityData #LizSteward #Envisio #ThirdLine #PewCharitableTrusts #MikeMaciag #ElizabethSteward #FinancialWorldMagazine #GovernmentPerformanceProject #StateGovernmentEvaluations #DataSnapshotsvsTrendLines #CityofCoronaCA #RiversideCounty #DashboardBestPractice #RonSilver #TheSignalandtheNoise #CityGovernmentPressCoverage #StateandLocalGovernmentManagement