DO NOT BELIEVE EVERYTHING YOU READ
- greenebarrett

- 1 day ago
- 3 min read
We’ve just come across a study titled “Best and Worst States to Be a Police Officer by WalletHub. This is just the kind of study that we like to follow for use on this website. In the study, California was ranked as the best.
Then we took a moment to reflect on the findings of this research, and it struck us that it may not make sense to look at the best place to be a police officer by state when this is really a local issue. It could well be that it’s terrific to be an officer in one city and a nightmare of a job elsewhere – particularly in a state like California where there’s such huge variation from place to place. So, though WalletHub clearly outlined its criteria (for which it gets credit), giving a little thought to its findings made us somewhat more dubious.
We’ve been concerned about studies like this, as over the years, we’ve happily written about scores of reports (probably hundreds) that are focused on broad areas of state and local government, like budgeting, human resources, performance management, infrastructure and so on. And we’ve been involved in creating some of these, too.
All of this made us reflect on the ways in which we try to determine the value, validity and common sense of reports about state and local governments.
Some time ago, we wrote about the reasons that we distrust some of the studies and reports that we read because of hidden flaws. Sometimes, these are just sloppy errors. Anyone can make a mistake. But the warning signs have remained consistent. We often spot them when we probe for more information. For example, our suspicions escalate:
· When a study focuses on a common problem in a government program, but the author can’t produce a concrete example or answer basic questions we pose when we call.
· If the data chosen to support a report’s findings is too old to logically represent current conditions or if a survey sample is too small to extrapolate a credible conclusion.
· When a report on a controversial topic includes only the supporting evidence on one side of an issue, ignoring a stockpile of facts that take the other side. (We’re not looking for an “on the one hand; on the other hand” approach, but researchers owe it to the audience to acknowledge a contrary point of view.)

In a blog post that we wrote more than a year ago, we recalled a time when we wrote and co-produced a documentary about Walt Disney. For that project, we interviewed 77 people. Many knew Disney personally. Others were film and Disney company historians.
In a conversation that was taped with one, he told us that Disney’s father had never had any success in life. We pointed to information we knew that contradicted this point and the historian said the following: “Yes, that may be true, but it doesn’t fit into the theme.”
Years ago, one of our editors (who we’ll not mention by name, for obvious reasons), complained that “we hadn’t come to our conclusions” before we embarked on reporting. We were younger at the time and lacked the courage to argue that this approach was ridiculous. Fortunately, it didn’t take long to know that it was also relatively rare.
Most of the academics and journalists we know do try to be fair and balanced in their work. But we are watchful for situations in which that’s not the case.
And that’s a problem, particularly when other researchers rely on misleading published narratives, the false conclusions can be repeated until everyone believes they’re true.
#StateandLocalManagement #StateandLocalGovernmentPerformance #StateandLocalManagementResearch #StateandLocalPerformanceResearch #StateandLocalGovernmentReports #StateandLocalReports #StateandLocalStudies #StateandLocalResearch #StateandLocalResearchFlaws #WaltDisneyDocumentary #WaltDisneyManBehindTheMyth #StateandLocalGovernmentData #StateandLocalDataQuality #UnbalancedResearchReports #StateRankingFlaws #QuestionableSurveyResearch #ReportBias #MissingData #OldData #WalletHub #BandGReport #BarrettandGreeneInc




Comments