Fact Checking Politifact: Wrong About Jon Stewart’s Use of the Word “Misinformed”
mis·in·form (mis-in-form) tr.v. mis·in·formed, mis·in·form·ing, mis·in·forms: To provide with incorrect information.
Politifact often does a good service checking the veracity (or lack thereof) of controversial claims made by public figures. But they seem to have encountered some difficulty fact-checking Jon Stewart’s claim that Fox Viewers are “the most consistently misinformed media viewers.”
Politifact maintains that Stewart is wrong about Fox viewers, based on the findings of five different polls. Three were conducted by the Pew Foundation: 2007 Political Knowledge Survey, 2008 Media Survey, and the 2010 Media Consumption Survey. Two were conducted by the University of Maryland’s Program on International Policy Attitudes (PIPA): 2003 Misperceptions, the Media and the Iraq War, and 2010 Misinformation and the 2010 Election.
The three Pew polls measure how informed viewers are. They don’t even belong in the discussion, because they don’t go to Stewart’s point. “Do you know who the Secretary of State is” or “what is the name of the Vice President” are questions that you can answer if you’re paying attention. There is no shortage of people who go glassy-eyed and stupid while staring at cable news, and I’m proud to be one of them. I can feel the lull of Kathleen Parker’s voice shaving points off my IQ. I might be able to tell you who the Secretary of Education is under ordinary circumstances, but freely admit that listening to Bill Bennett drone on about anything is enough to flip the switch on enough synapses that answering any question becomes a challenge. It’s a guilty pleasure for people who don’t smoke pot.
On the other hand, the two PIPA studies measure how misinformed viewers are. That’s a very different yardstick. Listening to Dana Bash may freeze a few neurons in the “off” position, and I may not get the news value that I should out of the segment, but unless she says something that is manifestly untrue I can’t claim to have been misinformed.
Politifact loads the two PIPA studies at the bottom of the story, and dismisses one of them for having prompted the “fiercest counterattack” on methodological ground. One of the “fierce counterattack” links they supply goes to John Lott, who dismisses it as a “Soros funded poll” because Soros donated to the Tides Foundation, and the Tides Foundation donated to PIPA. (That’s some impressive methodology right there.) The other goes to Brent Bozzell, who dismisses PIPA as “liberal pranksters masquerading as pollsters.” Bet you never saw that one coming.
The first of the PIPA polls, the 2003 Misperceptions, the Media and the Iraq War poll, determined whether people believed the following to be true:
- There were links between Saddam Hussein and 9/11
- There were links between Saddam Hussein and al-Quaeda
- There was evidence of weapons of mass destruction
- There was evidence of chemical and biological weapons use
- World public opinion was favorable regarding the war in Iraq
All of those claims were manifestly false, yet anyone who watched Fox during the run up to the war can verify that they were repeated endlessly on the network (though Fox certainly wasn’t the only outlet guilty of that sin). The PIPA study identified those who held one or more of these misperceptions, based on their primary news source:
Eighty percent of Fox viewers were likely to have one or more misperceptions.
Although many of the questions from the 2010 survey were less black-and-white, Fox viewers were still more likely to believe that:
- Most scientists do not agree that climate change is occurring
- The stimulus legislation did not include any tax cuts
- The auto bailout only occurred under Obama
- When TARP came up for a vote most Republicans opposed it
Now you might try to advance the argument that these were broadly held Republican beliefs at the time, and Fox can not be held responsible for the fact that conservatives embraced them. But as the 2003 PIPA study showed (graph above), Republicans who watched Fox were more likely to hold misperceptions than all Republicans, 54% to 43%. They were also more likely to hold misperceptions than Republicans who got their news from PBS-NPR, who had a rate of 32%.
The way Stewart phrased the comment, it’s not enough to show a sliver of evidence that Fox News’ audience is ill-informed. The evidence needs to support the view that the data shows they are “consistently” misinformed — a term he used not once but three times.
That’s right. He did use the term “misinformed.” Not “ill-informed.”
By Politifact’s own measure, Jon Stewart was right when he claimed that “every study” has found Fox viewers to be consistently the most misinformed — because every study they cite which surveys “misinformation” draws that conclusion. Politifact is wrong to interject meaning that was outside of what Stewart claimed, and then use three polls that don’t apply to what he said to measure the veracity of that claim.
Politifact deserves credit for holding people accountable for the claims they make. But in order to be a trusted and reliable arbiter, an organization must also be scrupulous in policing themselves. Their post on Stewart needs revisiting — after a quick trip to the dictionary.