THE OPINION MAKERS: AN INSIDER EXPOSES THE TRUTH BEHIND THE POLLS by David W. Moore
Part 6
Chapter 5 "Misleading the Public"
Reviewed by Thomas Riggins
In this chapter of his book Moore explains why so many polls contradict each other and generally misrepresent what the American people actually think about major policy issues. One of the problems is that many policy issues are both arcane and complex and many, if not most, people are not following the issue and basically don't really know what to think about it. This is a fact, Moore says, "that media pollsters generally do everything in their power to conceal. Rather than allow respondents to freely acknowledge they don't have an opinion, pollsters pressure them to choose one of the available options."
One of the tricks of the trade in polling is that vastly different results can be obtained by how the questions in the poll are designed. This is especially the case, Moore points out, when people are not well informed about the issue and forced choice questions are presented to them. An example is the the polling done by Frank Luntz, a Republican pollster working for Arctic Power, a group which favors drilling in the Arctic National Wildlife Refuge. He reported that drilling was favored 51% to 34%. This was a month after a poll by John Zogby for the Wilderness Society, an anti-drilling group, reported that drilling was opposed 55% to 38%. Zogby presented the issue as an environmental one, whereas Luntz presented it as an issue of energy independence.
Another example. In 2003 an ABC/Washington Post poll found that Americans opposed the U.S. sending troops to Liberia 51% to 41% while CNN/USA Today/Gallup poll found that they approved sending the troops 57% to 36%. Moore quotes the editor in chief at Gallop as saying: "Opinions about sending in U.S. troops are therefore very dependent on how the case is made in the questions and what elements of the situation there are stressed to them [the respondents]. The two polls essentially manufactured their respective results "neither of which," Moore concludes, "told the truth about how little people knew and how unengaged they were from the issue."
The next example is regarding the State Children's Health Insurance Program (SCHIP) whereby the federal government helps the state pay for children's insurance ( for families not poor enough for Medicaid nor rich enough to buy their own). Polls were taken in 2007 to see if the public supported this program. CNN said 61% supported SCHIP, CBS found 81%, ABC/WP found 72% and Gallup found 52% OPPOSED. Why this great disparity? It was "because each poll fed its respondents selected information, which the general public did not have. The real public, where only half of Americans knew anything about the program, wasn't represented in these polls at all."
One last poll. In 2006 Americans were asked "Should it be more difficult to obtain an abortion in this country?" Pew asked twice and got Pew 1 66% YES, while Pew 2 only got 37% YES; Harris got 40% YES, and the CBS/NYT got 60% YES. It turns out that most Americans are not really informed about this issue and Moore says, "forcing them to come up with an opinion in an interview meant that different polls ended up with contradictory results."
So what can we conclude regarding media polls claiming to tell us what the American people think? Well, Moore writes that, "By manipulating Americans who are ill informed or unengaged in policy matters into giving pseudo opinions, pollsters create an illusory public opinion that is hardly a reflection of reality." In other words, most opinion polls on public policy are junk.
Next time, Chapter 6 "Damaging Democracy."