Monday, October 27, 2008


Part 9 the last installment

Chapter 8 "A New Direction"

Reviewed by Thomas Riggins

Moore tell us the big problem with the polls, which the pollsters themselves know, is that the polls are deliberately designed to NOT reveal what the American people are really thinking.

He points out that people can have opinions that are superficial or ones that are deeply held. Pollsters go out of their way to smooth over this difference because media clients want clear cut expressions of opinions.

People are also often ignorant of the issues they are asked about so pollsters fill them in (they are then no longer a representative sample) to get a definite answer. Sometimes pollsters do ask if people have heard about the issue, other times they don't-- depending on the issue and the kind of responses they want. "That's" Moore says, "a deliberately manipulative tactic that cannot help but undercut pollsters' claims of scientific objectivity."

When asking for opinions pollsters should always have a question that asks if the respondent knows or cares about the issue. Knowing the state of the ignorance of the public is just as important as knowing what it thinks and "suppressing it for commercial or other purposes is
simply unacceptable."

Moore also says a question should be asked about the "intensity" of the opinion. Pollsters should also stop supplying information to the respondents as that makes the poll "hypothetical" rather than an actual reflection of what people are thinking.

The following rule should be applied. Any poll that does not reveal that at least 20 per cent of the respondents are "disengaged" has probably been manipulated. The poll "should be viewed with deep suspicion."

Another thing to be wary of, according to Moore, is a device called the "national electorate." During primary season most polls take a nation wide survey and try to predict the primaries on that basis. This is why they are so often off course. It is too expensive to take state by state polls so the cheaper, and less accurate, "national electorate" is polled instead. If it can't be gotten rid of then at least, after asking "If the election were held today who would you vote for?" add a question about the degree of support for the respondent's choice--i.e., definitely would vote for, leaning towards but might change, have not really decided, etc.

In a section called "Fuzzy Opinion", we learn that wording can determine the outcome of a poll. For example, if you ask a question about the government's wanting to ban some action and use the term "not allow" instead of "forbid" more people will say they agree with the government. More people will agree with programs labeled as "assistance to the poor" that if the term "welfare" is used. More people will support "gay and lesbian relations" than "homosexual relations." So pollsters know how to get the results they want once they figure which buzz words to use or to avoid.

Even the order of the questions can make a poll fuzzy. Given a choice between two answers most people choose the second to the first. The order of questions is also important with multiple questions. Moore gives the example of Bill Clinton getting a better rating when he was rated after Al Gore was rated rather than before Gore was rated.

Moore concludes that "any measure of public opinion is at best a rough approximation of what people are thinking." The margin of error is only one of many ways polls can be misleading. He ends his book by saying the polls could be a better reflection of reality if they would only honestly try to measure the "extent of public disengagement" and not publish "false results to conceal public ignorance and apathy." However, there is no evidence that any of the major media polls are willing to do this. He hopes that their many contradictions will eventually shame them into being more honest with the public. As of now, they are doing a disservice to the democratic process.