5/4/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (9/9)Read NowChapter 8 "A New Direction"Moore tells us the big problem with the polls, which the pollsters themselves know, is that the polls are deliberately designed to NOT reveal what the American people are really thinking. He points out that people can have opinions that are superficial or ones that are deeply held. Pollsters go out of their way to smooth over this difference because media clients want clear cut expressions of opinions. People are also often ignorant of the issues they are asked about so pollsters fill them in (they are then no longer a representative sample) to get a definite answer. Sometimes pollsters do ask if people have heard about the issue, other times they don't-- depending on the issue and the kind of responses they want. "That's" Moore says, "a deliberately manipulative tactic that cannot help but undercut pollsters' claims of scientific objectivity." When asking for opinions pollsters should always have a question that asks if the respondent knows or cares about the issue. Knowing the state of the ignorance of the public is just as important as knowing what it thinks and "suppressing it for commercial or other purposes is simply unacceptable." Moore also says a question should be asked about the "intensity" of the opinion. Pollsters should also stop supplying information to the respondents as that makes the poll "hypothetical" rather than an actual reflection of what people are thinking. The following rule should be applied. Any poll that does not reveal that at least 20 percent of the respondents are "disengaged" has probably been manipulated. The poll "should be viewed with deep suspicion." Another thing to be wary of, according to Moore, is a device called the "national electorate." During primary season most polls take a nationwide survey and try to predict the primaries on that basis. This is why they are so often off course. It is too expensive to take state by state polls so the cheaper, and less accurate, "national electorate" is polled instead. If it can't be gotten rid of then at least, after asking "If the election were held today who would you vote for?" add a question about the degree of support for the respondent's choice--i.e., definitely would vote for, leaning towards but might change, have not really decided, etc. In a section called "Fuzzy Opinion", we learn that wording can determine the outcome of a poll. For example, if you ask a question about the government's wanting to ban some action and use the term "not allow" instead of "forbid" more people will say they agree with the government. More people will agree with programs labeled as "assistance to the poor" than if the term "welfare" is used. More people will support "gay and lesbian relations" than "homosexual relations." So pollsters know how to get the results they want once they figure which buzz words to use or to avoid. Even the order of the questions can make a poll fuzzy. Given a choice between two answers most people choose the second to the first. The order of questions is also important with multiple questions. Moore gives the example of Bill Clinton getting a better rating when he was rated after Al Gore was rated rather than before Gore was rated. Moore concludes that "any measure of public opinion is at best a rough approximation of what people are thinking." The margin of error is only one of many ways polls can be misleading. He ends his book by saying the polls could be a better reflection of reality if they would only honestly try to measure the "extent of public disengagement" and not publish "false results to conceal public ignorance and apathy." However, there is no evidence that any of the major media polls are willing to do this. He hopes that their many contradictions will eventually shame them into being more honest with the public. As of now, they are doing a disservice to the democratic process. AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association.
0 Comments
4/29/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (8/9)Read NowChapter 7 "Uncertain Future"In this chapter Moore tries to look ahead at the future of polling. He begins with a discussion of how the polls blew it with regard to their predictions of the outcome of the New Hampshire primary held in January 2008. 11 different polls forecast a defeat for Clinton and a victory for Obama. Clinton won. The main reason the polls were wrong, Moore believes, is because they stopped polling too early. There were many undecided voters and there was a big pro Clinton shift just a couple of days before the election. Besides stopping polling too early, what other things go on with polls that can make them unreliable? We have discussed some of the problems in earlier parts of this series so let's look at some new problems. First there is the problem of getting a "representative sample." Many polls have a big problem with non responsive contacts. So, there arises the question if those who do agree to respond are just as "representative" of the population as the larger group which includes responsive and non responsive individuals. One way to handle this problem is to check your representative sample against something called the CPS, i.e., "current population survey" which is provided by the U.S. Census. Moore says there are five groups that are underrepresented by polls, namely, "those who are younger, lower educated, Hispanic, nonwhite, and living in urban areas." This may look like a serious problem, but it is not. Pollsters can correct for this bias by using the CPS which allows for the "under represented people in the survey [i.e., the poll] to be accorded the same influence their group would have in the population at large...." Nevertheless, Moore does point out some problems that standard polling faces even with the CPS. The way most polls are conducted results in overrepresentation of Republicans and conservatives at the expense of moderates and Democrats. Even using "rigorous" techniques not usually applied to most polls in order to get a better representative sample one pollster admitted that "much of the population remained beyond our view." So there we were 2 weeks before the 2008 general election with the polls favoring Obama. Maybe so-- but we shouldn’t have trusted their accuracy. Especially when Moore says that since so many would be respondents would just fail to respond to a poll that "it represents an ever-present threat to the validity of all polls." There is also a problem Moore doesn't discuss: i.e., widespread vote rigging and voter intimidation by the Republicans that may throw off the polls and change the (official) outcome of the election. What about people who only use cell phones? The consensus was that not enough people only use cell phones to make lack of contact with this segment of the population a problem. Figures indicate that about 7 per cent of voters in 2004 only had cell phones and no land lines. These were also mostly younger voters. Using the CPS they can be weighted into the general population and it will not affect the representative sample which pollsters are trying to select. This may not always be the case and in January 2008 Gallop began cell phone interviewing as well as the typical landline interviewing technique. Moore said it's just a matter of time before this becomes a general practice for all the polls. And now, let's look at internet polling. This was not working out too well. Harris and Zogby did internet polling. They got volunteers to join a panel and then sent them questionnaires. The problem was their "panels tended to be disproportionately white, male, young, better educated, techno-oriented, and, apparently, conservative." They were simply not representative. At least as far as elections go, telephone polls appeared to be the better tool. Another problem with Harris' and Zogby's internet polls was that they were very secretive about how they arrived at their results. Moore says, "that they violate many of the scientific approaches developed over the past several decades for obtaining representative samples of the general public." There was one other poll, Knowledge Networks, doing internet polling that was trying to use the best scientific principles. It still had some problems but it seems if internet polling has a real future it will be along the lines established by this firm. Moore draws three conclusions at the end of this chapter. 1. Polls will continue to manufacture rather than report public opinion if they don't change the way they ask and analyze their questions. 2. As a result of not changing their methods they will continue to serve the powerful entrenched interests not the public. 3. Therefore, most people will start to see polls as the enemy of the democratic process rather than the help they could be. Since their real incentive is to make money serving their clients, not honestly reporting public opinion, I don't see how they are going to change. Coming up, the final part of this series, Chapter 8 "A New Direction." AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/27/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (7/9)Read NowChapter 6 "Damaging Democracy" This chapter is rather short compared to most of the others and somewhat repetitive. Moore draws some conclusions, based on the other chapters, about the effects the polls have on American democracy. Since, as we have seen, the polls misrepresent what the American public is thinking, and mislead everyone as to the popularity and winning chances of competing politicians (most of the time), he thinks most polls actually "damage the democratic process." If people believe false polls they may not oppose policies that they incorrectly believe to be popular. He opposes national polls of Democrats vs polls of Republicans because he thinks they don't really predict how people will actually vote, hence are misleading. Although general polls of the population about the popularity, at the time, of a candidate are a little better-- they are "at least trying to describe a future real-world event"-- the results of the general election in the case of the presidency-- he thinks they are "typically worthless." As earlier chapters have shown, polls taken more than two weeks out from the election are basically unreliable. They can be harmful to democracy as well insofar as they can be used "to suppress the truth about the electorate." He ends this chapter by remarking that "polling can be a useful tool for enhancing democracy, but only if it is used to serve the needs of the public , not the whims of the press." This would suggest that we need some sort of neutral national polling agency, perhaps run by academics with no bone to pick [a rare breed] and who were only interested in actually finding out what the public was thinking. At any rate, the system we have now, mostly sponsored by the mass media seeking viewers and readers, or by political action groups and special interests does more harm than good and is more akin to propaganda than news reporting. Next: Chapter 7 "Uncertain Future." AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/22/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (6/9)Read NowChapter 5 "Misleading the Public"In this chapter of his book Moore explains why so many polls contradict each other and generally misrepresent what the American people actually think about major policy issues. One of the problems is that many policy issues are both arcane and complex and many, if not most, people are not following the issue and basically don't really know what to think about it. This is a fact, Moore says, "that media pollsters generally do everything in their power to conceal. Rather than allow respondents to freely acknowledge they don't have an opinion, pollsters pressure them to choose one of the available options." One of the tricks of the trade in polling is that vastly different results can be obtained by how the questions in the poll are designed. This is especially the case, Moore points out, when people are not well informed about the issue and forced choice questions are presented to them. An example is the polling done by Frank Luntz, a Republican pollster working for Arctic Power, a group which favors drilling in the Arctic National Wildlife Refuge. He reported that drilling was favored 51% to 34%. This was a month after a poll by John Zogby for the Wilderness Society, an anti-drilling group, reported that drilling was opposed 55% to 38%. Zogby presented the issue as an environmental one, whereas Luntz presented it as an issue of energy independence. Another example. In 2003 an ABC/Washington Post poll found that Americans opposed the U.S. sending troops to Liberia 51% to 41% while CNN/USA Today/Gallup poll found that they approved sending the troops 57% to 36%. Moore quotes the editor in chief at Gallup as saying: "Opinions about sending in U.S. troops are therefore very dependent on how the case is made in the questions and what elements of the situation there are stressed to them [the respondents]. The two polls essentially manufactured their respective results "neither of which," Moore concludes, "told the truth about how little people knew and how unengaged they were from the issue." The next example is regarding the State Children's Health Insurance Program (SCHIP) whereby the federal government helps the state pay for children's insurance ( for families not poor enough for Medicaid nor rich enough to buy their own). Polls were taken in 2007 to see if the public supported this program. CNN said 61% supported SCHIP, CBS found 81%, ABC/WP found 72% and Gallup found 52% OPPOSED. Why this great disparity? It was "because each poll fed its respondents selected information, which the general public did not have. The real public, where only half of Americans knew anything about the program, wasn't represented in these polls at all." One last poll. In 2006 Americans were asked "Should it be more difficult to obtain an abortion in this country?" Pew asked twice and got Pew 1 66% YES, while Pew 2 only got 37% YES; Harris got 40% YES, and the CBS/NYT got 60% YES. It turns out that most Americans are not really informed about this issue and Moore says, "forcing them to come up with an opinion in an interview meant that different polls ended up with contradictory results." So what can we conclude regarding media polls claiming to tell us what the American people think? Well, Moore writes that, "By manipulating Americans who are ill informed or unengaged in policy matters into giving pseudo opinions, pollsters create an illusory public opinion that is hardly a reflection of reality." In other words, most opinion polls on public policy are junk. Next time, Chapter 6 "Damaging Democracy." AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/20/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (5/9)Read NowChapter 4 "Inscrutable Elections" Moore devotes this entire chapter to showing how the "polls often present a highly misleading if not outright false picture of how the candidates are faring and what voters are thinking." A good example is his report on a media poll conducted in New Hampshire in September 2007 to try and determine how the primary might turn out. There were two versions-- one widely reported in the media, the other basically ignored. In the first poll, the widely reported one, the standard "if the election was held today" who would you vote for, forced vote question yielded an 11% undecided total and 43% for Clinton to 20% for Obama (26% for others). But when the poll was given again with the FIRST question asking if people were still undecided about whom to vote for the undecided was 55%, Clinton 24%, Obama 10% and 11% other. Even though Clinton eventually won the huge 43% early lead was an illusion. Besides the "forced choice" question, which leads to unreliable results, there is another tactic pollsters use that leads to misleading results. If you remember, at the beginning of the 2008 race for president we were told that the polls revealed two "front runners"-- i.e., Clinton and Giuliani. They were never really "front runners", at that time, at all. How did the pollsters determine that they were? [Giuliani!!!] Moore says, "The problem with the apparent front-runner status of both Giuliani and Clinton is that it was based on national polls of Republicans and Democrats, respectively. These are people who don't vote-- at least not all in the same primary election, and maybe not at all." To get an accurate picture the pollsters would have to run separate polls on a state by state basis in the states with the first primaries. "Polling a national sample of Democrats and Republicans," Moore writes, "reveals nothing about the dynamics of the state-by-state nomination contest." There are three reasons why this isn't done-- it costs too much, people are too undecided, and the results of one state can affect the results in another state. But a nice juicy "front runner" story is better than nothing from the media's point of view. The conclusion to be drawn from this chapter is that polls won't tell you how the voters are really thinking (or feeling) about the candidates during most of the election cycle. Within about two weeks of the actual election most people have finally made up their minds so at this time, and only at this time, the "forced choice" question method will give a "fairly accurate estimate" of what is going on. "But," Moore warns, "in the weeks and months and years before the election, these polls give a misleading, even false, picture about the electorate." What is true about the electorate is also true about polls on public policy such as peace and war, abortion, the environment, etc. So, stay tuned for Chapter 5 "Misreading the Public." AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/15/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (4/9)Read NowChapter 3 "Telling Americans What They Think" In this chapter Moore explains how and why the polls so often go wrong and why even though the pollsters know how to end the problem they refuse to do so. We have all heard about the 1948 election polling fiasco: "Dewey Defeats Truman!" The reason all the major polls got it wrong is that they quit polling in the middle of October. Since 1948 they have learned that so many undecided voters are out there who don't finally make up their minds until the last two weeks before the election, that Moore says "no pollster today would even consider predicting a presidential winner based on polls conducted two weeks or longer before election day." So the best polls are those taken as close as possible to the election. The reason so many meaningless polls come out during the election season is that the media demand them to spark interest in their papers and shows (so they can pump up ad revenues-- it’s all quite cynical.) There is a phrase, from Gallup, that all pollsters use that goes something like this, "if the election were held today for whom would you vote." This is a forced choice question which results, as the pollsters know, in a false picture of reality. They refuse to first ask if the respondent is still undecided because that does not provide the dramatic results of a split nation, demanded by the media. Sometimes they may put a "no opinion" figure-- usually very small because respondents just refuse to make a forced choice. The question also does not reflect the degree of commitment of the respondent's forced choice and many people end up changing their minds nearer to election day. All of this is uninteresting to the media. Private polls done for individuals or parties can reflect all this information, but they are rarely made public. The media quite consciously dupes us-- remember that the next time your favorite News Anchor tells you what the polls say about the upcoming election or what the people think about this or that policy. Finally, Moore informs us that there is a "crisis" in polling today. The crisis is the result of "the refusal of media polls to tell the truth about those surveyed and about the larger electorate. Rather than tell us the 'essential facts' about the public they feed us a fairy-tale picture of a completely rational, all-knowing, and fully engaged citizenry. They studiously avoid reporting on widespread public apathy, indecision, and ignorance. The net result is conflicting poll results and a distortion of public opinion that challenges the credibility of the whole polling enterprise. Nowhere is this more often the case than in election polling." I would add that it is just not in polling that the media falsifies our view of the world. Our so-called "free" media constantly distorts the news. Try this experiment. Compare Fox News ABC, NBC,CBS-- the BBC and then Prensa Latina (the Cuban state news service) all on line with, say, coverage of the Middle East, or South America-- just as examples. I think you will know from this experiment which news service to trust in the future. Next Up, Chapter Four: "Inscrutable Elections" AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/13/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (3/9)Read NowChapter 2 “Manufacturing Public Opinion” Moore opens this chapter by pointing out that large sections of the public know little, and care even less, about many of the issues that pollsters are asking them to give opinions about. Since the pollsters want to have dramatic splits in public opinion (it makes the poll more interesting for their media clients) they use questions with force-choice answers (i.e., usually two choices are given and "unsure" is not given as a choice) and ignore the fact that many people don't know much about the issues. The polls thus often "distort or completely mischaracterize what the American public is really thinking." I almost said they "misunderestimate" what is going on. One of the tricks to get around public ignorance on the subject of the poll is to supply some information to the person being polled. "As you may know X has said that Y is the case. Do you agree with what X says or not." But now you have biased the sample population you are polling by giving them this information. They no longer represent a typical cross section of the public. All polls do this and thus get "a manufactured opinion based on a mythological public-- measures that look like they represent what a rational, informed, and engaged citizenry might be thinking." The Gallup people tried to get more honest reflections of public opinion. George Gallup decided on a five question poll that would also measure what the public knew about an issue. Moore reproduces the results of a 1953 poll concerning support for the Taft-Hartley Act. The result was CHANGE IT 19%, LEAVE IT AS IT IS 11%, REPEAL IT 3%, NO OPINION 7%, NOT FOLLOWED THE ISSUE 60%. This approach has not been adopted because the media clients of the polls don't consider it newsworthy to report that 2/3’s of the public is not aware of the issues they are reporting on. Even though the polling companies know this type of poll is more accurate they have decided to rely mostly on the forced-choice method because it gets the big dramatic results their media clients want. This is shocking because they have no concept of the "truth" but only want to sell their services to their media clients who also have no concept of "truth" but only want more readers or viewers. This chapter definitely points out the three main ways in which the polls falsify public opinion. 1) By not pointing out how much of the public is uninformed about the issues or doesn't care. 2) By using forced choice methods to get a response the pollster wants rather than what the person being polled would really have responded. This is a variant of the first way. 3) By supplying the person being polled with information he or she didn't have before (as a way of getting a "choice") and thus biasing the sample. Most Americans would not have had the information that was supplied so the polling sample is not really representative. This 3rd point also could lead to a 4th in so far as the information supplied by pollsters is mostly an oversimplified presentation of the issues. Moore ends his chapter with a quote from Daniel Yankelovich, a great pollster himself and with integrity: "Sad to say, the media who sponsor opinion polls on policy issues have little or no stake in the quality of the poll findings they report." Coming up next: Chapter Three "Telling Americans What They Think" AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/8/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By: Thomas Riggins (2/9)Read NowChapter 1 “Iraq and the Polls-- The Myth of War Support” Discussing the preface to this book in Part 1, we reviewed the claim that the polls distort and misrepresent public opinion. Here is a case in point: how the polls manufactured a pro Iraq War sentiment to bolster the claims of the Bush administration. In the run up to the war in 2003 the press was reporting a pro war mood in the country. The pollsters typically asked their questions giving two choices and forcing the interviewee to choose an answer. Sometimes a control question was asked when the polling company really wanted to find out what was going on. The CNN/USA Today/Gallup Poll did this in February 2003 [the war began about a month later] and found that about 30% supported it, 30% were against it, and 40% could care less. That is hardly a pro-war mood when 70% either don’t care or oppose it. Moore points out that this neutral factor was not measured by the other polls and practically ignored by the CNN/USA Today/Gallup Poll itself. Ignoring the control question “reveals much,” Moore adds, “about the way that media polls manufacture public opinion for their own purposes.” The problem is, Moore says, that polls want an answer to their questions even if the people they are asking don't know or care about the issue. Moore says we should distinguish between DIRECTIVE (the person really wants the opinion carried out) and PERMISSIVE (the person doesn't really care what happens) OPINIONS. Moore gives the example of the poll mentioned above. Are you for the war?-- 59%. Against the war-- 38%. No opinion-- 3%. That is the standard "forced answer" poll. It looks like the people want war! This would have been the original result had not Gallup asked a control or follow up question. People were asked if they would be upset if their opinions were not carried out-- i.e., if the government did the opposite of what they thought. Now if you take the strong or directive opinions on the war (yes vs no) along with the permissive, don't care group (plus the no opinion group) you get the following. For war-- 29%, Against war 30%, no opinion, unsure, don't really care 41%. So reporting that 59% favored the war would not have been a true statement of how the public really felt. Most polls (including Gallup) don't usually use a follow up question so most polls are deceitful. The truth was that about 71% of the people didn't want war, were unsure or didn't care one way or the other. Moore also makes a distinction between "top-of- mind" responses and reasoned ones. That is between an opinion that is just what someone has heard about or been told about (say by a pollster) but really doesn't know much about, and one that has been arrived at after thinking about it and reading about it. This is the difference between a knee-jerk response and a well thought out one. To get "newsworthy" polls for their clients (the big media) most pollsters lump these two groups together-- even though the answer of the "top-of-the-mind" person may have been elicited by the form or wording of the question itself. Here is another example of a misleading poll. It was once claimed that most Americans supported what the government was doing at Guantanamo. In 2007 Gallup pollsters did a standard poll asking if Gitmo should be closed or not. They got this answer: yes, close it-- 33%, no-- 53%, undecided-- 13%. But when a control question was asked (as in the Iraq war poll above) i.e., would you mind if the government did just the opposite of what you think, the response was modified to yes, close it-- 19%, no-- 28%, undecided, don't care-- 52%. A big difference as you can see! Finally, remember the anti missile shield? In 2002 Bush took the US out of the 1972 anti ballistic missile treaty and claimed he had the support of the American people. Forced choice polls had been taken and seemed to back him up-- the majority of American people were for the anti missile shield. Gallup did a forced choice poll (only two answers allowed but a person could volunteer an "I don't know.") Here is how the poll turned out. For the shield 64%, against it 30%, neutral 6%. Then Gallop did the same poll but again with a control question that allowed people who didn't care or didn't know anything about what was going on to op out. This result was for it 29%, against 13%, neutral 59%. The second poll gave a much truer picture of what Americans were thinking than did the first. Moore says opinions are easily manipulated and "that on all sorts of issues, the media polls continually and systematically distort public opinion, with severe consequences for us all." Just ask yourselves if control questions were used in any of the polls that came out saying how popular Palin is. If not, why not? What about polls on Putin, Biden, Trump? Coming up Wednesday— a review of Chapter Two-- "Manufacturing Public Opinion". AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. 4/6/2022 The Opinion Makers: An Insider Exposes the Truth Behind Polls. David W. Moore. Reviewed By. Thomas Riggins. (1/9)Read NowThis is an important book to read, especially during this electoral season. It's not very long (161 pages) but thoroughly explains how public opinion polls are manipulated in this country to produce the results wanted by those who use them to influence and shape public perceptions of reality. As professor Mark Crispin Miller of New York University says, the book presents "A powerful argument that polls do not merely misinform us but pose a genuine, if subtle, threat to our democracy." The author, David W. Moore, knows what he is talking about having been a senior editor for the Gallup Poll for thirteen years. Anyone who wants to know how the polls are used to manipulate public thinking could do no better than to buy and read this book, either from Beacon Press or through Amazon.com. In the meantime, I will go over the salient points in the book so that readers will be better prepared not to be taken in by questionable polls. I will begin with the preface. Moore begins by reminding us, with a quote (from political scientists Lawrence Jacobs and Robert Shapiro): "Whether democratic government survives is not foreordained or guaranteed. What is critical is creating the expectation that substantial government responsiveness to public opinion is appropriate and necessary." [This brings to mind former Vice President Cheney's response to being informed that the majority of the American people were against the Bush administration's policies in Iraq. "So?"] Public opinion polls are one way that the American people's will can be expressed. But, as Moore says, "For years, we pollsters have systematically misled the American people about the accuracy of our polls." Polls do state a margin of error but the following statement should be attached to every poll: "In addition to sampling error, question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of public opinion polls." Moore, in fact, when at Gallup included that statement in a report to a client who had commissioned a poll. The client's response was "It essentially says you can't trust any of the numbers. What good is a report like that?" Since this would make all polls doubtful, Moore says that the polling industry and their clients just ignore this qualification. What this means is that polls are "rough estimates" and not, as they try to claim to be, "precise reflections of reality." If you check closely "you will see large variations among highly reputable polling organizations." [I would question the use of the term "highly reputable"!] Which polls does Moore have in mind. The four most important are The New York Times/CBS News Poll, The Washington Post/ABC News Poll, Pew Research, and the USA Today/Gallup Poll. Other polls he mentions that are less influential are CNN, NBC/Wall Street Journal, Time, Newsweek, Associated Press/Ipsos, Los Angeles Times, Fox, John Zogby [sometimes with Reuters], and Harris Interactive. What do these polls all have in common? They "give us distorted readings of the electoral climate, manufacture a false public consensus on policy issues, and in the process undermine American democracy." Stay tuned to see how they do this. Coming up Friday, a review of Chapter One: "Iraq and the Polls-- The Myth of War Support". AuthorThomas Riggins is a retired philosophy teacher (NYU, The New School of Social Research, among others) who received a PhD from the CUNY Graduate Center (1983). He has been active in the civil rights and peace movements since the 1960s when he was chairman of the Young People's Socialist League at Florida State University and also worked for CORE in voter registration in north Florida (Leon County). He has written for many online publications such as People's World and Political Affairs where he was an associate editor. He also served on the board of the Bertrand Russell Society and was president of the Corliss Lamont chapter in New York City of the American Humanist Association. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |