This page is optimized for mobile devices, if you would prefer the desktop version just click here

6.2 How is public opinion measured?  (Page 8/21)

People may also feel social pressure to answer questions in accordance with the norms of their area or peers.

Nate Silver. 2010. “The Broadus Effect? Social Desirability Bias and California Proposition 19.” FiveThirtyEightPolitics . July 27, 2010. http://fivethirtyeight.com/features/broadus-effect-social-desirability-bias/ (February 18, 2016).
If they are embarrassed to admit how they would vote, they may lie to the interviewer. In the 1982 governor’s race in California, Tom Bradley was far ahead in the polls, yet on Election Day he lost. This result was nicknamed the Bradley effect    , on the theory that voters who answered the poll were afraid to admit they would not vote for a black man because it would appear politically incorrect and racist. In the 2016 presidential election, the level of support for Republican nominee Donald Trump may have been artificially low in the polls due to the fact that some respondents did not want to admit they were voting for Trump.

In 2010, Proposition 19 , which would have legalized and taxed marijuana in California, met with a new version of the Bradley effect. Nate Silver, a political blogger, noticed that polls on the marijuana proposition were inconsistent, sometimes showing the proposition would pass and other times showing it would fail. Silver compared the polls and the way they were administered, because some polling companies used an interviewer and some used robo-calling. He then proposed that voters speaking with a live interviewer gave the socially acceptable answer that they would vote against Proposition 19, while voters interviewed by a computer felt free to be honest ( [link] ).

Nate Silver. 2010. “The Broadus Effect? Social Desirability Bias and California Proposition 19.” FiveThirtyEightPolitics . July 27, 2010. http://fivethirtyeight.com/features/broadus-effect-social-desirability-bias/ (February 18, 2016).
While this theory has not been proven, it is consistent with other findings that interviewer demographics can affect respondents’ answers. African Americans, for example, may give different responses to interviewers who are white than to interviewers who are black.
D. Davis. 1997. “The Direction of Race of Interviewer Effects among African-Americans: Donning the Black Mask.” American Journal of Political Science 41 (1): 309–322.

In 2010, polls about California’s Proposition 19 were inconsistent, depending on how they were administered, with voters who spoke with a live interviewer declaring they would vote against Proposition 19 and voters who were interviewed via a computer declaring support for the legislation. The measure was defeated on Election Day.

Push polls

One of the newer byproducts of polling is the creation of push poll     s , which consist of political campaign information presented as polls. A respondent is called and asked a series of questions about his or her position or candidate selections. If the respondent’s answers are for the wrong candidate, the next questions will give negative information about the candidate in an effort to change the voter’s mind.

In 2014, a fracking ban was placed on the ballot in a town in Texas. Fracking, which includes injecting pressurized water into drilled wells, helps energy companies collect additional gas from the earth. It is controversial, with opponents arguing it causes water pollution, sound pollution, and earthquakes. During the campaign, a number of local voters received a call that polled them on how they planned to vote on the proposed fracking ban.

Kate Sheppard, “Top Texas Regulator: Could Russia be Behind City’s Proposed Fracking Ban?” Huffington Post , 16 July 2014. http://www.huffingtonpost.com/2014/07/16/fracking-ban-denton-russia_n_5592661.html (February 18, 2016).
If the respondent was unsure about or planned to vote for the ban, the questions shifted to provide negative information about the organizations proposing the ban. One question asked, “If you knew the following, would it change your vote . . . two Texas railroad commissioners, the state agency that oversees oil and gas in Texas, have raised concerns about Russia’s involvement in the anti-fracking efforts in the U.S.?” The question played upon voter fears about Russia and international instability in order to convince them to vote against the fracking ban.

These techniques are not limited to issue votes; candidates have used them to attack their opponents. The hope is that voters will think the poll is legitimate and believe the negative information provided by a “neutral” source.

Summary

The purpose of a poll is to identify how a population feels about an issue or candidate. Many polling companies and news outlets use statisticians and social scientists to design accurate and scientific polls and to reduce errors. A scientific poll will try to create a representative and random sample to ensure the responses are similar to what the actual population of an area believes. Scientific polls also have lower margins of error, which means they better predict what the overall public or population thinks. Most polls are administered through phones, online, or via social media. Even in scientific polls, issues like timing, social pressure, lack of knowledge, and human nature can create results that do not match true public opinion. Polls can also be used as campaign devices to try to change a voter’s mind on an issue or candidate.

<< Chapter < Page Page > Chapter >>
MCQ 3 FlashCards 2 Terms 8

Read also:

OpenStax, American government. OpenStax CNX. Dec 05, 2016 Download for free at http://cnx.org/content/col11995/1.15
Google Play and the Google Play logo are trademarks of Google Inc.
Jobilize.com uses cookies to ensure that you get the best experience. By continuing to use Jobilize.com web-site, you agree to the Terms of Use and Privacy Policy.