Measuring Public Opinion GV344 Activity Introduction Hey, there, (Name) here! Alright, so if you wouldn t mind just filling out this short questionnaire, we can get started here. Do you think I am A) awesome, B) amazing, C) the best, D) rad, or E) too wonderful for words? Don t worry, there are no wrong answers here, just go with your gut! I m looking for your honest opinion! Of me! Video 1 - Introduction While I investigate how scientific my opinion poll is, why don t you take a look at this video that explains why we take opinion polls in the first place. Roll tape! What s that you say? You say you think I may have left some options off this list? Hunh I ll have to look into that Video 1 Excuse me, sir or madam, how old are you? What is your ethnic group? How much do you make a year? Where do you live? Are you married or single? How many children do you have? How did you vote in last year's election? Will you vote in this election, and who will you be voting for?
Every day thousands, if not millions, of people are asked these and many other questions. Is it some grand conspiracy to steal their identity? No, it's only the effort that people are using to measure public opinion. Who wants this information? Everyone from the government and political candidates to companies that sell you cars and bubble gum. What are they going to do with this information? Create public policy, or sell you that new car. One of the tools that is used is the public opinion poll. A poll is when a small number of people, called a sample, are asked a set of questions. The idea is that this sample will reflect the feelings of the population as a whole. Polls seek out public opinion on everything under the sun; which brands of toothpaste we use, how much television we watch, where are we going to go on our next vacation, and on and on. Some polls are concerned with trying to discover American's political opinions. This is a fairly complicated process. You have to create a set of questions that will result in valid, accurate results. No poll is perfect, but a valid poll recognizes the degree of uncertainty in its result. This measurement is known as the margin of error. For example, if 48% of the sample polled said that they liked chocolate broccoli, there is a 5% margin of error; the actual percentage of the population that like chocolate broccoli could be anywhere from 43% to 53% of the population. That is, up to five percentage points less than 48% or five percentage points more. For a poll to be well constructed, it must be conducted on a random, but representative, sample of the population and contain valid questions. The major polling organizations such as the Gallup organization and the Harris survey go to great lengths to make sure that their polls ask valid questions of carefully chosen samples. So how well do they work? A well-constructed scientific poll does a good job of measuring public opinion, though they do have limits. No poll is perfect, but they are pretty good indicators of
what the public thinks about candidates, issues and products. At least, no one has found a better way of measuring that yet. Well, if there are scientific polls, there must be unscientific polls as well, and indeed, there are, but they don't provide useable information or results. Let's say you take an internet poll. What do the results really tell us? Not much, since it's not a good sample of the public, internet users, or even users of the web site, it's just a tally of the opinions of people that took the time to complete the survey. That means that any conclusions drawn about those opinions will not reflect the opinions of the population. Even valid polls have difficulty measuring the strength of an opinion, that is its intensity. How could a poll measure an opinion's stability, or how much that opinion will change? And what about the opinion's relevance? For this reason, polls are conducted nearly nonstop during a Presidential election. Pollsters know that public opinion changes about the candidates, and so that opinion has to be measured again and again. We've learned about how the media uses polls, but what about polls on the media? Since the media has great power to influence political questions, pollsters regularly try to find out what Americans think about the media. The Pew Research Center for the people and the press takes comprehensive surveys every two years about American's opinions of the media. Let's take a look at some of the results of that survey, the last one conducted in 2007. For the most part, people have positive feelings about the media, though they seem to prefer local sources over national news. The Pew Center has also combined results from six of its surveys to show how attitudes toward the media compare with attitudes toward different areas of the government. Looking at the data, apparently as much as people like to complain about the news and those who deliver it, they still have a better opinion of the media than of the government.
When done correctly, public opinion polls can provide a wealth of information about Americans and their likes and dislikes. This includes their views on politics and the media. In the next lecture, we'll look at the mistakes that occur when polls are done incorrectly. Video 1 Recap So, a poll tells us how the population as a whole feels about things. And polls are conducted to get people s opinions on everything from politics to underarm deodorant. It seems everybody wants to know what you think! But polls are only useful if they meet certain requirements. You can t ask three tall people how they feel about underwater basket-weaving and then say you know how everybody feels about it! Anyway, you can see the video again to catch anything you may have missed, or we can keep going. Video 2 - Introduction So, in nineteen thirty-six, George Gallup correctly predicted that FDR would win the presidency, and Literary Digest magazine was thoroughly embarrassed by its prediction that Alfred M. Landon would win by a landslide.
How did Literary Digest get it so incredibly, incredibly wrong? I m glad you asked! This video will explain how polls don t always get it right. Video 2 So far, we've seen that poll results are used as evidence for or against a policy and as an indicator of how people feel about a candidate, so it is important that polling figures be accurate, but what happens when they aren't? When deciding whether a poll is likely to be right or wrong about a national issue, the first thing to consider is the type of poll and who conducted it. If a poll is something other than a scientific poll, and we've talked about this before, you can ignore it. This rules out internet polls, call-in polls, newspaper and magazine polls and any other sort of straw poll. About the only real use of this sort of poll might be in helping a magazine determine the views of its reader, but remember, they won't provide any meaningful information about the general U.S. population. The most reliable scientific polls are by national polling organizations. These companies' reputations and profits rely on providing accurate information. That means that their polls are much more likely to be conducted according to strict procedures to ensure valid results. Groups like Gallup, Harris, Zogby and Yankelovich, as well as polls by major media outlets are conducted to create valid polls and provide a margin or error for their results. Keep in mind that the results of these polls are seldom wrong about opinions at a particular point in time, but as you'll recall, polls have a great difficulty measuring the stability of people's opinions. That is, the same questions asked to the same sample at a different time will almost certainly yield different poll results. This doesn't mean the first poll was wrong, it just means that people's minds have changed.
One case in which poll results can seem wrong is when the difference between two measures in a poll falls within the margin of error. Let's see what that means. Suppose a Gallup poll indicates that 48% of the public favors candidate A, and 46% favors candidate B. The other 6% is split among minor candidates and undecided. Gallup calculates that the margin of error for this poll is 3%. That means that candidate A's support could range anywhere from 45% to 51%, and candidate B's could range anywhere from 43% to 49%. Notice that the candidates overlap in the range of support from 45% to 49%., so if B winds up winning the election with 49% of the vote, and A ends up with only 46%, it seems as if the poll was wrong. But given the closeness of the candidates and the margin of error, in fact, the poll was right on the money. When deciding how much faith to place in a given poll, bear in mind that scientific polls are excellent at determining what the public opinion is, but poor at determining how strongly the opinion is held, how likely it is to change, and how important it is to the people who hold it. There are two fairly easy ways that this can help you evaluate a poll. First, compare it to others on the same topic conducted around the same time. Do they all give the same results? Secondly, compare it to others on the same topic conducted at different times. Do they show a consistent trend? Do they indicate change? Good polls give good results, but their limitations must be kept in mind. Here's an example in political history of polls predicting the outcome of Presidential elections and getting the results spectacularly wrong. In 1948, the incumbent President, Democrat Harry Truman, was given virtually no chance against his Republican challenger, Tomas Dewey, the Governor of New York. On election night, the Chicago Tribune, a Republican leaning newspaper, printed an edition with the headline, "Dewey
Defeats Truman." The only problem was that Truman defeated Dewey. The newspaper had based its predictions on poll results. So what went wrong? Pollsters had conducted many of their polls by telephone, and in 1948, many poor and working class people, traditional Democratic supporters, did not own phones, and so poll results did not take their preferences into account. So we see the problem of unrepresentative samples. Even worse, the pollsters were so confident of their results early on that they stopped polling weeks before election day, and Truman benefited from a surge of popularity late in the campaign. Pollsters had failed to take into account the instability of public opinion. This polling disaster gave a strong push to scientific polling, and there has not been a comparable polling failure since 1948. Remember, polls are tools; how well they work depends on how well they are made. Video 2 - Recap The election of nineteen thirty-six wasn t the last time somebody messed up their prediction of who would win. The Chicago Tribune made a similar mistake during the election of nineteen forty-eight. But even when a poll is very scientific and its results are accurate, no poll can tell us how strongly people feel about their opinions. How strongly do YOU feel that you got everything you needed from that video? You can watch it again, or we can move on to a question.