Skeptoid #883: Lies, Damned Lies, and Polls
Watch out next time you take a poll... is someone trying to learn about you, or manipulate you? Learn about your ad choices: dovetail.prx.org/ad-choices
Watch out next time you take a poll... is someone trying to learn about you, or manipulate you? Learn about your ad choices: dovetail.prx.org/ad-choices
| Time | Text |
|---|---|
|
Scrutinizing Pollster Lies
00:13:49
|
|
| What are you going to do the next time a pollster reaches out to you with a quick survey? | |
| Are you going to ignore it? | |
| Take it? | |
| One thing you might not think to do is to carefully scrutinize it and look for some certain telltale signs, because it might not be what it appears. | |
| In this world, there are lies, damned lies, and polls. | |
| That's coming up right now on Skeptoid. | |
| Hi, I'm Alex Goldman. | |
| You may know me as the host of Reply All, but I'm done with that. | |
| I'm doing something else now. | |
| I've started a new podcast called Hyperfixed. | |
| On every episode of HyperFixed, listeners write in with their problems and I try to solve them. | |
| Some massive and life-altering, and some so minuscule it'll boggle your mind. | |
| No matter the problem, no matter the size, I'm here for you. | |
| That's HyperFixed, the new podcast from Radiotopia. | |
| Find it wherever you listen to podcasts or at hyperfixedpod.com. | |
| You're listening to Skeptoid. | |
| I'm Brian Dunning from Skeptoid.com. | |
| Lies, damned lies, and polls. | |
| Your cell phone rings, and it's an unrecognized number. | |
| Let's say you're one of those people who lives on the edge and allows unrecognized numbers to ring through, and so you answer it. | |
| Surprisingly, this one is not a scam call, but it's the next closest thing, a pollster, calling with a telephone survey. | |
| There might be one question, there might be 10. | |
| There might be a lightning round of demographic questions at the end of the call. | |
| And then you hang up and wonder what just happened. | |
| Because whatever you think may have been the purpose behind that survey, there's a very good chance that you're wrong. | |
| The difference between a poll and a survey is not really a hard and fast one. | |
| Both terms refer to the same thing, a questioning of some target group. | |
| But a poll is usually short, often with only a single question, and a survey is usually longer with multiple questions. | |
| There's one other important difference outside of the formal definition that's what we're interested in today, and that's how they can be used or misused. | |
| The proper intended use of polls and surveys are to learn something. | |
| A poll is a quick way to find out where people are at on some important issue, while a survey is a way to collect deeper information. | |
| Let's say you're designing a new car and you want to know what features you should put into it. | |
| You're going to want to know what people like and dislike these days. | |
| You're going to want to know what they can afford. | |
| You're going to want to know what demographic will be interested in it. | |
| So you'd probably do a survey to ask all kinds of questions so you can find trends in the data and get a solid handle on what you should be building, at what price, and for whom. | |
| Surveys are best for when you really want to learn something, especially nuanced information. | |
| You might have to pay people a few bucks to take the survey, but that's okay because the knowledge is worth it to you. | |
| But let's say you're putting in a new sports stadium and you've got the choice to build it in town or outside of town. | |
| There's no point in a long survey with a bunch of questions. | |
| You just want a simple vote from as many people as you can get. | |
| That's a job for a poll. | |
| You probably don't need to pay anyone to answer your poll, not only because it's so quick and easy, but because there are plenty of people who want their vote heard and are happy to give their opinion. | |
| A poll is also a fine way to learn something. | |
| It's like a hammer. | |
| Boom, you get one data point very quickly. | |
| But also like a hammer, you have to hit it square, or you'll bend your nail or skew your results. | |
| That means you have to word the question very precisely. | |
| Here's a recent example. | |
| In April 2023, Navigator Research conducted a poll asking Americans if they support or oppose Donald Trump being criminally indicted. | |
| 52% supported it and 40% opposed it. | |
| But when they asked the same question, but this time including the details of what he was indicted for, support went up to 54% and opposition went down to 39%. | |
| Now what does Navigator Research do with that information? | |
| And which results should they report? | |
| There are subtleties to even the simplest of polls. | |
| You can't just ask, do you prefer candidate 1 or candidate A? | |
| Because even the order of the options matters. | |
| And guess what? | |
| We thought we had a very simple thing to do, ask a single question. | |
| And now all of a sudden we're already mired in science. | |
| When there are multiple options, pollsters have to account for primacy bias and recency bias, the tendency for respondents to select one of the first or last options, respectively. | |
| So the order must be randomized for all respondents. | |
| Primacy bias is the stronger, so when you only have two options, the first one offered will be selected more often, all else equal. | |
| So if you want to get the most accurate data, you need to put candidate A first just as often as candidate 1. | |
| You have to randomize. | |
| But that's just one small issue with polls, and it assumes the pollster is trying to collect accurate data. | |
| Are there any other purposes for polls and surveys, particularly in the way their results can be used? | |
| Of course there are. | |
| Learning something is not necessarily the reason many polls are conducted. | |
| Consider who hired the pollster and why they were hired. | |
| All the complications that can skew the results of a survey or poll are problems that knowledge-seeking survey designers have to be aware of and account for. | |
| But to the spend doctor, or the political campaign, or the think tank, or anyone else in the propaganda business, they comprise a toolbox of nifty little tricks to get the data to say what they want it to say. | |
| Primacy bias is just the beginning. | |
| Here are a couple others. | |
| Acquiescence bias. | |
| People tend to go for the friendly answer, to answer yes to a yes-no question, or to agree with an agree-disagree question, where their acquiescence does not actually reflect their true feelings. | |
| Simply agreeing requires less thought. | |
| It seems like you're being nicer to the interviewer, and we tend to perceive authority in the questioner and assume they know more than we do. | |
| An unbiased question will present the actual choices, rather than asking whether you agree or not with one of them. | |
| The context effect. | |
| The order of the questions can matter because some may contain information that skews our perception of later questions. | |
| For example, one survey might open with the question of how well we approve of the job the president is doing, while another survey might make that the third question, after asking what we thought of that time he bombed civilians in Iraq, and what we thought of the economy taking a giant dive. | |
| This second survey has primed us with negative information about the president before asking what we think of him. | |
| In a world that can feel overwhelming, spreading thoughtful, evidence-based content is one of the best ways to make a positive impact. | |
| Ask your local public radio station to air the Skeptoid Files, a 30-minute radio-friendly version of Skeptoid that pairs two related episodes promoting real science, true history, and critical thinking. | |
| And in these challenging times for public media, we're offering these broadcasts for free to radio stations, available on the PRX Exchange or directly from Skeptoid Media. | |
| It's an easy ask. | |
| Just send a quick message to your station's programming director. | |
| By helping to bring the Skeptoid files to the airwaves, you'll help promote the essential skills we all need to tell fact from fiction. | |
| Just go to your local station's website, find the programming director's email address, or just their general email address. | |
| You can even use the telephone. | |
| I know that might sound crazy. | |
| It's an old legacy device that allows real-time voice communication. | |
| I know that's weird, but hey, it's an option. | |
| The world can feel chaotic, but you're not powerless. | |
| When you promote critical thinking, you can help your community tell fact from fiction. | |
| And that's how we shape a better future. | |
| In uncertain times, spreading good ideas can make you feel helpful, not helpless. | |
| Let's stand up for reason, truth, and understanding. | |
| Together, get them to air the Skeptoid files from Skeptoid Media, available on the PRX Exchange, and they'll know what that is. | |
| Typically, unbiased surveys use what's called probability sampling to determine who to poll. | |
| This means that the sample you select will, statistically, match the population at large. | |
| Two factors are needed to do this correctly. | |
| Number one, every person in the population must have an equal chance of being selected. | |
| And number two, you must be able to determine each person's chance of being selected. | |
| If both of these are done, then you can randomly choose which people to poll via any method that actually is random and be assured that your answers will be a true representation of the overall population's views. | |
| For example, in the 2016 Brexit referendum, pollsters failed to do this. | |
| They ended up oversampling younger and more educated people who favored remaining in the European Union and undersampled older and less educated people who favored leaving. | |
| The resulting expectation was that remaining in the EU would probably win, though by a slim margin, when in fact the referendum went the other way. | |
| Non-probability surveys are those in which you're not trying to measure the whole population, but a deliberately targeted subset. | |
| You might want to find out what features are important to pickup truck owners. | |
| So obviously you would want to limit your population to pickup truck owners. | |
| While there are countless legitimate cases for limiting the survey population, it's another obvious ways that pollsters with ulterior motives can produce desired results instead of actual results. | |
| If I go to churches to ask whether people are religious, I'll be able to report that nearly everyone polled is religious. | |
| If I work for the Japanese Sumo Federation's marketing department, I might go to a sumo tournament to ask people what their favorite sport is. | |
| And guess what survey result I'll be able to publish? | |
| This type of population selection is called sampling bias, and it's used to fool the public all the time. | |
| A political candidate looking to report that his campaign is favored might have his pollsters go to one of his own rallies to conduct the poll. | |
| Recently, a religious anti-abortion group reported the results of a survey that found women do not actually want the right to choose. | |
| It turned out the survey was seen and taken only by visitors on their own website. | |
| There's also something called the mode effect, and this refers to the mode or method by which the poll is administered. | |
| Telephone, mail, internet, in-person, etc. | |
| When you're face-to-face with a person, you might be less likely to give honest and open answers to certain sensitive questions. | |
| Pew Research conducted an experiment where they asked people about their financial status. | |
| When asked online, 20% admitted it was poor. | |
| But when asked in person, only 14% did. | |
| The 6% difference was a mode effect. | |
| When they asked a question that was not quite so personal, like what they thought of healthcare laws, no mode effect appeared. | |
| The mode effect can be used deliberately to take advantage of another common bias seen in survey respondents, social desirability bias, also called the Bradley effect, named after Los Angeles mayor Tom Bradley, a black candidate who lost the election to a white candidate, despite leading in the polls. | |
| People tend to give pollsters answers that are more socially acceptable, such as indicating a willingness to vote for a non-white candidate. | |
| Then, when they get behind the anonymity of the voting machine curtain, they vote the way they truly feel, social desirability thrown to the wind. | |
| Let's say you're a pollster hired by a political campaign that wants to report that Americans don't care very much about social justice issues. | |
| You're well aware of the mode effect and of social desirability bias, so your best bet is to ask your survey questions in an impersonal way, like via mail or a website, so that people don't feel pressured to give more socially desirable responses. | |
| But if your goal is the opposite, say to report that social justice is more important to Americans than ever, you should probably ask these questions face to face, because fewer people are likely to admit in person that they don't care about women's rights, LGBT rights, and so on. | |
| There is an even darker side to the polling business, and it's called the push poll. | |
| A push poll is one where the poll itself is little more than a ruse. | |
| Probably nobody's ever even going to look at the responses. | |
| The most common type of push-poll is one to trash a political candidate. | |
| The pollster might ask, would you vote for Joe Biden knowing he'd be the oldest man ever to take office? | |
| You don't care how people answer. | |
|
The Dark Side of Push Polls
00:03:21
|
|
| The whole point was just to raise alarm about Joe Biden's ability. | |
| One such smear campaign was conducted against candidate John McCain in 2000. | |
| Telephone pollsters asked probable McCain voters, primarily white Christians, would you be more or less likely to vote for John McCain for president if you knew he had fathered an illegitimate black child? | |
| He hadn't, of course. | |
| It was just an attempt to scare away his voters. | |
| Now this episode is not a comprehensive expose of the many ways surveys and polls can be used and abused to both inform and deceive us. | |
| My hope is that it is at least enough of a spark to prompt you to investigate further on your own. | |
| The more people learn to recognize when a survey is legit and when it's not, the better informed we'll all be and the less effectiveness tools of the misinformers will have. | |
| Just remember, whenever anyone approaches you with a poll, your first reaction should always be to be skeptical. | |
| A great big skeptoid shout out to premium supporters Gary Nolan at The Logical Libertarian on Twitter, Patrick Ekenbergi, Edvolved the Dog and is human, and Blake Bona Fede. | |
| Check out our 40-minute movie, Principles of Curiosity, that teaches the basics of scientific skepticism and critical thinking in a far-ranging journey that takes you from the depths of Death Valley to the highest points in space. | |
| It's free on YouTube and at principlesofcuriosity.com. | |
| Are you getting the Skeptoid podcast companion email? | |
| It comes out each week along with each new episode, featuring the wonder of the week, show notes, and much more. | |
| If you don't, you're only getting half the show. | |
| So come to skeptoid.com and click on podcast companion email. | |
| You're listening to Skeptoid, a listener-supported program. | |
| I'm Brian Dunning from Skeptoid.com. | |
| Hello, everyone. | |
| This is Adrienne Hill from Skookum Studios in Calgary, Canada, the land of maple syrup and mousse. | |
| And I'm here to ask you to consider becoming a premium member of Skeptoid for as little as $5 per month. | |
| And that's only the cost of a couple of Tim Horton's double-doubles. | |
| And that's Canadian for coffee with double cream and sugar. | |
| Why support Skeptoid? | |
| If you are like me and don't like ads, but like extended versions of each episode, Premium is for you. | |
| If you want to support a worthwhile nonprofit that combats pseudoscience, promotes critical thinking, and provides free access to teachers to use the podcast in the classroom via the Teacher's Toolkit, then sign up today. | |
| Remember that skepticism is the best medicine. | |
| Next to giggling, of course. | |
| Until next time, this is Adrienne Hill. | |
| From PRX. | |