Hey everybody, welcome to "What The Fact." I'm Katie, and this is Aaron, and we're editors with PolitiFact.
Today we're going to talk about polling. It's actually one of my favorite topics, Katie.
Now, we're not pollsters and we don't kind of interpret the polls to say who's winning and losing, but so many times, people look at polls and they draw the wrong conclusions.
As the referees, the fact-checkers, we get to put their claims on the Truth-o-Meter, and they don't always fare too well. So we have a good history kind of how polling works, and we're going to help you make sure you don't make the same mistakes that a lot of politicians do.
Yeah, we have some good tips to help you read polls like a fact-checker. But before we get into that, Newsy is going to help us understand: How do polls factor in to the Democratic debates?
The DNC is using poll results to determine who appears on stage. So let's get a better picture of how that works.
Poll numbers play a key role in determining which presidential candidates make the debate stage, but not all polls are created equal. We're going to take a look at a few groups on the Democratic National Committee's approved pollster list and explain why they're considered reputable.
The DNC's qualified poll list is made up of largely two types of sponsors: media organizations and universities. Quinnipiac is one of the university-based pollsters the DNC accepts. The agency is a well-known group that conducts both national and state-level surveys. Typically, Quinnipiac University Polls gather responses from a random group of over 1,000 registered voters, and respondents are contacted on the phone using either landlines or cellphones.
The DNC-approved poll list includes representation from all the big news networks and a handful of key newspapers. Fox News' poll methodology says it aims for about 1,000 responses to its surveys, and interviewers randomly call both landlines and cellphones. Polls done by The New York Times have also contacted people by phone, but recent examples show the number of people contacted for each survey varies.
The reliability of public opinion polls depends on a number of factors. If the questions use biased language, that can impact how people respond. If the demographics are off or if the sample size is too small, then the results won't be a representative sample of the larger population. Presumably, the DNC wants to rely on poll numbers that are representative of American voters. So the committee's decision to accept results from more than a dozen polling entities should help it achieve that.
Alright, Newsy, thank you for that rundown to understand that process. Now, as fact-checkers we are going to give you three tips that we use. We use a lot more than three tips, actually, but the big ones that we use to understand poll results so we don't get carried away in any horse race coverage ourselves.
Yeah, first is margin of error, and this is something I'm sure you're all familiar with. Basically it's that little caveat at the end of every poll that says this is plus or minus 4.5%. What that means is that essentially, the number you're looking at has a range, right?
And it's a range because it's a survey, which means it's a sample and it's not perfect. We didn't survey everyone in the country, for instance. So if it says plus or minus 4.5%, for instance, that means if you see a number that's 45, it could easily be 49.5 or it could be 40.5, and so you really have to think of it as a band and a range rather than a specific number.
So for instance, any race, if it's 45-44 and the margin of error is 3%, it's basically tied. And that's the way you should think about it. Don't necessarily think someone is winning or losing, because when you factor in that margin of error, of course the big thing there, Katie, is the fewer people that you actually survey, the bigger that margin of error is.
So the more people you survey, it shrinks, to a point. But in any survey there will always be a margin of error, so something you need to pay attention to if you're looking at any poll.
OK, so "margin of error," big tip, always look for those three words. The next thing we want you to know about polling is that state polls and national polls are quite different.
This was a really big deal in 2016. So, Aaron, the national polls in 2016 pretty accurately predicted the mood of the country over all. They had Hillary Clinton over Donald Trump by about 3 percentage points.
But when you looked at the state polling, which is done less frequently and involves different pools of folks, they would have told you that Donald Trump was polling a lot better in those key battleground states.
Yeah, and it's important to note, of course, that polling is expensive. We talked to pollsters in Iowa who said this isn't easy work to do, and so you're not going to see as many state polls, which makes it maybe harder for you to understand what is happening in those battleground states. Alright Aaron, what's our third tip?
Well the third one I'm going to say is, and this is kind of an advance tip, you've got to look at educational attainment in a poll. So this comes down to a process called "weighting." Right?
Yeah, so basically any pollster is trying their best to figure out who's actually going to vote. And since they're only sampling a small group of people, not everyone, they have to kind of figure out: How will people vote who just went to high school versus people who got a bachelor's degree versus, maybe, how many people got a Ph.D.
One of the things that pollsters have learned over time is the people who respond to polls the most are those that likely have the most education. And so that doesn't actually match who comes and votes on Election Day.
And so pollsters are trying to create a sort of balance, and one of the things they said in 2016, for instance, is maybe some of the pollsters got that wrong.
They relied on people who had more formal education, and as such might have skewed some numbers for Hillary Clinton that turned out not to be there.
So where can people find out, is this poll taking education into account?
Oh, it's within the meat of the poll. So basically one thing we always say is go back to the primary source, so any article that's talking about a poll.
Most polls you can find online, pollsters are getting pretty good at talking about their methodology about who they surveyed. If you can't find the methodology of who they surveyed, don't trust the poll. Just skip it altogether, because any good pollster is going to make it available so you can kind of dig into it yourself and figure out, is this really legit, or is it more wishy-washy?
OK, I hope those three tips about margin of error, state versus national polling and weighting help you better discern polls so you stay ahead of the horse race coverage that is only starting. And there's a lot more to come.
Yeah. Until next week, I'm Katie and this is Aaron. We'll see you then, bye.