Do Election Day exit polls prove that John Kerry actually beat George W. Bush? As Congress moved on Thursday to certify the electoral votes for the presidency, the polls -- which have fueled a great deal of skepticism over the legitimacy of Bush's win in the two months since the vote -- merit revisiting.
As we know, exit polls on Election Day generally pointed toward a Kerry win. The numbers, which weren't officially released by any of the media organizations that sponsored the polls but which nevertheless found their way to the Web during the course of the day, appeared to show Kerry slightly ahead in many battleground states as well as in the national popular vote count. To folks who suspect that Bush didn't actually win, those numbers are a key bit of evidence -- they provide a kind of moral authority to the movement, a reason to keep on fighting in Congress and in Ohio. But the argument has never had much force: Because much of what we know about the exit polls is based on leaks to Web sites and not on any kind of official release from media organizations, it's been difficult to place much faith in the polling numbers over the election results.
This week, though, Scoop, an online magazine based in New Zealand, released a stash of exit poll data that, on first glance, seems to strengthen the case of the people who question Bush's win. The documents appear to be the ones that the National Election Pool, the consortium of television networks that ran the poll, sent to subscribers (mostly newspapers) during the course of Election Day. It's unclear how Scoop came upon this information (the site didn't respond to requests). But the data more or less confirms what skeptics of Bush's victory have been saying all along -- that even late on Election Day, it looked as if Kerry had a good lead on Bush in the exits.
By examining a few of the dozens of documents obtained by Scoop -- which show, variously, the day's polling for House races and for the presidential race regionally and nationally, at three different times on Election Day -- we can get a good idea of how the race looked to news organizations as people went to the voting booths. At midday Eastern time, NEP interviewers had spoken to about 8,000 voters, 51 percent of whom said they'd voted for Kerry, 48 percent for Bush, and 1 percent for Ralph Nader. But there were some problems with those numbers: At that point in time, the NEP's poll (which can be seen on this PDF file) clearly included too many women -- 58 percent of the respondents were female, 42 percent were male, and the women who'd been interviewed preferred Kerry to Bush by 53 to 45 percent.
A more interesting set of numbers is available in this PDF file, which shows what the NEP knew later that evening -- at 7:33 p.m. Eastern time, when it had interviewed 11,027 people. By this time, the poll had a more natural distribution by gender -- 54 percent women, 46 percent men. Women still preferred Kerry to Bush, but what's interesting is that even at this late hour, with a better gender breakdown, Kerry still led Bush by a 3-point margin, 51 to 48. Because the NEP's exit poll included so many interviews, it had a relatively low margin of error -- only 1 percent. Kerry's lead in the poll exceeded that margin.
It isn't exactly earth-shattering news that the NEP had Kerry so far ahead of Bush at this late hour; many reporters who had access to the numbers (Salon was not a subscriber) have suggested that the polls showed Kerry winning, and Steve Coll, the managing editor of the Washington Post, disclosed the numbers in a an online chat on the paper's Web site. Still, the documents obtained by Scoop underline the starkness of the case -- as of the close of voting on the East Coast on Election Day, national political reporters and operatives in both campaigns had every reason to believe that Kerry was going to win and Bush was going to lose.
So what happened? Were the polls wrong, or were the election results wrong? Since the vote, a small group of pollsters, statisticians, mathematicians and political bloggers have been ruminating over these questions in great detail online, poring over all the available evidence to determine why the polls said what they did on Election Day -- and whether we should be mad at the NEP for screwing up the exit polling, or at Bush for stealing the election. Their work, while valiant, has not come to much, as they've been hampered by the secrecy of the NEP, which has offered vague suggestions that the poll was flawed but not any specific, comprehensive reasons for how and why the polls showed Kerry ahead. (Officials at the NEP, including its pollster, Warren Mitofsky, were not available to comment for this article. In November, Joe Lenski, one of the pollsters working on the exits, told me that the NEP would be conducting an in-depth study to determine if and how the poll failed; no such study has been released to the public.)
At present, then, despite the new data obtained by Scoop and the work of online analysts, we can't really say all that much about how, why or if the polls failed and what we ought to then conclude about the sanctity of this election. But we do know some things for sure. First, according to the Washington Post, the NEP conducted an internal review of 1,400 precincts in which it interviewed voters, and it determined that on average, Kerry's share of the vote was overstated by 1.9 percent.
Why did this happen? In the few interviews he's given, Mitofsky has suggested that Kerry voters were simply more cooperative with pollsters than Bush voters. "In an exit poll, everybody doesn't agree to be interviewed," he told PBS's "NewsHour" shortly after the election. "It's voluntary, and the people refuse usually at about the same rate, regardless of who they support. [But] when you have a very energized electorate, which contributed to the big turnout, sometimes the supporters of one candidate refuse at a greater rate than the supporters of the other candidate." Mitofsky added that he recognized this problem during the day, and that he warned "members" of the NEP -- that is, the major television networks and the Associated Press -- but not "subscribers," which included most major newspapers. That, he said, was a mistake, but he nevertheless considered the polls a success because media organizations did not make projections based on the data. "There were no mistakes in the projections. We were very cautious with them, and none were wrong, even though the exit polls did overstate Kerry in a number of states."
Mitofsky raises a good point -- even if the exit polls overstated Kerry's margins, how "wrong" were they? On Election Day, were the exit polls all but guaranteeing a Kerry win? Are the election results so far off the exit poll projections that we should question them?
The best-known amateur analysis of the exit polls -- conducted by Steven Freeman, a professor at the University of Pennsylvania's Center for Organizational Dynamics -- argues that the poll results were too significant to have occurred by chance, and that some other cause (oh, say, widespread vote rigging) is the likelier scenario. Freeman, who looked at state-by-state exit poll data collected from CNN's Web site during the course of Election Day -- data apparently released by mistake -- examined the difference between the predicted results and the actual results in Ohio, Florida and Pennsylvania. The odds that Kerry's predicted vote count differed from his actual count simultaneously in all three states was nearly impossible, he concluded in a PDF document that was widely passed around online. He initially put the odds of such an error occurring by chance alone at 250 million to 1, though he later revised the number down to 662,000 to 1. "As much as we can say in social science that something is impossible, it is impossible that the discrepancies between predicted and actual vote counts in the three critical battleground states of the 2004 election could have been due to chance or random error," he wrote.
To listen to Freeman, then, you might think we have a problem with the election. But looking to other experts, that's not at all clear. Mark Blumenthal -- the Democratic pollster whose blog, Mystery Pollster, has followed the exit poll story more comprehensively than any other outlet -- questions Freeman's analysis in those three states. Blumenthal found no statistically significant discrepancies in Florida, Ohio and Pennsylvania. In other words, although Kerry's vote projection was overstated there, the numbers were still within the margin of error for the polls conducted in those states. (The polls conducted in each state have larger margins of error than the national polls; the margin of error depends on the number of interviews conducted in each state, which can be found in this methodology statement released by the NEP.)
"In a lot of ways, this is a lot of sound and fury, signifying not very much," Blumenthal says, because even if the poll showed Kerry ahead, he wasn't ahead by enough in the important states to have those states called for him. "I remember looking at the numbers as they were coming in at maybe 6 that night," Blumenthal says. "And all I cared about was Florida, Ohio, Wisconsin, Iowa, those states. Everything else doesn't matter. And I remember looking at it and thinking that these numbers aren't big enough to know that Kerry is going to win. I've lived through enough election nights to know that what you know at 5 doesn't necessarily end up happening later on."
On his blog, Blumenthal has pointed out that the polls showed Kerry ahead in only four states that he eventually lost -- Ohio, Iowa, Nevada and New Mexico. But Kerry was not ahead by greater than the poll's margin of error in any of those states. "The exit polls had Kerry ahead by 4 percentage points in Ohio, by 3 in New Mexico, by 2 in Iowa and by 1 point in Nevada," he wrote. But the margin of error for the polls in those states was at least 5 to 7 points -- Kerry's 4-point lead in Ohio, for example, is simply not statistically significant.
Blumenthal points out, too, that there's a prevailing belief among people who question the election results that we should trust the poll numbers over the results because the polls are hardly ever wrong. But they're frequently wrong, he says -- which many newspaper reporters can attest to. To cite just one example of the many he lists on his blog, Blumenthal pointed out that in 1992, the national exit poll overstated Bill Clinton's margin by 2.5 percent. The only reason there wasn't a hue and cry about it was that Clinton won.
But Blumenthal is not dismissive of the concerns of those who say that the differences between the exit polls and the actual results should cause us to distrust the election. These theories, he says, have been fostered by the NEP's secrecy. If the NEP gave us some plausible explanation for what happened and why, perhaps it could put many fears to rest.
It's unclear whether that will happen anytime soon. Mitofsky, the pollster who led the polls, is said to be thin-skinned about his work; there's no sign he's willing to take part in an open, public examination of what went wrong. Indeed, just this week, when Mickey Kaus, Slate's in-house blogger, chided Mitofsky for misinforming the newspapers about the state of the race on Election Day, Mitofsky shot an angry e-mail defending his work and attacking Kaus. "If my clients were as misinformed as you seem to think how come none of them announced an incorrect winner from the 120 races we covered that day?" he said. "It seems that the only ones confused were the leakers and the bloggers. I guess I should include you in that list, but I'll bet you don't make mistakes. We have never claimed that all the exit polls were accurate. Then again, neither is your reporting."
It doesn't give one much hope for getting to the bottom of things.
Shares