Twice in the past nine months, two polling firms have had their results questioned after folks with a mathematical bent analyzed their data. The most recent has resulted in a high-profile lawsuit by the liberal political blog, Daily Kos.
Silver based his claim on a statistical analysis of Strategic Visions’ results. By comparing the way random data actually appears, he showed that the pollster’s numbers were incredibly unlikely to be real, unmanipulated polling data — "millions to one against," Silver wrote.
Sets of numbers from polling data, when compiled over time, will conform to certain rules of random distribution. But when people manipulate or make up data, they don’t end up with that same randomness — humans tend toward patterns.
The easy example of this is the Gambler’s Fallacy, where people think that if a coin-toss comes up heads a few times in a row, then it’s more likely to come up tails next. But of course, randomness doesn’t work that way. So people end up injecting patterns of what they mistakenly think will look like randomness.
Silver’s exacting mathematical analysis revealed important problems. Pollsters can have a huge effect on behavior, because media outlets from newspapers to TV to radio and the Web report polls and set the tone for what the country thinks about political issues and candidates. After Silver’s report, he later noted that Strategic Vision seemed to have stopped doing any regular polling, although the firm did release a poll of Georgia voters again in March.
Now another pollster seems to have problems, as well. Daily Kos announced Tuesday that it would be suing Research 2000, a pollster it had paid to produce its "State of the Nation" poll for a year and a half.
Here again, statistical analysis revealed serious problems. A report done for Daily Kos showed significant deviations from what random data looks like. But this time, it’s more than just a blogger shedding light on a problem: Daily Kos says "we were victims of that fraud," and is taking Research 2000 to court.
Many people say they don’t trust polls — surveys can be biased, questions asked poorly, and samples not representative. But the media often rely on polls to give at least a basic idea of where the public stands, and they often report stories using that basic idea.
The public need to be scrupulous, then, in examining pollsters. We can thank these statisticians for doing the number-crunching to show problems with the results. And everyone can ask for more — demand that pollsters are completely transparent in how they do their surveys, so that people like Nate Silver and others can regularly check on their data.
If you see a news story about a certain pollster and want to check up on them, FiveThirtyEight has recently updated how it does pollster ratings — a scorecard showing how reliable each pollster is at predicting elections based on its publicly released polls before an election. The site also regularly questions inconsistencies in polls.
Keep yourself informed, and you’ll be less likely to be misled.