In the days leading up to the 2012 presidential election, many pollsters and political pundits predict that the race will be one of the closest in recent memory. But one statistician correctly predicts the outcome of the election in 49 of the 50 states before voting even begins.
Using data analysis, Nate Silver, a New York Times blogger, successfully predicts that President Barack Obama has a 90% chance of being re-elected. At the time of this writing, Silver’s projections have proven correct in every state except Florida, where it’s too early to know if he’s correct because votes are still being tallied.
Because of the success of Silver’s projections – which some political pundits have maligned – this election is now being described not only as a win for Obama, but also a win for big data and the “quants,” of the world.
“It’s not just about Nate Silver, but the use of statistics in general and to express prediction in terms of probabilities,” Andrew Lipsman, vice president of marketing and industry analyst at ComScore, tells AdAge. “It’s interesting because so much of how the election moves is driven by the media narrative, and in some instances it can be a self-fulfilling prophecy. The Romney momentum narrative didn’t exist.”
While much of the media coverage focuses on descriptions of an expected tight race in Ohio, for example, Silver’s big data analytics project predicts that Obama is ahead.
An article in CNET notes that Silver knows this via a “painstaking analysis of every poll of the Buckeye State available to him, and 100,000 simulated elections that showed, when all was said and done, that the most crucial state in this year’s election, one that Romney could not win without, was not the nail biter many said it was, but rather a comfortable lead for the president.”
Silver’s success means a lot for big data, analytics and mathematical models, and it is likely to have broad implications for future elections and election coverage. Silver’s model takes human bias out of the equation, Mashable notes.
“The ever-present temptation to cherry-pick polls is subverted. You set your parameters at the start, deciding how much weight and accuracy you’re going to give to each poll based purely on their historical accuracy” according to Mashable. “You feed in whatever other conditions you think will matter to the result. By 2016, if the networks are paying attention, don’t be surprised to see that the talking heads are all Nate Silver clones. Every media organization will now want its own state poll-based algorithm.”
- Subscribe to our blog to stay up to date on the latest insights and trends in data analysis and big data analytics.
- Please join us on Thursday, November 15th at 11 a.m. EST for our complimentary webcast, “Structured + Unstructured Data: Creating Greater Value with Big Data Variety,” presented by Syed Mahmood, Sr. Product Marketing Mgr, TIBCO Spotfire; Rik Tamm-Daniels, VP Technology, Attivio; Parul Sharma, Solutions Mgr, 3K Technologies.In this webcast, Syed Mahmood of TIBCO Spotfire and Rik Tamm-Daniels of Attivio will discuss data source trends and how enterprises can leverage non-conventional data sources to uncover deeper insights. Then, Parul Sharma, Solutions Manager of 3K Technologies, will demonstrate how the Spotfire analytics platform and Attivio’s Active Intelligence Engine (AIE) can be used to extract valuable insights by combining structured and unstructured data.
Spotfire Blogging Team