Nate Silver
PHOTOGRAPH BY Stephanie Noritz
Politics

Nate Silver crunches the numbers

Political junkies look to Brooklyn-based blogger who correctly predicted 49 out of 50 states in 2008 election
By Luiza Ch. Savage

When a string of new polls came out this week showing Mitt Romney making major gains on the heels of his aggressive debate performance against a subdued President Barack Obama, there was a sense of panic among the President’s supporters. “Has any candidate lost 18 points among women voters in one night ever?” fretted commentator Andrew Sullivan. “On every single issue, Obama has instantly plummeted into near-oblivion.

Democrats mourned and Republicans gloated, but one voice stayed calm amidst the furor.

Nate Silver, a 34-year-old Brooklyn-based statistician and blogger who correctly predicted the results of 49 out of 50 states and every Senate race in the 2008 election, tried to cool emotions on Monday. “According to Twitter, Barack Obama went from a huge favorite at 1 p.m. to a huge underdog at 4 p.m.,” Silver tweeted. “Get a grip, people.”

Silver’s blog, FiveThirtyEight.com (named for the number of electors in the U.S. Electoral College system that technically elects presidents), was licensed by the New York Times after his 2008 success. In a post on Sunday, he counselled caution in over-interpreting the latest polls: “Polling data is often very noisy, and not all polls use equally rigorous methodology. But the polls, as a whole, remain consistent with the idea that they may end up settling where they were before the conventions, with Mr. Obama ahead by about two points. Such an outcome would be in line with what history and the fundamentals of the economy would lead you to expect.”

It was the kind of cool-headed, data-driven, politically agnostic analysis that has made Silver’s work a must-read for political junkies. But his passion for numbers transcends political polls. A few years after graduating from the University of Chicago with an economics degree, Silver quit a job as an economic consultant to make a living playing online poker and obsessing over baseball statistics. He designed a statistical system for predicting baseball player performance that was eventually purchased by Baseball Prospectus magazine, and he co-authored several books on baseball analysis.

Now Silver has come out with a new book on the art and science of forecasting. In The Signal and the Noise: Why So Many Predictions Fail—But Some Don’t, Silver travels the globe investigating forecasting in fields as varied as hurricane and earthquake prediction, to finance and national security. Along the way, he investigates why ratings agencies missed the risks in mortgage-backed securities, why national weather forecasts are more accurate than the ones on local TV news, and why the more you see a political pundit prognosticating, the more likely he or she is to be wrong.

Silver argues that in the era of “Big Data”—when 2.5 quintillion bytes of data are generated each day—human judgment matters more than ever. “We have more information than we’ve ever had before,” Silver says in an interview, “but we’ve got the same brains we had as cavemen.” Those brains are wired to look for patterns. The trick is to train them to distinguish real patterns from randomness, the “signal” from the “noise.” And he lays out ways for the rest of us to become better at judging everything from the latest presidential poll to the likelihood that your spouse is cheating on you if you happen to find a pair of unfamiliar underwear in your home. (He has a mathematical theorem for the latter; more on that later.)

The future is not knowable, but we can get better at forecasting it. “Between ignorance and knowledge,” quips Silver, “is probability.” First, though, Silver wants you to forget what you learned in your statistics class in high school or university.

“For some reason statistics and probability are taught in a very abstract way,” he complains. “In contrast, think about the way you learn to read—you don’t start by memorizing syntax and grammar rules.” To improve your ability to make decisions about the future, Silver counsels joining a fantasy sports league or playing poker. “Anything that gets you thinking probabilistically,” he says. And anything that allows you to test your predictions, discover your biases and acquire humility.

The best forecasters, he argues, “are comfortable with shades of ambiguity.” He endorses the work of Berkeley psychologist Philip Tetlock, who divides prognosticators into two styles of thinking—“hedgehogs,” who are seized with a grand theory or idea as the basis for their predictions, and “foxes,” who are open to shades of grey and constantly question their own biases, revise their predictions to take account of new information, and are able to express their predictions probabilistically. The foxes make better forecasters.

“Keep asking yourself, ‘What is my process for improvement and getting smarter in light of my biases? What is my process for evaluating how well I do?’ ” says Silver.

His book is one part how-to guide and several parts inquisitive math nerd’s exploration of forecasting disciplines. Silver calls on meteorologists, NASA climate scientists, professional poker coaches and epidemiologists who predict mutations of flu strains. He interviews a Canadian who was one of the chief engineers who programmed a computer that beat chess masters by anticipating their moves, and a Winnipegger who moved to L.A. and made millions betting on basketball games. At one point, he sits down with Donald Rumsfeld to discuss “unknown unknowns.”

Silver’s approach is summed up in an equation called Bayes’ theorem, which provides a systematic way of estimating how much a new piece of evidence changes the probability of something happening—given what you already know. For example, in the case of mysterious underwear, the likelihood that your spouse is cheating on you will depend heavily on what you estimate that likelihood to be before you found the underwear. If you were already suspicious of your spouse, the formula will yield a high probability that the underwear is incriminating. If you had no reason to doubt your spouse’s loyalty, the underwear is much less likely to be a smoking gun. “It’s a little mathematical tool to tell you how much you should change your views when you encounter new information given how much you already know,” he says.

This Bayesian approach is at the heart of Silver’s political forecasting. For example, two days after Romney’s successful debate performance, Obama got the good news that the U.S. unemployment rate had dropped from 8.1 to 7.8. Democrats trumpeted the change, while Republicans suggested the number was insignificant at best, rigged at worst.

Silver emphasizes that in building a good forecasting model, it’s important to avoid partisan bias in judging how important the new piece of information is. To do that, he sets up rules for how his model will treat new information, such as economic numbers, before he knows what those numbers will be. His model assigns importance to economic data based on how correlated they have been with election outcomes in the past. Likewise, the importance that he assigns to polls depends on how accurate the polling firm’s results have been in prior elections. “We are not taking each piece of information as being equally valuable,” he says.

By Monday, Silver’s model showed that Romney’s chances of winning the election had doubled from before the debate—from 13.9 to 25.2. Good news for Romney, but hardly reason for Democratic panic.

But not every field of prediction has the advantage of vast swathes of historical data to help build and refine forecasting models.

I ask Silver about the President’s predicament when he had to decide whether to order a risky commando raid on the compound in Abbottabad at a time when his advisers could only estimate the chances Osama bin Laden was there at 40 to 70 per cent. In a situation where there is little hard data, or you are forecasting things that occur at rare intervals, Silver says, “I think you have to be more focused on process.” And we’re back to talking about foxes and hedgehogs. Says Silver: “You have to be a fox.”