Approximately 50 times a day, I visit a handful of presidential election–forecasting websites to remind myself of Joe Biden’s chances of becoming the next president. Honestly, I’m not sure why I do it to myself.
Am I using this information to make up my mind? Please; Election Day is tomorrow, and my mind has been made up for more than 1,450 days. Am I seeking more information to shape my advice to other people? No; I would, in any probabilistic scenario, advise everybody to vote, specifically for the person endorsed by my employer. Am I trying to calibrate the anxiety I should feel, right now, about an event over which I have almost no control? Yep, that sounds right.
And yet, when the results are in, I won’t have any way of knowing if the forecasts were a useful guide for my anxiety. For instance, the smartest modelers out there say Biden has roughly a 90 percent chance of winning the presidential election. If Biden wins, it won’t necessarily mean the forecasts were correct. Maybe his “real” odds were more likely 50 percent (or 99.9 percent). If Trump wins, it won’t necessarily mean his 10 percent odds were wrong, either. One-in-10 events happen all the time. Just now, I flipped a coin three times, and it landed heads-heads-tails. The odds of that exact sequence: 12.5 percent. Unlikely things happen so constantly that you could say reality is mostly one unlikely thing after another.
Obsessing over the probabilities of unique one-off events that we have an infinitesimally small amount of individual control over—like a presidential election—is a metaphysically strange thing to do. Still, everybody I know, including me, has been doing so, obsessively, for weeks.
I wanted to know if America’s forecast addiction was understandable, paradoxical, or pathological. So I called Jordan Ellenberg, a mathematician and University of Wisconsin professor, who wrote a best-selling book about thinking mathematically. This interview has been condensed and edited.
Derek Thompson: You are a highly credentialed mathematician, a professor, and a best-selling author on the subject of math. So I want to know—emotionally, psychologically—what it means to you that Joe Biden has a roughly 90 percent chance of victory.
Jordan Ellenberg: You’re asking if a trained mathematician has a finely tuned sense of how to feel about the difference between Biden having an 80 percent chance of victory and a 90 percent chance of victory. And I’m not sure that I do.
I think it’s actually more helpful to be a baseball fan than a mathematician in this circumstance. If you have a long experience of watching baseball, you have a sense of how nervous to be in a given situation. If your baseball team is up two runs and pitching in the bottom of the ninth inning, you are very likely to win the game. [Ed: About 95 percent, in fact.] Now, there are many, many games in baseball history where teams have come back to win after being down two runs in the bottom of the ninth. But that doesn’t mean you should be scared every time your favorite team is up in that scenario.
Thompson: I’m glad you mentioned fear, because when I’m looking at an election forecast, the numbers cash out as feelings. Ninety percent odds of a Biden win? Okay, that makes me happy. But 70 percent? Now I’m anxious. And 20 percent? Terrified. It’s almost as if these election-forecasting sites should express their probabilities in emojis.
Ellenberg: According to some philosophers of mathematics, probability is a measure of your feelings. It’s a measure of your degree of belief in some proposition. That’s all it is.
As a teacher of math, you’re always trying to emphasize that every single mathematical formalism in the world was developed from a real problem. People didn’t decide to come up with abstract concepts for no reason. We have a formal theory of probability. That starts with dice games. It started with people who were gambling. They were trying to understand: What should guide my actions? What should guide my decision making? That’s where our tradition of probability theory starts. It’s notable that probability theory comes incredibly late in the history of mathematics, around the middle of the 17th century. That’s a sign, possibly, that it’s very difficult to think about probabilities intuitively. As opposed to, say, an older subject in mathematics, like geometry, where we can more easily grasp what things are shaped like. Probabilities are hard to think about, even for experts.
Thompson: Let’s go back to 2016. Did you feel like, compared with your friends and family, that being a mathematician made you more prepared for Trump’s win? Were you more attuned to the possibility of the less likely event?
Ellenberg: You know, I don’t want to pat my own back. But I think I did a good job in 2016—and am doing a good job now, not only with the election but also with the pandemic—of having a certain level of epistemic humility about the world. It makes sense to me to say: There is a good reason to bet on X, but we don’t know for sure. There are so many people, especially pundits on TV, saying: “Now look, this is what’s going to happen.” And then somebody else says, “No, this is going to happen.” That’s a very different perspective to have on the future: to believe that a clear answer exists, if only we’re clever enough to see it.
Thompson: You’re describing two very different mindsets, or frameworks of thinking, about the future. So, let me try an analogy here. Some people are optimistic, and some people are pessimistic. That’s a psychological spectrum between optimism and pessimism. What you’re describing is another psychological spectrum: Let’s call it the epistemic-humility axis. Or, the who-the-hell-knows axis. Some people think about the future like a perfectly solvable equation and think: There is one answer here, and if I think hard enough, I’m going to solve it. But some people think about the future, and, even as they’re trying to discern what’s going to happen, they’re focused on what they don’t know, or cannot know.
Ellenberg: Yes, I like that. I think these are orthogonal axes. A pessimist can be overly certain about the future, or a pessimist can be epistemically humble. But that humility is not natural. It doesn’t come naturally to us. It’s exhausting to be epistemically humble all the time.
I think about this a lot in the current pandemic. You see a lot of analysts saying, “If I think about it hard enough, I can figure out why this state got hit harder than that state, and why this country got hit harder than that other country.” They think that if they look hard enough, and study the curves and the policies, they’re going to figure out the entire difference between the Netherlands and Belgium. Maybe they will. But also, maybe random individual choices and dumb luck play a role in the spread of a pandemic. The truth is that there is a lot of stochasticity in the world. You have to be prepared to live in a world where strange things happen, and you can’t immediately and totally explain them.
Thompson: Speaking of opposite ends of the epistemic-humility spectrum: I was listening to the FiveThirtyEight podcast the other day, and the hosts were saying that people in their life were coming up to them and asking: “I know your site says Biden has an 87 percent chance of winning, but what’s really going to happen?” I thought that was pretty funny. Here you have readers of a probability website, asking the site’s methodologists to discard probability and just tell them the one thing that’s going to happen. As if the forecasting model is a ruse, and the Wizard of Oz is behind the curtain looking at the one true crystal ball. That seems to be an illustration of the phenomenon you’re describing. Many normal smart people just want to know the one answer.
Ellenberg: It is funny. It’s also normal. We feel differently about probabilities, depending on the circumstance.
In many contexts, we overlook small odds. FiveThirtyEight gives Trump a 3 percent chance of winning New Mexico. Most people are going to look at that and say: “Okay, Trump isn’t going to win New Mexico.” I would not complain if you said that.
But now think about COVID. If you’re a 65-year-old person, you have a 99 percent chance of survival if you get this virus. The odds of death are smaller than the odds of Trump winning New Mexico, which I just said you can maybe ignore. But a 1 percent chance of your own death in two weeks? That’s a completely different risk. People go out of their way to avoid things that give them a one-in-100 chance of dying. When they don’t, we call them idiots. So the nature of the event is highly relevant to your feelings, as it should be! Even the most hard-core rationalist would never say you should feel the same about a 1 percent chance of dropping dead versus losing your car keys.
Thompson: So probabilities are almost like guides for our feelings about the future?
Ellenberg: I would put it even more strongly: Feelings are for the same thing that math is for. They’re both for guiding your decisions and helping you select actions and helping you understand things. Relevant to your decision making is how strongly you feel about the outcome. So, yes, probabilities are about feelings.
Thompson: Any last words of advice for people who have already voted and are sitting at home frantically trying to read every poll to divine a future that they cannot control?
Ellenberg: A good mental-health question to ask yourself is: What am I actually gaining from trying to figure this out now? None of us sitting at home is going to decide the election. The meaningful actions we’re going to take in support of our preferred candidate at every level have mostly been taken, or decided. So what are we gaining? We’re all about to find out the answer. Our epistemic situation when we know the outcome of the election will be the exact same no matter how hard we think about it right now. Our stress affects nothing.