The "18" Numerals are turned on after being delivered to be placed atop 1 Times Square for the New Year's Eve ball drop in New York City, on December 13, 2017.Brendan McDermid / Reuters

2017 was a wild ride, and 2018 doesn’t seem inclined to put on the brakes. Who could have guessed last year that Matt Lauer would go from Today to yesterday—felled, along with Harvey Weinstein, Al Franken, Bill O’Reilly, and so many others, by the open discussion of their creepy “open secrets”? That FBI Director James Comey would be fired? That Facebook would find Russian influence operations had reached more than 126 million Americans? That ISIS would lose Raqqa and attack Barcelona? That a Nobel peace prize-winner in Burma would allow ethnic cleansing within her borders? That a Congress with a 19 percent approval rating would pass a tax bill with 25 percent approval under a president with just 32 percent approval? That a Democrat would be elected senator from Alabama? Or that a Republican accused of molesting children would nearly win that seat?

Predicting is one tough business. Intelligence analysts work on it every day, trying to assess the future before it unfolds. The rest of us get a small taste of this dicey world every New Year’s Eve, when we resolve to do things differently next year. But just a week into the new year, you can already hear the sound of resolutions shattering all over the country.

A few years ago, I vowed to make daily exercise my New Year’s resolution. I was so committed, I bought a Fitbit and strapped that sucker onto my wrist, telling myself it was “exercise jewelry.” By February, I had adopted a different catchphrase: “strategic Fitbit usage plan.” I was 100 percent successful because I only wore my Fitbit on days when I was sure I’d hit 10,000 steps and get that glorious little wrist buzz. On days when I holed up to write, the Fitbit spent time “charging” in a drawer, so it didn’t count. By summer, my Fitbit was living in that drawer full-time, alongside her dust-bunny friends. I have since stopped trying to fool myself. My 2018 resolution is: Eat more chocolate.

New Year’s resolutions are predictions about the future. They are usually aspirational. And they are almost always deceptive. Like so many people, I did not end up doing what I said I would. And here’s the thing: I failed at the easiest prediction possible—me predicting me, just a few weeks into the future.

Now imagine how hard it is for an intelligence analyst to predict how other people will behave—months, even years from now. And intelligence targets don’t want to be accurately predicted. They are doing everything they can to mislead and hide from America’s clever dot-collectors and connectors.

Many factors make prediction difficult. Usually we focus on the wrong ones—like believing that people are inherently unpredictable. Sure, people often do things that you wouldn’t expect for all sorts of reasons—new options or opportunities arise, interests and affinities change, new partners or advisers exert influence, and sometimes life just intervenes. As the CIA’s Sherman Kent learned with Nikita Khrushchev back in 1962, world leaders can zig when you expect them to zag, and those unpredictable moves can be especially consequential—in good ways and bad. Kent’s shop missed signals of the Cuban missile crisis in part because the missile deployment was so out of keeping with past Soviet practice and because Kent viewed the move as “suicidal.” Mao Zedong stunned the world in 1972 when he welcomed Richard Nixon to Beijing, setting China on a path from the Cultural Revolution to the capitalist revolution. Ronald Reagan was a Cold War hawk in his first term but a peacemaker in his second, nearly reaching a remarkable deal with Soviet leader Mikhail Gorbachev at Reykjavik to abolish all nuclear weapons. More recently, Russia’s Vladimir Putin and Turkey’s Recep Erdogan have been dragging their countries backwards, from democracy to autocracy.

Yes, people can be unpredictable. But it’s the predictable weaknesses of our thinking that often blind us the most. Peering over the horizon requires overcoming the faulty wiring of our own brains.

Psychologists have found all sorts of cognitive biases that distort how we perceive and process the world around us. A big one is that we ascribe higher probabilities to events that we can easily recall—like sensational news stories. That’s why, for example, Americans are more afraid of dying in shark attacks than car accidents, even though fatal car crashes are about 60,000 times more likely. In fact, many things have a higher probability of killing you than sharks, including being trampled in a Black Friday sale. A few years ago, the world was gripped by the Ebola outbreak, which killed an estimated 11,000 people from 2014 to 2016. (“Is the U.S. Prepared for an Ebola Outbreak?” blared The New York Times.) Meanwhile, influenza, the common flu, killed about 50 times more people during the same period worldwide—somewhere between half a million and a million people.

Psychologists have also found “confirmation bias”—the tendency for people to readily believe new information that confirms their pre-existing beliefs and views but discount new information that challenges them. Any horoscope reader suffers from confirmation bias, believing only the good bits and whatever else resonates with their preconceptions of their astrological sign. “Yes, yes, that sounds just like a Taurus!”

We’re also suckers for optimism. Optimism bias, or wishful thinking, can be seen in everything from investing to sports to politics. Research finds that people expect their own investments will perform better than average and that NFL fans will over-predict wins of their favorite team and under-predict losses even when they’re paid money to predict accurately. Remember Brexit? It came as a surprise but it shouldn’t have. Polls consistently showed the referendum was a very tight contest. Of the 35 polls conducted in the weeks before the referendum, 17 showed the “Leave” campaign ahead, and 15 showed the “Remain” side ahead. But many, it seems, were hoping that the U.K. would never really leave Europe, and looked only at the bright side of the numbers they saw. The night before the referendum, betting markets gave “Remain” an 88 percent chance of winning.

And then there’s math. Even really smart people are often terrible with probabilities. Each year, when I co-teach a Stanford MBA class about political risk, I ask students whether they would take a pill that could make them look their all-time best, forever. The pill has been tested thoroughly and is 99.9 percent safe. Usually all but one or two students say they’d take the pill. Then I tell them there’s a 1 in 1000 chance that ingesting the pill will cause instant death. How many would still opt for the beauty pill? Not so many hands go up—even though statistically speaking, 99.9 percent safe is exactly the same as a 1 in 1000 risk of death. The point of this exercise isn’t the danger of vanity. It’s the difficulty of communicating risk. The old saying that “numbers don’t lie” is more of a lie than we think.

2018 will be full of foreign-policy surprises. Most will not be the good kind. We already know some of the big risks and events in store: rising tensions and the possibility of war with North Korea, the potential for even greater instability in the Middle East as Mohammed bin Salman embarks on a high stakes gambit to reform Saudi Arabia’s economy, a no-holds barred U.S. congressional election season, and nefarious Russian cyber activities, to name a few. To deal with them effectively, policymakers would be wise to think more about thinking. Beware of misestimating the likelihood of events, discounting information that doesn’t fit with prior beliefs, and optimism bias. These cognitive traps have serious consequences. And one can be certain that they aren’t going anywhere.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.