The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
Does his faith influence his judicial decision making?
March was a hugely important month for religion and the Supreme Court, and a pivotal moment for Justice Antonin Scalia, the subject of a fat new biography. Too bad we couldn’t talk plainly about what was, and is, at stake. In a country historically averse to political debates about competing faiths, nowhere is frank discussion of religion more taboo than at the U.S. Supreme Court. “Religion is the third rail of Supreme Court politics. It’s not something that’s talked about in polite company,” as Jeff Shesol, the author of a book about the New Deal Court, put it. He was speaking with NPR’s Nina Totenberg in 2010, when John Paul Stevens was looking at retirement and, for the first time in American history, there was the prospect of six Catholics, three Jews, and no Protestants on the highest court in the land—a watershed almost too “radioactive,” Totenberg remarked, even to note. And beware of venturing any further than that, as the University of Chicago Law School’s Geoffrey Stone did in a controversial 2007 blog post suggesting that the Supreme Court’s five conservatives likely derived their abortion views from Catholic doctrine: Scalia—a devout Catholic, and the current Court’s longest-serving conservative—announced a boycott of the school until Stone leaves the faculty.
Multiple news outlets in Texas are reporting that Justice Antonin Scalia, who served for three decades on the U.S. Supreme Court, died in his sleep of apparent natural causes Saturday at age 79.
According to the San Antonio Express-News, Scalia was attending a private party at the Cibolo Creek Ranch in west Texas on Friday night. Staff reportedly found him on Saturday.
President Ronald Reagan appointed Scalia to the Supreme Court in 1986, where he became the intellectual leader of the Court’s conservative wing. Among the wider public, Scalia was one of the few justices who became a household name, due in part to his outspokenness and his often-blistering dissents from the bench.
We’ll have more information as it becomes available.
And why stopping it requires that governments get out of the way
As it stands, the international coalition is far from winning the information war against the Islamic State. Its air strikes may be squeezing the group in Iraq and Syria and killing many of its leaders, but that has not halted the self-proclaimed caliphate’s ideological momentum. Indeed, at the end of 2015, it was estimated that the number of foreigners travelling to join militant groups in Iraq and Syria—predominantly the Islamic State—had more than doubled in the course of just 18 months. What’s more, while these figures may be striking, sheer numbers are less important than intent when it comes to the organization’s actual threat to the world. As we have already seen, it takes a very small number of people to unleash great terror, whether in Iraq, Syria, or elsewhere.
Meryl Streep explained her all-white film-festival jury by claiming that “we’re all Africans, really.” She’s right, and so wrong.
Meryl Streep, at the Berlin film festival this week, was asked why—given #OscarsSoWhite, given that it’s 2016, given that come on—she had convened an all-white panel to judge this year’s festival entrants. Invoking the rhetoric of an American president who had visited Berlin in the course of the last century, Streep dismissed objections to her panel’s monochromism. “The thing that I notice is that there is a core of humanity that travels right through every culture,” she said. “And after all, we’re all from Africa originally, you know. We’re all Berliners; we’re all Africans, really.”
This wasn’t terribly surprising. When Streep was asked, last year, in the course of promoting her extremely feminist film Suffragette, whether she is herself a feminist, the actor replied that, no, she isn’t. Instead: “I am a humanist,” she said. “I am for nice, easy balance.”
Einstein’s gravitational waves rest on a genuinely radical idea.
After decades of anticipation, we have directly detected gravitational waves—ripples in spacetime traveling at the speed of light through the universe. Scientists at LIGO (the Laser Interferometic Gravitational-wave Observatory) have announced that they have measured waves coming from the inspiral of two massive black holes, providing a spectacular confirmation of Albert Einstein’s general theory of relativity, whose hundredth anniversary was celebrated just last year.
Finding gravitational waves indicates that Einstein was (once again) right, and opens a new window onto energetic events occurring around the universe. But there’s a deeper lesson, as well: a reminder of the central importance of locality, an idea that underlies much of modern physics.
Our telephone habits have changed, but so have the infrastructure and design of the handset.
One of the ironies of modern life is that everyone is glued to their phones, but nobody uses them as phones anymore. Not by choice, anyway. Phone calls—you know, where you put the thing up to your ear and speak to someone in real time—are becoming relics of a bygone era, the “phone” part of a smartphone turning vestigial as communication evolves, willingly or not, into data-oriented formats like text messaging and chat apps.
The distaste for telephony is especially acute among Millennials, who have come of age in a world of AIM and texting, then gchat and iMessage, but it’s hardly limited to young people. When asked, people with a distaste for phone calls argue that they are presumptuous and intrusive, especially given alternative methods of contact that don’t make unbidden demands for someone’s undivided attention. In response, some have diagnosed a kind of telephoniphobia among this set. When even initiating phone calls is a problem—and even innocuous ones, like phoning the local Thai place to order takeout—then anxiety rather than habit may be to blame: When asynchronous, textual media like email or WhatsApp allow you to intricately craft every exchange, the improvisational nature of ordinary, live conversation can feel like an unfamiliar burden. Those in power sometimes think that this unease is a defect in need of remediation, while those supposedly afflicted by it say they are actually just fine, thanks very much.
How those three little words sound around the world
I love saying “I love you.” I’ll say “love ya” to my parents when I’m about to get off the phone with them, and “love you!!” to my wife as she’s heading out the door for work (“love you???” on Gchat means I’ve gotten myself into trouble with her and I’m searching for a way out). I tell my son I love him, and he doesn’t even get it—he’s an infant. I’ve been known to proclaim that I love sushi and football and Benjamin Franklin (I mean, how could you not love Ben?).
Many people in this world would find my behavior rather strange. That’s because Americans are exceptionally promiscuous when it comes to professing their love. In the United States, “I love you” is at once exalted and devalued. It can mean everything ... or nothing at all. This is not universally the case.
Ben Stiller’s follow-up to his own comedy classic is a downright bummer, no matter how many celebrity cameos it tries to cram in.
You don’t need to go to the theater to get the full experience of Zoolander 2. Simply get your hands on a copy of the original, watch it, and then yell a bunch of unfunny topical lines every time somebody tells a joke. That’s how it feels to watch Ben Stiller’s sequel to his 2001 spoof of the fashion industry: Zoolander 2 takes pains to reference every successful gag you remember from the original, and then embellish them in painful—often offensive, almost always outdated—fashion. It’s a film that has no real reason to exist, and it spends its entire running time reaffirming that fact.
The original Zoolander, to be fair, had no business being as funny as it was—it made fun of an industry that already seems to exist in a constant state of self-parody, and much of its humor relied on simple malapropisms and sight gags. But it was hilarious anyway as a candid snapshot of the fizzling-out of ’90s culture. Like almost any zeitgeist comedy, it belonged to a particular moment—and boy, should it have stayed there. With Zoolander 2, Stiller (who directed, co-wrote, and stars) tries to recapture the magic of 2001 by referencing its past glories with increasing desperation, perhaps to avoid the fact that he has nothing new to say about the fashion industry or celebrity culture 15 years laters.
Russell Simmons on advocating the world’s most pretentious diet
A true illustration of our place in the universe
The city could be underwater within the century—but it has a plan.