"You know Ladurée has opened down the street from the hotel." I was interested in everything that Mark McClusky, a writer for Wired, was telling me, as we sat in the luxury of the two-star Cheval Blanc restaurant in Basel. We were both tagging along with our respective publishing colleagues to the international watch fair Baselworld, and
enjoying the view of the sun setting over the Rhine outside the
two-story arched windows. Liveried waiters and sommeliers hovered
beside the white-naperied tables, waiting to bring us fantastically
expensive run-of-the-mill updated Continental food and way-marked-up
French wine, though I bravely went against the sommelier's
recommendation and ordered a Swiss red Cornalin. (It was fine, but the table breathed a collective sigh of relief when I ordered a Cahors as the
second bottle.) This stopped me.
It quickly became clear that McClusky, whose writing on Grant Achatz and
other technology-minded chefs I've admired, shared more than an
interest in food and colleagues we love working with. (When we emailed
greetings to Bob Cohn, grand master of TheAtlantic.com
and a former editor of McClusky's, he fired back one, very precise
word: "BOONDOGGLE!") We both love macarons, the buttercream-filled
almond-meringue sandwich cookies that have taken over the food world and that
grown men can admit to liking, even if a male fondness for cupcakes
dare not speak its name (I spoke it in this video).
Ladurée--the current Pierre Hermé-supervised macaron that is considered
the international gold standard--right in the heart of Sprungli luxemburgli-land! Luxemburgli are the Swiss version of macarons, and have a place in Swiss hearts almost as high as Sprungli's truffes du jour,
the fresh truffles that are considered the ne plus ultra of fresh
chocolate here, which the Swiss would of course define as the world's
best. Sprungli is the historic chocolate-maker, general city luxury
caterer, and macaron-baker that holds a place of pride in the city.
This called for a taste-off.
So this morning
we mounted an expedition to the new branch, which turns out to be one
of three in Switzerland; Ladurée is opening many stores
where rich people live, though so far none, sadly, in the U.S. McClusky
ordered a box of 15 to bring back to Oakland, and then chose the four
varieties we thought we could compare against Sprungli: pistachio,
caramel, and the two varieties of chocolate Ladurée offers, its plain
and Madagascar, which it says is 72 percent cocoa liquor, one of those
meaningless claims. Then we went to the largest of the many branches of
which this year is celebrating its 175th anniversary, and ordered the
closest equivalent. Sprungli has branches at the airport, and has its
macaron packaging down better to avoid crushing: plastic dome-shaped
covers protect Sprungli's macarons, which are smaller, rounder, and button-shaped in comparison with the flattened yo-yos that are Laduree's more substantial disks. Laduree sells beautiful and expensive gift boxes in various
decorative schemes, but the interiors don't feature the grooved plastic trays in which the macarons are displayed at its shops (and which the
central Paris HQ presumably uses for shipping macarons to its various branches).
So even if Ladurée macarons are much tougher than the fragile
luxemburgli, they're likelier to get jostled while traveling.
led me down Bahnhofstrasse, the main commercial street where all the
buildings are of course impeccably clean, to the lakefront, where we
opened the goods. First up, his choice, was pistachio. Sprungli's were
greasy and unpleasant. The filling tasted much more of almond
extract than pistachio, and had little flavor beyond the slimy texture
I generally loathe in buttercream. Laduree's tasted of real pistachios,
which also gave saving grit to the buttercream. Though I like the airy,
meringue-like puff of the Sprungli shell, which crumbles and disappears
when you bite into it, the much chewier, brownie-textured meringue of
the Ladurée shell made the pistachio macaron a much better cookie. "Not
a fair fight," McClusky remarked.
didn't do much better on the caramel: the filling was undercooked and
underflavored, whereas Ladurée's had the depth and chew of
butterscotch. But we did like the salt on the Sprungli shell. Both
bakeries even call their flavor "salted caramel" and offer nothing
else, salt with caramel having overtaken the world much like
the molten chocolate cake originally created, as a way to fix an senJean-Georges Vongerichten we'd had (uncredited, of course) at
the Cheval Blanc. "Sprungli's going down," McClusky said.
then came the chocolate, which should be the flagship for both
houses--and certainly should be for Sprungli. And here the tables turned.
Sprungli's chocolate ganache had a lovely, fruity acidity and a complex
flavor that grew and lingered--a really fine chocolate encased in a
light, cocoa-y shell that set it off without getting in the way of the lingering, changing aftertaste.
Ladurée's plain chocolate macaron tasted of almost nothing but salt:
every Lauduree macaron, in fact, left a noticeable, and sometimes
unpleasant, aftertaste of salt. Neither the shell nor the ganache had
any strength or distinction of flavor. The Madagascar was better, but only marginally: it did taste
of chocolate, but was completely unremarkable, and again too salty. These were disks of inferior
Swiss, saved by chocolate again! The strange of apparition of
Heidi--who appears in a lurid technicolor cartoon-like series of illuminated color stills in the airport train
as you round a corner, already disoriented, frame after frame of her
with blinding blond pigtails leaning against a mountain as the sounds of cowbells, mooing, and an a capella choir suddenly invade your ear; her
picture takes over your retina and, you hope, not your dreams--would
doubtless approve, and keep smiling her mysterious, satisfied smile.
Corby Kummer's work in The Atlantic has established him as one of the most widely read, authoritative, and creative food writers in the United States. The San Francisco Examiner pronounced him "a dean among food writers in America."
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”
A tattooed, profanity-loving Lutheran pastor believes young people are drawn to Jesus, tradition, and brokenness.
“When Christians really critique me for using salty language, I literally don’t give a shit.”
This is what it’s like to talk to Nadia Bolz-Weber, the tattooed Lutheran pastor, former addict, and head of a Denver church that’s 250 members strong. She’s frank and charming, and yes, she tends to cuss—colorful words pepper her new book, Accidental Saints. But she also doesn’t put a lot of stock in her own schtick.
“Oh, here’s this tattooed pastor who is a recovering alcoholic who used to be a stand-up comic—that’s interesting for like five minutes,” she said. “The fact that people want to hear from me—that, I really feel, has less to do with me and more to do with a Zeitgeist issue.”
What do Google's trippy neural network-generated images tell us about the human mind?
When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.
The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."
Climate change means the end of our world, but the beginning of another—one with a new set of species and ecosystems.
A few years ago in a lab in Panama, Klaus Winter tried to conjure the future. A plant physiologist at the Smithsonian Tropical Research Institute, he planted seedlings of 10 tropical tree species in small, geodesic greenhouses. Some he allowed to grow in the kind of environment they were used to out in the forest, around 79 degrees Fahrenheit. Others, he subjected to uncomfortably high temperatures. Still others, unbearably high temperatures—up to a daily average temperature of 95 degrees and a peak of 102 degrees. That’s about as hot as Earth has ever been.
It’s also the kind of environment tropical trees have a good chance of living in by the end of this century, thanks to climate change. Winter wanted to see how they would do.