I've been working on this piece for the magazine, and managed to finish a draft today. I wanted to put Rakim's "I Know You Got Soul" up here (you know for the "Been a long time, I shouldn't have left you") but I somehow I got diverted are started watching KRS-ONE's live shows. I love Rakim--still my favorite MC ever, with Raekwon and Nas coming in close. But KRS-ONE is almost a category to himself.
I didn't see KRS live until I was like 20 years old. Before then I knew him as a great MC, by which I mean an incredible lyricist with a great flow. But then I went away to college and saw him live. It's very hard to explain what happens (to this very day) at a KRS-ONE show. The first thing I need to say is KRS is a great, great performer--certainly the best rapper I've ever seen, and arguably the best performer I've seen bar none.
This is no small feat. I've never seen KRS with a band. He controls the crowd simply through voice and presence. For those of us who grew up controlling nothing, this has long had special meaning, and perhaps is key to understanding hip-hop's enduring power. At any rate, there are basically two KRS-ONEs. The first is the one I knew as a boy--an intellectual wordsmith, a philosopher. The second, the live one, I met later, is something else, something visceral and ferocious, something that represents beyond the artist himself.
The last time I saw KRS it was 1998. He came on stage and the baseline to "The Bridge Is Over" came on. He didn't say a word. He just walked the stage. The crowd went insane. People who'd never been to New York, or some no doubt from Queens itself, leaping in the air, chanting "The Bridge is over, The Bridge is over..." It was like his mere presence, the scowl on his face, his bop, combined with the music to transport us somewhere else.
This is nostalgia at its most powerful and meaningful. 1986 has a specific meaning to some of us. It's that era of Just-ICE, Mantronix and Sparky D, that moment just before hip-hop really broke (88, 89) and became one of the most significant artistic movement of the latter century. At that point it was really just a baby, but those of us who cradled it in the twin decks of our boom-boxes, out on porches, on benches, in projects, in dorm rooms, felt that we were watching something incredible happen. And then it did.
KRS represents that time--the Big Bang. His manic energy, his awkward freestyle, the way he mugs the crowd when "Still Number One" comes on. It's as if he takes in all of the dark energy of old, all that we felt on those streets addled with crack, haunted by Saturday Night Specials, drinks it and then radiates. You have to see him. Even now he is talismanic. A shaman of our lovely and painful past, who somehow stills move the crowd in this odd and different future.
I learned a long time ago to not speak of "greatest" anything in hip-hop unironically. But KRS has an actual claim to "greatest." There's nothing like him. I don't know that there ever will be.
A rock structure, built deep underground, is one of the earliest hominin constructions ever found.
In February 1990, thanks to a 15-year-old boy named Bruno Kowalsczewski, footsteps echoed through the chambers of Bruniquel Cave for the first time in tens of thousands of years.
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
Washington voters handed Hillary Clinton a primary win, symbolically reversing the result of the state caucus where Bernie Sanders prevailed.
Washington voters delivered a bit of bad news for Bernie Sanders’s political revolution on Tuesday. Hillary Clinton won the state’s Democratic primary, symbolically reversing the outcome of the state’s Democratic caucus in March where Sanders prevailed as the victor. The primary result won’t count for much since delegates have already been awarded based on the caucus. (Sanders won 74 delegates, while Clinton won only 27.) But Clinton’s victory nevertheless puts Sanders in an awkward position.
Sanders has styled himself as a populist candidate intent on giving a voice to voters in a political system in which, as he describes it, party elites and wealthy special-interest groups exert too much control. As the primary election nears its end, Sanders has railed against Democratic leaders for unfairly intervening in the process, a claim he made in the aftermath of the contentious Nevada Democratic convention earlier this month. He has also criticized superdelegates—elected officials and party leaders who can support whichever candidate they chose—for effectively coronating Clinton.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Americans persist in thinking that Adam Smith's rules for free trade are the only legitimate ones. But today's fastest-growing economies are using a very different set of rules. Once, we knew them—knew them so well that we played by them, and won. Now we seem to have forgotten
IN Japan in the springtime of 1992 a trip to Hitotsubashi University, famous for its economics and business faculties, brought me unexpected good luck. Like
several other Japanese universities, Hitotsubashi is almost heartbreaking in
its cuteness. The road from the station to the main campus is lined with cherry
trees, and my feet stirred up little puffs of white petals. Students glided
along on their bicycles, looking as if they were enjoying the one stress-free
moment of their lives.
They probably were. In surveys huge majorities of students say that they study
"never" or "hardly at all" during their university careers. They had enough of
that in high school.
I had gone to Hitotsubashi to interview a professor who was making waves. Since
the end of the Second World War, Japanese diplomats and businessmen have acted
as if the American economy should be the model for Japan's own industrial
growth. Not only should Japanese industries try to catch up with America's lead
in technology and production but also the nation should evolve toward a
standard of economic maturity set by the United States. Where Japan's economy
differed from the American model—for instance, in close alliances between
corporations which U.S. antitrust laws would forbid—the difference should be
considered temporary, until Japan caught up.
Speculation about how Ramsay Bolton might die reveals the challenges of devising a cathartic TV death—and illuminates a larger issue facing the series.
Warning: Season 6 spoilers abound.
Ever since Ramsay Bolton revealed himself as Westeros’s villain-in-chief, Game of Thrones fans have wanted him dead. He first appeared in season three disguised as a Northern ally sent to help Theon Greyjoy but quickly turned out to be a lunatic whose appetite for cruelty only grew as the series progressed. (Last year, Atlantic readers voted him the actual worst character on television.) After several colorful and nauseating years of rape, torture, murder, and bad visual puns, speculation about the Bolton bastard’s looming death has reached its peak this sixth season. But “Will Ramsay die this season?” also gives way to a slightly more complicated question: “How should Ramsay die?”
Bernie Sanders is contesting the Democratic primary to the end, just as Hillary Clinton did eight years ago—but that parallel has its limits.
In May of 2008, two Democrats were somehow still fighting over the nomination. The stronger of the two had a comfortable lead in delegates and made calls to unify the party. But the weaker contender, buoyed by a loyal base, refused to give up. It got awkward.
The difference in 2016, of course, is Hillary Clinton’s position in the drama. She played the spoiler eight years ago, refusing to concede to Barack Obama in a primary that dragged into June, to the consternation of party elders. (They were nervously eyeing John McCain, who had pluckily sewn up his nomination by late February). But this year, she is the candidate ascendant, impatient to wrap up this whole Bernie Sanders business and take on Donald Trump.
In an ironic twist, the Republican nominee—the author of many a failed real-estate deal—is trying to use the Clintons’ bad 1978 land purchase against Hillary Clinton
Suddenly it looks like the presidential campaign could turn into a referendum on the 1990s. No, that doesn’t mean you get to vote your opinion on Third Eye Blind. Instead, Donald Trump seems to be determined to dredge up the detritus of the decade to attack Hillary Clinton.
Democrats knew what they were getting with the Clintons—an incredible political powerhouse, and a perpetual whiff of scandal. What they didn’t know, and still don’t, is how bad it will be this time, and how much it will matter.
Now comes one of the first tests. On Monday, Trump released a short video highlighting accusations of rape lodged against Bill Clinton by Kathleen Willey and Juanita Broaddrick. Attacks on Bill Clinton’s scandals are certainly fair game—the former president will find plenty of defenders, but his behavior will not. Whether they will work is a different matter. Hillary Clinton is trying to strike a delicate balance, reminding people why they liked the Clinton years without running as a nostalgia candidate, but she is ultimately the candidate—not her husband. The attacks could also simply remind people of Trump’s own checkered past as both a friend of the Clintons and a subject of sexual-harassment allegations. (I write in more depth about the risks, rewards, and lessons of this strategy here.)
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
The day—a celebration of corporate conformity disguised as a celebration of individuality—helped to bring about the current dominance of “business casual.”
The New York Times ran a story Wednesday announcing “The End of the Office Dress Code.” The suit and its varied strains, the article argues—corporate uniforms that celebrate, well, corporate uniformity—are giving way to more individualized interpretations of “office attire.” As the writer Vanessa Friedman puts it, “We live in a moment in which the notion of a uniform is increasingly out of fashion, at least when it comes to the implicit codes of professional and public life.”
It’s true. We live in a time in which our moguls dress in hoodies and t-shirts, and in which more and more workers are telecommuting—working not just from home, but from PJs. It’s a time, too, when the lines between “work” and “everything else” are increasingly—and sometimes frustratingly—fluid. And so: It’s also a time when many of us are trying to figure out, together, what “work clothes” actually means, and the extent to which the term might vary across professions. As Emma McClendon, who curated a new exhibit on uniforms for the Museum at the Fashion Institute of Technology, summed it up: “We are in a very murky period.”
What’s harder to believe: that it took a year for Andrea Constand to accuse the star of sexual assault, or that it’s taken 11 years and dozens more women coming forward for those accusations to be heard in court?
To date, more than 50 women have accused Bill Cosby of sexual misconduct. Constand was the first. In January of 2005 she told police that a year earlier, Cosby had touched and penetrated her after drugging her. A prosecutor decided against proceeding with the case, and Constand followed up with a civil suit that resulted in a 2006 settlement. After that came an accelerating drip of women making allegations about incidents spanning a wide swath of Cosby’s career, from Kristina Ruehli (1965) to Chloe Goins (2008).