Regarding predecessors, it wasn't at the same level of immersion, broad talent or personal response but early sides by The Righteous Brothers were an absolute staple of black radio in the early '60s when that "reverse cross-over" was pretty rare. But their material went down-hill pretty quickly.
Earlier the only example I know of someone who literally chose to "become black" in music was Johnny Otis - who was a Greek-American kid from Berkeley. In his teens he married his black sweet heart, traveled with all black blues and "territory" swing bands in the late '30s and early '40s, was totally immersed in black culture and music and had a string of big band, early R&B and R&R hits via his own band and singers he promoted like Big Mama Thornton, Esther Phillips and Etta James.
But Johnny basically claimed blackness in a pre-civil rights world when you couldn't live and move across "color lines" without making a life choice and sticking to it - he was simply considered black, even making the cover of "Negro Achievements" magazine in the early '50s, and was deeply involved in SoCal black political activism. I've talked to older black folk who saw his band at the Apollo in the 40s and early 50s and didn't have any idea he was white.
I caught some justifiable heat for linking to Teena Marie's more pop joints, but I refuse to front--I'll take The Righteous Brothers version of "Unchained Melody" over Harry Belafonte's. There. I said it. Ghost ruined that song.
There's also some interesting conversation here about how black people have thought about white people who've done the reverse cross-over. As is mentioned in the comment, this deserves to be bracketed off from simply singing R&B, or performing blackness. It's interesting that Brucds notes that many people thought Otis was black. I can totally see why. Pictured above is Abe and Effa Manley, co-owners of the Newark Eagles. Again, I would not have known this, but Effa was white--but had a black stepfather and was thus considered "a light-skinned black."
I think what all of this really is how much blackness, like any culture, really isn't about a melanin count, per se. So much of it is historical along with specific mores. It's no mistake that both Manley and Teena Marie grew up around black people. Johnny Otis claimed to be "black by persuasion."
You look at this people's lives--Manley was the first woman inducted into the baseball hall of fame--and it's kind of humbling. It's like, I'm the descendant of slave, they aren't. But really, if Teena Marie says "I'm a black artists with white skin" who am I to truly object? Johnny Otis was on the cover of Negro Achievements. That's more than you can say for the kid.
Also, Questlove, for a less poppy perspective, Questlove breaks down his Teena Marie favorites.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
Readers share their own experiences in an ongoing series.
Prompted by Emma Green’s note on the Supreme Court case Whole Women’s Health v. Hellerstedt, for which a group of lawyers filed a document openly describing their abortions, readers share their own stories in an ongoing collection edited by Chris Bodenner. We are posting a wide range of experiences—from pro-choice and pro-life readers, women and men alike—so if you have an experience not represented so far, please send us a note: email@example.com.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
People in Great Britain felt their leaders weren’t treating them fairly. Politicians in the U.S. should take note.
Britain’s Brexit vote has shocked the political elites of both the U.S. and Europe. The vote wasn’t just about the EU; in fact, polls before the referendum consistently showed that Europe wasn’t top on voters’ lists of concerns. But on both sides of the Atlantic Ocean, large numbers of people feel that the fundamental contracts of capitalism and democracy have been broken. In a capitalist economy, citizens tolerate rich people if they share in the wealth, and in a democracy, they give their consent to be governed if those governing do so in their interest. The Brexit vote was an opportunity for people to tell elites that both promises have been broken. The most effective line of the Leave campaign was “take back control.” It is also Donald Trump’s line.
The switch in their first joint campaign appearance is a reflection of the Democrats' confidence—and her lead in the polls.
NEWS BRIEF There are two simple ways to cut through the bluster and the spin to see how a presidential campaign is really feeling about its prospects at any given moment: You can follow the money, and you can follow the plane.
Is a candidate retrenching by spending more time and ad dollars in states their party has won in the past and must hold onto in November? Or is he or she being more aggressive—and aspirational—by trying to expand the map and add states that are more difficult, and potentially less crucial, to capture the White House?
On Wednesday, the Hillary Clinton campaign offered up a clue to its level of confidence when it announced that it had rescheduled a joint event with President Obama—the first since he endorsed his former secretary of state—for July 5. The rally was originally scheduled for mid-June, but was canceled following the Orlando shooting. What was notable about the announcement, however, is that the Clinton-Obama road show is launching in a different state than the campaign first planned. The postponed rally was to occur in Wisconsin, a state that Democrats haven’t lost in a presidential year since 1984 but which had been seen as a potential pickup for Donald Trump. Clinton and Obama will instead appear in North Carolina, which the president won narrowly in 2008 but lost four years ago.
Their degrees may help them secure entry-level jobs, but to advance in their careers, they’ll need much more than technical skills.
American undergraduates are flocking to business programs, and finding plenty of entry-level opportunities. But when businesses go hunting for CEOs or managers, “they will say, a couple of decades out, that I’m looking for a liberal arts grad,” said Judy Samuelson, executive director of the Aspen Institute’s Business and Society Program.
That presents a growing challenge to colleges and universities. Students are clamoring for degrees that will help them secure jobs in a shifting economy, but to succeed in the long term, they’ll require an education that allows them to grow, adapt, and contribute as citizens—and to build successful careers. And it’s why many schools are shaking up their curricula to ensure that undergraduate business majors receive something they may not even know they need—a rigorous liberal-arts education.
The star Daily Show correspondent is moving on to make her own scripted comedy, and her gain is the show’s huge loss.
When Jon Stewart announced he was leaving The Daily Show last year, many fans lobbied for Jessica Williams to replace him, pushing one of the show’s standout performers into a limelight she deemed herself not quite ready for. “Thank you, but I am extremely under-qualified for the job!” Williams tweeted. Comedy Central eventually picked Trevor Noah for the gig, and in the following months, Williams’s star has only risen higher. It’s no huge surprise, then, that on Wednesday she told Entertainment Weekly she was moving on from The Daily Show to develop her own scripted series for Comedy Central. It’s great news for Williams, but a huge loss for the show she’s leaving behind.
The discussion over Williams becoming The Daily Show host in 2015 turned into a minor political maelstrom. Williams publicly pushed back against the idea that she had “impostor syndrome,” as suggested by one writer, for calling herself “under-qualified” and pointing to her young age (25 at the time) as a reason for her disinterest in the position. Indeed, there are a thousand reasons to not want the daily grind of a TV hosting gig, and the heightened scrutiny and criticism Noah has received in his year on the job is among them. But as Williams’s popularity and talents have grown, and as The Daily Show has struggled to retain its critical cachet after Stewart’s departure, it’s been hard not to mourn a different outcome in which Williams took the host job and steered the series in a fresher, more relevant direction.
The Supreme Court declined to hear a major religious-freedom case on Tuesday, showing how much things have changed since Hobby Lobby.
Two years ago, the U.S. Supreme Court handed down a controversial 5-4 ruling about birth control and religion, Burwell v. Hobby Lobby Stores, Inc. Because of the ruling, private companies owned by religious people, including the craft-supply chain Hobby Lobby, can now refuse to cover certain kinds of birth control in their employee insurance plans, a requirement that was put in place by the 2010 Affordable Care Act. Supporters of the ruling claimed it as a triumph for religious freedom and an important precedent for cases about conscience-based objections to contraception.
Two years later, a pharmacy chain in Washington state, Stormans Inc., which operates a store in Olympia called Ralph’s Thriftway, has been denied a hearing before the Supreme Court. The pharmacy’s owners, along with two other pharmacists who are also plaintiffs in the case, Stormans, Inc. v. Wiesman, refused to stock emergency contraception, including Plan B and ella, for religious reasons—they believe the pills are effectively abortifacients. Long-standing state regulations require Washington pharmacies to stock a “representative assortment of drugs in order to meet the pharmaceutical needs of ... patients.” The requirements were updated in 2007, specifying that pharmacies must deliver all FDA-approved drugs to customers; they can’t refer people to get medication at a different location for any kind of religious or moral reasons.