“Ta-Nehisi, Brian, colorist Laura Martin, letterer Joe Sabino, assistant editor Chris Robinson, and I have been working on this series for months already, so we’re happy to have a launch date as we’re all anxious to start getting this book out in front of people,” series editor Wil Moss told Marvel.com. “We may be biased, but we think it's something pretty special!”
A few months ago, I was fortunate enough to be contracted to work on Marvel’s Black Panther. I didn’t want to say too much before I got started, but now, with a few scripts in, having gotten comfortable with my editors, and having been blown away by Brian Stelfreeze’s art (early sketches of which you see here), I’m feeling a little better. With that in mind, my hope is, from time to time, to update you guys on the process of making the thing.
I guess I should start by saying I’ve never done this before. I expect that there will be stumbles and screw-ups on my part. My nightmare basically involves this turning into some sort of stunt or vanity project. I did not take this on to look pretty, or add a line to my CV. I took it on for the same reason I take on new stories—to grow intellectually and artistically. In this case it’s another genre—fictional, serial story-telling—one a good distance away from journalism, memoir, and essays.
Still I find myself falling back on old principles. I’m a writer who really values organization. I value it even more when saddled with the relatively high probability of failure. In that regard, my basic approach has been as follows:
1.) Read a ton of back issues and try to think about what I find interesting (Ramonda) and what I find less interessting (M’Baku.) 2.) Get a detailed outline done of all the issues I was contracted to write. 3.) Write those scripts early in order to give Brian, and my editors, a chance to tell me what I am doing wrong. 4.) Revise the outline regularly, as events (and finished scripts) dictate a need to change. That has been the plan. Having a plan doesn’t guarantee success. But not having a plan probably guarantees failure.
One thing I did not count on was the extent to which the art would shape the story. Brian’s thoughts on T’Challa, and his supporting cast, have been invaluable. You can see the fruits of collaboration in the image above. After talking back and forth we came up with some new ideas for how T’Challa’s famed Vibranium-weave suit might work—in this case, absorbing kinetic energy and allowing him to fire that energy back out in short energy bursts. “Energy bursts” almost gets it wrong—think “force-push” not “optic blast.” All the old powers are there—enhanced senses, agility, peak-human strength, etc. But this idea (and others) really came out of Brian’s thoughts—not just on the suit—but on the properties of Vibranium itself.
Writing, for me, is a lonely exercise. I pitch an idea to my editors and then I disappear for awhile. There are a few regular check-ins, but generally the next thing they see from me is a draft. Black Panther has been different. There’s a lot more collaboration and conversation. Barely three days go by in which I don’t talk to Brian or my editor, Wilson Moss.
I’ll have more to say about that process as the days go on. For now, enjoy some of Brian’s (awesome) concept art. I’ve seen some of his penciled pages already. They’re glorious. I’m trying to keep up.
Hailed as a savant, lampooned as a fraud, Britain’s likely next prime minister must lead his country through its moment of maximum peril—and opportunity.
Late morning on Tuesday, July 23, the denouement in Boris Johnson’s lifelong quest for political power will be revealed, when the committee that has organized the Conservative Party’s leadership election will announce the winner of the race to replace Theresa May. The following day, the winner—Johnson is the heavy favorite—will be driven to Buckingham Palace for an audience with the Queen, and be formally appointed prime minister.
It will be the culmination of seven weeks of national campaigning in which Johnson has slowly and cautiously closed in on the prize. Yet in reality it has been a 40-year pursuit, relentlessly driving forward, each step a mere prelude to the next on his seemingly unstoppable rise.
Two families, two bodies, and a wilderness of secrets
They found what was left of him in the spring of 2014. Firefighters battling a huge blaze on Alaska’s Kenai Peninsula first spotted a boot in the dirt. Then they noticed some bones scattered across a wide grassy area. Fire crews in Alaska are used to seeing the bones of moose, caribou, bears, and other large creatures that live and die in these woods. So it wasn’t until crew members found a human skull that they stopped to consider that the pieces might go together. The skull was resting on its side, the face angled toward the ground. A few blackened molars clung to the upper jaw. The lower jaw was missing.
As May navigated going from prime minister to a backbench MP, she turned to her predecessors Tony Blair, Gordon Brown, and David Cameron.
In the final weeks of her premiership, as she has considered life after Downing Street, Theresa May sought out the only people who could reasonably give her advice: her predecessors. The outgoing British Prime Minister reached out to David Cameron, Gordon Brown and Tony Blair to discuss life after office, officials told The Atlantic.
The revelation sheds new light on the prime minister’s mind-set as she faces up to the reality of her life outside the trappings of office as a member of the exclusive club of ex–prime ministers, currently inhabited by just four living individuals. (May also spoke to John Major, the Conservative prime minister from 1990 to 1997, though only on the sidelines of a cricket reception, the two officials said.)
An astonishing number of students start college in America without finishing it: Roughly 40 percent of college enrollees don’t go on to get a degree within six years of starting to work toward one.
The good news is that in recent decades things have gotten a bit less bad. By one calculation, at four-year state schools that didn’t make the top 50 public universities in U.S. News & World Report’s rankings, the graduation rate within six years rose from about 40 percent for students starting in the early 1990s to about 50 percent for students starting in the late 2000s. (The phenomenon was not limited to non-elite schools.)
When Jeff Denning, an economist at Brigham Young University, started looking closely at the data on college-completion rates, he was a bit perplexed by what, exactly, was driving this uptick. He and some of his BYU colleagues noticed that a range of indicators from those two decades pointed in the direction of lower, not higher, graduation rates: More historically underrepresented groups of students (who tend to have lower graduation rates) were enrolling, students appeared to be studying less and spending more time working outside of school, and student-to-faculty ratios weren’t decreasing. “We started thinking, What could possibly explain this increase?” Denning told me. “Because we were stuck with not being able to explain anything.”
When, exactly, did the astronaut set foot on the moon? No one knows.
The Apollo 11 mission was, in most respects, a feat of extraordinary precision.
Traveling at a maximum velocity of about seven miles a second, the Saturn V rocket would have launched the crew far off course in the event of even a slight navigational error. From nearly 240,000 miles away, Houston’s Mission Control could track the spacecraft’s position to within 30 feet. The command module’s guidance computer kept time to the millisecond.
And yet for all that precision, no one can say with absolute certainty when, exactly, Neil Armstrong first set foot on the moon.
Most of the details of the moment are canonical: Armstrong took his one small step on July 20, 1969—50 years ago this past Saturday. The step took place just after 10:56 eastern time that night. And Armstrong bookended the step with the lines “Okay, I’m going to step off the [lunar module] now” and “That’s one small step for man, one giant leap for mankind.” (Or was it “one small step for a man,” as Armstrong insisted?) At some point during the roughly eight-second interval between those two lines, he became the first human being to walk on the moon. But when exactly he did so is less than clear.
No one has done more to dispel the myth of social mobility than Raj Chetty. But he has a plan to make equality of opportunity a reality.
Raj Chetty got his biggest break before his life began. His mother, Anbu, grew up in Tamil Nadu, a tropical state at the southern tip of the Indian subcontinent. Anbu showed the greatest academic potential of her five siblings, but her future was constrained by custom. Although Anbu’s father encouraged her scholarly inclinations, there were no colleges in the area, and sending his daughter away for an education would have been unseemly.
But as Anbu approached the end of high school, a minor miracle redirected her life. A local tycoon, himself the father of a bright daughter, decided to open a women’s college, housed in his elegant residence. Anbu was admitted to the inaugural class of 30 young women, learning English in the spacious courtyard under a thatched roof and traveling in the early mornings by bus to a nearby college to run chemistry experiments or dissect frogs’ hearts before the men arrived.
Inside the mind of a psychologist who helps determine whether parents are “good enough” to keep their children
The Vermont lake was the perfect setting for a mother-daughter day. The mother packed water and towels. The daughter, an excitable young girl, shoved cheese sticks into a cooler. When the two arrived at the beach, they swung the cooler between them as they walked to the water.
But the mother’s smile was strained, because the day of family fun would be closely watched. Joining the pair was Sharon Lamb, a psychologist who evaluates parents and makes recommendations to family courts regarding whether their rights to their children should be terminated. The daughter had been in foster care for two years, and her mother was in danger of losing custody permanently. Lamb was there to help determine whether the mother could be considered fit to parent.
Ten years ago, a neuroscientist said that within a decade he could simulate a human brain. Spoiler: It didn’t happen.
On July 22, 2009, the neuroscientist Henry Markram walked onstage at the TEDGlobal conference in Oxford, England, and told the audience that he was going to simulate the human brain, in all its staggering complexity, in a computer. His goals were lofty: “It’s perhaps to understand perception, to understand reality, and perhaps to even also understand physical reality.” His timeline was ambitious: “We can do it within 10 years, and if we do succeed, we will send to TED, in 10 years, a hologram to talk to you.” If the galaxy-brain meme had existed then, it would have been a great time to invoke it.
One could argue that the nature of pioneers is to reach far and talk big, and that it’s churlish to single out any one failed prediction when science is so full of them. (Science writers joke that breakthrough medicines and technologies always seem five to 10 years away, on a rolling window.) But Markram’s claims are worth revisiting for two reasons. First, the stakes were huge: In 2013, the European Commission awarded his initiative—the Human Brain Project (HBP)—a staggering 1 billion euro grant (worth about $1.42 billion at the time). Second, the HBP’s efforts, and the intense backlash to them, exposed important divides in how neuroscientists think about the brain and how it should be studied.
President Trump’s attorney general had the first word on the Mueller investigation. It may end up being the final word.
Back in May, Representative Justin Amash of Michigan held a town hall to defend his position as the lone Republican calling for impeachment proceedings against President Donald Trump. Amash explained to voters that he’d arrived at this position after reading Special Counsel Robert Mueller’s 448-page report on Russian interference in the 2016 election and possible obstruction of justice by the president. But for at least one voter, that explanation was more like revelation: As far as she was aware, Trump had been totally exonerated.
“I was surprised to hear there was anything negative in the Mueller report at all about President Trump. I hadn’t heard that before,” Cathy Garnaat, a Republican who supported Amash and Trump, told NBC that night. “I’ve mainly listened to conservative news and I hadn’t heard anything negative about that report, and President Trump has been exonerated.”
President Trump has instructed aides to prepare for sweeping budget cuts if he wins a second term in the White House, five people briefed on the discussions said, a move that would dramatically reverse the big-spending approach he adopted during his first 30 months in office. Trump’s advisers say he will be better positioned to crack down on spending and shrink or eliminate certain agencies after next year, particularly if Republicans regain control of the House of Representatives.
If Trump is really contemplating large cuts in a second term, it’d be very strange. But the idea of a serious fiscal-conservative turn after 2020, no matter how unlikely, does raise a larger question: What exactly would be the point of a second Trump term?