“Ta-Nehisi, Brian, colorist Laura Martin, letterer Joe Sabino, assistant editor Chris Robinson, and I have been working on this series for months already, so we’re happy to have a launch date as we’re all anxious to start getting this book out in front of people,” series editor Wil Moss told Marvel.com. “We may be biased, but we think it's something pretty special!”
A few months ago, I was fortunate enough to be contracted to work on Marvel’s Black Panther. I didn’t want to say too much before I got started, but now, with a few scripts in, having gotten comfortable with my editors, and having been blown away by Brian Stelfreeze’s art (early sketches of which you see here), I’m feeling a little better. With that in mind, my hope is, from time to time, to update you guys on the process of making the thing.
I guess I should start by saying I’ve never done this before. I expect that there will be stumbles and screw-ups on my part. My nightmare basically involves this turning into some sort of stunt or vanity project. I did not take this on to look pretty, or add a line to my CV. I took it on for the same reason I take on new stories—to grow intellectually and artistically. In this case it’s another genre—fictional, serial story-telling—one a good distance away from journalism, memoir, and essays.
Still I find myself falling back on old principles. I’m a writer who really values organization. I value it even more when saddled with the relatively high probability of failure. In that regard, my basic approach has been as follows:
1.) Read a ton of back issues and try to think about what I find interesting (Ramonda) and what I find less interessting (M’Baku.) 2.) Get a detailed outline done of all the issues I was contracted to write. 3.) Write those scripts early in order to give Brian, and my editors, a chance to tell me what I am doing wrong. 4.) Revise the outline regularly, as events (and finished scripts) dictate a need to change. That has been the plan. Having a plan doesn’t guarantee success. But not having a plan probably guarantees failure.
One thing I did not count on was the extent to which the art would shape the story. Brian’s thoughts on T’Challa, and his supporting cast, have been invaluable. You can see the fruits of collaboration in the image above. After talking back and forth we came up with some new ideas for how T’Challa’s famed Vibranium-weave suit might work—in this case, absorbing kinetic energy and allowing him to fire that energy back out in short energy bursts. “Energy bursts” almost gets it wrong—think “force-push” not “optic blast.” All the old powers are there—enhanced senses, agility, peak-human strength, etc. But this idea (and others) really came out of Brian’s thoughts—not just on the suit—but on the properties of Vibranium itself.
Writing, for me, is a lonely exercise. I pitch an idea to my editors and then I disappear for awhile. There are a few regular check-ins, but generally the next thing they see from me is a draft. Black Panther has been different. There’s a lot more collaboration and conversation. Barely three days go by in which I don’t talk to Brian or my editor, Wilson Moss.
I’ll have more to say about that process as the days go on. For now, enjoy some of Brian’s (awesome) concept art. I’ve seen some of his penciled pages already. They’re glorious. I’m trying to keep up.
As their goosebumps have long suggested, women perform better on tests of cognitive function at toastier room temperatures.
If “I told you so” had a sensation, it would be the sweet cocoon of an 80-degree workspace. For years, women have been saying that the AC is on too damn high. We’ve dragged not one but two sweaters to the office in the summer: one for our slowly numbing legs, and one for our shivering shoulders. Scientific studies have already shown that offices are set for men’s frostier preferred temperatures.
Now a new paper confirms what many of us have long suspected. Women don’t just prefer warmer office temperatures. They perform better in them, too.
For the study, published today in the journal PLOS One, the researchers Tom Chang and Agne Kajackaite had 543 college students in Berlin take different types of tests in a room set to various temperatures between 61 and 91 degrees Fahrenheit. First, the participants had to answer logic problems, like the one about a bat costing $1 more than a ball. Then, the students were asked to add up two-digit numbers without a calculator. Finally, they had to form German words out of the letter scramble ADEHINRSTU.
Out with the kitchen table, and in with the couch.
According to a recent survey of more than 1,000 American adults, the table is becoming a less and less popular surface to eat on. Nearly three-quarters of those surveyed said they grew up typically eating dinner at a kitchen table, but a little less than half said they do so now when eating at home.
Where are they dining instead? The couch and the bedroom are both far more popular now than in the respondents’ youth. Thirty percent of the survey takers cited the couch as their primary at-home eating location, and 17 percent took meals in the bedroom. To put it another way, the number of respondents who most often eat at a kitchen table nowadays is roughly the same as the number who eat either on the couch or in their bedroom.
Poor white Americans’ current crisis shouldn’t have caught the rest of the country as off guard as it has.
Sometime during the past few years, the country started talking differently about white Americans of modest means. Early in the Obama era, the ennobling language of campaign pundits prevailed. There was much discussion of “white working-class voters,” with whom the Democrats, and especially Barack Obama, were having such trouble connecting. Never mind that this overbroad category of Americans—the exit pollsters’ definition was anyone without a four-year college degree, or more than a third of the electorate—obliterated major differences in geography, ethnicity, and culture. The label served to conjure a vast swath of salt-of-the-earth citizens living and working in the wide-open spaces between the coasts—Sarah Palin’s “real America”—who were dubious of the effete, hifalutin types increasingly dominating the party that had once purported to represent the common man. The “white working class” connoted virtue and integrity. A party losing touch with it was a party unmoored.
If mothers and fathers speak openly about child-care obligations, their colleagues will adapt.
I’m an economist. I love data and evidence. I love them so much that I write books about data-based parenting. When questions arise about how to support parents at work (for example, from Alexandria Ocasio-Cortez on Twitter), my first impulse is to endorse paid parental leave. Mountains of data and evidence show that paid leave is good for children’s health, and for mothers in particular. I am more than comfortable making a data-based case for this policy.
But experience, rather than pure data, leads me to believe that what happens after paid leave is nearly as crucial—that is to say, what happens when Mom and Dad return to the office. We need to normalize the experience of parenting while working.
As 23 candidates struggle for attention, one name stands out.
Barack Obama is literally more popularthan Jesus among Democrats. Unfortunately, neither the former president nor any of the party’s 23 candidates currently seeking the 2020 nomination know quite what to do with that information.
Of course, before any serious endorsement conversation can commence, Obama has to finish his book (between rounds of golf and raising millions for his foundation). The writing has been going more slowly than he’d expected, and according to several people who have spoken with him, the 44th president is feeling competitive with his wife, whose own book, Becoming, was the biggest release of 2018 and is on track to be the best-selling memoir in history. Speaking on the condition of anonymity, like others in this story, these sources note he’ll occasionally point out in conversation that he’s writing this book himself, while Michelle used a ghostwriter. He’s also trying to balance the historical and political needs of a project that will be up to his standards as a writer, and not 1,000 pages long. Obama’s research process has been intense and convoluted, and it’s still very much ongoing, from the legal pads he had shipped to Marlon Brando’s old island in French Polynesia, where he spent a month in March 2017, to the interviews that aides have been conducting with former members of his administration to jog and build out memories.
A neuroscientist on how we came to be aware of ourselves.
Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species.
John Walker Lindh was the first American to face charges related to the War on Terror. Dozens have followed.
John Walker Lindh, the “American Taliban,” is leaving prison. When the young Californian began serving his sentence for the crime of supporting the group—nearly two decades ago—he was 21, and America was fighting the Taliban in Afghanistan as part of the post-9/11 War on Terror. Now, the United States is holding negotiations with the group to try to get troops out of the country, and has even considered paying Taliban emissaries’ expenses to get to peace talks.
Lindh’s incarceration has spanned nearly the entirety of America’s post-9/11 wars. Early on, many Americans saw him as the face of terror, even though he was never convicted of plotting attacks against them. He had joined the Taliban in the summer of 2001, months before the U.S. was at war with the group, to help it fight in its own civil war. He had stayed with the group after 9/11, and had been present at a prisoner uprising that killed the 32-year-old CIA officer Johnny Micheal Spann, the first American to die in the new war. By then, George W. Bush had declared that the U.S. would make no distinction between al-Qaeda, bin Laden’s international terrorist network that had perpetrated the 9/11 attacks, and the Taliban, the Islamist fundamentalist government in Afghanistan that had sheltered al-Qaeda while it plotted. Americans were shocked to see one of their own on the other side.
Applying to schools has become an endless chore—one that teaches students nothing about what really matters in higher education.
The crazed pursuit of college admissions helps no one thrive. And while the Varsity Blues admissions scandal shines a light on families that break the rules, it’s time to consider the unhappiness of families that play by them. While competition for seats may be inevitable, students scramble to do ever more to get into college—and give away more of their childhood to do so. This competition might seem a problem only for middle class and wealthy families. But students of modest means suffer most when applying to college becomes an endless list of tasks requiring time and other resources.
As the CEO of the College Board, I see this arms race up close. We administer the SAT, a test that helps admissions officers assess the reading, writing, and math skills of students across the country and around the world. We also administer the Advanced Placement program, which helps students earn credit for college-level work they do while in high school. We know these tools to be useful, but we also see how they can contribute to the arms race. The College Board can and will do more to limit the excesses—more on that below—but there is more at stake than which tests kids take or don’t take.
Certain stars have a history distinct from all the others around them.
We are made of star stuff, as Carl Sagan told us. The first stars ignited billions of years ago, out of the cold, primordial gas in the dark universe. The stars blazed until they exploded in bursts powerful enough to forge heavy chemical elements. The process repeated itself, over and over, all across space. The new elements found their way into other stars, and then planets, and, eventually, life.
It’s a remarkable cosmic tale, with a recent twist. Some of the stardust has managed to become sentient, work out its own history, and use that knowledge to better understand the stars.
Astronomers know stars so well, in fact, that they can tell when one doesn’t belong—when it’s migrated to our galaxy from a completely different one.
Disney’s live-action remake of the 1992 animated classic is a special-effects-laden extravaganza that comes off as clumsy and half-hearted.
Disney’s 1992 classic Aladdin is one of the greatest cinematic arguments for the storytelling potential of animation, which is perfectly expressed through the character of Genie. As voiced by Robin Williams and renderedin two dimensions, he’s a slapstick genius who can conjure anything, appear in any shape or size, and gleefully defy the laws of physics. For years, animation was the only way such a fantastic character could exist on-screen, but in 2019, visual effects have advanced enough that audiences can see a gigantic blue version of Will Smith try to give the same performance. Technological progress has clearly gone too far.
Guy Ritchie’s live-action remake of Aladdin, the latest in a long line of Disney revivals of its own greatest works, existsin the same nostalgic sphere as recent hits such asBeauty and the Beast and The Jungle Book. It’s a garish,special-effects-laden extravaganza that still manages to feel tossed-off and half-hearted. The film isentirely devoted to the property it’s adapting, but its mimicry underlines just how pale an imitation it is. The only participant really trying to energize the project is Smith, who—poor man—has to spend much of his screen time transformed into a rubbery CGI monstrosity who’s impossible to take seriously.