I've been struggling all weekend to write something worthwhile about Manning Marable. On Saturday, I had the good fortune of receiving a note from Georgia State historian, and former Marable grad assistant, John McMillan, seeking a place to publish his own tribute. I hastily offered this page and McMillan was kind enough to offer his memories. They are as follows:
In hindsight, this is embarrassing to admit, but here
goes.When I first met Manning Marable
in 1996, at age 26, I was nervous.Partly I was on edge because I was trying to make a big decision: Should
I pursue a Ph.D. in African-American history at either Rutgers or Michigan,
where I'd been offered full funding?Or,
should I go to Columbia (my first choice), with no money upfront, but with some
vague possibility of securing a teaching fellowship down the line?
Months before, Manning had already written me to say that if
I were admitted to Columbia, he'd be keen to take me on as one of his graduate
students.(That was a thrill unto
itself!)Nevertheless, I couldn't help
but wonder (and this is the embarrassing part): did he know I was white?And if so, would he have any doubts about my
commitment to Black Studies, or my intellectual authority to work in the field?
Keep in mind, by then I'd read virtually all of Manning's
major works, including the earlier, more polemical stuff, like How Capitalism
Underdeveloped Black America, where he declared, "Progressive white Americans
must succeed in overturning their own racism."
No problem there, I chuckled.I'd long made a point of challenging racism
in others, and I've always tried (to the best of my ability) never to tolerate
it in myself.But then, he added this:
"Nothing short of a commitment to racial equality and Black
freedom such as that exhibited by the militant white abolitionist John Brown
will be sufficient."
There was only one way to gauge Manning's attitude, and that
was to show up at his office.I made the
haul all the way from mid-Michigan to New York City in my Chevy pick-up
truck.At that point in my life, I'd
never been anywhere near an Ivy League campus.My first memory of the area around Columbia comes from driving up and
down Broadway, Amsterdam Ave., and perhaps a dozen cross streets in-between,
again and again and again, screaming and pounding on my dashboard over my
inability to find a parking space.
As soon as I met Manning, though, all of my anxiety melted
away. As anyone who knew him would agree, one of his most striking qualities
was his affability.And although I
probably would not have said this in print while he was still alive, the plain
fact is, he really did look a lot like a teddy bear.
One thing I remember from that day is how vigorously he
stressed the fact that he saw himself as both a scholar, and an activist.For him, the two vocations were
inseparable.What's more, he wanted me
to know that when he became the founding director of the Institute for Research
in African American Studies (IRAAS) a few years earlier, he'd envisioned it as
fundamentally a community resource.And
by "community," he pointed out, "I don't mean just Columbia, or even
Morningside Heights."He gestured toward
the window of his 6th floor office, which afforded views to the north and the
east."We're not in Morningside
Heights!We're in Harlem!"
To this end, he had a remarkable capacity for making time for virtually anyone who wanted something from him, even including the conspiratorial-minded guy with the rusty stains on his shirt (or was it blood?) who would occasionally show up unannounced at Manning's door, asking to bend his ear.Then there was this other fellow: he was never around, except for on the periodic occasions when the Institute would lay out a very nice buffet in honor of some distinguished guest speaker, in which case he would always be there, first in line, testing the capacity of his Styrofoam plates with enormous mounds of chicken wings, mini quiches, cocktail shrimp, and whatever else.(Okay, I'll confess: I once watched as Manning quietly observed this guy from the corner of the room, sighed heavily, and rolled his eyes.)
Manning was also one of the hardest workers in all of academia.In the mid-to-late 1990s, you might recall, a whole corps of "black public intellectuals" was suddenly gaining more exposure than they'd probably ever dreamed of.After a long period during which black scholars were more likely to toil away in obscurity, with their contributions being slighted or overlooked, now at least a few of them - through a combination of intelligence, charisma, and moxie - seemed to be everywhere.And while some celebrated the new visibility of people like Henry Louis Gates, Jr., Cornel West, and Michael Eric Dyson, others sensed a certain entrepreneurialism in their approach.Sure, they could all talk a very good game, people used to grouse.Hell, put them in range of a microphone, and they'll talk about anything!But when it came to scholarship, what did they actually do?
That was never quite my own view, but regardless: nobody ever credibly said such a thing about Manning.Sure, he made TV appearances and gave paid lectures (oh, how he must have loved Black History Month).But he was also an author of god-knows-how-many books and articles, the great bulk of which showcased his deep immersion in fields as diverse as history, sociology, political science, economics, and even literature.His new, nearly 600 page opus, Malcolm X: A Life of Reinvention, is already being celebrated as an exhaustively researched tome, one that will completely upend our understanding of that fabled leader.
What an incredible exercise in self-restraint it must have been to keep plugging away on that biography for fifteen-odd years, all the while sitting on so many explosive revelations.I remember him excitedly making a few vague allusions to the discoveries he was making, way back in the day.Now we all know just what he was onto.
At the moment, I'm awfully sad that I didn't stay in better touch with Manning in recent years, though I can take some solace from the fact that about six weeks ago, I sent him a warmly inscribed copy of my first monograph.I have so many fond memories of our conversations from the three-year period that I worked for him, but I'll always treasure that first meeting the best.After listening to my concerns, putting me at ease, and making me laugh out loud, he said something I did not expect: "I might be able to help you out."
Five months later, I'd relocated to Manhattan, and I was meeting a considerable portion of my grad school expenses by working as his research assistant.(We collaborated on two books.) Without him, I'm not sure I'd have mustered the courage to go to Columbia, something that later turned out - without question - to be one of the great blessings of my life.And yet whenever I tried to thank Manning for anything - whether for helping to pay for my education, or for buying me a sandwich (as he sometimes did), I always got the same response.He'd shrug, smile impishly, and say, "Hey, what do you expect?I'm a socialist!"
John McMillian is Assistant Professor of history at Georgia State University. His most recent book is Smoking Typewriters: The Sixties Underground Press and the Rise of Alternative Media in America (Oxford, 2011).
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
Be kind, show understanding, do good—but, some scientists say, don’t try to feel others’ pain.
In 2006, then-senator Barack Obama gave a commencement speech offering what seemed like very sensible advice. “There’s a lot of talk in this country about the federal deficit,” he told Northwestern’s graduating class. “But I think we should talk more about our empathy deficit—the ability to put ourselves in someone else’s shoes; to see the world through those who are different from us—the child who’s hungry, the laid-off steelworker, the immigrant woman cleaning your dorm room.”
In the years since then, the country has followed Obama’s counsel, at least when it comes to talking about empathy. It’s become a buzzword, extolled by Arianna Huffington, taught to doctors and cops, and used as a test for politicians. "We are on the cusp of an epic shift,” according to Jeremy Rifkin’s 2010 book The Empathetic Civilization. “The Age of Reason is being eclipsed by the Age of Empathy."
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The retired general and former CIA director holds forth on the Middle East.
ASPEN, Colo.—Retired U.S. Army General David Petraeus pioneered America’s approach to counterinsurgency, led the surge in Iraq, served as director of the CIA for a year, and was sentenced to two years probation for leaking classified information to his mistress. On Wednesday at the Aspen Ideas Festival, he was interviewed by my colleague, Jeffrey Goldberg, about subjects including efforts to stop Iran’s nuclear program; the civil war in Syria; ISIS and the threat it poses to the United States; and the Iraq War.
Here are several noteworthy moments from their conversation, slightly condensed:
The Risks of Attacking Iran
Jeffrey Goldberg: So you believe that, under certain circumstances, President Obama would still use military force against Iran?
David Petraeus: I think he would, actually. I know we’ve had red lines that didn’t turn out to be red lines. ... I think this is a different issue, and I clearly recognize how the administration has sought to show that this is very, very different from other sort of off-the-cuff remarks.
Goldberg: How did the Obama administration stop Israel from attacking Iran? And do you think that if this deal does go south, that Israel would be back in the picture?
Petraeus: I don’t, actually. I think Israel is very cognizant of its limitations. ... The Israelis do not have anything that can crack this deeply buried enrichment site ... and if you cannot do that, you’re not going to set the program back very much. So is it truly worth it, then?
So that’s a huge limitation. It’s also publicly known that we have a 30,000-pound projectile that no one else has, that no one else can even carry. The Massive Ordinance Penetrator was under design for almost six years. ... If necessary, we can take out all these facilities and set them back a few years, depending on your assumptions.
But that’s another roll of the iron dice, as Bismarck used to say, and you never know when those dice are rolled what the outcome is going to be. You don’t know what risks could materialize for those who are in harm’s way.
You don’t know what the response could be by Iran.
There’s always the chance that there will be salvos at Israel, but what if they decide to go at the Gulf states, where we have facilities in every single one.
This is not something to be taken lightly, clearly.
Freed last March from Death Row after serving 29 years for a crime he didn’t commit, the Louisiana native died Monday of lung cancer.
On November 5, 1983, a jewelry shop owner named Isadore Rozeman in Shreveport, La., was murdered in a robbery. Glenn Ford, a local man who knew Rozeman slightly, was not guilty of the crime. But the next year, the 34-year-old was nonetheless convicted of Rozeman’s murder and sentenced to death. For the next 29 years, three months, and five days, Ford lived in a 8-foot by 10-foot cell in Louisiana’s Angola prison and spent most of his days in solitary confinement.
In March 2014, Ford was finally exonerated and released. But soon after gaining his freedom, Ford was condemned anew—this time by a lung cancer diagnosis. On Monday, he died of the disease in New Orleans, where a nonprofit organization had provided him with a home. Ford was 65.
The meaning of the Confederate flag is best discerned in the words of those who bore it.
This afternoon, in announcing her support for removing the Confederate flag from the capitol grounds, South Carolina Governor Nikki Haley asserted that killer Dylann Roof had “a sick and twisted view of the flag” which did not reflect “the people in our state who respect and in many ways revere it.” If the governor meant that very few of the flag’s supporters believe in mass murder, she is surely right. But on the question of whose view of the Confederate Flag is more twisted, she is almost certainly wrong.
Roof’s belief that black life had no purpose beyond subjugation is “sick and twisted” in the exact same manner as the beliefs of those who created the Confederate flag were “sick and twisted.” The Confederate flag is directly tied to the Confederate cause, and the Confederate cause was white supremacy. This claim is not the result of revisionism. It does not require reading between the lines. It is the plain meaning of the words of those who bore the Confederate flag across history. These words must never be forgotten. Over the next few months the word “heritage” will be repeatedly invoked. It would be derelict to not examine the exact contents of that heritage.