I've been struggling all weekend to write something worthwhile about Manning Marable. On Saturday, I had the good fortune of receiving a note from Georgia State historian, and former Marable grad assistant, John McMillan, seeking a place to publish his own tribute. I hastily offered this page and McMillan was kind enough to offer his memories. They are as follows:
In hindsight, this is embarrassing to admit, but here
goes.When I first met Manning Marable
in 1996, at age 26, I was nervous.Partly I was on edge because I was trying to make a big decision: Should
I pursue a Ph.D. in African-American history at either Rutgers or Michigan,
where I'd been offered full funding?Or,
should I go to Columbia (my first choice), with no money upfront, but with some
vague possibility of securing a teaching fellowship down the line?
Months before, Manning had already written me to say that if
I were admitted to Columbia, he'd be keen to take me on as one of his graduate
students.(That was a thrill unto
itself!)Nevertheless, I couldn't help
but wonder (and this is the embarrassing part): did he know I was white?And if so, would he have any doubts about my
commitment to Black Studies, or my intellectual authority to work in the field?
Keep in mind, by then I'd read virtually all of Manning's
major works, including the earlier, more polemical stuff, like How Capitalism
Underdeveloped Black America, where he declared, "Progressive white Americans
must succeed in overturning their own racism."
No problem there, I chuckled.I'd long made a point of challenging racism
in others, and I've always tried (to the best of my ability) never to tolerate
it in myself.But then, he added this:
"Nothing short of a commitment to racial equality and Black
freedom such as that exhibited by the militant white abolitionist John Brown
will be sufficient."
There was only one way to gauge Manning's attitude, and that
was to show up at his office.I made the
haul all the way from mid-Michigan to New York City in my Chevy pick-up
truck.At that point in my life, I'd
never been anywhere near an Ivy League campus.My first memory of the area around Columbia comes from driving up and
down Broadway, Amsterdam Ave., and perhaps a dozen cross streets in-between,
again and again and again, screaming and pounding on my dashboard over my
inability to find a parking space.
As soon as I met Manning, though, all of my anxiety melted
away. As anyone who knew him would agree, one of his most striking qualities
was his affability.And although I
probably would not have said this in print while he was still alive, the plain
fact is, he really did look a lot like a teddy bear.
One thing I remember from that day is how vigorously he
stressed the fact that he saw himself as both a scholar, and an activist.For him, the two vocations were
inseparable.What's more, he wanted me
to know that when he became the founding director of the Institute for Research
in African American Studies (IRAAS) a few years earlier, he'd envisioned it as
fundamentally a community resource.And
by "community," he pointed out, "I don't mean just Columbia, or even
Morningside Heights."He gestured toward
the window of his 6th floor office, which afforded views to the north and the
east."We're not in Morningside
Heights!We're in Harlem!"
To this end, he had a remarkable capacity for making time for virtually anyone who wanted something from him, even including the conspiratorial-minded guy with the rusty stains on his shirt (or was it blood?) who would occasionally show up unannounced at Manning's door, asking to bend his ear. Then there was this other fellow: he was never around, except for on the periodic occasions when the Institute would lay out a very nice buffet in honor of some distinguished guest speaker, in which case he would always be there, first in line, testing the capacity of his Styrofoam plates with enormous mounds of chicken wings, mini quiches, cocktail shrimp, and whatever else. (Okay, I'll confess: I once watched as Manning quietly observed this guy from the corner of the room, sighed heavily, and rolled his eyes.)
Manning was also one of the hardest workers in all of academia. In the mid-to-late 1990s, you might recall, a whole corps of "black public intellectuals" was suddenly gaining more exposure than they'd probably ever dreamed of. After a long period during which black scholars were more likely to toil away in obscurity, with their contributions being slighted or overlooked, now at least a few of them - through a combination of intelligence, charisma, and moxie - seemed to be everywhere. And while some celebrated the new visibility of people like Henry Louis Gates, Jr., Cornel West, and Michael Eric Dyson, others sensed a certain entrepreneurialism in their approach. Sure, they could all talk a very good game, people used to grouse. Hell, put them in range of a microphone, and they'll talk about anything! But when it came to scholarship, what did they actually do?
That was never quite my own view, but regardless: nobody ever credibly said such a thing about Manning. Sure, he made TV appearances and gave paid lectures (oh, how he must have loved Black History Month). But he was also an author of god-knows-how-many books and articles, the great bulk of which showcased his deep immersion in fields as diverse as history, sociology, political science, economics, and even literature. His new, nearly 600 page opus, Malcolm X: A Life of Reinvention, is already being celebrated as an exhaustively researched tome, one that will completely upend our understanding of that fabled leader.
What an incredible exercise in self-restraint it must have been to keep plugging away on that biography for fifteen-odd years, all the while sitting on so many explosive revelations. I remember him excitedly making a few vague allusions to the discoveries he was making, way back in the day. Now we all know just what he was onto.
At the moment, I'm awfully sad that I didn't stay in better touch with Manning in recent years, though I can take some solace from the fact that about six weeks ago, I sent him a warmly inscribed copy of my first monograph. I have so many fond memories of our conversations from the three-year period that I worked for him, but I'll always treasure that first meeting the best. After listening to my concerns, putting me at ease, and making me laugh out loud, he said something I did not expect: "I might be able to help you out."
Five months later, I'd relocated to Manhattan, and I was meeting a considerable portion of my grad school expenses by working as his research assistant. (We collaborated on two books.) Without him, I'm not sure I'd have mustered the courage to go to Columbia, something that later turned out - without question - to be one of the great blessings of my life. And yet whenever I tried to thank Manning for anything - whether for helping to pay for my education, or for buying me a sandwich (as he sometimes did), I always got the same response. He'd shrug, smile impishly, and say, "Hey, what do you expect? I'm a socialist!"
John McMillian is Assistant Professor of history at Georgia State University. His most recent book is Smoking Typewriters: The Sixties Underground Press and the Rise of Alternative Media in America (Oxford, 2011).
Meet the Bernie Sanders supporters who say they won’t switch allegiances, no matter what happens in the general election.
Loyal fans of Bernie Sanders have a difficult decision to make. If Hillary Clinton faces off against Donald Trump in the 2016 presidential election, legions of Sanders supporters will have to decide whether to switch allegiances or stand by Bernie until the bitter end.
At least some supporters of the Vermont senator insist they won’t vote for Clinton, no matter what. Many view the former secretary of state with her deep ties to the Democratic establishment as the polar opposite of Sanders and his rallying cry of political revolution. Throwing their weight behind her White House bid would feel like a betrayal of everything they believe.
These voters express unwavering dedication to Sanders on social media, deploying hashtags like NeverClinton and NeverHillary, and circulating petitions like www.wontvotehillary.com, which asks visitors to promise “under no circumstances will I vote for Hillary Clinton.” It’s garnered more than 56,500 signatures so far. Many feel alienated by the Democratic Party. They may want unity, but not if it means a stamp of approval for a political status quo they believe is fundamentally flawed and needs to be fixed.
The candidate has exposed the tension between democracy and liberal values—just like the Arab Spring did.
When I was living in the Middle East, politics always felt existential, in a way that I suppose I could never fully understand. After all, I could always leave (as my relatives in Egypt were fond of reminding me). But it was easy enough to sense it. Here, in the era of Arab revolt, elections really had consequences. Politics wasn’t about policy; it was about a battle over the very meaning and purpose of the nation-state. These were the things that mattered more than anything else, in part because they were impossible to measure or quantify.
The primary divide in most Arab countries was between Islamists and non-Islamists. The latter, especially those of a more secular bent, feared that Islamist rule, however “democratic” it might be, would alter the nature of their countries beyond recognition. It wouldn’t just affect their governments or their laws, but how they lived, what they wore, and how they raised their sons and daughters.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
Boosting your ego won’t make you feel better. Instead, try talking to yourself like you would your best friend.
In 1986, California state assemblyman John Vasconcellos came up with what he believed could be “a vaccine for major social ills” like teen pregnancy and drug abuse: a special task-force to promote self-esteem among Californians. The effort folded three years later, and was widely considered not to have accomplished much.
To Kristin Neff, a psychology professor at the University of Texas, that’s not surprising. Though self-esteem continues to reverberate as a pop-psych cure-all, the quest for inflated egos, in her view, is misguided and largely pointless.
There’s nothing wrong with being confident, to answer Demi Lovato’s question. The trouble is how we try to achieve high self-regard. Often, it’s by undermining others or comparing our achievements to those around us. That’s not just unsustainable, Neff argues, it can also lead to narcissism or depressive bouts during hard times.
There’s no escaping the pressure that U.S. inequality exerts on parents to make sure their kids succeed.
More than a half-century ago, Betty Friedan set out to call attention to “the problem that has no name,” by which she meant the dissatisfaction of millions of American housewives.
Today, many are suffering from another problem that has no name, and it’s manifested in the bleak financial situations of millions of middle-class—and even upper-middle-class—American households.
Poverty doesn’t describe the situation of middle-class Americans, who by definition earn decent incomes and live in relative material comfort. Yet they are in financial distress. For people earning between $40,000 and $100,000 (i.e. not the very poorest), 44 percent said they could not come up with $400 in an emergency (either with cash or with a credit card whose bill they could pay off within a month). Even more astonishing, 27 percent of those making more than $100,000 also could not. This is not poverty. So what is it?
If pushed, most people would say, “It’s discriminatory.” That’s the answer my Con Law students often give about various hypothetical statutes. They’re always correct, and always wrong, because all laws are “discriminatory.” Driver’s-license laws and drinking laws discriminate on the basis of age, for example. Immigration law discriminates on the basis of birthplace and citizenship. Tax laws discriminate on residence, income level, home ownership, and occupation.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
Stuffed to overflowing with superheroes, the studio’s latest nonetheless understands that character is key.
Way back in 2012, I was genuinely astonished by the cinematic juggling act that Joss Whedon accomplished in The Avengers. Six heroes pulled from widely different walks of super-life: Who could believe he’d manage to integrate them all into a coherent story?
These days, that challenge looks rudimentary. A year ago, Whedon’s The Avengers: Age of Ultron found space to squeeze in three more heroes and a brand-new super-villain, along with another half-dozen characters from the ever-expanding Marvel universe. And now, in Captain America: Civil War—which serves in many respects as a third Avengers movie—we have fully a dozen heroes divvied up into two competing super-teams. At this rate, pretty soon Marvel Studios honcho Kevin Feige will have to rent out a stadium just to accommodate his lycra-clad swarms.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.