I've been struggling all weekend to write something worthwhile about Manning Marable. On Saturday, I had the good fortune of receiving a note from Georgia State historian, and former Marable grad assistant, John McMillan, seeking a place to publish his own tribute. I hastily offered this page and McMillan was kind enough to offer his memories. They are as follows:
In hindsight, this is embarrassing to admit, but here
goes.When I first met Manning Marable
in 1996, at age 26, I was nervous.Partly I was on edge because I was trying to make a big decision: Should
I pursue a Ph.D. in African-American history at either Rutgers or Michigan,
where I'd been offered full funding?Or,
should I go to Columbia (my first choice), with no money upfront, but with some
vague possibility of securing a teaching fellowship down the line?
Months before, Manning had already written me to say that if
I were admitted to Columbia, he'd be keen to take me on as one of his graduate
students.(That was a thrill unto
itself!)Nevertheless, I couldn't help
but wonder (and this is the embarrassing part): did he know I was white?And if so, would he have any doubts about my
commitment to Black Studies, or my intellectual authority to work in the field?
Keep in mind, by then I'd read virtually all of Manning's
major works, including the earlier, more polemical stuff, like How Capitalism
Underdeveloped Black America, where he declared, "Progressive white Americans
must succeed in overturning their own racism."
No problem there, I chuckled.I'd long made a point of challenging racism
in others, and I've always tried (to the best of my ability) never to tolerate
it in myself.But then, he added this:
"Nothing short of a commitment to racial equality and Black
freedom such as that exhibited by the militant white abolitionist John Brown
will be sufficient."
There was only one way to gauge Manning's attitude, and that
was to show up at his office.I made the
haul all the way from mid-Michigan to New York City in my Chevy pick-up
truck.At that point in my life, I'd
never been anywhere near an Ivy League campus.My first memory of the area around Columbia comes from driving up and
down Broadway, Amsterdam Ave., and perhaps a dozen cross streets in-between,
again and again and again, screaming and pounding on my dashboard over my
inability to find a parking space.
As soon as I met Manning, though, all of my anxiety melted
away. As anyone who knew him would agree, one of his most striking qualities
was his affability.And although I
probably would not have said this in print while he was still alive, the plain
fact is, he really did look a lot like a teddy bear.
One thing I remember from that day is how vigorously he
stressed the fact that he saw himself as both a scholar, and an activist.For him, the two vocations were
inseparable.What's more, he wanted me
to know that when he became the founding director of the Institute for Research
in African American Studies (IRAAS) a few years earlier, he'd envisioned it as
fundamentally a community resource.And
by "community," he pointed out, "I don't mean just Columbia, or even
Morningside Heights."He gestured toward
the window of his 6th floor office, which afforded views to the north and the
east."We're not in Morningside
Heights!We're in Harlem!"
To this end, he had a remarkable capacity for making time for virtually anyone who wanted something from him, even including the conspiratorial-minded guy with the rusty stains on his shirt (or was it blood?) who would occasionally show up unannounced at Manning's door, asking to bend his ear. Then there was this other fellow: he was never around, except for on the periodic occasions when the Institute would lay out a very nice buffet in honor of some distinguished guest speaker, in which case he would always be there, first in line, testing the capacity of his Styrofoam plates with enormous mounds of chicken wings, mini quiches, cocktail shrimp, and whatever else. (Okay, I'll confess: I once watched as Manning quietly observed this guy from the corner of the room, sighed heavily, and rolled his eyes.)
Manning was also one of the hardest workers in all of academia. In the mid-to-late 1990s, you might recall, a whole corps of "black public intellectuals" was suddenly gaining more exposure than they'd probably ever dreamed of. After a long period during which black scholars were more likely to toil away in obscurity, with their contributions being slighted or overlooked, now at least a few of them - through a combination of intelligence, charisma, and moxie - seemed to be everywhere. And while some celebrated the new visibility of people like Henry Louis Gates, Jr., Cornel West, and Michael Eric Dyson, others sensed a certain entrepreneurialism in their approach. Sure, they could all talk a very good game, people used to grouse. Hell, put them in range of a microphone, and they'll talk about anything! But when it came to scholarship, what did they actually do?
That was never quite my own view, but regardless: nobody ever credibly said such a thing about Manning. Sure, he made TV appearances and gave paid lectures (oh, how he must have loved Black History Month). But he was also an author of god-knows-how-many books and articles, the great bulk of which showcased his deep immersion in fields as diverse as history, sociology, political science, economics, and even literature. His new, nearly 600 page opus, Malcolm X: A Life of Reinvention, is already being celebrated as an exhaustively researched tome, one that will completely upend our understanding of that fabled leader.
What an incredible exercise in self-restraint it must have been to keep plugging away on that biography for fifteen-odd years, all the while sitting on so many explosive revelations. I remember him excitedly making a few vague allusions to the discoveries he was making, way back in the day. Now we all know just what he was onto.
At the moment, I'm awfully sad that I didn't stay in better touch with Manning in recent years, though I can take some solace from the fact that about six weeks ago, I sent him a warmly inscribed copy of my first monograph. I have so many fond memories of our conversations from the three-year period that I worked for him, but I'll always treasure that first meeting the best. After listening to my concerns, putting me at ease, and making me laugh out loud, he said something I did not expect: "I might be able to help you out."
Five months later, I'd relocated to Manhattan, and I was meeting a considerable portion of my grad school expenses by working as his research assistant. (We collaborated on two books.) Without him, I'm not sure I'd have mustered the courage to go to Columbia, something that later turned out - without question - to be one of the great blessings of my life. And yet whenever I tried to thank Manning for anything - whether for helping to pay for my education, or for buying me a sandwich (as he sometimes did), I always got the same response. He'd shrug, smile impishly, and say, "Hey, what do you expect? I'm a socialist!"
John McMillian is Assistant Professor of history at Georgia State University. His most recent book is Smoking Typewriters: The Sixties Underground Press and the Rise of Alternative Media in America (Oxford, 2011).
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
For decades, some psychologists have claimed that bilinguals have better mental control. Their work is now being called into question.
In one of his sketches, comedian Eddie Izzard talks about how English speakers see bilingualism: “Two languages in one head? No one can live at that speed! Good lord, man. You’re asking the impossible,” he says. This satirical view used to be a serious one. People believed that if children grew up with two languages rattling around their heads, they would become so confused that their “intellectual and spiritual growth would not thereby be doubled, but halved,” wrote one professor in 1890. “The use of a foreign language in the home is one of the chief factors in producing mental retardation,” said another in 1926.
A century on, things are very different. Since the 1960s, several studies have shown that bilingualism leads to many advantages, beyond the obvious social benefits of being able to speak to more people. It also supposedly improves executive function—a catch-all term for advanced mental abilities that allow us to control our thoughts and behavior, such as focusing on a goal, ignoring distractions, switching attention, and planning for the future.
After getting shut down late last year, a website that allows free access to paywalled academic papers has sprung back up in a shadowy corner of the Internet.
There’s a battle raging over whether academic research should be free, and it’s overflowing into the dark web.
Most modern scholarly work remains locked behind paywalls, and unless your computer is on the network of a university with an expensive subscription, you have to pay a fee, often around 30 dollars, to access each paper.
Many scholars say this system makes publishers rich—Elsevier, a company that controls access to more than 2,000 journals, has a market capitalization about equal to that of Delta Airlines—but does not benefit the academics that conducted the research, or the public at large. Others worry that free academic journals would have a hard time upholding the rigorous standards and peer reviews that the most prestigious paid journals are famous for.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The ancient civilization may have tracked Jupiter using sophisticated methods, but their reasons for stargazing were very different than ours.
We’ve never escaped the influence of the Babylonians. That there are 60 seconds in a minute, 60 minutes in an hour, and 360 degrees in a full circle, are all echoes of the Babylonian preference for counting in base 60. An affinity for base 12 (inches in a foot, pence in an old British shilling) is also an offshoot, 12 being a factor of 60.
All this suggests that the Babylonians had a mathematics worth copying, which was why the Greeks did copy it and thereby rooted these number systems in Western tradition. The latest indication of Babylonian mathematical sophistication is the discovery that their astronomers knew that, in effect, the distance traveled by a moving object is equal to the area under the graph of velocity plotted against time. Previously it had been thought that this relationship wasn’t recognized until the fourteenth century in Europe. But since historian Mathieu Ossendrijver of the Humboldt University in Berlin found the calculation described in a series of clay tablets inscribed with cuneiform writing in Babylonia during the fourth to the first centuries B.C.E., where it was used to figure out the distance traveled across the sky by the planet Jupiter.
After a pair of poor showings in New Hampshire, Chris Christie and Carly Fiorina drop out of the race.
The Republican race is headed to South Carolina with two fewer candidates. The day after finishing sixth and seventh in the New Hampshire primaries, New Jersey Governor Chris Christie and former Hewlett-Packard CEO Carly Fiorina announced on Wednesday that they were suspending their campaigns.
Fiorina was always a long shot—she was practically a political newcomer, having only run one unsuccessful Senate campaign. And while her record at HP was vulnerable to attack, Republican figures saw in her both private-sector experience and a woman who could counter Hillary Clinton’s monopoly on a “historic” woman’s candidacy. While many political professionals sniffed at Fiorina’s candidacy, remembering that 2010 Senate race, she broke out after a commanding performance in the undercard to the first Republican debate. That earned her a promotion to the main stage at the next debate, where she scored another victory. But it was all downhill from there. Dogged by questions of honesty and unable to earn media attention, her campaign faded quickly.
This morning I went on Democracy Now to discuss my critique of “class-first” policy as a way of ameliorating the effects of racism. In the midst of that discussion I made the point that one can maintain a critique of a candidate—in this case Bernie Sanders—and still feel that that candidate is deserving of your vote. Amy Goodman, being an excellent journalist, did exactly what she should have done—she asked if I were going to vote for Senator Sanders.
I, with some trepidation, answered in the affirmative. I did so because I’ve spent my career trying to get people to answer uncomfortable questions. Indeed, the entire reason I was on the show was to try to push liberals into directly addressing an uncomfortable issue that threatens their coalition. It seemed wrong, somehow, to ask others to step into their uncomfortable space and not do so myself. So I answered.
Issued last summer, the rules are the centerpiece of the White House’s climate-change-fighting agenda, and they play a big part in the recent, tepid optimism about global warming. Without the proposal of the plan, the United States couldn’t have secured the Paris Agreement, the first international treaty to mitigate greenhouse-gas emissions, last December. And without the adoption of the plan, the United States almost certainly won’t be able to comply with that document. If the world were to lose the Paris Agreement—which was not a total solution to the climate crisis, but meant to be a first, provisional step—years could be lost in the diplomatic fight to reduce climate-change’s dangers.
Most people in the U.S. believe their country is going to hell. But they’re wrong. What a three-year journey by single-engine plane reveals about reinvention and renewal.
When news broke late last year of a mass shooting in San Bernardino, California, most people in the rest of the country, and even the state, probably had to search a map to figure out where the city was. I knew exactly, having grown up in the next-door town of Redlands (where the two killers lived) and having, by chance, spent a long period earlier in the year meeting and interviewing people in the unglamorous “Inland Empire” of Southern California as part of an ongoing project of reporting across America.
Some of what my wife, Deb, and I heard in San Bernardino before the shootings closely matched the picture that the nonstop news coverage presented afterward: San Bernardino as a poor, troubled town that sadly managed to combine nearly every destructive economic, political, and social trend of the country as a whole. San Bernardino went into bankruptcy in 2012 and was only beginning to emerge at the time of the shootings. Crime is high, household income is low, the downtown is nearly abandoned in the daytime and dangerous at night, and unemployment and welfare rates are persistently the worst in the state.
One day in 1979, James Burke, the chief executive of Johnson & Johnson, summoned more than 20 of his key people into a room, jabbed his finger at an internal document, and proposed destroying it.
The document was hardly incriminating. Entitled “Our Credo,” its plainspoken list of principles—including a higher duty to “mothers, and all others who use our products”—had been a fixture on company walls since 1943. But Burke was worried that managers had come to regard it as something like the Magna Carta: an important historical document, but hardly a tool for modern decision making. “If we’re not going to live by it, let’s tear it off the wall,” Burke told the group, using the weight of his office to force a debate. And that is what he got: a room full of managers debating the role of moral duties in daily business, and then choosing to resuscitate the credo as a living document.