When I heard Obama's invocation of the Proclamation last week, it immediately struck me as wrong -- but for different reasons. This letter, written to President Lincoln in 1864, has always stuck with me:
Belair [Md.] Aug 25th 1864
Mr president It is my Desire to be free. to go to see my people on the eastern shore. my mistress wont let me you will please let me know if we are free. and what i can do. I write to you for advice. please send me word this week. or as soon as possible and oblidge.
When I read this I was basically of Obama's view -- that the Proclamation was a necessary compromise, the sort of thing that is essential to American democracy. But I also thought it was important to always remember that compromise, whatever its virtue, isn't an abstract concept. It's the compromising of the lives of actual people. But in the course of researching the column I came to a somewhat different opinion -- that the Proclamation actually went further than I thought.
Better people here will know this, but my understanding is that there really was no constitutional mechanism by which Lincoln could -- with a wave of his pen -- emancipate the slaves of loyal owners. Thus there never really a choice between, say, ending slavery everywhere and ending just in disloyal states. The compromise was whether the Proclamation would cover all formerly rebel areas that had fallen under union control -- occupied areas of Tennessee, Virginia, South Carolina and Louisiana for instance. And the Proclamation did actually exempt some of those areas.
But on the other side of the ledger there's the fact that Lincoln immediately affected the largest act of manumission in American history with a stroke of the pen. I haven't come across a precise number, in terms of who was immediately freed by the Proclamation. But it was in the thousands, and Foner estimates that it may well have ranged into the tens of thousands. Other states had emancipated slaves--but almost always gradually. Nothing like this -- an immediate grant of freedom to thousands of slaves--had happened before.
This is to say nothing of those slaves who were freed as the Union Army pushed South. To me, that really is the heart of the Proclamation's genius. Remember that it was not an act of kindness, but hard-nosed policy of belligerence put forth by a country trying to win a war. The Proclamation necessarily united that war for the Union with the destruction of slavery. It's almost impossible to imagine a Union in which slavery was destroyed in the deep South but somehow thrived in the border states. Finally, and least appreciated in my view, the Proclamation brought, at final count, almost 200,000 black men into the Union Army.
It's worth considering that the Proclamation was not the act of
Lincoln moving closer to the slave-holders, but to their opponents. From
Eric Foner's Pulitzer Prize winner, The Fiery Trial:
Emancipation Proclamation differed dramatically from Lincoln's previous
policies regarding slavery and emancipation, some of which dated back
to his days in the Illinois legislature and Congress. It abandoned the
idea of seeking the cooperation of slaveholders in emancipation, and of
distinguishing between loyal and disloyal owners. It was immediate, not
gradual; contained no mention of monetary compensation for slaveowners,
did not depend on action by the states, and made no reference to
colonization (in part, perhaps, because gradualism, compensation, and
colonization had no bearing on the "military necessity"
that justified the document.) Lincoln had long resisted the enlistment
of black soldiers; now he welcomed them into the Union Army. The
Proclamation addressed slaves directly, not as the property of the
country's enemies but as persons with wills of their own whose action
might help win the Civil War.
want to hammer down on Foner's point about arming blacks. In the summer
of 1862, Lincoln said that he feared if he armed blacks "in a few weeks
the arms would be in the hands of the rebels." A year later, he was
arguing that in military matters, black were...
greatest available, and yet unavailed of force for restoring the
union. The bare sight of fifty thousand armed, and drilled black
soldiers on the banks of the Mississippi, would end the rebellion at
There is some bravado here,
no doubt, But it's important to understand that this isn't just about
the violence itself. It's difficult to understand, in today's society,
what it actually meant to recognize another human's right to hold a gun.
The right to bear arms was, in previous centuries, directly tied to
citizenship, as was military service. To open the Army to men of all
colors was to admit the possibility of expanding the franchise, and
perhaps even political office, across the color line. It was to grant
that America's broad aristocracy would not be forever color-bound.
is exactly what happened. It may not come across in my writing, but I
have deep roots in America's radical tradition, in general, and the
black radical tradition specifically. Like a lot of people of that ilk,
there was a tendency in me to write off the Proclamation as a weak-kneed
compromise proffered by another racist president. By last week, I was
past that point. Still the research really affirmed something for
me -- those of us who are radicals, whether practicing or not, shouldn't
downplay the Proclamation, we should take credit for it. As Douglass
did. As Phillips did. The Proclamation and all that followed is a
textbook example of what a dose of radicalism can do for democracy.
started this letter musing about an enslaved black woman whom Lincoln's
compromise left in limbo. She should be remembered--but she shouldn't
be remembered alone. Again from Foner:
its palpable limitations, the proclamation set of scenes of jubilation
among free blacks in the North and contrabands and slaves in the South.
At Beaufort on the Sea Islands, over 5,000 African-Americans celebrated
their freedom by singing what a white observer called "the Marseillaise
of the slave"; "In that New Jerusalem, I am not afraid to die; We must
fight for liberty in that New Jerusalem." In the North, blacks gathered
in their churches. "I have never witnessed," the abolitionists Benjamin
R. Plumly wrote to Lincoln from Philadelphia, "such intense, intelligent
and devout Thanksgiving..." When one person suggested that Lincoln
might pursue 'some form of colonization; a woman shouted, "God won't let
Indeed God didn't.
Finally, I think Lincoln's own words give some sense of how to handle
such momentous events with humility--"I claim not to have controlled
events, but confess plainly that events have controlled me."
P.S. In addition to Foner's The Fiery Trial, his Free Soil, Free Labor Free Men and James Oakes' The Radical and The Republican were essential to all of my thinking.
Know your history, as they say.
MORE: The author of The Radical and The Republican is James Oakes, not Stephen Oates. My sincerest apologies for the botch.
A rock structure, built deep underground, is one of the earliest hominin constructions ever found.
In February 1990, thanks to a 15-year-old boy named Bruno Kowalsczewski, footsteps echoed through the chambers of Bruniquel Cave for the first time in tens of thousands of years.
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
Washington voters handed Hillary Clinton a primary win, symbolically reversing the result of the state caucus where Bernie Sanders prevailed.
Washington voters delivered a bit of bad news for Bernie Sanders’s political revolution on Tuesday. Hillary Clinton won the state’s Democratic primary, symbolically reversing the outcome of the state’s Democratic caucus in March where Sanders prevailed as the victor. The primary result won’t count for much since delegates have already been awarded based on the caucus. (Sanders won 74 delegates, while Clinton won only 27.) But Clinton’s victory nevertheless puts Sanders in an awkward position.
Sanders has styled himself as a populist candidate intent on giving a voice to voters in a political system in which, as he describes it, party elites and wealthy special-interest groups exert too much control. As the primary election nears its end, Sanders has railed against Democratic leaders for unfairly intervening in the process, a claim he made in the aftermath of the contentious Nevada Democratic convention earlier this month. He has also criticized superdelegates—elected officials and party leaders who can support whichever candidate they chose—for effectively coronating Clinton.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Americans persist in thinking that Adam Smith's rules for free trade are the only legitimate ones. But today's fastest-growing economies are using a very different set of rules. Once, we knew them—knew them so well that we played by them, and won. Now we seem to have forgotten
IN Japan in the springtime of 1992 a trip to Hitotsubashi University, famous for its economics and business faculties, brought me unexpected good luck. Like
several other Japanese universities, Hitotsubashi is almost heartbreaking in
its cuteness. The road from the station to the main campus is lined with cherry
trees, and my feet stirred up little puffs of white petals. Students glided
along on their bicycles, looking as if they were enjoying the one stress-free
moment of their lives.
They probably were. In surveys huge majorities of students say that they study
"never" or "hardly at all" during their university careers. They had enough of
that in high school.
I had gone to Hitotsubashi to interview a professor who was making waves. Since
the end of the Second World War, Japanese diplomats and businessmen have acted
as if the American economy should be the model for Japan's own industrial
growth. Not only should Japanese industries try to catch up with America's lead
in technology and production but also the nation should evolve toward a
standard of economic maturity set by the United States. Where Japan's economy
differed from the American model—for instance, in close alliances between
corporations which U.S. antitrust laws would forbid—the difference should be
considered temporary, until Japan caught up.
What’s harder to believe: that it took a year for Andrea Constand to accuse the star of sexual assault, or that it’s taken 11 years and dozens more women coming forward for those accusations to be heard in court?
To date, more than 50 women have accused Bill Cosby of sexual misconduct. Constand was the first. In January of 2005 she told police that a year earlier, Cosby had touched and penetrated her after drugging her. A prosecutor decided against proceeding with the case, and Constand followed up with a civil suit that resulted in a 2006 settlement. After that came an accelerating drip of women making allegations about incidents spanning a wide swath of Cosby’s career, from Kristina Ruehli (1965) to Chloe Goins (2008).
In recent years, the idea that educators should be teaching kids qualities like grit and self-control has caught on. Successful strategies, though, are hard to come by.
In 2013, for the first time, a majority of public-school students in this country—51 percent, to be precise—fell below the federal government’s low-income cutoff, meaning they were eligible for a free or subsidized school lunch. It was a powerful symbolic moment—an inescapable reminder that the challenge of teaching low-income children has become the central issue in American education.
The truth, as many American teachers know firsthand, is that low-income children can be harder to educate than children from more-comfortable backgrounds. Educators often struggle to motivate them, to calm them down, to connect with them. This doesn’t mean they’re impossible to teach, of course; plenty of kids who grow up in poverty are thriving in the classroom. But two decades of national attention have done little or nothing to close the achievement gap between poor students and their better-off peers.
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
Bernie Sanders is contesting the Democratic primary to the end, just as Hillary Clinton did eight years ago—but that parallel has its limits.
In May of 2008, two Democrats were somehow still fighting over the nomination. The stronger of the two had a comfortable lead in delegates and made calls to unify the party. But the weaker contender, buoyed by a loyal base, refused to give up. It got awkward.
The difference in 2016, of course, is Hillary Clinton’s position in the drama. She played the spoiler eight years ago, refusing to concede to Barack Obama in a primary that dragged into June, to the consternation of party elders. (They were nervously eyeing John McCain, who had pluckily sewn up his nomination by late February). But this year, she is the candidate ascendant, impatient to wrap up this whole Bernie Sanders business and take on Donald Trump.
The day—a celebration of corporate conformity disguised as a celebration of individuality—helped to bring about the current dominance of “business casual.”
The New York Times ran a story Wednesday announcing “The End of the Office Dress Code.” The suit and its varied strains, the article argues—corporate uniforms that celebrate, well, corporate uniformity—are giving way to more individualized interpretations of “office attire.” As the writer Vanessa Friedman puts it, “We live in a moment in which the notion of a uniform is increasingly out of fashion, at least when it comes to the implicit codes of professional and public life.”
It’s true. We live in a time in which our moguls dress in hoodies and t-shirts, and in which more and more workers are telecommuting—working not just from home, but from PJs. It’s a time, too, when the lines between “work” and “everything else” are increasingly—and sometimes frustratingly—fluid. And so: It’s also a time when many of us are trying to figure out, together, what “work clothes” actually means, and the extent to which the term might vary across professions. As Emma McClendon, who curated a new exhibit on uniforms for the Museum at the Fashion Institute of Technology, summed it up: “We are in a very murky period.”
Speculation about how Ramsay Bolton might die reveals the challenges of devising a cathartic TV death—and illuminates a larger issue facing the series.
Warning: Season 6 spoilers abound.
Ever since Ramsay Bolton revealed himself as Westeros’s villain-in-chief, Game of Thrones fans have wanted him dead. He first appeared in season three disguised as a Northern ally sent to help Theon Greyjoy but quickly turned out to be a lunatic whose appetite for cruelty only grew as the series progressed. (Last year, Atlantic readers voted him the actual worst character on television.) After several colorful and nauseating years of rape, torture, murder, and bad visual puns, speculation about the Bolton bastard’s looming death has reached its peak this sixth season. But “Will Ramsay die this season?” also gives way to a slightly more complicated question: “How should Ramsay die?”