America under the Articles of Confederation was a mess, not unlike Europe today. Could it lead to a stronger European Union?
German Chancellor Angela Merkel and French President Nicolas Sarkozy after a joint press conference at the Elysee Palace in Paris / AP
Monday, French President Nicolas Sarkozy and German Chancellor Angela Merkel spoke of their desire to change the treaties currently holding the European Union together, and to push the countries towards a tighter, more integrated federation. "We want to make sure that the imbalances that led to the situation in the euro zone today cannot happen again," Sarkozy said.
This would be a worthy project, of course, but treaty change and Union restructuring are about as big a political and legal headache as one could possibly imagine. With Europe this fractured, could leaders really agree upon a new setup? Is the political will present in the right states? Will the markets wait for the deliberations?
One always wants to be careful with historical comparisons: though TV pundits toss them around like party favors, their predictive power is limited. Looking to the Great Depression, for instance, doesn't immediately yield a solution or a timeline for our current economic woes. But there's another moment in American history that makes for a better comparison to Europe today: the 1781 signing of the Articles of Confederation.
There's something comforting about turning to a time when America was nearly as screwed up as Europe is today. Not only did the U.S. emerge from the tangle, but it emerged considerably stronger: the states' and nation's financial and logistical problems and mismanagement wound up pushing them to develop "a more perfect Union." So let's take a look for a moment at our own country's poor showing under the Articles of Confederation.
The Articles of Confederation, our pre-Constitution national legal framework, were drafted during the American Revolution and ratified largely in the late 1770s, Maryland finally signing on in 1781. By the late 1780s -- though historians may disagree over the extent to which the states were in actual crisis -- the Confederation and its members were looking pretty shabby.
American was then a picture of, at least superficially, fascinating disarray. Under the Articles, the federal government had no power to tax. It procured money through a "requisition" system, all states contributing, in theory, to the cost of providing national, public goods. In practice, this was a disaster. After only a few requisitions, as Keith Dougherty and Michael J.G. Cain recall in an article on "Marginal Cost Sharing and the Articles of Confederation" in the journal Public Choice, "states learned to withold their payments, leaving Congress without the resources to carry out its constitutional responsibilities. Lack of revenue prevented Congress from forcefully responding to British non-compliance with the 1783 Anglo-American peace treaty, reacting to the Spanish blockade of the Mississippi River between 1784 and 1787, enforcing treaties with the Indians by limiting western movement of settlers, and averting the piracy of the Barbary states."
A pretty little arrangement, no? Much has been written about the Articles' failings, but what Dougherty and Cain point out is that they "failed to organize a union where state and national interests coincided." In practice, "states fully complying with [...] requisitions, when others did not, incurred a greater portion of the national costs than originally intended." Game theory 101.
Another set of problems, of course, came from the fact that the states were proving truly terrible at handling issues of taxation on their own. In fact, though the particulars differ, the strong financial and debt-driven component to the Confederation's problems really does prompt images of Athens in the past year.
The American Revolution, after all, was a war, wars being so famously costly that historians like Charles Tilly have argued warmaking and its associated taxation were the main drivers of state formation in Western Europe. The American Revolution had been financed through loans, bonds, and poorly conceived paper currency. States then took on this debt. Historians Oscar and Mary Flug Handlin estimated that Massachusetts in the 1780s owed over $5 million to the Confederation, its total debt around $14 million -- enormous numbers at that time. The taxes imposed as a result proved extremely burdensome, one of a few factors leading to unrest in Massachusetts culminating in the armed uprising known as Shays' Rebellion, which then became one of the factors in the general consensus that a new system might be a good idea.
Causation is tricky to establish: Robert Freer argued forcefully in The New England Quarterly back in the '60s that "in all likelihood, the Constitutional Convention would have met when it did, the same document would have been drawn up, and it would have been ratified even if Shays's Rebellion had not taken place." But one of the reasons Freer argued that we would have gotten the Constitution anyway is that there were plenty of other examples of financial and political disorder, like failure to pay federal requisitions and the states of Maryland, Virginia, and Rhode Island mucking around with paper money.
America under the Articles of Confederation, in short, was a mess. And though you could debate the details endlessly, there's little doubt that the extent of that mess was in political leaders' minds when they started talking about reconfiguring things and calling the Constitutional Convention. Thus far, the Constitution has had a better track record than the Articles in keeping order.
What does this mean for Europe today? There are a number of cases in both American and European history where a non-lethal screwup -- say, the Articles of Confederation -- has provided the necessary impetus to establish a more screwup-resistant setup, as with the Constitution. So, while the ratings agencies seem to get more pessimistic by the day about Europe's prospects, maybe European leaders could pull out of this with something even stronger. Merkel and Sarkozy likely have a very nasty path ahead of them, if, indeed, either of them manages to stay around to push the treaty modifications through -- France, recall, has elections scheduled for next year. But there's a universe in which Europe exits this crisis in a better position than that in which it entered.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Climate change means the end of our world, but the beginning of another—one with a new set of species and ecosystems.
A few years ago in a lab in Panama, Klaus Winter tried to conjure the future. A plant physiologist at the Smithsonian Tropical Research Institute, he planted seedlings of 10 tropical tree species in small, geodesic greenhouses. Some he allowed to grow in the kind of environment they were used to out in the forest, around 79 degrees Fahrenheit. Others, he subjected to uncomfortably high temperatures. Still others, unbearably high temperatures—up to a daily average temperature of 95 degrees and a peak of 102 degrees. That’s about as hot as Earth has ever been.
It’s also the kind of environment tropical trees have a good chance of living in by the end of this century, thanks to climate change. Winter wanted to see how they would do.
A tattooed, profanity-loving Lutheran pastor believes young people are drawn to Jesus, tradition, and brokenness.
“When Christians really critique me for using salty language, I literally don’t give a shit.”
This is what it’s like to talk to Nadia Bolz-Weber, the tattooed Lutheran pastor, former addict, and head of a Denver church that’s 250 members strong. She’s frank and charming, and yes, she tends to cuss—colorful words pepper her new book, Accidental Saints. But she also doesn’t put a lot of stock in her own schtick.
“Oh, here’s this tattooed pastor who is a recovering alcoholic who used to be a stand-up comic—that’s interesting for like five minutes,” she said. “The fact that people want to hear from me—that, I really feel, has less to do with me and more to do with a Zeitgeist issue.”
What do Google's trippy neural network-generated images tell us about the human mind?
When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.
The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."