America under the Articles of Confederation was a mess, not unlike Europe today. Could it lead to a stronger European Union?
German Chancellor Angela Merkel and French President Nicolas Sarkozy after a joint press conference at the Elysee Palace in Paris / AP
Monday, French President Nicolas Sarkozy and German Chancellor Angela Merkel spoke of their desire to change the treaties currently holding the European Union together, and to push the countries towards a tighter, more integrated federation. "We want to make sure that the imbalances that led to the situation in the euro zone today cannot happen again," Sarkozy said.
This would be a worthy project, of course, but treaty change and Union restructuring are about as big a political and legal headache as one could possibly imagine. With Europe this fractured, could leaders really agree upon a new setup? Is the political will present in the right states? Will the markets wait for the deliberations?
One always wants to be careful with historical comparisons: though TV pundits toss them around like party favors, their predictive power is limited. Looking to the Great Depression, for instance, doesn't immediately yield a solution or a timeline for our current economic woes. But there's another moment in American history that makes for a better comparison to Europe today: the 1781 signing of the Articles of Confederation.
There's something comforting about turning to a time when America was nearly as screwed up as Europe is today. Not only did the U.S. emerge from the tangle, but it emerged considerably stronger: the states' and nation's financial and logistical problems and mismanagement wound up pushing them to develop "a more perfect Union." So let's take a look for a moment at our own country's poor showing under the Articles of Confederation.
The Articles of Confederation, our pre-Constitution national legal framework, were drafted during the American Revolution and ratified largely in the late 1770s, Maryland finally signing on in 1781. By the late 1780s -- though historians may disagree over the extent to which the states were in actual crisis -- the Confederation and its members were looking pretty shabby.
American was then a picture of, at least superficially, fascinating disarray. Under the Articles, the federal government had no power to tax. It procured money through a "requisition" system, all states contributing, in theory, to the cost of providing national, public goods. In practice, this was a disaster. After only a few requisitions, as Keith Dougherty and Michael J.G. Cain recall in an article on "Marginal Cost Sharing and the Articles of Confederation" in the journal Public Choice, "states learned to withold their payments, leaving Congress without the resources to carry out its constitutional responsibilities. Lack of revenue prevented Congress from forcefully responding to British non-compliance with the 1783 Anglo-American peace treaty, reacting to the Spanish blockade of the Mississippi River between 1784 and 1787, enforcing treaties with the Indians by limiting western movement of settlers, and averting the piracy of the Barbary states."
A pretty little arrangement, no? Much has been written about the Articles' failings, but what Dougherty and Cain point out is that they "failed to organize a union where state and national interests coincided." In practice, "states fully complying with [...] requisitions, when others did not, incurred a greater portion of the national costs than originally intended." Game theory 101.
Another set of problems, of course, came from the fact that the states were proving truly terrible at handling issues of taxation on their own. In fact, though the particulars differ, the strong financial and debt-driven component to the Confederation's problems really does prompt images of Athens in the past year.
The American Revolution, after all, was a war, wars being so famously costly that historians like Charles Tilly have argued warmaking and its associated taxation were the main drivers of state formation in Western Europe. The American Revolution had been financed through loans, bonds, and poorly conceived paper currency. States then took on this debt. Historians Oscar and Mary Flug Handlin estimated that Massachusetts in the 1780s owed over $5 million to the Confederation, its total debt around $14 million -- enormous numbers at that time. The taxes imposed as a result proved extremely burdensome, one of a few factors leading to unrest in Massachusetts culminating in the armed uprising known as Shays' Rebellion, which then became one of the factors in the general consensus that a new system might be a good idea.
Causation is tricky to establish: Robert Freer argued forcefully in The New England Quarterly back in the '60s that "in all likelihood, the Constitutional Convention would have met when it did, the same document would have been drawn up, and it would have been ratified even if Shays's Rebellion had not taken place." But one of the reasons Freer argued that we would have gotten the Constitution anyway is that there were plenty of other examples of financial and political disorder, like failure to pay federal requisitions and the states of Maryland, Virginia, and Rhode Island mucking around with paper money.
America under the Articles of Confederation, in short, was a mess. And though you could debate the details endlessly, there's little doubt that the extent of that mess was in political leaders' minds when they started talking about reconfiguring things and calling the Constitutional Convention. Thus far, the Constitution has had a better track record than the Articles in keeping order.
What does this mean for Europe today? There are a number of cases in both American and European history where a non-lethal screwup -- say, the Articles of Confederation -- has provided the necessary impetus to establish a more screwup-resistant setup, as with the Constitution. So, while the ratings agencies seem to get more pessimistic by the day about Europe's prospects, maybe European leaders could pull out of this with something even stronger. Merkel and Sarkozy likely have a very nasty path ahead of them, if, indeed, either of them manages to stay around to push the treaty modifications through -- France, recall, has elections scheduled for next year. But there's a universe in which Europe exits this crisis in a better position than that in which it entered.
His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.
Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.
The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
As journalists push back against hoaxes and conspiracies, media skeptics are using charges of “fake news” against professionals.
For a term that is suddenly everywhere, “fake news” is fairly slippery.
Is “fake news” a reference to government propaganda designed to look like independent journalism? Or is it any old made-up bullshit that people share as real on the internet? Is “fake news” the appropriate label for a hoax meant to make a larger point? Does a falsehood only become “fake news” when it shows up on a platform like Facebook as legitimate news? What about conspiracy theorists who genuinely believe the outrageous lies they’re sharing? Or satire intended to entertain? And is it still “fake news” if we’re talking about a real news organization that unintentionally gets it wrong? (Also, what constitutes a real news organization anymore?)
The ethanol in kombucha has some regulators concerned about the popular microbial drink.
If it’s not fermented, don’t eat it.
That’s a rule from a best-selling diet book that a health guru—maybe you, or Gwyneth Paltrow—could write. The cover could be you and Gwyneth surrounded by honey and dirt, applying probiotic ointments, eating kimchi and smile-laughing over a cauldron of home-brewed kombucha.
Kombucha is a smart choice, because the drink has the fastest-growing segment of the “functional beverage” market in the U.S.—a category vaguely defined by one industry publication as “drinks with added functionality, such as ingredients and associated health benefits and functional positioning.” As in, water isn’t functional. Or, used in a sentence: “Kombucha now occupies about one-third of our refrigerated functional-beverage shelf.”
Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?
This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.
At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
As Oklahoma attorney general, Scott Pruitt sued the federal government to prevent rules about air and water pollution from taking effect.
Throughout the long campaign, and in the long month that has followed, President-elect Donald Trump sounded some odd notes about the environment.
He rejected the scientific fact of climate change, calling it a hoax or a fraud. He repeatedly announced his intent to repeal all of the Obama administration’s environmental regulations. He lamented, wrongly, that you couldn’t use hairspray anymore because it damaged the ozone layer.
And then, out of nowhere, he met with Al Gore, who won a Nobel Peace Prize for educating the public about the dangers of climate change.
While the broad strokes of Trump’s policies were never in doubt, there was often enough bizarreness to wonder what he would do with the powers of the Environmental Protection Agency.
Studies show that for most types of cognitively demanding tasks, anything but quiet hurts performance.
Like most modern “knowledge” workers, I spend my days in an open office. That means I also spend my days amid ringing phones, the inquisitive tones of co-workers conducting interviews, and—because we work in a somewhat old, infamous building—the pounding and drilling of seemingly endless renovations.
Even so, the #content must still be wrung from my distracted brain. And so, I join the characters of trend pieces everywhere in wearing headphones almost all day, every day. And what better to listen to with headphones than music? By now, I’ve worked my way through all the “Focus” playlists on Spotify—most of which sound like they were meant for a very old planetarium—and I’ve looped back around to a genre I like to call “soft, synthy pop songs whose lyrics don’t make much sense:” Think Miike Snow rather than Michael Jackson.
A stray observation helped one researcher to uncover the strange connection between the seashells and lobsters of his childhood.
Born in the Bahamas to a family of lobster fishermen, Nicholas Higgs spent much of his childhood diving in Caribbean waters, working on boats, and collecting shells on the beach. That connection to the sea stayed with him. He moved to the UK and became a marine biologist. He studied whales and marine worms. And on his wedding day, he asked his parents to bring some shells from the Bahamas to decorate the dining tables. Those shells, which symbolized his past, would also define his future.
At the wedding, his former boss picked one up and identified it as a lucinid clam—a group that feeds in a strange way. While most clams filter food from the surrounding water, lucinids get almost all their nourishment from bacteria that live in their gills. And the bacteria create their own food—just like plants, but with one critical difference. Plants make nutrients by harnessing the sun’s energy, in a process called photosynthesis. But the clam bacteria get their energy by processing minerals in their surroundings. That’s chemosynthesis—making nutrients with chemical power instead of solar power.