An unexpected wave of democratization reshaped the world not so long ago. Could it happen again now?
When Portugal's Estado Novo dictatorship fell in the Carnation Revolution of 1974, and through months of political turmoil afterward, it wasn't particularly obvious that Portugal would end up a democracy: It had never been one before; in fact, for most of the 20th century it had been under authoritarian rule. Next door in Spain, the also-authoritarian regime of Francisco Franco regime seemed plenty stable. And throughout the West, journalists, intellectuals, and academics tended to assume that the whole Iberian peninsula -- along with most of Latin America -- wasn't fit for democracy on account of its Latin-Catholic social mores. Similar ideas about Asia's and Africa's ostensible incompatibility with democracy were super-common.
At the time of the Carnation Revolution, only 41 of the world's then-150 states were democracies, and most of these were first-world, advanced-industrial economies. But after Portugal pulled off its big democratic transition in the mid-'70s, Greece and Spain followed, leading to to what Samuel Huntington called the "third wave" of democratization globally: During the '80s, civilian governments replaced military rulers across Latin America, eventually including Chile; Ferdinand Marcos's dictatorship fell in the Philippines; military rule ended in South Korea; and martial law was lifted in Taiwan, beginning a 10-year democratic transition there. By 1990, between the fall of the Berlin wall and the collapse of the Soviet Union, most Eastern European countries were holding meaningful elections. Also in 1990 -- the year a watershed democratic transition got underway in Benin, and the same year Nelson Mandela was released from prison in Apartheid South Africa -- there were just three democracies on the African continent; only seven years later, the majority of African states were holding competitive elections. (For a fuller retrospective of the third wave, check out Larry Diamond's "Universal Democracy?")
Here and in other areas of the world hit by the third wave, there's been ideological resistance, endemic corruption, and daunting regression offsetting the advance of democracy. But of the almost 200 states in existence around the world today, 123 are democratic, and no form of government has anything close to the broad global legitimacy theirs does.
Until now, North Africa and the Middle East have remained mainly unmoved by this current (Israel, Lebanon, and our own attempts to engineer democratization in Iraq over the last decade notwithstanding). That seemed maybe to be changing in 2009, with Iran's Green Revolution. And it appears decisively to be changing now -- in Tunisia, Egypt, Libya, and potentially across the region.
No, none of the countries affected by today's pattern of fast-replicating protest movements is obviously in the midst of a real transition to democracy, and we can't really tell yet how close to one any of them might be. In some cases, as in Yemen, it's not even clear that anti-government agitation will organize itself around of democratic goals at all. And it remains entirely possible that the regional momentum building since the outset of this year's Jasmine Revolution in Tunisia will stall, or that the democratic hopes driving this momentum will end up crushed, whether abruptly by force or gradually by political failure.
But here at The Atlantic, the idea of democracy strikes us nevertheless as the right frame for looking at the broader story around these uprisings. This isn't just because the story is ultimately tough to scope in regional or cultural terms -- though it is: It's North African but also Middle Eastern; it's Arab but also Berber and Persian; it's Muslim but also secular. And it's not just because the story is ultimately impossible to imagine apart from its global history -- though it's that, too: Without the third wave having normalized democratic ideas internationally, and without the proliferation of Western social media, we wouldn't have the Arab social movements we now have. It's also because the story is already affecting global events as much as it's been affected by them: While the idea of democracy has rapidly gone from a latent aspiration among Arab peoples to a manifest threat to Arab political orders, Arab mass protest movements have almost as rapidly created powerful demonstration effects, influencing others not just throughout the region but around the world -- including scenes as geographically remote as Zimbabwe and China.
From the escalation of protests in Tunisia, through the revolution in Egypt, the current crisis in Libya, and the ongoing demonstrations in Yemen, Bahrain, and elsewhere, to the unprecedented ways in which social technology has changed the political game in country after country, The Atlantic has been on the regional story in the Middle East and North Africa with some of the sharpest and most creative reporting and analysis we have going. As of today, we've also launched a special section at TheAtlantic.com, The Democracy Report, bringing this coverage together from across our channels (International, Technology, Politics, and others). So check back, read around, and stay with the discussion -- down in the comments, on Facebook, or on Twitter.
A CFPB investigation concluded that Transunion and Equifax deceived Americans about the reports they provided and the fees they charged.
In personal finance, practically everything can turn on one’s credit score. It’s both an indicator of one’s financial past, and the key to accessing necessities—without insane costs—in the future. But on Tuesday, the Consumer Financial Protection Bureau announced that two of the three major credit-reporting agencies responsible for doling out those scores—Equifax and Transunion—have been deceiving and taking advantage of Americans. The Bureau ordered the agencies to pay more than $23 million in fines and restitution.
In their investigation, the Bureau found that the two agencies had been misrepresenting the scores provided to consumers, telling them that the score reports they received were the same reports that lenders and businesses received, when, in fact, they were not. The investigation also found problems with the way the agencies advertised their products, using promotions that suggested that their credit reports were either free or cost only $1. According to the CFPB the agencies did not properly disclose that after a trial of seven to 30 days, individuals would be enrolled in a full-price subscription, which could total $16 or more per month. The Bureau also found Equifax to be in violation of the Fair Credit Reporting Act, which states that the agencies must provide one free report every 12 months made available at a central site. Before viewing their free report, consumers were forced to view advertisements for Equifax, which is prohibited by law.
In 1985, Neil Postman observed an America imprisoned by its own need for amusement. He was, it turns out, extremely prescient.
Earlier this month, thousands of protesters gathered at Washington’s National Mall to advocate for an assortment of causes: action against global climate change, federal funding for scientific research, a generally empirical approach to the world and its mysteries. The protesters at the March for Science, as scientists are wont to do, followed what has become one of the established formulas for such an event, holding clever signs, wearing cheeky outfits, and attempting, overall, to carnivalize their anger. “Make the Barrier Reef Great Again,” read one sign at the March. “This is my sine,” read another. “I KNEW TO WEAR THIS,” one woman had written on the poncho she wore that soggy Saturday, “BECAUSE SCIENCE PREDICTED THE RAIN.” Three protesters, sporting sensible footwear and matching Tyrannosaurus rex costumes, waved poster boards bearing messages like “Jurassick of this shit.”
The MIT economist Peter Temin argues that economic inequality results in two distinct classes. And only one of them has any power.
A lot of factors have contributed to American inequality: slavery, economic policy, technological change, the power of lobbying, globalization, and so on. In their wake, what’s left?
That’s the question at the heart of a new book, The Vanishing Middle Class: Prejudice and Power in a Dual Economy, by Peter Temin, an economist from MIT. Temin argues that, following decades of growing inequality, America is now left with what is more or less a two-class system: One small, predominantly white upper class that wields a disproportionate share of money, power, and political influence and a much larger, minority-heavy (but still mostly white) lower class that is all too frequently subject to the first group’s whims.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The party appears to be struggling to convince the public it represents a better alternative to President Trump and the GOP.
If Democrats want to regain the power they’ve lost at the state and federal level in recent years, they will have to convince more voters they can offer solutions to their problems.
That may be especially difficult, however, if voters think the party and its representatives in government don’t understand or care about them. And according to a recently released poll, many voters may, in fact, feel that way. The Washington Post-ABC News survey, released this week, found that a majority of the public thinks the Democratic Party is out of touch with the concerns of average Americans in the United States. More Americans think Democrats are out of touch than believe the same of the Republican Party or President Trump.
From joy and attachment to anxiety and protectiveness, mothering behavior begins with biochemical reactions.
The artist Sarah Walker once told me that becoming a mother is like discovering the existence of a strange new room in the house where you already live. I always liked Walker's description because it’s more precise than the shorthand most people use for life with a newborn: Everything changes.
Because a lot of things do change, of course, but for new mothers, some of the starkest differences are also the most intimate ones—the emotional changes. Which, it turns out, are also largely neurological.
Even before a woman gives birth, pregnancy tinkers with the very structure of her brain, several neurologists told me. After centuries of observing behavioral changes in new mothers, scientists are only recently beginning to definitively link the way a woman acts with what's happening in her prefrontal cortex, midbrain, parietal lobes, and elsewhere. Gray matter becomes more concentrated. Activity increases in regions that control empathy, anxiety, and social interaction. On the most basic level, these changes, prompted by a flood of hormones during pregnancy and in the postpartum period, help attract a new mother to her baby. In other words, those maternal feelings of overwhelming love, fierce protectiveness, and constant worry begin with reactions in the brain.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
There’s a common perception that women siphon off the wealth of their exes and go on to live in comfort. It’s wrong.
A 38-year-old woman living in Everett, Washington recently told me that nine years ago, she had a well-paying job, immaculate credit, substantial savings, and a happy marriage. When her first daughter was born, she and her husband decided that she would quit her job in publishing to stay home with the baby. She loved being a mother and homemaker, and when another daughter came, she gave up the idea of going back to work.
Seven years later, her husband told her to leave their house, and filed for a divorce she couldn’t afford. “He said he was tired of my medical issues, and unwilling to work on things,” she said, citing her severe rheumatoid arthritis and OCD, both of which she manages with medication. “He kicked me out of my own house, with no job and no home, and then my only recourse was to lawyer up. I’m paying them on credit.” (Some of the men and women quoted in this article have been kept anonymous because they were discussing sensitive financial matters, some of them involving ongoing legal disputes.)
A new documentary explores how early experiences drive development.
The idea that new babies are empty vessels waiting to be filled with knowledge of the world around them doesn’t sound unreasonable. With their unfocused eyes and wrinkly skin, tiny humans sometimes look more like amoebas than complex beings.
Yet scientists have built a body of evidence, particularly over the last three decades, that suggests this is patently untrue. “When kids are born, they’re already little scientists exploring the world,” said the filmmaker Estela Renner via a video conference from Brazil before a recent screening of her new documentary The Beginning of Life (streaming on Netflix) at the World Bank in Washington, D.C.
That’s something Renner, a Brazilian mother of three, discovered as she spoke with early-childhood experts and parents in nine countries around the world about the impact a child’s environment in the first few years of life has on not only her physical development, but her cognitive, social, and emotional development, too. “I didn’t know that kids were not blank slates,” she said. “It changed the way I look at babies.” If more people recognized that fact, the way communities and policymakers think about and invest in the early years of life might be different.
The wealthiest Americans donate 1.3 percent of their income; the poorest, 3.2 percent. What's up with that?
When Mort Zuckerman, the New York City real-estate and media mogul, lavished $200 million on Columbia University in December to endow the Mortimer B. Zuckerman Mind Brain Behavior Institute, he did so with fanfare suitable to the occasion: the press conference was attended by two Nobel laureates, the president of the university, the mayor, and journalists from some of New York’s major media outlets. Many of the 12 other individual charitable gifts that topped $100 million in the U.S. last year were showered with similar attention: $150 million from Carl Icahn to the Mount Sinai School of Medicine, $125 million from Phil Knight to the Oregon Health & Science University, and $300 million from Paul Allen to the Allen Institute for Brain Science in Seattle, among them. If you scanned the press releases, or drove past the many university buildings, symphony halls, institutes, and stadiums named for their benefactors, or for that matter read the histories of grand giving by the Rockefellers, Carnegies, Stanfords, and Dukes, you would be forgiven for thinking that the story of charity in this country is a story of epic generosity on the part of the American rich.