How America Lost Its Mind

The election of Donald Trump, Kurt Andersen argued in his September cover story, revealed that a critical mass of Americans has become untethered from reality. Andersen also traced the roots of the exceptionalism that has turned the country into “Fantasyland.”


Kurt Andersen documents well how America has lost its mind, but only partly why. We live in a country whose history, economy, and culture have brought together people of many different races, ethnicities, and religions.

But diversity, one of our nation’s strengths, also creates tension. Culture is what allows us to make sense of the world. Travelers who experience culture shock know how disorienting it is when people say or do things that are not immediately intelligible, that don’t seem normal. So when Somali Muslims move in down the street, their neighbors may feel uncomfortable; the difference in their physical appearance, language, clothing, or music makes some residents feel they don’t know what’s going on, and that is disconcerting. It leads them to retreat, to insist that the world is the way they say it is no matter how many facts may say otherwise. Some point the finger at an easy target and want the source of their consternation to simply disappear. But that is the most magical of magical thinking, and this is why America has lost its mind.

I agree with Andersen that this story could have a happy ending. We need to become world citizens, and such a trend is already under way. Economics and emigration move people around the globe at a rate history has never before witnessed. Social media have their downside, but they also put the entire world in the palm of your hand. Diversity is becoming the new normal. Although adults today may look askance at those Somali neighbors, their children will not wonder about their clothing, language, or religion—only about whether they listen to Beyoncé and play soccer. The recent sad display of white supremacy in Charlottesville, Virginia, is perhaps evidence of the death throes of an old order.

Robert L. Kelly
Professor of Anthropology, University of Wyoming
Laramie, Wyo.


Kurt Andersen’s article was, for me, a hard look in the mirror. I drank from the wells of Esalen, relativism, and post-structuralism; my education was chock-full of Foucault, and any expectation that objectivity was a “thing” was driven from my mind like Saint Patrick’s snakes from Ireland. But when I left graduate school I realized that 100 percent relativism was flawed; in a democratic society, we need to take a stand for what we believe in, and we need to define right and wrong.

It’s not a coincidence that relativism gained strength in the wake of the civil-rights movement and when America became part of the global community: It allows us to embrace cultural diversity. Considering what relativism offers and that we want to live in a diverse society, our task is to identify the values that hold us together. If that were easy, we would have figured it out already.

Victoria Finn
Zurich, Switzerland


I think Kurt Andersen is correct that the current form of the disease infecting the political right metastasized via Fox News and the internet. By purposefully tearing down all the “validating institutions,” the GOP untethered its base from any set of stable facts.

But Andersen fails to explain why this worked so well: grievance. Cultivating grievance and outrage has been a core strategy of partisan media, especially on the right, and it is the reason so many people are living in the same Fantasyland. This is reinforced by people’s natural desire to fit in to their respective communities, rally around teams (or political parties), and identify enemies. Furthermore, a lot of people feel that things are deeply broken and moving in the wrong direction. Progressives like me look at climate change, economic inequality, and social-justice issues. The right wing seems to have latched on to immigration, terrorism, secularism, and political correctness. When things feel broken, it’s understandable to direct some frustration at institutions and experts who have failed to fix them. And it’s not a long leap from there to feeling like those institutions are not only inadequate but complicit.

It’s easy to feel hopeless if you’ve ever tried to argue with a Fantasylander. Reasonable people argue with facts and evidence, but Fantasylanders counter by opposing the very idea of evidence, facts, experts. One weapon they have, in the age of the internet, is the reality that most facts are ultimately a consensus based on faith. I don’t mean that there are no objective facts. I mean that most of us can’t be absolutely certain of them. I accept the fact that climate change is real and man-made, and that childhood vaccines are a good thing, not because I conducted rigorous experiments but because I trust the people who did, as well as the institutions they belong to.

Shawn Smith
San Francisco, Calif.


I once toured Dachau, the Nazi concentration camp near Munich, Germany. During my two hours there, four groups of elementary-school students came to tour the site, which included a poster of a Hitler Youth member with the caption “Verführt. Verleitet. Verheizt.” (“Tempted. Misled. Slaughtered.”) Clearly Germany, unlike America, has an investment in truth as a means to avoid deceptive beliefs about its past and a repeat of barbaric practices. Meanwhile, McGraw-Hill published a high-school textbook that described the people captured, brought to the U.S., and forced to labor on plantations as “workers” rather than slaves.

If the “facts” taught in childhood are a deliberate misrepresentation of reality, is it a surprise that the population continues to nurture views that are independent of facts? Or that we have a president whose administration espouses a policy of deliberate lies that it dares to label “alternative facts”?

Richard DeBeau
Northfield, Minn.


Everything Kurt Andersen describes in his article is a mere symptom of our dysfunction, not a cause. The real culprit is plain old-fashioned racism—pure and simple. Racism is the reason America cannot properly enter the 21st century. It pollutes and degrades every national dialogue we have, from education to health care to foreign policy.

If you want to point to a date when our politics went “haywire,” try 1964. That was the year the Civil Rights Act was passed and the GOP sold its soul to become the party of racism in order to win elections in the Dixiecrat states. It was a desperate strategy by an out-of-touch party to maintain some power, but it worked—at the expense of our nation.

One wonders how Mr. Andersen could miss this point in his analysis. Since 1964, the GOP has managed to weaponize white racial insecurity and inflame bigotry for the benefit of candidates who otherwise couldn’t get elected dogcatcher. Conservative talk radio and Fox News continue to worm their way into the brains of white Americans through endless diatribes of paranoia supported by carefully managed disinformation. Their patriotic accomplishment has been to damage or destroy the judgment (not to mention the hearts) of millions of white Americans.

The GOP has quite possibly inflicted a mortal wound upon itself, but the disease of racism won’t disappear with the party. We’re going to have to take some decisive and far-reaching actions if we’re to preserve our country.

Royal Mason
San Diego, Calif.


I read “How America Lost Its Mind” with interest, but I was frustrated by Kurt Andersen’s single concept of truth. For him, if faith is involved, it is not truth but “magical thinking.” He does not distinguish between the fundamentalist faithful and the faithful who seek insight and inspiration rather than prediction and certainty.

We have science-based structures to organize our physical lives and we have faith-based structures to organize our lives’ meaning. One cannot take the place of the other. The error of fundamentalism is in replacing science with religion. The opposite error is failing to understand that everyone lives by principles that cannot be “rationally” verified.

Patricia DeWitt
Jacksonville, Fla.


Religion has been the primary way by which human beings have tried to understand the meaning of life and make sense out of creation. I do not argue for the truth of these beliefs in terms of objective reality, but like it or not, they form the historical philosophical basis for our concepts of morality and proper human behavior. To denigrate them as simply an example of a flight into the irrational is perhaps the best example of the intellectual chasm today between those on the left and those on the right. It also is a large factor in why reasoned discussion between the two sides has been on such a dramatic downtrend: For those on the right, it is difficult to have a reasonable conversation with someone who makes no secret about the fact that he thinks you are both benighted and stupid.

Bruce Franzese
White Plains, N.Y.


Kurt Andersen has engaged in a great degree of magical thinking of his own, trying to make a connection between Christianity and his perception of unreason in American dialogue. If that were the case, it would be a very long game indeed, as it has taken 2,000 years of Christian tradition and teaching (not to mention millennia of Jewish belief preceding it) to finally result in the wackification of America.

Had Andersen taken the time to acquire more than a popular-culture understanding of orthodox Christianity, he would have found that it explicitly rejects the relativism that he points to as the main symptom of the mindlessness syndrome. All of orthodox Christian belief is laid out for anyone to see; truths are not transient and personal but rather universal and eternal. Ten commandments establish all our responsibilities toward God and our fellow creatures. While Andersen may try to set up a conflict between faith and reason, I have always been counseled by the Church to understand that the Bible is not a science book. It does not purport to explain the workings of the world, but rather our relationship with the God who made it, by whatever mechanism he did.

Donald Trump does not come from an orthodox-Christian background. His religious training, such as it is, came from the prosperity-gospel school, which is unorthodox in preaching faith (as much in self as in God) as a means to wealth. The fact that many Christians supported him may speak to a fall away from orthodox teaching and toward moral relativism and secular standards that allow a cheat, an adulterer, and a liar to be judged to be the better choice among candidates.

Jon Abel
Richardson, Texas


Regarding the origins of relativism in the 1960s counterculture, Kurt Andersen’s history is incomplete. He describes New Age irrationalism as bursting onto the scene ex nihilo—an instantaneous spasm of adolescent wooziness. But it’s important to remember which “reality” the counterculture was rejecting: that of the Robert McNamaras of the world, whose uncritical faith in capitalism cost thousands of soldiers (and countless civilians) their lives. Their influence extends through Ronald Reagan’s presidency and Dick Cheney’s shadow reign to the policies of Ayn Rand fanboys like Paul Ryan. Clinging to a belief in markets that works for them, these ideologues have managed to tip the Supreme Court—a bastion, one would hope, of reasonableness—toward the bizarre fundamentalism of the Citizens United ruling, sure to mire American politics for decades to come. Most astonishing of all is how these fanatics have persuaded working-class conservatives to support the very policies that have decimated their communities and livelihoods.

Andersen is unduly harsh toward the harebrained idealists of the ’60s, perhaps overreacting to his own youthful gullibility. Far crazier, it seems to me, is the gullibility of mature adults who persistently worship false idols that impoverish them.

David Southward
Milwaukee, Wis.


Kurt Andersen replies:

This article, which focused on the past half century, was an excerpt from my new book, Fantasyland, which is about a dozen times as long and whose subtitle is A 500-Year History—that is, all of the issues raised in these letters are addressed at length in the book. As I explain there, the “grievance and outrage” that Shawn Smith mentions has been a defining American habit of mind since the Pilgrims left England. I don’t agree with Royal Mason that “plain old-fashioned racism—pure and simple” is the single cause of the Fantasyland phenomenon, but I do devote a large part of the book to myths and fantasies of the Old South and white superiority cultivated from the antebellum period through the 1890s and 1920s and 1960s to now. And while Christianity was only passingly mentioned in the Atlantic excerpt, a plurality of my book’s arguments and history concern our very peculiar American Protestantisms; Jon Abel’s version of Christianity sounds like a sensible strain with which I have no quarrel.


Getting Smart About Smartphones

In September, Jean M. Twenge chronicled the potential ill effects of ubiquitous smartphones and social media on young adults (“Has the Smartphone Destroyed a Generation?”).

Jasu Hu

We are often asked whether social media and smartphones are good or bad for teens. Parents, teachers, policy makers—even teens themselves—seek clear and simple answers. In this respect, Jean Twenge’s article does not disappoint.

Twenge draws a straight line tracing broad trends to a single source: networked technologies. We see three main problems with this. First, Twenge uses correlational data to make causal claims. Yet correlation neither implies nor confirms causation. Second, despite stating that “no single factor ever defines a generation,” Twenge devotes her piece to a single-factor characterization of “iGen.” Third, just as digital media are unlikely to be the sole cause of teens’ attitudes and behaviors, they’re also unlikely to have a singular, uniform impact on all teens.

Our research documents that youth can have distinctly different experiences on the same networked platforms; existing peer and family relationships and prior levels of well-being are among the many factors that converge to determine whether a teen has a positive or negative experience on Instagram, Snapchat, or iMessage. Cherry-picking studies overlooks more-nuanced accounts of teens and technology, as well as the reality that many youth have routinely positive experiences online. Giving in to the allure of simple narratives does a disservice to our young people and undercuts our ability to help them. Only through the deciphering of teens’ complex relationship with technology can we fashion effective strategies for supporting them.

Katie Davis
University of Washington
Seattle, Wash.

Emily Weinstein and Howard Gardner
Harvard University
Cambridge, Mass.


Fellow parents, it’s time for us to consider another possible explanation for why our kids are increasingly disengaged. It’s because we’ve disengaged ourselves; we’re too busy looking down at our screens to look up at our kids.

I know: It’s how I live myself. Children are super annoying—especially teen-agers, I would say, now that I’ve got one … I would much rather spend an hour perusing Wonder Woman crafts on Pinterest than listening to my 13-year-old ramble on about anime. As a friend warned me when I first got pregnant, “Children are simultaneously overwhelming and understimulating.” Why wouldn’t we want to be distracted from that? …

[There is] a competing explanation for the recent declines in adolescent independence that Twenge observes. Fostering independence takes work: Someone has to teach the kid to drive, show them how to get to the mall, maybe prod themto make some friends and get outside …

My entire experience of parenthood has been lived in the tug-of-war between child and screen; my kids can’t remember a time when they didn’t have to compete with my iPhone in order to get my attention. Like many people, my constant screen interactions are a matter of professional obligation as well as personal taste, so I live life as a constant juggling act between the needs of my children and the distractions of social media …

I think we can do better than Twenge’s suggestions of instilling “the importance of moderation,” or “mild boundary-setting.” The off switch has its place, but if that’s all we have to offer our kids, we aren’t helping prepare them for what it means to live in a digital world …

It’s so important for us to both discover and model ways of being online that help our kids embrace the potential of social media, smartphones, and whatever the next thing is to come along …

My own research suggests that the best way we can do that is by embracing our role as digital mentors: actively encouraging our kids to use technology, but offering ongoing support and guidance in how to use it appropriately.

Alexandra Samuel
Excerpt from a JSTOR Daily article


Jean M. Twenge replies:

My thanks to the letter writers. I welcome the opportunity to delve into these issues more deeply. I explicitly noted in the article that the correlational analyses I did can’t rule out the possibility that unhappy teens spend more time on screens. However, two studies following people over time found that more social-media use led to unhappiness, but unhappiness did not lead to more social-media use. A third study was a true experiment (which can determine causation); it randomly assigned adults to give up Facebook for a week, or not. Those who gave up Facebook ended the week happier, less lonely, and less depressed.

If depression causes social-media use, why did depression increase so suddenly after 2011? If the increase in depression occurred first, some other, unknown factor would have had to cause depression to rise so sharply, which would then have led to more smartphone and social-media use. It seems far more likely that smartphone and social-media use went up, and the increase in depression followed.

When weighing evidence and interventions, we have to consider the risks of doing something versus doing nothing. There doesn’t seem to be much risk involved in limiting teens’ smartphone or social-media use to two hours a day or less. However, letting teens continue to spend six-plus hours a day with new media risks having these negative mental-health trends continue if screen time is even part of the cause for their rise.


Seventy Years of The Atlantic

To commemorate our 160th anniversary this month, The Atlantic sought out the subscriber who has been reading the magazine the longest. That search brought us to 90-year-old William Allan Plummer, who received his first issue as a college student on the GI Bill in 1947. “I was interested in world events and culture,” he told The Atlantic recently.


The Big Question: What crime most changed the course of history?

(On TheAtlantic.com, readers answered October’s Big Question and voted on one another’s responses. Here are the top vote-getters.)

5. The original sin of America was slavery. Whether or not it was a crime at the time, there is no denying the illegality of slavery to all we stand for as a country. slavery.

Alayna Buckner

4. The extermination of American Indians, which ultimately led to the creation of the world’s greatest superpower, the United States of America.

Gary Kohl

3. The mass murder of Jews, Gypsies, homosexuals, and others in the Holocaust.

Louis Franzini

2. The execution of Jesus of Nazareth was a travesty of justice, but his followers’ belief that he rose from the dead contributed to the end of the pagan Roman empire.

Marie D. Hoff

1. Gavrilo Princip assassinating Archduke Franz Ferdinand. This crime triggered World War I, led to the October Revolution and the end of the Habsburg and Ottoman empires, rewrote the world map, set the stage for World War II, and affects just about every person alive today.

Louis Nagel


Correction:

“How America Lost Its Mind,” by Kurt Andersen (September), stated that among all American state legislators, there is only one avowed atheist. In fact, according to the Center for Freethought Equality, there are several.


To contribute to The Conversation, please email letters@theatlantic.com. Include your full name, city, and state.