Since 1857, The Atlantic has been challenging established answers with tough questions. In this video, actor Michael K. Williams—best known as Omar Little from The Wire—wrestles with a question of his own: Is he being typecast?
What are you asking yourself about the world and its conventional wisdom? We want to hear your questions—and your thoughts on where to start finding the answers: firstname.lastname@example.org. Each week, we’ll update this thread with a new question and your responses.
When I was 11, my mother died. My father had become blind a few years before, from a rare form of glaucoma. He had no choice but to allow me to do things that are normally done by an adult, such as budgeting and paying bills, cooking and cleaning, and other various things. He had to talk to me in an honest way, and make me understand things and rely on my judgement in lots of matters. Other adults did too. I was never a child again after my mother died and my dad knew it.
Another reader’s mother also died at a pretty young age:
I became an adult when my mother died and my dad started dating four months later. I was 20 years old. Once he had a new woman in his life (whom he is still married to now) and essentially a new family, I was out. We had really started to be at odds the year before, when I had started to do things my way instead of his way. He had pretty much taken for granted that I could make it in this world without his advice or anything.
For this next reader, it was boarding school:
I’m not sure the end of childhood is the sort of thing that one can pinpoint; seems to me there were rather a number of distinct rites of passage. The first was when I went to boarding school, around age 10. When my parents dropped me off that first day, I knew I was on my own. Calling home to say they should come get you was not an option; my parents made this pretty clear, but it was not necessary. I knew.
Another reader had to go abroad to step out of childhood:
When I was an exchange student, my father came down to visit. There I was, living independently in a foreign country at 17. I could speak the language fluently and had to navigate us for him.
I was 18 and turned down an Ivy League school and, using my own money, moved to Italy to live with my 25-year-old girlfriend. I think my parents would say I became an adult when, at 26, I handed them a copy of my will and let them know I had been chosen to deploy in a war zone as a civilian alongside a joint counter-terrorism/counter-insurgency military unit. Strangely enough, they weren’t happy with either bit of news.
Rachael was 18 and had just started college:
I received an offer for health insurance in the mail. It went to my home address, and my mom was absolutely thrilled at the notion that I would be off her employer-supplied (but expensive) health insurance. I began paying for my own health insurance at that time, but I also let her know that I wouldn’t allow her to claim me as a dependent on her income taxes anymore. I paid my own expenses (insurance, college, etc). I laugh now when I think of “kids” still being on their parents’ insurance until they’re 26.
This next reader is almost 26:
I became an adult in the past year or so, when my dad learned how much was in my savings account and I mentioned my credit score in the context of considering new car costs. I think my parents assumed up to that point that I was just scraping by and blowing my money irresponsibly, and they were impressed with the degree to which I was caring for myself.
I think calling my dad and asking for advice has helped him to see me differently as well. There’s something about discussing investments and trying to decide on insurance plans that I would assume makes it hard to keep seeing your kid as a kid.
Another reader also nods to financial independence: “I became an adult when I began to pick up the check for my parents by surreptitiously passing plastic to the maître d’ early in the meal.” This next reader entered adulthood in a brutal fashion:
I was 20 and began working at the same factory as my father did. He was in maintenance as an industrial electrician. There had been a summer program for employees’ children and I worked out well enough that I was hired at the end of the summer. He was proud that I carried my weight.
However, three years into it and just after my 23rd birthday, my hand got caught in a take-up roll for a large paper machine and I was flung around like a rag doll. With both femurs and my left ulna, left radius, and left humerus broken, I spent months recovering.
But I kept a good attitude, believing falsely that I would be back to my normal self. My dad told me that he could never have had such a good attitude having gone through what I did.
This next reader also defined his adulthood alongside his dad’s admiration:
I became an adult when I joined the ROTC program my freshman year of college to appease my dad. He got a glimmer of pride in his eyes, saw it as me taking initiative, and was proud that I seemed interested in serving my country. I wasn’t really; I only did it to get him off my back so I could do drugs and other hedonistic things in college. But it was good to have his approval for once and no stress.
The first moment of adulthood was rather mundane for this reader: “Probably the day Mom asked if she could come to my apartment 45 minutes away to use my washing machine because hers was broken.” For another reader, the moment was different for each of his parents:
For my dad, I’d say the writing was on the wall around the time I was 16 and it became apparent that I was physically stronger than he was. He maintained the power of the purse for a few years after, but, considering that it was at about the same time when he’d occasionally offer me a beer, I’m inclined to accept that I was an adult in his eyes.
Mom? Damn, who knows what she really thinks about anything, but I’m guessing she first fully acknowledged my adulthood not through any of the accomplishments or milestones, but when, in 2003, I bought her a car and she therefore had a tangible symbol that could be seen and acknowledged by others.
This next reader’s parents need grandkids before truly considering her an adult:
I’m a 31-year-old married attorney homeowner with no kids. I don’t think my parents look at me like an adult. I don’t think they will until I have kids. This is most obviously manifested in my parents’ constant amnesia of me being a lawyer. They will be having discussions about some legal consideration, I’ll weigh in, yet my opinion is given no weight whatsoever. Though maybe I’m just being whiny that they aren’t taking me seriously. Is that the same thing as not seeing me as an adult? They seem very proud of me, but that doesn’t seem akin to viewing me as an adult either.
Jason might never become an adult in his parents’ eyes:
In some ways, my sister and I will always be “kids.” My mom will randomly start doing things to my hair or bring over something like saucepans that we already have and don’t need. My dad critiques my yard and is always convinced something is wrong with my car (there isn’t, Dad, it’s fine!) I just laugh at this stuff, though; it doesn’t really bother me.
It bothers Doug, though; he finds the topic a “sore spot”:
I’m not convinced my parents believe I’m an adult. I’m 32, the youngest of three, the second most-educated (sister has multiple grad degrees), and the highest earner. I’m constantly getting asked if I’m saving enough or if I need help with anything. Meanwhile, I never ask for help, but both of my older sisters constantly need help. I’m the one who made sure my dad got power of attorney for my grandmother before she totally lost it, and who opened college accounts for my nieces and nephews.
Henry David Thoreau is something of a poster child for solitude. In his essay “Walking,” published just after his death in our June 1862 issue, Thoreau made the case “for absolute freedom and wildness … to regard man as an inhabitant, or a part and parcel of Nature, rather than a member of society”:
We should go forth on the shortest walk, perchance, in the spirit of undying adventure, never to return, prepared to send back our embalmed hearts only as relics to our desolate kingdoms. If you are ready to leave father and mother, and brother and sister, and wife and child and friends, and never see them again—if you have paid your debts, and made your will, and settled all your affairs, and are a free man—then you are ready for a walk.
Thoreau himself was “a genuine American weirdo,” as Jedediah Purdy recently put it, and solitude suited him: His relentless individualism irritated his friends, including Atlantic co-founder Ralph Waldo Emerson, who described Thoreau’s habit of contradicting every point in pursuit of his own ideals as “a little chilling to the social affections.” Emerson may have had Thoreau in mind when, in our December 1857 issue, he mused that “many fine geniuses” felt the need to separate themselves from the world, to keep it from intruding on their thoughts. Yet he questioned whether such withdrawal was good for a person, not to mention for society as a whole:
This banishment to the rocks and echoes no metaphysics can make right or tolerable. This result is so against nature, such a half-view, that it must be corrected by a common sense and experience. “A man is born by the side of his father, and there he remains.” A man must be clothed with society, or we shall feel a certain bareness and poverty, as of a displaced and unfurnished member. He is to be dressed in arts and institutions, as well as body-garments. Now and then a man exquisitely made can live alone, and must; but coop up most men, and you undo them. …
When a young barrister said to the late Mr. Mason, “I keep my chamber to read law,”—“Read law!” replied the veteran, “’tis in the courtroom you must read law.” Nor is the rule otherwise for literature. If you would learn to write, ’tis in the street you must learn it. Both for the vehicle and for the aims of fine arts, you must frequent the public square. … Society cannot do without cultivated men.
Emerson concluded that the key to effective, creative thought was to maintain a balance between solitary reflection and social interaction: “The conditions are met, if we keep our independence, yet do not lose our sympathy.”
Four decades later, in our November 1901 issue, Paul Elmore More identified a radical sympathy in the work of Nathaniel Hawthorne, which stemmed, he argued, from Hawthorne’s own “imperial loneliness of soul”:
His words have at last expressed what has long slumbered in human consciousness. … Not with impunity had the human race for ages dwelt on the eternal welfare of the soul; for from such meditation the sense of personal importance had become exacerbated to an extraordinary degree. … And when the alluring faith attendant on this form of introspection paled, as it did during the so-called transcendental movement into which Hawthorne was born, there resulted necessarily a feeling of anguish and bereavement more tragic than any previous moral stage through which the world had passed. The loneliness of the individual, which had been vaguely felt and lamented by poets and philosophers of the past, took on a poignancy altogether unexampled. It needed but an artist with the vision of Hawthorne to represent this feeling as the one tragic calamity of mortal life, as the great primeval curse of sin … the universal protest of the human heart.
Fast-forward a century, and what More described as “the solitude that invests the modern world” had only gotten deeper invested—while “the sense of personal importance” gained new narcissistic vehicles in the form of social-media tools that let us “connect” online while keeping our real, messy selves as private as we choose. Which is not a bad thing: In some ways, the internet looks like the perfect way to achieve Emerson’s ideal balance between independent thought and social engagement.
A considerable part of Facebook’s appeal stems from its miraculous fusion of distance with intimacy, or the illusion of distance with the illusion of intimacy. Our online communities become engines of self-image, and self-image becomes the engine of community. The real danger with Facebook is not that it allows us to isolate ourselves, but that by mixing our appetite for isolation with our vanity, it threatens to alter the very nature of solitude.
The new isolation is not of the kind that Americans once idealized, the lonesomeness of the proudly nonconformist, independent-minded, solitary stoic, or that of the astronaut who blasts into new worlds. Facebook’s isolation is a grind. What’s truly staggering about Facebook usage is not its volume—750 million photographs uploaded over a single weekend—but the constancy of the performance it demands. More than half its users—and one of every 13 people on Earth is a Facebook user—log on every day. Among 18-to-34-year-olds, nearly half check Facebook minutes after waking up, and 28 percent do so before getting out of bed. The relentlessness is what is so new, so potentially transformative. Facebook never takes a break. We never take a break.
The same year, Brian Patrick Eha also noted the changing nature of solitude—particularly the kind of solitude achieved by wearing headphones in public. “We are each of us cocooned in noise,” he wrote, “and can escape from one another’s only when immersed in our own.” For both Marche and Eha, the problem with technology is not its tendency to isolate people so much as the way it works to prevent us—through a sense of connection or simply through distraction—from fully experiencing that isolation and all it entails.
And as the author Dorthe Nors explained in 2014 for our By Heart series of writer interviews, a full experience of isolation has serious benefits:
The artistic process unfolds in the lonely hours. That’s when the work happens. You have to control the creative energy that you’ve got. You have to discipline yourself to fulfill it. And that work only happens alone.
Solitude, I think, heightens artistic receptivity in a way that can be challenging and painful. When you sit there, alone and working, you get thrown back on yourself. Your life and your emotions, what you think and what you feel, are constantly being thrown back on you. And then the “too much humanity” feeling is even stronger: you can’t run away from yourself. You can’t run away from your emotions and your memory and the material you’re working on. Artistic solitude is a decision to turn and face these feelings, to sit with them for long periods of time.
For Nors, like for Hawthorne, solitude not only enables personal reflection, but also grants access to some deeper, more universal strain of human feeling. That’s the same lesson that Nathaniel Rich, writing in our latest issue, took from the story of Christopher Knight, who spent 27 years living utterly alone in the woods of Maine:
Since his arrest in April 2013, Knight has agreed to be interviewed by a single journalist. Michael Finkel published an article about him in GQ in 2014 and has now written a book, The Stranger in the Woods, that combines an account of Knight’s story with an absorbing exploration of solitude and man’s eroding relationship with the natural world. Though the “stranger” in the title is Knight, one closes the book with the sense that Knight, like all seers, is the only sane person in a world gone insane—that modern civilization has made us strangers to ourselves.
Yet a total withdrawal from civilization can’t be the answer—nor, at a political moment when empathy and understanding seem ever-more-urgently needed, can walling yourself off from other people’s ideas be wise. In February, Emma Green offered this critique of a new book by Rod Dreher, a conservative Christian thinker who calls for like-minded members of his faith to withdraw from public life into communities of their own:
Dreher wrote The Benedict Option for people like him—those who share his faith, convictions, and feelings of cultural alienation. But even those who might wish to join Dreher’s radical critique of American culture, people who also feel pushed out and marginalized by shallowness of modern life, may feel unable to do so. Many people, including some Christians, feel that knowing, befriending, playing with, and learning alongside people who are different from them adds to their faith, not that it threatens it. For all their power and appeal, Dreher’s monastery walls may be too high, and his mountain pass too narrow.
So, tell us about your experience: How do you incorporate solitary reflection into a 21st-century lifestyle? Can you see communitarian benefits in spending more time on your own—or, on the other hand, point to what society loses when more people spend more time alone? Please send your answers (and your questions) to email@example.com.
The election of the elders of an evangelical church is usually an uncontroversial, even unifying event. But this summer, at an influential megachurch in Northern Virginia, something went badly wrong. A trio of elders didn’t receive 75 percent of the vote, the threshold necessary to be installed.
“A small group of people, inside and outside this church, coordinated a divisive effort to use disinformation in order to persuade others to vote these men down as part of a broader effort to take control of this church,” David Platt, a 43-year-old minister at McLean Bible Church and a best-selling author, charged in a July 4 sermon.
Platt said church members had been misled, having been told, among other things, that the three individuals nominated to be elders would advocate selling the church building to Muslims, who would convert it into a mosque. In a second vote on July 18, all three nominees cleared the threshold. But that hardly resolved the conflict. Members of the church filed a lawsuit, claiming that the conduct of the election violated the church’s constitution.
Being inclusive is important. But it’s not everything.
Who can get pregnant? It sounds like a trick question. For centuries, English speakers have talked about “pregnant women” without a second thought, but a vocal and growing movement wants to replace that phrase with the more inclusive pregnant people. And because the United States hasn’t yet found an issue it can’t turn into a polarized debate, a partisan divide has already formed. The received wisdom is now that a good liberal should always say “pregnant people,” if only because it upsets Tucker Carlson.
I disagree. Language evolves, and inclusion for transgender people matters. But for now I will keep using pregnant women in almost all circumstances.
Pregnant people is a relatively new phrase. Google’s Ngram viewer, which trawls English-language books dating back to 1800, finds absolutely no trace of it before 1978, and a sharp spike in the past decade. It now appears in CNN headlines, Planned Parenthood advice, Washington Postcolumns, and CDC guidelines on COVID-19 vaccination. Its usage reflects a growing awareness that not everyone who gets pregnant defines themselves as a woman—transgender men and nonbinary people can give birth too. (Nonbinary is itself a very recent coinage; the usage examples given in Merriam-Webster’s dictionary date back only to 2015.) Using more inclusive language, the American Civil Liberties Union’s deputy legal director, Louise Melling, recently told my colleague Emma Green, “should do a fair amount of work to help address discrimination. If we talk about ‘pregnant people,’ it’s a reminder to all of us to catch ourselves when we’re sitting in the waiting room at the GYN that we’re not going to stare at the man who’s there.”
Claims about the drug are based on shoddy science—but that science is entirely unremarkable in its shoddiness.
Ivermectin is an antiparasitic drug, and a very good one. If you are infected with the roundworms that cause river blindness or the parasitic mites that cause scabies, it is wonderfully effective. It is cheap; it is accessible; and its discoverers won the Nobel Prize in 2015. It has also been widely promoted as a coronavirus prophylactic and treatment.
This promotion has been broadly criticized as a fever dream conceived in the memetic bowels of the internet and as a convenient buttress for bad arguments against vaccination. This is not entirely fair. Perhaps 70 to 100 studies have been conducted on the use of ivermectin for treating or preventing COVID-19; several dozen of them support the hypothesis that the drug is a plague mitigant. Twometa-analyses, which looked at data aggregated across subsets of these studies, concluded that the drug has value in the fight against the pandemic.
The James Webb Space Telescope, the long-awaited successor to Hubble, is mired in controversy over its namesake.
In 1999, Karen Knierman picked up a free mug at her first big astronomy conference, just before she started grad school. It bore the logo of an ambitious observatory, designed to peer at the most distant galaxies in the universe: NGST, short for Next Generation Space Telescope. The mug was on Knierman’s desk in 2002 when NASA made a surprise announcement: NGST was going to become JWST, after James Webb. Knierman sipped from her suddenly out-of-date mug and wondered, Who?
That was the prevailing reaction among scientists at the time. Webb, who died in 1992, was more of a behind-the-scenes manager than a space-science star; he had served as NASA’s second administrator, in the 1960s, during the run-up to the Apollo moon landings. But scientists went with the rebrand. Work on the telescope continued. Scientists got new merch, new mugs.
Midnight Mass is a morally urgent critique of how faith can fuel everyday cruelty and violence.
This story contains spoilers for the Netflix series Midnight Mass.
The Exorcist is a film I’ve long loved because it raised the bar not just for horror, but also for movies that explore questions of faith and doubt, good and evil, life and death. I know all of its beats by heart, but when I recently rewatched the 1973 classic, the ending hit differently. The movie concludes with an exorcism, naturally. Chris MacNeil has brought her daughter, Regan, to a host of medical professionals in a desperate attempt to save her from what turns out to be a demonic possession. But the only person who can save the girl, it seems, is a priest. The camera lingers on the mother’s exhausted face as two priests close the door to her daughter’s bedroom and go to work.
A brilliant new account upends bedrock assumptions about 30,000 years of change.
Many years ago, when I was a junior professor at Yale, I cold-called a colleague in the anthropology department for assistance with a project I was working on. I didn’t know anything about the guy; I just selected him because he was young, and therefore, I figured, more likely to agree to talk.
Five minutes into our lunch, I realized that I was in the presence of a genius. Not an extremely intelligent person—a genius. There’s a qualitative difference. The individual across the table seemed to belong to a different order of being from me, like a visitor from a higher dimension. I had never experienced anything like it before. I quickly went from trying to keep up with him, to hanging on for dear life, to simply sitting there in wonder.
Barack Obama seems almost tragically fixated on the idea that poetry, podcasting, and TV programming can heal our national wounds.
After hours of searching conversation about America and the human soul, the former president of the United States reiterated his brand identity. “Here’s what makes me optimistic ... because, you know, I’m the hope guy,” Barack Obama told Bruce Springsteen in a chat recorded last year for their podcast, Renegades: Born in the USA. Transcripts of that conversation have now been adapted into a book with the same title that also features reproductions of Obama’s speeches, snatches of Springsteen’s lyrics, and hundreds of photographs.
In 2008, Obama became the “hope guy” by promising national unity after the turbulent George W. Bush years. Marketed by street-art posters and celebrity sing-alongs, deploying a dynamic oratorical style and an inspiring personal story, the would-be first Black president pitched himself as a transformational figure—and pitched America on the story of progress it could tell itself if it elected him.
Thousands of pages of internal documents offer the clearest picture yet of how Facebook endangers American democracy—and show that the company’s own employees know it.
Before I tell you what happened at exactly 2:28 p.m. on Wednesday, January 6, 2021, at the White House—and how it elicited a very specific reaction, some 2,400 miles away, in Menlo Park, California—you need to remember the mayhem of that day, the exuberance of the mob as it gave itself over to violence, and how several things seemed to happen all at once.
At 2:10 p.m., a live microphone captured a Senate aide’s panicked warning that “protesters are in the building,” and both houses of Congress began evacuating.
At 2:13 p.m., Vice President Mike Pence was hurried off the Senate floor and out of the chamber.
At 2:15 p.m., thunderous chants were heard: “Hang Mike Pence! Hang Mike Pence!”
At the White House, President Donald Trump was watching the insurrection live on television. The spectacle excited him. Which brings us to 2:28 p.m., the moment when Trump shared a message he had just tweeted with his 35 million Facebook followers: “Mike Pence didn’t have the courage to do what should have been done to protect our Country and our Constitution … USA demands the truth!”
And why we’re failing to do the same things in America
Having grown up inGermany, I am skeptical of the popular notion that life is so much more rational and efficient in the country than it is anywhere else. Those who believe that Germans are incapable of irrationality should suggest imposing a speed limit on the country’s highways. And those who believe that Germans are incapable of inefficiency should learn how much time and money were spent to build Berlin’s new airport.
And yet I have, since returning to Germany about a month ago, been struck by how much more rational, efficient, and pragmatic the country’s handling of the late stages of the coronavirus pandemic has been. While the American response to COVID-19 has barely gone beyond the measures that were first adopted in the spring of 2020, Germany has phased in a series of additional policies over the past 18 months. None of them adds serious disruptions to daily life, and yet they collectively put the country in a much better position to contain the virus.
In ways both large and small, American society still assumes that the default adult has a partner and that the default household contains multiple people.
If you were to look under the roofs of American homes at random, it wouldn’t take long to find someone who lives alone. By the Census Bureau’s latest count, there are about 36 million solo dwellers, and together they make up 28 percent of U.S. households.
Even though this percentage has been climbing steadily for decades, these people are still living in a society that is tilted against them. In the domains of work, housing, shopping, and health care, much of American life is a little—and in some cases, a lot—easier if you have a partner or live with family members or housemates. The number of people who are inconvenienced by that fact grows every year.
Those who live alone, to be clear, are not lonely and miserable. Research indicates that, young or old, single people are more social than their partnered peers. Bella DePaulo, the author of How We Live Now: Redefining Home and Family in the 21st Century, reeled off to me some of the pleasures of having your own space: “the privacy, the freedom to arrange your life and your space just the way you want it—you get to decide when to sleep, when to get up, what you eat, when you eat, what you watch on Netflix, how you set the thermostat.”