Since 1857, The Atlantic has been challenging established answers with tough questions. In this video, actor Michael K. Williams—best known as Omar Little from The Wire—wrestles with a question of his own: Is he being typecast?
What are you asking yourself about the world and its conventional wisdom? We want to hear your questions—and your thoughts on where to start finding the answers: firstname.lastname@example.org. Each week, we’ll update this thread with a new question and your responses.
The initial wave of reader response to our question “Is a long life really worth it?” was overwhelming “meh, not so much.” But since then, many sexagenarians, septuagenarians, octogenarians, and nonagenarians have emailed more enthusiastic outlooks on old age. Here’s Jim:
A thought-provoking discussion, but it really misses the key point. I turn 65 in a couple of months, but I don’t expect to “retire” at 65—or ever. I’m fit and healthy and having the greatest fun of my life at the head of a fast-growing business. In a quarter century, if still alive, I might have to slow down a bit, but there will still be something useful for me to do.
The founding pastor of our church has poor hearing and is almost blind, but a few weeks ago he preached a great sermon to celebrate his 100th birthday. He still contributes in other ways as well.
Not everyone can continue working, but there is a huge need for volunteers in areas that do not require physical agility. Unless totally senile—and that’s something that will never happen to most of us—we all have something to offer.
Maggie is a quarter century older than Jim but has a very similar view:
Life isn’t over because I’m not longer “useful.” I’m 90 and have spent the last decade trying to be okay with not always being the helping hand. Though my greatest joy has come from knowing I have touched another’s life by being helpful, I have to remember that I am still touching people’s lives as long as I am alive. I’m so pleasantly surprised that people want to be around me.
I was pretty grim when I had to stop driving because a slight accident damaged the car beyond repair. My health also gave way and I was briefly hospitalized. It was a big adjustment. But now I am walking, exercising at the gym once a week, taking part in demonstrations, and forgetting about how old I am. I don’t see any other options.
In three weeks I will have my 90th birthday. I am certainly glad I did not die at 75. Since then, I have seen four more grandchildren born, two grandchildren graduate from college, and two from high school. I sold my financial advisory firm to my partners and helped start a new Trust Company, now serving as Regional Director and on their Board. I have had some wonderful trips and been able to enjoy sailing, tennis, and horseback riding up until two years ago. I have recently bought a set of golf clubs and look forward to enjoying a new sport.
Carol frames aging this way:
Everyone has three ages: chronological, biological, and mental. (The most important, by far, is our mental age.) I’m chronologically 81, biologically 65 and mentally 60.
Tony adds some perspective:
Consider this: Well into his 80s, Verdi [the Italian composer] was still at it; ahead were two of his greatest operas, Otello and Falstaff. And Michelangelo was still there, chisel in hand, well into his 80s. Problem is, we think it’s all over—but life, and sometimes ourselves too, always has a surprise in store.
Maureen calls old age “my blessing”:
I will be 70 on my next birthday! I have finally begun to live my truth. I am fortunate in that I have an appreciation for life that never occurred to me in my younger years. I love every sunrise and sunset. I enjoy watching the bunnies, hummingbirds, lizards, and butterflies. My grandchildren enjoy my company. I am my husband’s best friend. I have a deep spiritual connection. I take nothing for granted.
Life for me is beautiful—not because it is perfect, but because it is lovely even in its imperfection. I have made peace with my past and have no fears for my future. I am grateful for every moment! I will stay here on this amazing planet as long as I can.
Another positive outlook comes from Charlie:
Aging is not a sickness or a disease. No one yet has died knowing all there is to know and enjoying everything there is to enjoy! So why not try to be that first? Optimism, positivism, aggressiveness, regardless of your age, is what it means to be human. Cells may die and energy may lessen. But whatever is left should be used to live and love as fully as possible. We are always and ever in the process of becoming!
Joyce has some tips for healthy living in your eighties:
I read Ezekiel Emanuel’s article [“Why I Hope to Die at 75”] and agree with much of it; I certainly don’t want to have lots of effort made to keep me alive if I should unfortunately end up in a hospital and have no intention of any surgeries.
However, I am 83 and not hoping to die any time soon. I am unusually healthy for my age and do many things to remain so: I take no prescription drugs; I exercise regularly including weight lifting, walking and Tai Chi; I eat well, including fresh vegetable juice every day or so; I have good regular connections with family and close friends; I experience good art forms, including playing the piano, singing, movies, novels (currently my husband, who is 85, and I are watching the fine BBC series Lark Rise to Candleford and reading aloud together Margaret Atwood’s novel Blind Assassin).
Karen is 80 years old and wisely keeps her smartphone at bay:
I go for long walks every day it isn’t raining or unbearably cold. It is my job to keep myself as mobile and as healthy as possible. I don’t wear headphones or keep my phone on when I walk. I want to observe what wonders nature is revealing: sights, sounds, odors. I find the sound of the ocean is restful and restorative. As I near the end of my life, birds, otters, flowers, sunrises and sunsets take on extra meaning for I know I have a limited time in which to enjoy them.
And Nancy shares a great saying:
I am 79 and still teaching college courses—for another year at least, if lucky. Then for as long as I am able, I will continue to volunteer. As a good friend said, “You ought to be all spent up before you go.”
Here’s how an Atlantic author answered that question in September 1858:
Full of anticipations, full of simple, sweet delights, are these [childhood] years, the most valuable of [a] lifetime. Then wisdom and religion are intuitive. But the child hastens to leave its beautiful time and state, and watches its own growth with impatient eye. Soon he will seek to return. The expectation of the future has been disappointed. Manhood is not that free, powerful, and commanding state the imagination had delineated. And the world, too, disappoints his hope. He finds there things which none of his teachers ever hinted to him. He beholds a universal system of compromise and conformity, and in a fatal day he learns to compromise and conform.
But it wasn’t until the 20th century that scientists began to seriously study child development. In our July 1961 issue, Peter B. Neubauer heralded “The Century of the Child”:
Gone is the sentimental view that childhood is an era of innocence and the belief that an innate process of development continuously unfolds along more or less immutable lines. Freud suggested that, from birth on, the child’s development proceeds in a succession of well-defined stages, each with its own distinctive psychic organization, and that at each stage environmental factors can foster health and achievement or bring about lasting retardation and pathology. …
Freudian psychology does not, as some people apparently imagine, provide a set of ready-made prescriptions for the rearing of children. … The complexity of the interactions between mother and child cannot be reduced to rigid formulas. Love and understanding cannot be prescribed, and if they are not genuinely manifested, the most enlightened efforts to do what is best for the child may not be effective.
According to this view, children weren’t miniature adults, but they were preparing for adulthood. Growing up was a process that had to be managed by adults, which made the boundaries of childhood both more important and more nebulous.
A few years later, in our October 1968 issue, Richard Poirier described the backlash to a wave of campus protests as “The War Against the Young.” He implored older adults to take young people’s ideas seriously:
It is perhaps already irrelevant, for example, to discuss the so-called student revolt as if it were an expression of “youth.” The revolt might more properly be taken as a repudiation by the young of what adults call “youth.” It may be an attempt to cast aside the strangely exploitative and at once cloying, the protective and impotizing concept of “youth” which society foists on people who often want to consider themselves adults.
What’s more, Poirier argued, idealism shouldn’t just be the province of the young:
If young people are freeing themselves from a repressive myth of youth only to be absorbed into a repressive myth of adulthood, then youth in its best and truest form, of rebellion and hope, will have been lost to us, and we will have exhausted the best of our natural resources.
But how much redefinition could adulthood handle? In our February 1975 issue, Midge Decter addressed an anxious letter to that generation of student revolutionaries, who—though “no longer entitled to be called children”—had not yet fulfilled the necessary rites of passage for being “fully accredited adults”:
Why have you, the children, found it so hard to take your rightful place in the world? Just that. Why have your parents’ hopes for you come to seem so impossible of attainment?
Some of their expectations were, to be sure, exalted. … But … beneath these throbbing ambitions were all the ordinary—if you will, mundane—hopes that all parents harbor for their children: that you would grow up, come into your own, and with all due happiness and high spirit, carry forward the normal human business of mating, home-building, and reproducing—replacing us, in other words, in the eternal human cycle. And it is here that we find ourselves to be most uneasy, both for you and about you.
Decter blamed this state of affairs on overindulgent parenting: Adults, she argued, had failed their children by working too hard to protect them from unhappiness and by treating their “youthful rebellion” with too much deference.
The next decades’ developments in child psychology gave parents new advice. In our March 1987 issue, Bruno Bettelheim stressed the importance of letting kids guide their own play, without parents pushing them to obey rules they aren’t yet developmentally ready for. And in our February 1990 issue, Robert Karen outlined attachment theorists’ recommendations for how to “enable children to thrive emotionally and come to feel that the world of people is a positive place”—standards measured in part by a baby’s willingness to explore apart from its mother.
Were these parenting styles encouraging kids’ independence, or failing to push them hard enough? A generation after Decter, in Lori Gottlieb’s 2011 Atlantic piece “How to Land Your Kid in Therapy,” she also worried about parental indulgence:
The message we send kids with all the choices we give them is that they are entitled to a perfect life—that, as Dan Kindlon, the psychologist from Harvard, puts it, “if they ever feel a twinge of non-euphoria, there should be another option.” [Psychologist Wendy] Mogel puts it even more bluntly: what parents are creating with all this choice are anxious and entitled kids whom she describes as “handicapped royalty.” …
When I was my son’s age, I didn’t routinely get to choose my menu, or where to go on weekends—and the friends I asked say they didn’t, either. There was some negotiation, but not a lot, and we were content with that. We didn’t expect so much choice, so it didn’t bother us not to have it until we were older, when we were ready to handle the responsibility it requires. But today, [psychologist Jean] Twenge says, “we treat our kids like adults when they’re children, and we infantilize them when they’re 18 years old.”
In Hanna Rosin’s April 2014 article “The Overprotected Kid,” she lamented the loss of independence that once helped kids come of age:
One common concern of parents these days is that children grow up too fast. But sometimes it seems as if children don’t get the space to grow up at all; they just become adept at mimicking the habits of adulthood. As [geographer Roger] Hart’s research shows, children used to gradually take on responsibilities, year by year. They crossed the road, went to the store; eventually some of them got small neighborhood jobs. Their pride was wrapped up in competence and independence, which grew as they tried and mastered activities they hadn’t known how to do the previous year. But these days, middle-class children, at least, skip these milestones. They spend a lot of time in the company of adults, so they can talk and think like them, but they never build up the confidence to be truly independent and self-reliant.
Yet how exactly do you measure “true” independence and self-reliance? And what’s the final milestone that marks the transition to adulthood? Decter suggests it’s settling down with a stable career and a family. But in Julie Beck’s 2016 Atlantic piece, “When Are You Really an Adult?,” she places that rite of passage in historical context:
The economic boom that came after World War II made Leave It to Beaver adulthood more attainable than it had ever been. Even for very young adults. There were enough jobs available for young men, [historian Steven] Mintz writes, that they sometimes didn’t need a high-school diploma to get a job that could support a family. And social mores of the time strongly favored marriage over unmarried cohabitation hence: job, spouse, house, kids. But this was a historical anomaly. …
Many young people, [psychologist Jeffrey] Jensen Arnett says, still want these things—to establish careers, to get married, to have kids. (Or some combination thereof.) They just don’t see them as the defining traits of adulthood. Unfortunately, not all of society has caught up, and older generations may not recognize the young as adults without these markers. A big part of being an adult is people treating you like one, and taking on these roles can help you convince others—and yourself—that you’re responsible.
So, adults: What convinced you? Many readers have discussed the topic already, and we’d like to reopen the call for your stories—this time with an eye to the gaps between what it takes to feel like an adult and what it takes to be seen as one. Did you feel you’d become an adult long before you got treated like one? Or have you passed the markers of adulthood without quite feeling you’ve fully grown up? If you’re a parent, when did you feel your kids had grown up, or what will it take to make you certain? Please send your answers—and questions—to email@example.com.
When I was 11, my mother died. My father had become blind a few years before, from a rare form of glaucoma. He had no choice but to allow me to do things that are normally done by an adult, such as budgeting and paying bills, cooking and cleaning, and other various things. He had to talk to me in an honest way, and make me understand things and rely on my judgement in lots of matters. Other adults did too. I was never a child again after my mother died and my dad knew it.
Another reader’s mother also died at a pretty young age:
I became an adult when my mother died and my dad started dating four months later. I was 20 years old. Once he had a new woman in his life (whom he is still married to now) and essentially a new family, I was out. We had really started to be at odds the year before, when I had started to do things my way instead of his way. He had pretty much taken for granted that I could make it in this world without his advice or anything.
For this next reader, it was boarding school:
I’m not sure the end of childhood is the sort of thing that one can pinpoint; seems to me there were rather a number of distinct rites of passage. The first was when I went to boarding school, around age 10. When my parents dropped me off that first day, I knew I was on my own. Calling home to say they should come get you was not an option; my parents made this pretty clear, but it was not necessary. I knew.
Another reader had to go abroad to step out of childhood:
When I was an exchange student, my father came down to visit. There I was, living independently in a foreign country at 17. I could speak the language fluently and had to navigate us for him.
I was 18 and turned down an Ivy League school and, using my own money, moved to Italy to live with my 25-year-old girlfriend. I think my parents would say I became an adult when, at 26, I handed them a copy of my will and let them know I had been chosen to deploy in a war zone as a civilian alongside a joint counter-terrorism/counter-insurgency military unit. Strangely enough, they weren’t happy with either bit of news.
Rachael was 18 and had just started college:
I received an offer for health insurance in the mail. It went to my home address, and my mom was absolutely thrilled at the notion that I would be off her employer-supplied (but expensive) health insurance. I began paying for my own health insurance at that time, but I also let her know that I wouldn’t allow her to claim me as a dependent on her income taxes anymore. I paid my own expenses (insurance, college, etc). I laugh now when I think of “kids” still being on their parents’ insurance until they’re 26.
This next reader is almost 26:
I became an adult in the past year or so, when my dad learned how much was in my savings account and I mentioned my credit score in the context of considering new car costs. I think my parents assumed up to that point that I was just scraping by and blowing my money irresponsibly, and they were impressed with the degree to which I was caring for myself.
I think calling my dad and asking for advice has helped him to see me differently as well. There’s something about discussing investments and trying to decide on insurance plans that I would assume makes it hard to keep seeing your kid as a kid.
Another reader also nods to financial independence: “I became an adult when I began to pick up the check for my parents by surreptitiously passing plastic to the maître d’ early in the meal.” This next reader entered adulthood in a brutal fashion:
I was 20 and began working at the same factory as my father did. He was in maintenance as an industrial electrician. There had been a summer program for employees’ children and I worked out well enough that I was hired at the end of the summer. He was proud that I carried my weight.
However, three years into it and just after my 23rd birthday, my hand got caught in a take-up roll for a large paper machine and I was flung around like a rag doll. With both femurs and my left ulna, left radius, and left humerus broken, I spent months recovering.
But I kept a good attitude, believing falsely that I would be back to my normal self. My dad told me that he could never have had such a good attitude having gone through what I did.
This next reader also defined his adulthood alongside his dad’s admiration:
I became an adult when I joined the ROTC program my freshman year of college to appease my dad. He got a glimmer of pride in his eyes, saw it as me taking initiative, and was proud that I seemed interested in serving my country. I wasn’t really; I only did it to get him off my back so I could do drugs and other hedonistic things in college. But it was good to have his approval for once and no stress.
The first moment of adulthood was rather mundane for this reader: “Probably the day Mom asked if she could come to my apartment 45 minutes away to use my washing machine because hers was broken.” For another reader, the moment was different for each of his parents:
For my dad, I’d say the writing was on the wall around the time I was 16 and it became apparent that I was physically stronger than he was. He maintained the power of the purse for a few years after, but, considering that it was at about the same time when he’d occasionally offer me a beer, I’m inclined to accept that I was an adult in his eyes.
Mom? Damn, who knows what she really thinks about anything, but I’m guessing she first fully acknowledged my adulthood not through any of the accomplishments or milestones, but when, in 2003, I bought her a car and she therefore had a tangible symbol that could be seen and acknowledged by others.
This next reader’s parents need grandkids before truly considering her an adult:
I’m a 31-year-old married attorney homeowner with no kids. I don’t think my parents look at me like an adult. I don’t think they will until I have kids. This is most obviously manifested in my parents’ constant amnesia of me being a lawyer. They will be having discussions about some legal consideration, I’ll weigh in, yet my opinion is given no weight whatsoever. Though maybe I’m just being whiny that they aren’t taking me seriously. Is that the same thing as not seeing me as an adult? They seem very proud of me, but that doesn’t seem akin to viewing me as an adult either.
Jason might never become an adult in his parents’ eyes:
In some ways, my sister and I will always be “kids.” My mom will randomly start doing things to my hair or bring over something like saucepans that we already have and don’t need. My dad critiques my yard and is always convinced something is wrong with my car (there isn’t, Dad, it’s fine!) I just laugh at this stuff, though; it doesn’t really bother me.
It bothers Doug, though; he finds the topic a “sore spot”:
I’m not convinced my parents believe I’m an adult. I’m 32, the youngest of three, the second most-educated (sister has multiple grad degrees), and the highest earner. I’m constantly getting asked if I’m saving enough or if I need help with anything. Meanwhile, I never ask for help, but both of my older sisters constantly need help. I’m the one who made sure my dad got power of attorney for my grandmother before she totally lost it, and who opened college accounts for my nieces and nephews.
Henry David Thoreau is something of a poster child for solitude. In his essay “Walking,” published just after his death in our June 1862 issue, Thoreau made the case “for absolute freedom and wildness … to regard man as an inhabitant, or a part and parcel of Nature, rather than a member of society”:
We should go forth on the shortest walk, perchance, in the spirit of undying adventure, never to return, prepared to send back our embalmed hearts only as relics to our desolate kingdoms. If you are ready to leave father and mother, and brother and sister, and wife and child and friends, and never see them again—if you have paid your debts, and made your will, and settled all your affairs, and are a free man—then you are ready for a walk.
Thoreau himself was “a genuine American weirdo,” as Jedediah Purdy recently put it, and solitude suited him: His relentless individualism irritated his friends, including Atlantic co-founder Ralph Waldo Emerson, who described Thoreau’s habit of contradicting every point in pursuit of his own ideals as “a little chilling to the social affections.” Emerson may have had Thoreau in mind when, in our December 1857 issue, he mused that “many fine geniuses” felt the need to separate themselves from the world, to keep it from intruding on their thoughts. Yet he questioned whether such withdrawal was good for a person, not to mention for society as a whole:
This banishment to the rocks and echoes no metaphysics can make right or tolerable. This result is so against nature, such a half-view, that it must be corrected by a common sense and experience. “A man is born by the side of his father, and there he remains.” A man must be clothed with society, or we shall feel a certain bareness and poverty, as of a displaced and unfurnished member. He is to be dressed in arts and institutions, as well as body-garments. Now and then a man exquisitely made can live alone, and must; but coop up most men, and you undo them. …
When a young barrister said to the late Mr. Mason, “I keep my chamber to read law,”—“Read law!” replied the veteran, “’tis in the courtroom you must read law.” Nor is the rule otherwise for literature. If you would learn to write, ’tis in the street you must learn it. Both for the vehicle and for the aims of fine arts, you must frequent the public square. … Society cannot do without cultivated men.
Emerson concluded that the key to effective, creative thought was to maintain a balance between solitary reflection and social interaction: “The conditions are met, if we keep our independence, yet do not lose our sympathy.”
Four decades later, in our November 1901 issue, Paul Elmore More identified a radical sympathy in the work of Nathaniel Hawthorne, which stemmed, he argued, from Hawthorne’s own “imperial loneliness of soul”:
His words have at last expressed what has long slumbered in human consciousness. … Not with impunity had the human race for ages dwelt on the eternal welfare of the soul; for from such meditation the sense of personal importance had become exacerbated to an extraordinary degree. … And when the alluring faith attendant on this form of introspection paled, as it did during the so-called transcendental movement into which Hawthorne was born, there resulted necessarily a feeling of anguish and bereavement more tragic than any previous moral stage through which the world had passed. The loneliness of the individual, which had been vaguely felt and lamented by poets and philosophers of the past, took on a poignancy altogether unexampled. It needed but an artist with the vision of Hawthorne to represent this feeling as the one tragic calamity of mortal life, as the great primeval curse of sin … the universal protest of the human heart.
Fast-forward a century, and what More described as “the solitude that invests the modern world” had only gotten deeper invested—while “the sense of personal importance” gained new narcissistic vehicles in the form of social-media tools that let us “connect” online while keeping our real, messy selves as private as we choose. Which is not a bad thing: In some ways, the internet looks like the perfect way to achieve Emerson’s ideal balance between independent thought and social engagement.
A considerable part of Facebook’s appeal stems from its miraculous fusion of distance with intimacy, or the illusion of distance with the illusion of intimacy. Our online communities become engines of self-image, and self-image becomes the engine of community. The real danger with Facebook is not that it allows us to isolate ourselves, but that by mixing our appetite for isolation with our vanity, it threatens to alter the very nature of solitude.
The new isolation is not of the kind that Americans once idealized, the lonesomeness of the proudly nonconformist, independent-minded, solitary stoic, or that of the astronaut who blasts into new worlds. Facebook’s isolation is a grind. What’s truly staggering about Facebook usage is not its volume—750 million photographs uploaded over a single weekend—but the constancy of the performance it demands. More than half its users—and one of every 13 people on Earth is a Facebook user—log on every day. Among 18-to-34-year-olds, nearly half check Facebook minutes after waking up, and 28 percent do so before getting out of bed. The relentlessness is what is so new, so potentially transformative. Facebook never takes a break. We never take a break.
The same year, Brian Patrick Eha also noted the changing nature of solitude—particularly the kind of solitude achieved by wearing headphones in public. “We are each of us cocooned in noise,” he wrote, “and can escape from one another’s only when immersed in our own.” For both Marche and Eha, the problem with technology is not its tendency to isolate people so much as the way it works to prevent us—through a sense of connection or simply through distraction—from fully experiencing that isolation and all it entails.
And as the author Dorthe Nors explained in 2014 for our By Heart series of writer interviews, a full experience of isolation has serious benefits:
The artistic process unfolds in the lonely hours. That’s when the work happens. You have to control the creative energy that you’ve got. You have to discipline yourself to fulfill it. And that work only happens alone.
Solitude, I think, heightens artistic receptivity in a way that can be challenging and painful. When you sit there, alone and working, you get thrown back on yourself. Your life and your emotions, what you think and what you feel, are constantly being thrown back on you. And then the “too much humanity” feeling is even stronger: you can’t run away from yourself. You can’t run away from your emotions and your memory and the material you’re working on. Artistic solitude is a decision to turn and face these feelings, to sit with them for long periods of time.
For Nors, like for Hawthorne, solitude not only enables personal reflection, but also grants access to some deeper, more universal strain of human feeling. That’s the same lesson that Nathaniel Rich, writing in our latest issue, took from the story of Christopher Knight, who spent 27 years living utterly alone in the woods of Maine:
Since his arrest in April 2013, Knight has agreed to be interviewed by a single journalist. Michael Finkel published an article about him in GQ in 2014 and has now written a book, The Stranger in the Woods, that combines an account of Knight’s story with an absorbing exploration of solitude and man’s eroding relationship with the natural world. Though the “stranger” in the title is Knight, one closes the book with the sense that Knight, like all seers, is the only sane person in a world gone insane—that modern civilization has made us strangers to ourselves.
Yet a total withdrawal from civilization can’t be the answer—nor, at a political moment when empathy and understanding seem ever-more-urgently needed, can walling yourself off from other people’s ideas be wise. In February, Emma Green offered this critique of a new book by Rod Dreher, a conservative Christian thinker who calls for like-minded members of his faith to withdraw from public life into communities of their own:
Dreher wrote The Benedict Option for people like him—those who share his faith, convictions, and feelings of cultural alienation. But even those who might wish to join Dreher’s radical critique of American culture, people who also feel pushed out and marginalized by shallowness of modern life, may feel unable to do so. Many people, including some Christians, feel that knowing, befriending, playing with, and learning alongside people who are different from them adds to their faith, not that it threatens it. For all their power and appeal, Dreher’s monastery walls may be too high, and his mountain pass too narrow.
So, tell us about your experience: How do you incorporate solitary reflection into a 21st-century lifestyle? Can you see communitarian benefits in spending more time on your own—or, on the other hand, point to what society loses when more people spend more time alone? Please send your answers (and your questions) to firstname.lastname@example.org.
January 6 was practice. Donald Trump’s GOP is much better positioned to subvert the next election.
Technically, the next attempt to overthrow a national election may not qualify as a coup. It will rely on subversion more than violence, although each will have its place. If the plot succeeds, the ballots cast by American voters will not decide the presidency in 2024. Thousands of votes will be thrown away, or millions, to produce the required effect. The winner will be declared the loser. The loser will be certified president-elect.
The prospect of this democratic collapse is not remote. People with the motive to make it happen are manufacturing the means. Given the opportunity, they will act. They are acting already.
Who or what will safeguard our constitutional order is not apparent today. It is not even apparent who will try. Democrats, big and small D, are not behaving as if they believe the threat is real. Some of them, including President Joe Biden, have taken passing rhetorical notice, but their attention wanders. They are making a grievous mistake.
How to make everyday risk assessments when there are many shades of what it means to be vaccinated
This past spring, if someone told you that they were fully vaccinated, you knew precisely what they meant: At least two weeks before, they’d received two doses of the Moderna COVID-19 vaccine, two doses of Pfizer, or one dose of Johnson & Johnson.
Now what it means to be vaccinated encompasses much more variety. Some people who have gotten their initial doses haven’t gotten a booster dose, and some people mixed and matched the brands of their first shots and their booster. What’s more, everyone is on their own personal timeline, depending on when they got their shots. Amid this complexity, kids under 5 still aren’t eligible for any shots at all.
As the weather gets colder in much of the country and people bring more of their socializing indoors, this variety of vaccination histories introduces questions Americans didn’t previously have to deal with. Is it still safe to hang out with someone who is vaccinated but not boosted? Can unvaccinated little kids safely spend time with unboosted adults? And will the new coronavirus variant, Omicron, further complicate the risk calculus of an already complicated winter?
An episode chock-full of emotional violence may have ended with real tragedy.
This article contains spoilers through the eighth episode of Succession Season 3.
What’s clear by now is that the Roys need to stay away from water. Every late-in-the-season tragedy and act of bloodshed, whether real or intangible, has been tied to the element that classically represents femininity, emotion, and intuition. These are not, of course, qualities that you’d associate with the Roys, and yet balance will have its way in the end.
Tonight’s episode, “Chiantishire,” named for the unfortunate moniker upper-class Brits give the region of Tuscany, ended with Kendall Roy (played by Jeremy Strong) lying facedown in an infinity pool in the Italian countryside. It was a climactic and devastating end to an episode chock-full of emotional violence, but also one that seems fated to be an installment on the upcoming podcast being made about the “curse of the Roys.” The news of that show, delivered casually by Kendall’s PR flack Comfrey (Dasha Nekrasova), added yet more anguish to Kendall’s buffet of psychological slights. First, his mother, Lady Caroline (Harriet Walter), made fun of his buzz cut. Then she asked him to back out of certain events at her wedding so his more powerful father could attend. Finally, Logan (Brian Cox) taunted Kendall over an uneaten dinner, refusing to agree to buy him out of Waystar Royco, and reminding him that his addictions had led to a young man’s death at the wedding of his sister, Shiv (Sarah Snook). “Whenever you fucked up, I cleaned up your shit,” Logan said. “And I’m a bad person? Fuck off, kiddo.”
To head off the next insurrection, we’ll need to practice envisioning the worst.
A year after the insurrection, I’m trying to imagine the death of American democracy. It’s somehow easier to picture the Earth blasted and bleached by global warming, or the human brain overtaken by the tyranny of artificial intelligence, than to foresee the end of our 250-year experiment in self-government.
The usual scenarios are unconvincing. The country is not going to split into two hostile sections and fight a war of secession. No dictator will send his secret police to round up dissidents in the dead of night. Analogies like these bring the comfort of at least being familiar. Nothing has aided Donald Trump more than Americans’ failure of imagination. It’s essential to picture an unprecedented future so that what may seem impossible doesn’t become inevitable.
Countries with low vaccination rates are suffering from more than just inequity.
In the public-health world, the rise of Omicron prompted a great, big “I told you so.” Since the new variant was detected in South Africa, advocacy groups, the WHO, and global-health experts have said the new variant was a predictable consequence of vaccine inequity. Rich countries are hoarding vaccine doses, they said, leaving much of the developing world under-vaccinated. But in reality, countries with low vaccination rates are suffering from more than just inequity.
South Africa, the country where the variant was first reported, did receive vaccines far too late, partly because wealthy countries did not donate enough doses and pharmaceutical companies refused to share their technology. At one point, South Africa had to export doses of the Johnson & Johnson vaccine that it had manufactured in-country in order to comply with a contract it had signed with the company. The COVID-19 vaccines must be kept cold, and because not everywhere in South Africa has reliable roads and refrigeration, the country has struggled to store and transport vaccine doses to far-flung areas.
As we peer around the corner of the pandemic, let’s talk about what we want to do—and not do—with the rest of our lives.
At the bleakest moment in the pandemic, when you felt your most stressed, most scared, least centered, you probably heard some variation of the phrase This is really hard. Maybe you read it; maybe your manager said it to you; maybe you said it to yourself. But that’s the truth: Our nearly two years of living through a pandemic have been hard. And like everything else in the United States, that difficulty has not been evenly distributed. It has been hardest for those on the front lines, those afraid of how customers will react to their requests to put on a mask, those out of work or in constant fear of the way COVID variants are whipping through their community. It has been hard, in different ways, for those attempting to work and supervise school from home, for those in complete isolation, for those terrified of being around other people. It is fucking hard, in so many intersecting and unfair ways.
The GOP’s leaders are attempting to destroy the foundations of American democracy.
In October of 1860, The Atlantic’s first editor, James Russell Lowell, wrote of Abraham Lincoln that he “had experience enough in public affairs to make him a statesman, and not enough to make him a politician.” Lowell, in his endorsement, was mainly concerned not with Lincoln’s personal qualities but with the redemptive possibilities of his new party. The Republicans, Lowell wrote, “know that true policy is gradual in its advances, that it is conditional and not absolute, that it must deal with facts and not with sentiments.”
There is insufficient space in any one issue of this magazine to trace the Republican Party’s decomposition from Lincoln’s day to ours. It is enough to say that its most recent, and most catastrophic, turn—toward authoritarianism, nativism, and conspiracism—threatens the republic that it was founded to save.
The film’s lead is reprehensible and self-aggrandizing––and mesmerizing to watch.
Mikey Saber, the preening, confident chump who’s the ostensible hero of Sean Baker’s new film, Red Rocket, enters on-screen to a loud and familiar tune: “Bye Bye Bye,” by *NSync. The song is a piece of mainstream pop from yesteryear (it’s a shiver-inducing 21 years old), and its usage in this arty indie film seems laced with irony. Baker knows, though, that for all its non-subtlety, “Bye Bye Bye” is still as catchy as it was the day of its release, and he uses it to suggest the same of Mikey (played by Simon Rex): He’s his own kind of relic, rolling back into his hometown after a failed career in Los Angeles, but he’s still got a glint of charm to him.
Baker has always told small-scale stories set on the margins of America—2015’s Tangerinewas a bittersweet Christmas tale about trans sex workers, and 2017’s The Florida Project was about “hidden homeless” families living in a motel. Both of those films were empathetic works about people enduring incredibly challenging circumstances—Baker, who often casts first-time actors in his work, is a master of displaying unvarnished truth on-screen. Red Rocket is far more sour than sweet, but that’s part of the point; Mikey is a reprehensible fellow, but he’s clawed his way through life by sheer force of will, and as such, the camera simply can’t look away.
Too many people still have no protection against the coronavirus.
No one knows exactly what endemic COVID will look like, but whatever it looks like, this—gestures at the current situation—ain’t it. COVID is not yet endemic. There is little doubt that the coronavirus will get there eventually, when almost everyone has been vaccinated or infected or both, but right now we are still living through a messy and potentially volatile transition period. Cases are ticking up again. A new variant is afoot. The challenge ahead is figuring out how to manage the transition to endemicity, however long it takes.
COVID is not yet endemic because too many people still lack any immunity from either vaccination or infection, here in the United States and globally. Europe is a cautionary tale in this regard: Countries such as Germany and Austria that have slightly better vaccination coverage than the U.S.—68 percent and 66 percent, respectively, compared with 60 percent here—are nevertheless seeing their cases and hospitalizations soar in yet another wave. Even with most people vaccinated, there isn’t enough immunity to blunt big and fast surges of Delta. Just 15 percent of the population without immunity is still a huge absolute number in a country with millions of people, says Lloyd Chapman, an infectious-disease modeler at the London School of Hygiene and Tropical Medicine. Chapman and his colleagues have estimated the number of unvaccinated and unexposed people who could still be hospitalized for COVID in Europe based on each country’s age structure. (He is planning to do a similar analysis for the U.S.) “The main headline point would be that,” he says, “there’s still a long way to go.” And that was before Omicron. The new variant could be even better at evading previous immunity than Delta, and its spread might push endemicity further off into the future.
Why is Hollywood still hiring this raging anti-Semite?
Every day, as dawn’s rosy fingers reach through my window, I arise and check in with Twitter, to see what fresh hell awaits. Generally, by about 6:30, I’ve been made furious by the outrage du jour. But recently, I experienced more of a sense of bemusement than ire, as I took in Deadline’s headline: “Mel Gibson in Talks to Direct Lethal Weapon 5.”
Gibson is a well-known Jew-hater (anti-Semite is too mild). His prejudices are well documented. So my question is, what does a guy have to do these days to get put on Hollywood’s no-fly list? I’m a character actor. I tend to take the jobs that come my way. But—and this hurts to write—you couldn’t pay me enough to work with Mel Gibson.
Now, I love the Lethal Weapon movies (at least the first few). And Danny Glover’s a gem. But Gibson? Yes, he’s a talented man. Many horrible people produce wonderful art. Put me down as an ardent fan of Roald Dahl, Pablo Picasso, and Edith Wharton; can’t get enough of what they’re selling. But these three had the good taste to die. That makes it a lot easier to enjoy their output. Gibson lives. And Tinseltown need not employ him further.