Since 1857, The Atlantic has been challenging established answers with tough questions. In this video, actor Michael K. Williams—best known as Omar Little from The Wire—wrestles with a question of his own: Is he being typecast?
What are you asking yourself about the world and its conventional wisdom? We want to hear your questions—and your thoughts on where to start finding the answers: hello@theatlantic.com. Each week, we’ll update this thread with a new question and your responses.
Here’s how an Atlantic author answered that question in September 1858:
Full of anticipations, full of simple, sweet delights, are these [childhood] years, the most valuable of [a] lifetime. Then wisdom and religion are intuitive. But the child hastens to leave its beautiful time and state, and watches its own growth with impatient eye. Soon he will seek to return. The expectation of the future has been disappointed. Manhood is not that free, powerful, and commanding state the imagination had delineated. And the world, too, disappoints his hope. He finds there things which none of his teachers ever hinted to him. He beholds a universal system of compromise and conformity, and in a fatal day he learns to compromise and conform.
But it wasn’t until the 20th century that scientists began to seriously study child development. In our July 1961 issue, Peter B. Neubauer heralded “The Century of the Child”:
Gone is the sentimental view that childhood is an era of innocence and the belief that an innate process of development continuously unfolds along more or less immutable lines. Freud suggested that, from birth on, the child’s development proceeds in a succession of well-defined stages, each with its own distinctive psychic organization, and that at each stage environmental factors can foster health and achievement or bring about lasting retardation and pathology. …
Freudian psychology does not, as some people apparently imagine, provide a set of ready-made prescriptions for the rearing of children. … The complexity of the interactions between mother and child cannot be reduced to rigid formulas. Love and understanding cannot be prescribed, and if they are not genuinely manifested, the most enlightened efforts to do what is best for the child may not be effective.
According to this view, children weren’t miniature adults, but they were preparing for adulthood. Growing up was a process that had to be managed by adults, which made the boundaries of childhood both more important and more nebulous.
A few years later, in our October 1968 issue, Richard Poirier described the backlash to a wave of campus protests as “The War Against the Young.” He implored older adults to take young people’s ideas seriously:
It is perhaps already irrelevant, for example, to discuss the so-called student revolt as if it were an expression of “youth.” The revolt might more properly be taken as a repudiation by the young of what adults call “youth.” It may be an attempt to cast aside the strangely exploitative and at once cloying, the protective and impotizing concept of “youth” which society foists on people who often want to consider themselves adults.
What’s more, Poirier argued, idealism shouldn’t just be the province of the young:
If young people are freeing themselves from a repressive myth of youth only to be absorbed into a repressive myth of adulthood, then youth in its best and truest form, of rebellion and hope, will have been lost to us, and we will have exhausted the best of our natural resources.
But how much redefinition could adulthood handle? In our February 1975 issue, Midge Decter addressed an anxious letter to that generation of student revolutionaries, who—though “no longer entitled to be called children”—had not yet fulfilled the necessary rites of passage for being “fully accredited adults”:
Why have you, the children, found it so hard to take your rightful place in the world? Just that. Why have your parents’ hopes for you come to seem so impossible of attainment?
Some of their expectations were, to be sure, exalted. … But … beneath these throbbing ambitions were all the ordinary—if you will, mundane—hopes that all parents harbor for their children: that you would grow up, come into your own, and with all due happiness and high spirit, carry forward the normal human business of mating, home-building, and reproducing—replacing us, in other words, in the eternal human cycle. And it is here that we find ourselves to be most uneasy, both for you and about you.
Decter blamed this state of affairs on overindulgent parenting: Adults, she argued, had failed their children by working too hard to protect them from unhappiness and by treating their “youthful rebellion” with too much deference.
The next decades’ developments in child psychology gave parents new advice. In our March 1987 issue, Bruno Bettelheim stressed the importance of letting kids guide their own play, without parents pushing them to obey rules they aren’t yet developmentally ready for. And in our February 1990 issue, Robert Karen outlined attachment theorists’ recommendations for how to “enable children to thrive emotionally and come to feel that the world of people is a positive place”—standards measured in part by a baby’s willingness to explore apart from its mother.
Were these parenting styles encouraging kids’ independence, or failing to push them hard enough? A generation after Decter, in Lori Gottlieb’s 2011 Atlantic piece “How to Land Your Kid in Therapy,” she also worried about parental indulgence:
The message we send kids with all the choices we give them is that they are entitled to a perfect life—that, as Dan Kindlon, the psychologist from Harvard, puts it, “if they ever feel a twinge of non-euphoria, there should be another option.” [Psychologist Wendy] Mogel puts it even more bluntly: what parents are creating with all this choice are anxious and entitled kids whom she describes as “handicapped royalty.” …
When I was my son’s age, I didn’t routinely get to choose my menu, or where to go on weekends—and the friends I asked say they didn’t, either. There was some negotiation, but not a lot, and we were content with that. We didn’t expect so much choice, so it didn’t bother us not to have it until we were older, when we were ready to handle the responsibility it requires. But today, [psychologist Jean] Twenge says, “we treat our kids like adults when they’re children, and we infantilize them when they’re 18 years old.”
In Hanna Rosin’s April 2014 article “The Overprotected Kid,” she lamented the loss of independence that once helped kids come of age:
One common concern of parents these days is that children grow up too fast. But sometimes it seems as if children don’t get the space to grow up at all; they just become adept at mimicking the habits of adulthood. As [geographer Roger] Hart’s research shows, children used to gradually take on responsibilities, year by year. They crossed the road, went to the store; eventually some of them got small neighborhood jobs. Their pride was wrapped up in competence and independence, which grew as they tried and mastered activities they hadn’t known how to do the previous year. But these days, middle-class children, at least, skip these milestones. They spend a lot of time in the company of adults, so they can talk and think like them, but they never build up the confidence to be truly independent and self-reliant.
Yet how exactly do you measure “true” independence and self-reliance? And what’s the final milestone that marks the transition to adulthood? Decter suggests it’s settling down with a stable career and a family. But in Julie Beck’s 2016 Atlantic piece, “When Are You Really an Adult?,” she places that rite of passage in historical context:
The economic boom that came after World War II made Leave It to Beaver adulthood more attainable than it had ever been. Even for very young adults. There were enough jobs available for young men, [historian Steven] Mintz writes, that they sometimes didn’t need a high-school diploma to get a job that could support a family. And social mores of the time strongly favored marriage over unmarried cohabitation hence: job, spouse, house, kids. But this was a historical anomaly. …
Many young people, [psychologist Jeffrey] Jensen Arnett says, still want these things—to establish careers, to get married, to have kids. (Or some combination thereof.) They just don’t see them as the defining traits of adulthood. Unfortunately, not all of society has caught up, and older generations may not recognize the young as adults without these markers. A big part of being an adult is people treating you like one, and taking on these roles can help you convince others—and yourself—that you’re responsible.
So, adults: What convinced you? Many readers have discussed the topic already, and we’d like to reopen the call for your stories—this time with an eye to the gaps between what it takes to feel like an adult and what it takes to be seen as one. Did you feel you’d become an adult long before you got treated like one? Or have you passed the markers of adulthood without quite feeling you’ve fully grown up? If you’re a parent, when did you feel your kids had grown up, or what will it take to make you certain? Please send your answers—and questions—to hello@theatlantic.com.
The initial wave of reader response to our question “Is a long life really worth it?” was overwhelming “meh, not so much.” But since then, many sexagenarians, septuagenarians, octogenarians, and nonagenarians have emailed more enthusiastic outlooks on old age. Here’s Jim:
A thought-provoking discussion, but it really misses the key point. I turn 65 in a couple of months, but I don’t expect to “retire” at 65—or ever. I’m fit and healthy and having the greatest fun of my life at the head of a fast-growing business. In a quarter century, if still alive, I might have to slow down a bit, but there will still be something useful for me to do.
The founding pastor of our church has poor hearing and is almost blind, but a few weeks ago he preached a great sermon to celebrate his 100th birthday. He still contributes in other ways as well.
Not everyone can continue working, but there is a huge need for volunteers in areas that do not require physical agility. Unless totally senile—and that’s something that will never happen to most of us—we all have something to offer.
Maggie is a quarter century older than Jim but has a very similar view:
Life isn’t over because I’m not longer “useful.” I’m 90 and have spent the last decade trying to be okay with not always being the helping hand. Though my greatest joy has come from knowing I have touched another’s life by being helpful, I have to remember that I am still touching people’s lives as long as I am alive. I’m so pleasantly surprised that people want to be around me.
I was pretty grim when I had to stop driving because a slight accident damaged the car beyond repair. My health also gave way and I was briefly hospitalized. It was a big adjustment. But now I am walking, exercising at the gym once a week, taking part in demonstrations, and forgetting about how old I am. I don’t see any other options.
In three weeks I will have my 90th birthday. I am certainly glad I did not die at 75. Since then, I have seen four more grandchildren born, two grandchildren graduate from college, and two from high school. I sold my financial advisory firm to my partners and helped start a new Trust Company, now serving as Regional Director and on their Board. I have had some wonderful trips and been able to enjoy sailing, tennis, and horseback riding up until two years ago. I have recently bought a set of golf clubs and look forward to enjoying a new sport.
Carol frames aging this way:
Everyone has three ages: chronological, biological, and mental. (The most important, by far, is our mental age.) I’m chronologically 81, biologically 65 and mentally 60.
Tony adds some perspective:
Consider this: Well into his 80s, Verdi [the Italian composer] was still at it; ahead were two of his greatest operas, Otello and Falstaff. And Michelangelo was still there, chisel in hand, well into his 80s. Problem is, we think it’s all over—but life, and sometimes ourselves too, always has a surprise in store.
Maureen calls old age “my blessing”:
I will be 70 on my next birthday! I have finally begun to live my truth. I am fortunate in that I have an appreciation for life that never occurred to me in my younger years. I love every sunrise and sunset. I enjoy watching the bunnies, hummingbirds, lizards, and butterflies. My grandchildren enjoy my company. I am my husband’s best friend. I have a deep spiritual connection. I take nothing for granted.
Life for me is beautiful—not because it is perfect, but because it is lovely even in its imperfection. I have made peace with my past and have no fears for my future. I am grateful for every moment! I will stay here on this amazing planet as long as I can.
Another positive outlook comes from Charlie:
Aging is not a sickness or a disease. No one yet has died knowing all there is to know and enjoying everything there is to enjoy! So why not try to be that first? Optimism, positivism, aggressiveness, regardless of your age, is what it means to be human. Cells may die and energy may lessen. But whatever is left should be used to live and love as fully as possible. We are always and ever in the process of becoming!
Joyce has some tips for healthy living in your eighties:
I read Ezekiel Emanuel’s article [“Why I Hope to Die at 75”] and agree with much of it; I certainly don’t want to have lots of effort made to keep me alive if I should unfortunately end up in a hospital and have no intention of any surgeries.
However, I am 83 and not hoping to die any time soon. I am unusually healthy for my age and do many things to remain so: I take no prescription drugs; I exercise regularly including weight lifting, walking and Tai Chi; I eat well, including fresh vegetable juice every day or so; I have good regular connections with family and close friends; I experience good art forms, including playing the piano, singing, movies, novels (currently my husband, who is 85, and I are watching the fine BBC series Lark Rise to Candleford and reading aloud together Margaret Atwood’s novel Blind Assassin).
Karen is 80 years old and wisely keeps her smartphone at bay:
I go for long walks every day it isn’t raining or unbearably cold. It is my job to keep myself as mobile and as healthy as possible. I don’t wear headphones or keep my phone on when I walk. I want to observe what wonders nature is revealing: sights, sounds, odors. I find the sound of the ocean is restful and restorative. As I near the end of my life, birds, otters, flowers, sunrises and sunsets take on extra meaning for I know I have a limited time in which to enjoy them.
And Nancy shares a great saying:
I am 79 and still teaching college courses—for another year at least, if lucky. Then for as long as I am able, I will continue to volunteer. As a good friend said, “You ought to be all spent up before you go.”
Living a long life seems the obvious goal for most people, and many of them, like Dylan Thomas, raged against the dying of the light. Others—like the transhumanists that Olga featured recently—want to transcend death entirely.
Well, like most things, the answer is not a simple yes or no; it depends—on so many factors, some of which we can control (e.g. not smoking) and can’t control (e.g. our genetic make-up). If you’re in good health physically and have all your faculties and some purposeful work or hobby, or just something you really enjoyed doing, then maybe it might be a good idea to live a long life. But those are a lot of ifs.
Another reader, John, looks to human connections:
Health is essential to making survival good, but it also helps to have a caring partner, for companionship and support. I am biased, because at 81, I have my health and a good wife. I’d like to live past 100 if these conditions remain. But if I become disabled, chronically ill or alone, life is unlikely worth it.
Rita has a bleaker outlook:
Looking at my genetics, I’m starting to think I may live a long time. I’m not yet 70, but I can probably expect to go until 95 at least.
This doesn’t fill me with joy. Who’s going to look after me when my eyesight starts to crap out and I get weaker? Where’s the money going to come from to continue to pay my bills? These are not minor questions. Their answers, as far as I can see, are “nobody” and “nowhere.”
And anyway, it’s not as if I can look forward to hiking in the desert or exploring foreign cities in my extreme old age. Nor will many of us be directing films or conducting research in our nineties. What most of us can anticipate is day after day staring at a TV set, wondering if anyone is coming for a visit.
She adds, “That Atlantic excerpt you cited from 1928 nails it”—namely, “Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life.” Another reader, Bernyce, is also worried about infirmity:
After the age of 75, the human body declines—if not steadily, then in jerks and/or slopes. People begin to loose hearing, eyesight, and useful teeth, as well as the ability to digest food that may be ingested. A younger friend (74), living in an assisted-living facility because her son lives 200 miles away and she is no longer able to walk, says her companions say the food is delicious. She says the desserts are tasteful but everything else is flavorless and slippery. People who have loved ones to care for them may be more fortunate.
Watch the French film Amour. It is a short, beautiful, and painful glimpse of the end of life in a loving marriage. Even when we are not alone, the end of life is very difficult.
Here’s the haunting trailer for Amour:
Here’s Jim:
At the somewhat advanced age of 88 (and I’ll be 89 in a few days), I’m tired. I think I’ve accomplished all I’m capable of and am ready to rest … permanently, I guess. Curious to see what, if anything, comes next. I’ll let you know.
Jim’s “I’m tired” reminds me of a similar sigh of acceptance that came from William Buckley during one of his final interviews, before dying at the age of 82:
The clip is worth watching in full, even if you’re no fan of the conservative figure, but it begins with Charlie Rose asking Buckley if he wishes he were 20 again, and he replies:
No, absolutely not. If I had a pill that would reduce my age by 25 years I wouldn’t take it. Because I’m tired of life. I really am. I am utterly prepared to stop living on. There are no enticements to me that justify the weariness, the repetition ...
Buckley goes on to quote Sherwin Nuland—a surgeon, professor of bioethics, and author of How We Die: Reflections on Life’s Final Chapter—who once said, “The greatest enemy of older people is young doctors,” because they’re determined to keep you alive at any cost. This next reader would likely fight them off:
I am ready to go at 61. We have no problem helping our sick and injured pets, farm animals, etc. find final peace, and now people are beginning to evolve on this point too. Thank god. (Yes, I think god would agree.)
Let’s face it, after 60, folks begin kicking the ol’ bucket from normal end of life reasons. Seems the body remembers “hard” living in the early years. And this is okay. I’m reading The Razor’s Edge right now and that helps me understand.
Here’s Bill:
As an 85 year old, I recognize that my usefulness is coming to a close.
At this time, I seem to provide joy to my children and grandchildren.
When I become a liability and need the constant care of others, I am content to have my life end, even if I have to take care of that myself.
At this time I do not need nor want that kind of care. But it may come soon, and I can face that comfortably.
John quips, “At 74, I have recently said to my adult children, ‘You know, this getting old is getting old.’” Sharon is a very longtime reader:
Dear Atlantic, magazine of my youth and age;
I believe that one’s life should be as long as one can make a contribution in some way. For me, personally, I wish to live only as long as I can be useful. At 72, and a few years before, I made the decision that when I felt I could no longer contribute in a tangible way, I will end my life.
I was greatly miffed by an article by a know-all person of the psychiatric persuasion, who said that anyone who wished to end his or her life was depressed. In my opinion, that’s balderdash. My firm belief is that we should live only as long as we can help to decrease our particular footprint on the planet by benefitting others. My desire is to have 15 years of retirement, but if I can't meet my personal hook, I’ll discard that goal.
I think it is immoral to artificially prolong the physical existence of an individual who is in no more than a vegetative state. On the other hand, I believe that no one has the right to make that choice for another person.
Kent has some advice:
I think everyone should think about a long life, and when you’re about halfway there or within 30 years of being there, set yourself a goal of how old and how alert you want to be. It’s likely to affect your health and wealth by making you focus on more important things in life and your ability to experience them. The earth doesn’t owe anyone longevity so it’s up to you to figure out what and where and when you’ll take charge of your existence and final stages of life.
Kent’s note reminds me of my stepfather, who’s approaching 70 and has a really wise approach to the remainder of his life: Instead of focusing on how long he’s going to live, he’s focused on how short he can make the window of time he’ll be infirmed. By eating healthy, cycling dozens of miles per week, and generally keeping his stress low, he’s determined to shrink that final period as much as possible.
This next reader, Rachel, also looks to her parents:
I am compelled to write to you! That has never happened before.
In the last six years, I saw both my parents off this planet. Both were happy to go and did not overstay. My mother, always in good health, had hoped for some more years but fell ill. Once that happened, she did not want to linger. It was too physically painful.
My father simply grew lonely and disinterested, and he too welcomed the end. He actually asked me hasten it for him, but I reminded him it was against the law (!)
Now I have my parents-in-law. He is a priest whose life revolved around being connected to others and doing pastoral work but who has recoiled into himself these last five years and today makes no contribution to anyone, anything, anywhere. This is so wrong. He could bring meaning to people but has closed those doors.
My mother-in-law, who has had to put him into a home because she cannot care for him, spends her days wracked with guilt for having done so. While he abhors the thought of death (I thought he would want to go to his maker??), she welcomes it—to be relieved of her guilt.
But neither is dying soon. What kind of life is this for them and their families, everybody’s pocketbook, and the earth’s resources?
I am soon 58 and HAVE NO DESIRE to live long. My parents checked out at 87 and 89 and I would be happy to go sooner, while I am still making some contribution to the world and to my loved ones.
Emma contributes through teaching:
Life that includes giving, sharing, and caring for others is worth it. In contrast, life as a “parasite”—endlessly entertained by television and card games—is perhaps a more arrogant use of resources. Of course I can say this now, at age 75, the day I teach a Chinese emigre English, the day after I teach three little girls piano, and the day when I will soon perform music for fellow residents in our retirement community.
What will I say ten years from now, when all I hope for is to see my grandchildren safely through adolescence, and I have no energy to spare for what I do now, I do not know.
“If we are being quite frank, there are a few exceptional people who may have something special to give to humanity, but the vast majority of people are simply useless.”
Are the majority of people useless? If we only consider people who have made contributions to the world through their inventions, philosophies, scientific or medical research, political leadership, military or business achievements, etc., then I would agree that the vast majority of people would seem to be useless.
However, every person who has ever lived on the face of the earth has influenced or impacted the lives of those around them in ways we know nothing about, unless their life touched us personally. And then only I can know how they impacted my own life experience.
Some of the peoples’ influences were/are positive and constructive; some negative and destructive. But they all contribute to the evolutionary process of the human consciousness and therefore each person’s experience, which in turn influences the lives of people of succeeding cultures and generations.
The greater question to me is why we are here at all. What is the reason or need for our actual existence? But this gets into a philosophical discussion that could go on and on.
That’s the question that reader John Harris has been asking himself lately. He’s not alone: In 1862, one of The Atlantic’s founders, Ralph Waldo Emerson, wondered the same thing about aging. Acknowledging that “the creed of the street is, Old Age is not disgraceful, but immensely disadvantageous,” Emerson set out to explain the upsides of senescence. A common theme is the sense of serenity that comes with age and experience:
Youth suffers not only from ungratified desires, but from powers untried, and from a picture in his mind of a career which has, as yet, no outward reality. He is tormented with the want of correspondence between things and thoughts. … Every faculty new to each man thus goads him and drives him out into doleful deserts, until it finds proper vent. … One by one, day after day, he learns to coin his wishes into facts. He has his calling, homestead, social connection, and personal power, and thus, at the end of fifty years, his soul is appeased by seeing some sort of correspondence between his wish and his possession. This makes the value of age, the satisfaction it slowly offers to every craving. He is serene who does not feel himself pinched and wronged, but whose condition, in particular and in general, allows the utterance of his mind.
By 1928, advances in medicine had made it more possible to take a long lifespan for granted. In an Atlantic article titled “The Secret of Longevity” (unavailable online), Cary T. Grayson noted that “probably at no other time in the history of the human race has so much attention been paid to the problem of prolonging the span of life.” He offered a word of warning:
Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life. Merely to live offers little to the individual if he has lost the ability to think, to grieve, or to hope. There is perhaps no more depressing picture than that of the person who remains on the stage after his act is over.
On the other hand, as Cullen Murphy contended in our January 1993 issue, an eternity spent with no decrease in faculties wouldn’t necessarily be desirable either:
There are a lot of characters in literature who have been endowed with immortality and who do manage to keep their youth. Unfortunately, in many cases nobody else does. Spouses and friends grow old and die. Societies change utterly. The immortals, their only constant companion a pervading loneliness, go on and on. This is the pathetic core of legends like those of the Flying Dutchman and the Wandering Jew. In Natalie Babbitt’s Tuck Everlasting, a fine and haunting novel for children, the Tuck family has inadvertently achieved immortality by drinking the waters of a magic spring. As the years pass, they are burdened emotionally by an unbridgeable remoteness from a world they are in but not of.
Since antiquity, Murphy wrote, literature has had a fairly united stance on immortality: “Tamper with the rhythms of nature and something inevitably goes wrong.” After all, people die to make room for more people, and pushing lifespans beyond their ordinary limits risks straining resources as well as reshaping families.
Charles C. Mann examined some of those potential consequences in his May 2005 Atlantic piece “The Coming Death Shortage,” predicting a social order increasingly stratified between “the very old and very rich on top … a mass of the ordinary old … and the diminishingly influential young.” Presciently, a few years before the collapse of the real-estate bubble that wiped out millions of Americans’ retirement savings, Mann outlined the effects of an increased proportion of older people in the workforce:
When lifespans extend indefinitely, the effects are felt throughout the life cycle, but the biggest social impact may be on the young. According to Joshua Goldstein, a demographer at Princeton, adolescence will in the future evolve into a period of experimentation and education that will last from the teenage years into the mid-thirties. … In the past the transition from youth to adulthood usually followed an orderly sequence: education, entry into the labor force, marriage, and parenthood. For tomorrow’s thirtysomethings, suspended in what Goldstein calls “quasi-adulthood,” these steps may occur in any order.
In other words, Emerson’s period of “ungratified desires and powers untried” would be extended indefinitely. Talk about doleful deserts! On top of such Millennial malaise, Mann also predicted increased marital stress, declining birth rates, a depleted labor force, and a widespread economic slowdown as the world’s most powerful nations entered a “longevity crisis.”
But that’s just one vision. Another came from Gregg Easterbrook, who anticipated “a grayer, quieter, better future” in his October 2014 Atlantic article “What Happens When We All Live to 100?” His argument has some echoes of Emerson’s, but with modern science to back it up:
October 2014
Neurological studies of healthy aging people show that the parts of the brain associated with reward-seeking light up less as time goes on. Whether it’s hot new fashions or hot-fudge sundaes, older people on the whole don’t desire acquisitions as much as the young and middle-aged do. Denounced for generations by writers and clergy, wretched excess has repelled all assaults. Longer life spans may at last be the counterweight to materialism.
Deeper changes may be in store as well. People in their late teens to late 20s are far more likely to commit crimes than people of other ages; as society grays, the decline of crime should continue. Violence in all guises should continue downward, too. … Research by John Mueller, a political scientist at Ohio State University, suggests that as people age, they become less enthusiastic about war. Perhaps this is because older people tend to be wiser than the young—and couldn’t the world use more wisdom?
It’s a good point. Couldn’t we all use more wisdom, more experience, more opportunities to learn? Wouldn’t we make better use of our lives if our lives went on forever? Not so fast, Olga Khazan wrote last month:
A common fear about life in our brave, new undying world is that it will just be really boring, says S. Matthew Liao, director of the Center for Bioethics at New York University. Life, Liao explained, is like a party—it has a start and end time. … “But imagine there’s a party that doesn’t end,” he continued. “It would be bad, because you’d think, ‘I could go there tomorrow, or a month from now.’ There’s no urgency to go to the party anymore.”
The Epicureans of ancient Greece thought about it similarly, [psychologist Sheldon] Solomon said. They saw life as a feast: “If you were at a meal, you’d be satiated, then stuffed, then repulsed,” he said. “Part of what makes each of us uniquely valuable is the great story. We have a plot, and ultimately it concludes.”
Even so, some futurists believe immortality is within reach:
So, what do you think: Is there a limit to how long people should live? Is it selfish to want eternity for yourself, or would having even a few immortals around make the world better for everyone? Here’s one reader’s take:
This reminds me a bit of the cylons in the “new” Battlestar Galactica.
With the ability to reincarnate infinitely, and be effectively immortal, they were callous towards humans, and killed humans with impunity. It was only when their ability to reincarnate was ended and they became effectively mortal (and thus subject to basically the same rules of death as humans) that they were driven to behave in a moral way.
But another reader argues:
I for one think the world would be a better place if we collectively took a longer view, and what better way to do that than to give everyone a stake in it?
Because of the Internet I write more and receive feedback from people I know (on Facebook) and online strangers (on TAD and other platforms that use Disqus). I use it as a jumping-off place and resource for planning lessons for my high-school students in science.
However, I don’t practice music as often as I used to.
On a similar note, another reader confesses, “I draw less because I’m always on TAD”:
As a sketch artist, I appreciate my ability to Google things I want to draw for a reference point, but that doesn’t make me more creative. I already had the image in my head and the ability to draw. I honed my skills drawing people the old fashioned way, looking at pictures in books or live subjects and practicing till my fingers were going to fall off.
In my opinion, the internet also encourages people to copy the work of others that goes “viral” rather than creating something truly original. The fact that you can monetize that viral quality also makes it more likely that people will try to copy rather than create.
That’s the same reason a third reader worries that “the internet has become stifling for creativity”:
Maybe I am not looking in the right place, but most platforms seem to be more about reblogging/retweeting/reposting other people’s creations. Then there is the issue of having work stolen and credits removed.
As another reader notes, “This is the central conflict of fan fiction”:
It’s obviously creative. On the other hand, it is all based on blatant copying of another writer’s work. How much is this a huge expansion of a creative outlet, and how much is this actually people choosing to limit their own creativity by colonizing somebody else’s world rather than creating a new one?
For my part, I tend to think the internet has encouraged and elevated some amazing new forms of creativity based on reaction and re-creation, collaboration and synthesis. Take this delightful example:
Those creative forms are a big part of my job too: When I go to work, I’m either distilling my colleagues’ articles for our Daily newsletter or piecing together reader emails for Notes, and those curatorial tasks have been exciting and challenging in ways that I never expected. But I’ve also missed writing fiction and poetry and literary criticism, and I worry sometimes that I’m letting those creative muscles atrophy. If you’re a fanfic reader or writer (or videographer, or meme-creator, or content-aggregator) and would like to share your experience, please let us know: hello@theatlantic.com.
This next reader speaks up for creativity as “the product of synthesis”:
It’s not so much a quest for pure “originality,” as it is a quest for original perspectives or original articulations. I’d say that my creativity has been fueled by letting myself fall into occasional rabbit holes. Whether that’s plodding through artists I don’t know well on Spotify or following hyperlinks in a Wiki piece until I have forgotten about what it was that I initially wondered, that access to knowledge in a semi-random form triggers the old noggin like little else.
On the other hand: So much knowledge! So many rabbit holes! Jim is paralyzed:
I find many more ideas and inspirations, but the flow of information and ideas is so vast that I never find time to develop them. I need to get off the internet.
Diane is also exasperated:
The promise of digital technology was: spinning piles of straw into useful pieces of gold.
My reality is: looking for golden needles in a giant haystack of unusable straw.
I spend so much time looking for the few things actually useful to my project, my writing, my daily info needs, and by the end of the day I feel like I’ve wasted so much time and effort sorting through useless crap. And the pile of useless keeps getting bigger and bigger, like a bad dream.
This next reader provides some tips for productive discovery:
I am old enough to vaguely recall a time before I began to use the internet on a daily basis. What I would do, back then, when I got stuck and could not find a creative angle on a problem, was to go to some arbitrary corner of the library, take down the first book that caught my interest even though it had nothing to do with the problem at hand, and read a few pages—sometimes, the whole book. More often than not, it would trigger all sorts of analogies, and at least a few of them usually turned out to be fruitful. (Even if nothing turned out to be relevant, I usually still learned something interesting, so it was a win-win strategy.) It was a great way (to borrow Horace Walpole’s definition of serendipity) to make discoveries, by accidents and sagacity, of things one were not in quest of.
I try to use the internet in a somewhat similar fashion: When I’m stuck, I often spend a morning strolling around arbitrary corners of the internet, trying to discover stuff I did not know I was in quest of. Typically, I start in some academic resource like JSTOR. (I almost always start by limiting my search to articles at least 50 years old; it ensures that one does not end up reading fashionable stuff and thus thinking the same thoughts as all the other hamsters in the academic wheel. Also, older articles are usually far more well-written than the crap that results from the publish-or-perish system.) I am not above using e.g. Wikipedia, though, at least as a point of departure.
I also like reading old stuff in online newspaper/magazine archives. Sometimes, a stray remark in one of those wonderful 19th-century magazines written by and for men of letters is all you need to get a fresh angle on a familiar problem.
Gotta love those 19th-century magazines. In some ways, their mission wasn’t so different from that of the Facebook groups and Reddit threads and Disqus forums of today: creating a space for discourse and exchange and reflection, where exciting new ideas could bump up against each other. As James Russell Lowell, The Atlantic’s founding editor, wrote to a friend in 1857, “The magazine is to be free without being fanatical, and we hope to unite in it all available talent of all modes of opinion.” And as Terri, one of the founding members of TAD, reflects today:
TAD itself has been a creative endeavor for me and the other mods. Envisioning the community we wanted. Coming up with ideas to bring it to life. We developed ideas around the mix of politics, open and fun threads that the community has taken on and grown. It really has been a creative experience in collaboration on the internet.
Check out TAD’s whole discussion on creativity here, as well as many more. As for the offline benefits of online collaboration, take it from this reader—a “furniture maker and Weimaraner enthusiast”:
I would like to share a story about a project I am working on in which the internet has certainly aided my creativity. Zeus, our 8-month-old Weimaraner, is a couch hog. When my girlfriend and I sit down on the couch to watch TV, he will sit directly in front of us and bark until we make room for him. There are three large dog beds in the house, but Zeus steadfastly refuses to lie on the dog beds.
I am a member of a Weimaraner-owner Facebook group called Weim Crime. Several people in the group have had similar problems. We came up with a solution I tested out last week: build a dog bunk bed with one bed on the bottom and one bed about the same height as our couch.
It has worked out very well. Zeus quietly relaxes on the top dog bunk while we sit on the couch. I am now collecting feedback from that same group before building the more attractive final version. I have received very useful feedback—for example, lowering the top bunk deck to 18 inches or lower to prevent joint injuries. My end goal is to design and build a simple, low-cost dog bunk bed that is more attractive than the prototype and post a YouTube video showing other owners how to build a similar one.
This is just one silly project, but the feedback and interest I have receiving regarding the project has been really inspiring.
What questions about your day-to-day experience of the world have you been pondering? We welcome your feedback and inspirations. Check back Monday for the next discussion question in this series—and in the meantime, enjoy some Weimaraner art:
What the internet does to the mind is something of an eternal question. Here at The Atlantic, in fact, we pondered that question before the internet even existed. Back in 1945, in his prophetic essay “As We May Think,” Vannevar Bush outlined how technology that mimics human logic and memory could transform “the ways in which man produces, stores, and consults the record of the race”:
Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems. He has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory. His excursions may be more enjoyable if he can reacquire the privilege of forgetting the manifold things he does not need to have immediately at hand, with some assurance that he can find them again if they prove important.
Bush didn’t think machines could ever replace human creativity, but he did hope they could make the process of having ideas more efficient. “Whenever logical processes of thought are employed,” he wrote, “there is opportunity for the machine.”
Fast-forward six decades, and search engines had claimed that opportunity, acting as a stand-in for memory and even for association. In his October 2006 piece “Artificial Intelligentsia,” James Fallows confronted the new reality:
If omnipresent retrieval of spot data means there’s less we have to remember, and if categorization systems do some of the first-stage thinking for us, what will happen to our brains?
I’ve chosen to draw an optimistic conclusion, from the analogy of eyeglasses. Before corrective lenses were invented, some 700 years ago, bad eyesight was a profound handicap. In effect it meant being disconnected from the wider world, since it was hard to take in knowledge. With eyeglasses, this aspect of human fitness no longer mattered in most of what people did. More people could compete, contribute, and be fulfilled. …
It could be the same with these new computerized aids to cognition. … Increasingly we all will be able to look up anything, at any time—and, with categorization, get a head start in thinking about connections.
But in Nicholas Carr’s July 2008 piece “Is Google Making Us Stupid?,” he was troubled by search engines’ treatment of information as “a utilitarian resource to be mined and processed with industrial efficiency.” And he questioned the idea that artificial intelligence would make people’s lives better:
It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
Even as Carr appreciated the ease of online research, he felt the web was “chipping away [his] capacity for concentration and contemplation.” It was as if the rote tasks of research and recall, far from wasting innovators’ time, were actually the building blocks of more creative, complex thought.
On the other hand, “you should be skeptical of my skepticism,” as Carr put it. And from the beginning, one great benefit of the internet was that it brought people in contact not just with information, but with other people’s ideas. In April 2016, Adrienne LaFrance reflected on “How Early Computer Games Influenced Internet Culture”:
In the late 1970s and early 1980s, game makers—like anyone who found themselves tinkering with computers at the time—were inclined to share what they learned, and to build on one another’s designs. … That same culture, and the premium it placed on openness, would eventually carry over to the early web: a platform that anyone could build on, that no one person or company could own. That idea is at the heart of what proponents for net neutrality are trying to protect—that is, the belief that openness is a central value, perhaps even the foundational value, of what is arguably the most important technology of our time.
But as tech culture evolved and pervaded life outside the web, even its problem-solving methods began to seem reductive at times. Ian Bogost outlined that paradox in November 2016 when a new product called ketchup leather was billed as the “solution” to soggy burgers:
The technology critic Evgeny Morozov calls this sort of thinking “solutionism”—the belief that all problems can be solved by a single and simple technological solution. … Morozov is concerned about solutionism because it recasts social conditions that demand deeper philosophical and political consideration as simple hurdles for technology. …
But solutionism has another, subtler downside: It trains us to see everything as a problem in the first place. Not just urban transit or productivity, but even hamburgers. Even ketchup!
So, what’s your personal experience of how the internet affects creativity? Can you point to a digital distraction—Netflix, say, or Flappy Bird—that’s enriched your thinking in other areas of your life? On the flip side of the debate, can you point to a tool like email or Slack that’s sharpened your efficiency but narrowed the scope of your ideas? We’d like to hear your stories; please send us a note: hello@theatlantic.com.