As we look back on the first 10 years of the new millennium, the ubiquity of the Internet and the growth of all its social networking possibilities, from email to Blackberries, iPhones and Facebook, is surely one of the most significant changes to emerge from the decade. Granted, the changes began a decade before that. But the past 10 years has seen a phenomenal boom in the growth of Internet access and usage ... up 1600% in the Middle East, 1300% in Africa, and more than 300% worldwide, so that Internet users now number more than half the total population of the world, according to Internet World Stats.
We can now easily connect with friends on different continents without waiting two weeks for a letter, talk via computer without the expense of international long distance, and share new baby or other photos with hundreds of friends and relatives in a single posting, via personal Web sites and Facebook pages. My next-door neighbors, who are from Turino, Italy and just had a baby, even hooked up webcams around the house so the distant grandparents could watch live videos of their new grandson playing, feeding, and sleeping from half a world away. With chat rooms, email, Skype, Facebook, and the worldwide Web community, the possibility of being isolated or without someone to "talk" to is far more remote.
But are there hidden costs to all this connectedness? Is it possible that for some, there is loneliness, not safety, in numbers? Two essays by Willaim Deresiewicz in The Chronicle for Higher Education--one last January, and one penned only a few days ago, argue that it is. In his most recent essay, Deresiewicz quotes two studies, one from 1985, and one from 2004, that show a marked decline in people who have a "close confidant." In 1985, only one out of 10 people said they lacked such a person in their life. In 2004, that number had climbed to four out of 10. And that was before so many blogs and social networking sites expanded the number of options (and distractions) for how we spend whatever social connection time we have.
So as we spend more time connecting to the world, it appears that at least some of us may be trading off depth for breadth. We are at once more connected and less connected, depending on how you look at it. But that's not the only impact that concerns Deresiewicz. In his essay from last January, entitled "The End of Solitude," he talks about the impact of constant connectivity on our comfort with being quiet and alone. Just as boredom comes from a discomfort with idle time, he argues that loneliness comes not from being alone per se, but from discomfort over being alone. Just as a small child has to learn to put themselves to sleep, we have to learn how to be comfortable with being alone. And that takes practice ... practice that is far easier to avoid with all the distractions of constant connectivity.
The essays are an interesting read on the history of friendship, social values, and how evolving technology has affected our social connections, from the evolution of the suburbs to the advent of the Internet. And whether you agree with his assessment of Facebook and its impact on social connection, he raises some interesting and valid points.
Without question, there are certain elements that exist in inverse proportion to each other. An Olympic gold-medal athlete has deep expertise in one area, but generally trades off experience and knowledge in other subjects for that one field of excellence. You can go deep, or broad, but generally not both. Quality begins to degrade if increasing quantity is demanded in the same time frame. If you have 10 priorities, you really have none. The same goes for intimacy. Just ask anyone who's tried to balance multiple intimate relationships at the same time.
Friendship is less demanding than a more intimate and vulnerable romantic connection, but the same principle applies. I've noticed, the more times I've moved, and the more people I've met, how much harder it is to keep up with all those friendships on any significant level. Acquaintances are easy to maintain with casual, group emails and Holiday notes. But real friends? They take time and energy--both to develop, and to nurture or maintain.
Facebook, Twitter, Group Emails, texting and other mass communication and connection vehicles don't preclude anyone also taking that time and focus to develop a few deep friendships, any more than they preclude taking time to read, think, or get comfortable with yourself, alone. But they do throw more potential and tempting distractions in the mix, as well as a slightly guilty feeling that we should be keeping up with all those people. In our increasingly immediate, non-stop society, all of us struggle to find enough time for family and friends. And the more of that already-squeezed time anyone spends maintaining a broad network of Internet, text, Facebook and Twitter friends and updates, the less time and energy they have to devote to any one friend or person. It's just simple math.
Once upon a time, books and conversations were the only distractions we had. We also tended to stay in small, local communities, so we had years to develop ties with one small group of people. Is there a link between our moving away from those communities and the development of more media to assuage the loneliness and distance that ensued? I wouldn't be surprised if someone told me there was. But in any event, the media and distractions came. First radio and movies. Then TV. Then videos. Then video games, the Internet and the cell phone. For the past 50 years, there's been some passive way to avoid facing silence, alone with yourself, if you really wanted to.
At the very least, the increase in connection and distraction possibilities increases the need to make choices among all the options. There is no technology that can speed up the time it takes to have an intimate, personal, and unique conversation with a single friend. But it can increase the number of friends, past and present, with whom I could have those conversations, either via email or just through the reconnection magic of Internet searches. So the temptation is there to become scattered--and in trying to keep up with all, to end up keeping up well with none.
Does that mean that our friendships are in danger of becoming less deep, or that the increased distractions mean we've gotten worse at learning to be alone, in silence? Maybe. But only if we've allowed it. Avoiding scatteredness--in social connections, anyway--is simply a matter of prioritizing and letting go of things that are less important. And getting immersed in distractions is a choice. For those who are afraid of being alone, there have always been distractions. For those of us who recognize the value of silence and deeper connections, I doubt the advent of new technologies will suddenly change our craving for those things.
Indeed, as I've sat in a snow-bound Connecticut house, curled up with a bad cold the past few days, I've remembered again the beauty of a slower pace of living. One that allows for a long chat with an old friend, a well-developed thought, or the joy of spending time over a piece of writing not due two hours later. But I also love being able to keep in frequent touch with lifelong friends who live in Paris, in ways we never would if it took mailing international letters, instead of email, to connect.
As always, it's a matter of balance; of being master of the sorcery at our disposal, instead of letting it master us. Of course, balance itself is a skill that, like being comfortable with solitude or a deep friendship, requires patience, dedicated effort ... and evolves, in most cases, with age, experience, and time.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?
When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.
Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.
A controversial treatment shows promise, especially for victims of trauma.
It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)
Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.
Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.
As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.
He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
The authors in the running for Britain's most prestigious literary award come from seven countries and include seven women writers.
The longlist for the Man Booker Prize, one of the most prestigious literary awards, was announced Wednesday. For the second year, the prize was open to writers of any nationality who publish books in English in the U.K., and this year five American writers made the list of 13 contenders, chosen by five judges from a pool of 156 total works.
The U.S. is, in fact, the most well-represented country, with other entrants hailing from Great Britain, Jamaica, New Zealand, Nigeria, Ireland, and India. There are three debut novelists and one former winner on the list, and women writers outnumber men seven to six. From dystopian and political novels to a multitude of iterations on the family drama, the selections capture the ever-changing human experience in very different ways.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, Prince Charles surprised by an eagle, wildfire in California, a sunset in Crimea, and much more.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, fireworks in North Korea, Prince Charles surprised by an eagle, wildfire in California, protests in the Philippines and Turkey, a sunset in Crimea, and much more.
Members of Colombia's younger generation say they “will not torture for tradition.”
MEDELLÍN, Colombia—On a scorching Saturday in February, hundreds of young men and women in Medellín stripped down to their swimsuit bottoms, slathered themselves in black and red paint, and sprawled out on the hot cement in Los Deseos Park in the north of the city. From my vantage point on the roof of a nearby building, the crowd of seminude protesters formed the shape of a bleeding bull—a vivid statement against the centuries-old culture of bullfighting in Colombia.
It wasn’t long ago that Colombia was among the world’s most important countries for bullfighting, due to the quality of its bulls and its large number of matadors. In his 1989 book Colombia: Tierra de Toros (“Colombia: Land of Bulls”), Alberto Lopera chronicled the maturation of the sport that Spanish conquistadors had introduced to South America in the 16th century, from its days as an unorganized brouhaha of bulls and booze in colonial plazas to a more traditional Spanish-style spectacle whose fans filled bullfighting rings across the country.