My faith in adult society got a little boost the past weekend when I read that a growing number of people are becoming disillusioned with Facebook and are discontinuing their affiliation, or at least their frequent visits, to the site.
Not that Facebook, or its conceptual offspring Twitter, are in any immediate danger of extinction. The numbers of both networks are still climbing. But as Virginia Heffernan reported in the Sunday Times Magazine, there's a growing number of people who are becoming disenchanted with Facebook--and in some cases the whole idea of the Facebook--for a number of reasons.
For some, it's concerns about privacy. Facebook isn't just a friendly neighborhood park; the company profits from the information it collects on users. There were many who objected, in early 2008, to the fact that the site was holding onto profile information even when people closed down their accounts. Not to mention the "oops" when Facebook decided to let everyone in a user's circle know about other internet purchases a user made. There were also some who turned away after the kerfluffle over Facebook's assertion, last February, that it owned the copyright to all content on the site, and some who object to having their personal activity so closely monitored by some large, unseen entity.
But what intrigued me about the group Heffernan interviewed was the number who were simply tiring of checking in on other people's lives all the time, investing in connections that felt more like stalking or distant newsletters instead of direct one-on-one friendship, and a growing unease about how they're spending, or wasting, their time.
I find these growing sentiments reassuring because of an assessment a friend of mine made last spring about the social-network frenzy of Facebook and Twitter. A friend, it should be noted, whose entire job revolves around the development of new technology in Silicon Valley. But both of those technologies, he said, were really geared toward the needs and interests of teenagers and young people. Twitter, after all, evolved from cell phone texting, which nobody does anywhere near as impressively, or frequently, as the under-20 crowd. And Facebook was started by college students as a kind of snide "pig book" to put various students' photos together and allow people to weigh in on who was "hotter." It evolved into a college networking site, and expanded from there. But, still.
The tasks that Facebook and Twitter enhance ... staying connected with as large a group as possible, staying up-to-the-minute informed about what everyone in the social world you care about is doing, and in the process keeping track of where you fit in the social hierarchy of it all ... have been a primary focus of teenagers since time immemorial. Forty years ago, there were gossip cliques by the school lockers and fights over who got to use the family phone to keep up with the latest social status news. All Facebook and Twitter do is give teenagers additional tools to accomplish one of their prime developmental tasks: figuring out how to define themselves in relation to, and as distinct from, the rest of their peers, and exploring a wide variety of social connections within that group.
So in that context, texting, Facebook and Twitter are all terrific developments that, among other things, certainly free up the family phone. The puzzling thing is why they've been so popular among people who are supposed to be a bit beyond that stage. At some point in our development, we're supposed to let go of that obsessive focus on what everyone else is doing in order to focus on our own work and achievements. We're supposed to mature into valuing fewer but more meaningful friendships over the herd social groups we favored as teenagers. And hopefully, we're supposed to get busy enough with more significant contributions to family, community and the world to either care about, or have time for, the movements and chatter of people we're not that deeply connected to. As free time becomes more limited, choices have to be made. And there's a trade-off: to go deep, you can't go as broad.
There are certainly valuable uses for Facebook, even in the 30-something and beyond set. Most of my friends who have teenagers have joined so they have a better awareness of the technology and world their children are experiencing ... and to help them keep track of what's going on in their children's lives. And for older people who can't get out as much, social networking sites offer a way to stay connected with the world, and to keep loneliness at bay. Not to mention their appeal to marketers, who see a way to reach large groups of people (and especially the all-important young demographic) with a sales message in a fairly easy manner.
So the sites have their uses. But using them to compensate for the loneliness of old age, track your kids, or sell a product, is different than being giddy about them--or being addicted to them--for their own sake. And that's the part that's perplexed me about their growing use and popularity among the over-30 set. When teenagers are texting or twittering inane comments during class, they're being difficult, but age-appropriate. When Senators are twittering inane comments during major policy speeches, there's something slightly askew.
But perhaps the fascination with both sites is just a product of our innately curious and exploratory natures. When my sister and I, at ages 15 and 17, bought lacrosse sticks (boys', because we couldn't locate girls'), I remember the way my dad was drawn almost irresistibly toward the back yard where we were trying them out. He watched from the back window, then the open door, then the grass at the foot of the steps. We could feel how much he was itching to have a go at it, even though he'd never held a lacrosse stick in his life. When we finally offered him a turn, he lit up like a Christmas tree and laughed out loud at the novelty of the play. He had a blast with it. But he didn't have the need to play as long as my sister and I did. He tried it, had fun, and then moved on to the other tasks and activities of his day.
The kids come up with something new, and we can't help but want to try it out. But with different life and developmental tasks demanding our focus and time, we don't, or at least we shouldn't, stay as obsessed with it as they are--whether the "it" is the hula hoop, skateboarding, hanging out at the mall ... or a passionate attachment to Facebook or Twitter.
Is that natural dissipation of interest coming to pass with the social networking sites, as well? Hard to say. But if Heffernan's subjects are any guide, it may be ... until, of course, the next exciting new fad, fashion, techno-gizmo, or toy comes to town.
The paper of record’s inaccurate reporting on a nonexistent criminal investigation was a failure that should entail more serious consequences.
I have read The New York Times since I was a teenager as the newspaper to be trusted, the paper of record, the definitive account. But the huge embarrassment over the story claiming a criminal investigation of Hillary Clinton for her emails—leading the webpage, prominent on the front page, before being corrected in the usual, cringeworthy fashion of journalists who stonewall any alleged errors and then downplay the real ones—is a direct challenge to its fundamental credibility. And the paper’s response since the initial huge error was uncovered has not been adequate or acceptable.
This is not some minor mistake. Stories, once published, take on a life of their own. If they reinforce existing views or stereotypes, they fit perfectly into Mark Twain’s observation, “A lie can travel halfway around the world while the truth is putting on its shoes.” (Or perhaps Twain never said it, in which case the ubiquity of that attribution serves to validate the point.) And a distorted and inaccurate story about a prominent political figure running for president is especially damaging and unconscionable.
A newly discovered artifact buried with one of Jamestown’s most prominent leaders suggests he could have been a crypto-Catholic.
After 400 years in the Virginia dirt, the box came out of the ground looking like it had been plucked from the ocean. A tiny silver brick, now encrusted with a green patina and rough as sandpaper. Buried beneath it was a human skeleton. The remains would later be identified as those of Captain Gabriel Archer, one of the most prominent leaders at Jamestown, the first permanent English colony in America. But it was the box, which appeared to be an ancient Catholic reliquary, that had archaeologists bewildered and astonished.
“One of the major surprises was the discovery of this mysterious small silver box,” said James Horn, the president of the Jamestown Rediscovery Foundation. “I have to say, we’re still trying to figure this out. You have the very strange situation of a Catholic reliquary being found with the leader of the first Protestant church in the country.”
The agreement doesn’t guarantee that Tehran will never produce nuclear weapons—because no agreement could do so.
A week ago I volunteered my way into an Atlantic debate on the merits of the Iran nuclear agreement. The long version of the post is here; the summary is that the administration has both specific facts and longer-term historic patterns on its side in recommending the deal.
On the factual front, I argued that opponents had not then (and have not now) met President Obama’s challenge to propose a better real-world alternative to the negotiated terms. Better means one that would make it less attractive for Iran to pursue a bomb, over a longer period of time. Real world means not the standard “Obama should have been tougher” carping but a specific demand that the other countries on “our” side, notably including Russia and China, would have joined in insisting on, and that the Iranians would have accepted.
Orr: “Sometimes a thing happens. Splits your life. There’s a before and after. I got like five of them at this point.”
This was Frank offering a pep talk to the son of his murdered former henchman Stan in tonight’s episode. (More on this in a moment.) But it’s also a line that captures this season of True Detective so perfectly that it almost seems like a form of subliminal self-critique.
Remember when Ray got shot in episode two and appeared to be dead but came back with a renewed sense of purpose and stopped drinking. No? That’s okay. Neither does the show: It was essentially forgotten after the subsequent episode. Remember when half a dozen (or more) Vinci cops were killed in a bloody shootout along with dozen(s?) of civilians? No? Fine: True Detective’s left that behind, too. Unless I missed it, there was not a single mention of this nationally historic bloodbath tonight.
The former secretary of defense lobbied for the repeal of “Don’t ask, don’t tell,” and has now ended the Boy Scouts’ ban on gay scoutmasters.
Eagle Scout. Young Republican. CIA recruit. Air Force officer. CIA director. Secretary of defense.
It’s not the resume of a radical civil-rights campaigner, but Robert Gates has now integrated two of the great bastions of macho American traditional morality—first the U.S. armed forces, and now the Boy Scouts of America. In both cases, Gates pursued a careful, gradual strategy, one that wasn't fast enough for activists. In both cases, he was careful to take the temperature of constituents. And in both cases, once he was ready to act, he did so decisively. In the end what seemed to matter most was not Gates’s personal feelings but his determination to safeguard institutions he cared about and his deft skills as a bureaucratic operator.
Has the Obama administration’s pursuit of new beginnings blinded it to enduring enmities?
“The president said many times he’s willing to step out of the rut of history.” In this way Ben Rhodes of the White House, who over the years has broken new ground in the grandiosity of presidential apologetics, described the courage of Barack Obama in concluding the Joint Comprehensive Plan of Action with the Islamic Republic of Iran, otherwise known as the Iran deal. Once again Rhodes has, perhaps inadvertently, exposed the president’s premises more clearly than the president likes to do. The rut of history: It is a phrase worth pondering. It expresses a deep scorn for the past, a zeal for newness and rupture, an arrogance about old struggles and old accomplishments, a hastiness with inherited precedents and circumstances, a superstition about the magical powers of the present. It expresses also a generational view of history, which, like the view of history in terms of decades and centuries, is one of the shallowest views of all.
This is the third in a series. Readers are invited to send their own responses to firstname.lastname@example.org, and we will post their strongest critiques of the book and the accompanied reviews. (The first batch is here.) To further encourage civil and substantive responses via email, we are closing the comments section. You can follow the whole series on Twitter at #BTWAM and read all of the responses to the book from Atlantic readers and contributors.
Several years ago, Ta-Nehisi Coates took his son, not yet 5, to see a movie on the Upper West Side of Manhattan. As his son made his way off the escalator, a white woman pushed him and said, “Come on!” Chaos ensued. There was a black parent’s rage and a white man’s threat to have the black parent arrested. Coates narrates the incident in cool, steady prose. Ultimately, he writes of the regret he carries: “In seeking to defend you I was, in fact, endangering you.”
Companies that overvalue alpha-male behavior need to change—both to retain female talent and for the bottom line.
When it comes to gender equality in the workplace, the research on its economic benefits is clear: Equality can boost profits and enhance reputation. And then there’s also the fact that it’s more fair. But the progress of women in the workplace is so far inadequate: Women are woefully underrepresented in executive positions, the pay gap persists, and the motherhood penalty is very real.
Barbara Annis is the founder of the Gender Intelligence Group, a consultancy that works with executives at major firms (including Deloitte, American Express, BMO Financial Group, and eBay) to create strategies to transform their work cultures into ones that are friendly to both men and women.
I recently spoke with Annis about her work and the challenges to achieving gender parity. The following transcript of our conversation has been edited for clarity.
How a radical epilepsy treatment in the early 20th century paved the way for modern-day understandings of perception, consciousness, and the self
In 1939, a group of 10 people between the ages of 10 and 43, all with epilepsy, traveled to the University of Rochester Medical Center, where they would become the first people to undergo a radical new surgery.
The patients were there because they all struggled with violent and uncontrollable seizures. The procedure they were about to have was untested on humans, but they were desperate—none of the standard drug therapies for seizures had worked.
Between February and May of 1939, their surgeon William Van Wagenen, Rochester’s chief of neurosurgery, opened up each patient’s skull and cut through the corpus callosum, the part of the brain that connects the left hemisphere to the right and is responsible for the transfer of information between them. It was a dramatic move: By slicing through the bundle of neurons connecting the two hemispheres, Van Wagenen was cutting the left half of the brain away from the right, halting all communication between the two.