My faith in adult society got a little boost the past weekend when I read that a growing number of people are becoming disillusioned with Facebook and are discontinuing their affiliation, or at least their frequent visits, to the site.
Not that Facebook, or its conceptual offspring Twitter, are in any immediate danger of extinction. The numbers of both networks are still climbing. But as Virginia Heffernan reported in the Sunday Times Magazine, there's a growing number of people who are becoming disenchanted with Facebook--and in some cases the whole idea of the Facebook--for a number of reasons.
For some, it's concerns about privacy. Facebook isn't just a friendly neighborhood park; the company profits from the information it collects on users. There were many who objected, in early 2008, to the fact that the site was holding onto profile information even when people closed down their accounts. Not to mention the "oops" when Facebook decided to let everyone in a user's circle know about other internet purchases a user made. There were also some who turned away after the kerfluffle over Facebook's assertion, last February, that it owned the copyright to all content on the site, and some who object to having their personal activity so closely monitored by some large, unseen entity.
But what intrigued me about the group Heffernan interviewed was the number who were simply tiring of checking in on other people's lives all the time, investing in connections that felt more like stalking or distant newsletters instead of direct one-on-one friendship, and a growing unease about how they're spending, or wasting, their time.
I find these growing sentiments reassuring because of an assessment a friend of mine made last spring about the social-network frenzy of Facebook and Twitter. A friend, it should be noted, whose entire job revolves around the development of new technology in Silicon Valley. But both of those technologies, he said, were really geared toward the needs and interests of teenagers and young people. Twitter, after all, evolved from cell phone texting, which nobody does anywhere near as impressively, or frequently, as the under-20 crowd. And Facebook was started by college students as a kind of snide "pig book" to put various students' photos together and allow people to weigh in on who was "hotter." It evolved into a college networking site, and expanded from there. But, still.
The tasks that Facebook and Twitter enhance ... staying connected with as large a group as possible, staying up-to-the-minute informed about what everyone in the social world you care about is doing, and in the process keeping track of where you fit in the social hierarchy of it all ... have been a primary focus of teenagers since time immemorial. Forty years ago, there were gossip cliques by the school lockers and fights over who got to use the family phone to keep up with the latest social status news. All Facebook and Twitter do is give teenagers additional tools to accomplish one of their prime developmental tasks: figuring out how to define themselves in relation to, and as distinct from, the rest of their peers, and exploring a wide variety of social connections within that group.
So in that context, texting, Facebook and Twitter are all terrific developments that, among other things, certainly free up the family phone. The puzzling thing is why they've been so popular among people who are supposed to be a bit beyond that stage. At some point in our development, we're supposed to let go of that obsessive focus on what everyone else is doing in order to focus on our own work and achievements. We're supposed to mature into valuing fewer but more meaningful friendships over the herd social groups we favored as teenagers. And hopefully, we're supposed to get busy enough with more significant contributions to family, community and the world to either care about, or have time for, the movements and chatter of people we're not that deeply connected to. As free time becomes more limited, choices have to be made. And there's a trade-off: to go deep, you can't go as broad.
There are certainly valuable uses for Facebook, even in the 30-something and beyond set. Most of my friends who have teenagers have joined so they have a better awareness of the technology and world their children are experiencing ... and to help them keep track of what's going on in their children's lives. And for older people who can't get out as much, social networking sites offer a way to stay connected with the world, and to keep loneliness at bay. Not to mention their appeal to marketers, who see a way to reach large groups of people (and especially the all-important young demographic) with a sales message in a fairly easy manner.
So the sites have their uses. But using them to compensate for the loneliness of old age, track your kids, or sell a product, is different than being giddy about them--or being addicted to them--for their own sake. And that's the part that's perplexed me about their growing use and popularity among the over-30 set. When teenagers are texting or twittering inane comments during class, they're being difficult, but age-appropriate. When Senators are twittering inane comments during major policy speeches, there's something slightly askew.
But perhaps the fascination with both sites is just a product of our innately curious and exploratory natures. When my sister and I, at ages 15 and 17, bought lacrosse sticks (boys', because we couldn't locate girls'), I remember the way my dad was drawn almost irresistibly toward the back yard where we were trying them out. He watched from the back window, then the open door, then the grass at the foot of the steps. We could feel how much he was itching to have a go at it, even though he'd never held a lacrosse stick in his life. When we finally offered him a turn, he lit up like a Christmas tree and laughed out loud at the novelty of the play. He had a blast with it. But he didn't have the need to play as long as my sister and I did. He tried it, had fun, and then moved on to the other tasks and activities of his day.
The kids come up with something new, and we can't help but want to try it out. But with different life and developmental tasks demanding our focus and time, we don't, or at least we shouldn't, stay as obsessed with it as they are--whether the "it" is the hula hoop, skateboarding, hanging out at the mall ... or a passionate attachment to Facebook or Twitter.
Is that natural dissipation of interest coming to pass with the social networking sites, as well? Hard to say. But if Heffernan's subjects are any guide, it may be ... until, of course, the next exciting new fad, fashion, techno-gizmo, or toy comes to town.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
As incomes fall across the nation, even better-off areas like Sheboygan County, Wisconsin, are faltering.
SHEBOYGAN, Wisc.—There is still a sizable middle class in this county of 115,000 on the shores of Lake Michigan, a pleasant hour’s drive from Milwaukee. You can see it in the cars that pour in and out of the parking lots of local factories, in the restaurants packed with older couples on weeknights, and in the bars that seem to be on every single corner. You can see it in the local parks, including one called Field of Dreams, where kids play soccer and baseball and their parents sit and watch.
About 63 percent of adults in Sheboygan make between $41,641 and $124,924, meaning the area has one of the highest shares of middle-class households in the country, according to a report from the Pew Research Center. Nationally, only 51 percent of adults are middle-class.