My faith in adult society got a little boost the past weekend when I read that a growing number of people are becoming disillusioned with Facebook and are discontinuing their affiliation, or at least their frequent visits, to the site.
Not that Facebook, or its conceptual offspring Twitter, are in any immediate danger of extinction. The numbers of both networks are still climbing. But as Virginia Heffernan reported in the Sunday Times Magazine, there's a growing number of people who are becoming disenchanted with Facebook--and in some cases the whole idea of the Facebook--for a number of reasons.
For some, it's concerns about privacy. Facebook isn't just a friendly neighborhood park; the company profits from the information it collects on users. There were many who objected, in early 2008, to the fact that the site was holding onto profile information even when people closed down their accounts. Not to mention the "oops" when Facebook decided to let everyone in a user's circle know about other internet purchases a user made. There were also some who turned away after the kerfluffle over Facebook's assertion, last February, that it owned the copyright to all content on the site, and some who object to having their personal activity so closely monitored by some large, unseen entity.
But what intrigued me about the group Heffernan interviewed was the number who were simply tiring of checking in on other people's lives all the time, investing in connections that felt more like stalking or distant newsletters instead of direct one-on-one friendship, and a growing unease about how they're spending, or wasting, their time.
I find these growing sentiments reassuring because of an assessment a friend of mine made last spring about the social-network frenzy of Facebook and Twitter. A friend, it should be noted, whose entire job revolves around the development of new technology in Silicon Valley. But both of those technologies, he said, were really geared toward the needs and interests of teenagers and young people. Twitter, after all, evolved from cell phone texting, which nobody does anywhere near as impressively, or frequently, as the under-20 crowd. And Facebook was started by college students as a kind of snide "pig book" to put various students' photos together and allow people to weigh in on who was "hotter." It evolved into a college networking site, and expanded from there. But, still.
The tasks that Facebook and Twitter enhance ... staying connected with as large a group as possible, staying up-to-the-minute informed about what everyone in the social world you care about is doing, and in the process keeping track of where you fit in the social hierarchy of it all ... have been a primary focus of teenagers since time immemorial. Forty years ago, there were gossip cliques by the school lockers and fights over who got to use the family phone to keep up with the latest social status news. All Facebook and Twitter do is give teenagers additional tools to accomplish one of their prime developmental tasks: figuring out how to define themselves in relation to, and as distinct from, the rest of their peers, and exploring a wide variety of social connections within that group.
So in that context, texting, Facebook and Twitter are all terrific developments that, among other things, certainly free up the family phone. The puzzling thing is why they've been so popular among people who are supposed to be a bit beyond that stage. At some point in our development, we're supposed to let go of that obsessive focus on what everyone else is doing in order to focus on our own work and achievements. We're supposed to mature into valuing fewer but more meaningful friendships over the herd social groups we favored as teenagers. And hopefully, we're supposed to get busy enough with more significant contributions to family, community and the world to either care about, or have time for, the movements and chatter of people we're not that deeply connected to. As free time becomes more limited, choices have to be made. And there's a trade-off: to go deep, you can't go as broad.
There are certainly valuable uses for Facebook, even in the 30-something and beyond set. Most of my friends who have teenagers have joined so they have a better awareness of the technology and world their children are experiencing ... and to help them keep track of what's going on in their children's lives. And for older people who can't get out as much, social networking sites offer a way to stay connected with the world, and to keep loneliness at bay. Not to mention their appeal to marketers, who see a way to reach large groups of people (and especially the all-important young demographic) with a sales message in a fairly easy manner.
So the sites have their uses. But using them to compensate for the loneliness of old age, track your kids, or sell a product, is different than being giddy about them--or being addicted to them--for their own sake. And that's the part that's perplexed me about their growing use and popularity among the over-30 set. When teenagers are texting or twittering inane comments during class, they're being difficult, but age-appropriate. When Senators are twittering inane comments during major policy speeches, there's something slightly askew.
But perhaps the fascination with both sites is just a product of our innately curious and exploratory natures. When my sister and I, at ages 15 and 17, bought lacrosse sticks (boys', because we couldn't locate girls'), I remember the way my dad was drawn almost irresistibly toward the back yard where we were trying them out. He watched from the back window, then the open door, then the grass at the foot of the steps. We could feel how much he was itching to have a go at it, even though he'd never held a lacrosse stick in his life. When we finally offered him a turn, he lit up like a Christmas tree and laughed out loud at the novelty of the play. He had a blast with it. But he didn't have the need to play as long as my sister and I did. He tried it, had fun, and then moved on to the other tasks and activities of his day.
The kids come up with something new, and we can't help but want to try it out. But with different life and developmental tasks demanding our focus and time, we don't, or at least we shouldn't, stay as obsessed with it as they are--whether the "it" is the hula hoop, skateboarding, hanging out at the mall ... or a passionate attachment to Facebook or Twitter.
Is that natural dissipation of interest coming to pass with the social networking sites, as well? Hard to say. But if Heffernan's subjects are any guide, it may be ... until, of course, the next exciting new fad, fashion, techno-gizmo, or toy comes to town.
What looks at first glance like an opening up of possibilities is actually an attack on the human imagination.
You might not like what I’m about to say about the multiverse. But don’t worry; you’ve already had your revenge. If there are an infinite number of parallel universes, there will be any number of terrible dictatorships, places where life has become very difficult for people who like to string words together. Somewhere out there, there’s a society in which every desperate little essay like this one comes with a tiny, unremarkable button: push it, and the author will be immediately electrocuted to death.
Maybe your hate is more visceral—you already know I’ll die some day, but you want to see it happen; you need to see me groveling. You can if you want. Fly upwards from the plane of our solar system, keep on going, through the endless huddles of galaxies, never forgetting your purpose, until space and time run out altogether. Eventually you’ll find yourself in another universe, on a damp patch of grass and broken concrete, unwatched by whatever local gang or galactic empire rules the city rising in foggy shapes beyond the marshes. There, you’ll see a creature strangely similar to yourself, beating me to death with whatever bits of scrap are lying around.
Hillary Clinton has her problems, but Donald Trump is unfit for the presidency.
On one hand, there’s former Secretary of State Hillary Clinton, who oversaw “grossly inadequate” security at a diplomatic facility in Benghazi, Libya, the site of a deadly September 11, 2012, terrorist attack.
As pay TV slowly declines, cable news faces a demographic cliff. And nobody has further to fall than the merchant of right-wing outrage.
Updated at 12:05 p.m.
October 7, 2016, will be the 20th birthday of the Fox News Channel, and at the moment, the network is experiencing the soap-operatic highs and lows typical of any teenager on television. In many ways, the summer of 2016 may go down in Fox News history as the company’s nadir. Its founder and leader Roger Ailes has been dishonorably dispatched, the remaining executives are dealing with a flurry of sexual harassment lawsuits, and one of its most public faces, Sean Hannity, has ignominiously remodeled himself as a gutless Trump whisperer.
And yet Fox News’ fortunes are ascendant, at least in the most quantifiable sense. The network’s annual profit in 2015 soared by about 20 percent. For the first time ever, Fox News has been the most-watched cable network among both primetime and daytime viewers for several months, with a larger audience than its nominal rivals, CNN and MSNBC, combined. Led by “The O'Reilly Factor,” Fox News doesn’t just have the best-rated news show on cable television; according to The Wrap, it has the 13 best-rated news shows on cable television.
Why did the company trend a false article about Megyn Kelly?
Oh, Facebook. Just when the company seems to have avoided the responsibility of being a news organization (and all the attendant controversy), it finds itself back in the editorial muck.
Last week, Facebook made a surprise overhaul of its “Trending Stories” feature, the sidebar that highlights some of the most popular news stories on Facebook. Where the company had previously provided a short, human-written summary of the news at hand, it now only described the story in a one or two-word phrase: “#Toyko2020: Japanese Prime Minister Appears in Surprise Performance During Rio Ceremony,” became just “#Tokyo2020.”
Facebook’s decision to simplify the feature seemed like an attempt to wriggle out of editorial responsibility: What had been a messy human-led process would now become an algorithm-guided one. The company also laid off the 26 employees who had run the feature—19 curators and seven copyeditors—with little warning on Friday, according to Quartz.
In the primaries, he avoided policy debates by promising to build a wall—but the general election is forcing him into specifics.
The biggest political story of the last week has been Donald Trump’s flip-flop on deporting undocumented immigrants. This Sunday on CNN, Mike Pence filibustered his way through the subject for almost seven minutes before Jake Tapper finally declared, “You did not address the issue” and moved on. Chris Christie on ABC and Kellyanne Conway on CBS were no more coherent. The Daily Beast summed up the morning with the headline, “Immigration Flip-Flop Leaves Trump Campaign Flailing on Sunday Shows.”
But focusing on Trump’s “flip-flop” misses the point. Trump’s real problem isn’t that he’s changed his position on immigration. It’s that he’s trying to formulate one at all.
What the commentary of the last few days has generally overlooked is that while immigration was key to Trump’s success in the Republican primary, Trump never actually offered an immigration policy. To the contrary, his success rested in large measure on his ability to avoid one. Trump’s strategy on immigration, as on other key issues, was to cut through the Gordian knot of public policy with aggressive, quick fix solutions. Terrorism? Ban Muslims. ISIS? Bomb the hell out of them and take their oil. Loss of manufacturing jobs? Slap massive tariffs on companies that outsource American jobs.
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
Marketing ditties once had a distinctive, hokey sound, but today’s advertisers have ditched them for standard pop songs.
Most Americans can recite their share of jingles. Perhaps they can’t remember their partner’s cell phone number, but they know every digit required to reach Empire carpet. Or every word of “I’m a Toys ‘R Us Kid.” Or that the best part of waking up is Folgers in their cup.
And yet, despite its effectiveness, the jingle has become a relic of the mid-20th-century commercials it once dominated. Today’s pop songs and yesterday’s classics have effectively replaced the jingle: A Kanye West song plays in an ad for Bud Light Platinum, Lady Gaga’s “Applause” is a party anthem for the Kia Soul’s spokeshamsters, and a Bob Dylan track helps advertise Victoria’s Secret. Amid all this, Oscar Mayer decided to retire two of the most popular jingles of all time, “My Bologna Has a First Name” and “I Wish I Was an Oscar Mayer Weiner.” In 2010, the company announced a new ad campaign, sans the old tunes. “What we did not want to do was write jingles,” an ad exec told The New York Times.
What to do if you’re a Hillary fan seated next to a Trump supporter at a wedding
When America is finally great again, they’ll make the latte with soy milk like you asked.
All those political cracks, not to mention earnest proclamations, mean that for the next 10 weeks, many casual interactions run the risk of erupting into full-blown partisan warfare. It’s more of a danger for those with family members or close friends who support opposing candidates and views. But on Facebook, hot-button scuffles can break out between almost anyone. (I recently witnessed a college friend who lives in Europe arguing about gun rights with a random guy from my high school in Texas, whom I myself have spoken with only a few times in person.)
One reason Americans find the other side’s views so inflammatory is that increasingly, they view their political party as more of a tribe than a checkbox. “People start seeing themselves or their political views as the main representation of their values, and what is right and wrong,” said Emanuel Maidenberg, a clinical professor of psychiatry and biobehavioral sciences at UCLA.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Economics hasn’t been able to explain irrational choices. Can neuroscience?
Humans often make bad decisions. If you like Snickers more than Milky Way, it seems obvious which candy bar you’d pick, given a choice of the two. Traditional economic models follow this logical intuition, suggesting that people assign a value to each choice—say, Snickers: 10, Milky Way: 5—and select the top scorer. But our decision-making system is subject to glitches.
In one recent experiment, Paul Glimcher, a neuroscientist at New York University, and collaborators asked people to choose among a variety of candy bars, including their favorite—say, a Snickers. If offered a Snickers, a Milky Way and an Almond Joy, participants would always choose the Snickers. But if they were offered 20 candy bars, including a Snickers, the choice became less clear. They would sometimes pick something other than the Snickers, even though it was still their favorite. When Glimcher would remove all the choices except the Snickers and the selected candy, participants would wonder why they hadn’t chosen their favorite.