My faith in adult society got a little boost the past weekend when I read that a growing number of people are becoming disillusioned with Facebook and are discontinuing their affiliation, or at least their frequent visits, to the site.
Not that Facebook, or its conceptual offspring Twitter, are in any immediate danger of extinction. The numbers of both networks are still climbing. But as Virginia Heffernan reported in the Sunday Times Magazine, there's a growing number of people who are becoming disenchanted with Facebook--and in some cases the whole idea of the Facebook--for a number of reasons.
For some, it's concerns about privacy. Facebook isn't just a friendly neighborhood park; the company profits from the information it collects on users. There were many who objected, in early 2008, to the fact that the site was holding onto profile information even when people closed down their accounts. Not to mention the "oops" when Facebook decided to let everyone in a user's circle know about other internet purchases a user made. There were also some who turned away after the kerfluffle over Facebook's assertion, last February, that it owned the copyright to all content on the site, and some who object to having their personal activity so closely monitored by some large, unseen entity.
But what intrigued me about the group Heffernan interviewed was the number who were simply tiring of checking in on other people's lives all the time, investing in connections that felt more like stalking or distant newsletters instead of direct one-on-one friendship, and a growing unease about how they're spending, or wasting, their time.
I find these growing sentiments reassuring because of an assessment a friend of mine made last spring about the social-network frenzy of Facebook and Twitter. A friend, it should be noted, whose entire job revolves around the development of new technology in Silicon Valley. But both of those technologies, he said, were really geared toward the needs and interests of teenagers and young people. Twitter, after all, evolved from cell phone texting, which nobody does anywhere near as impressively, or frequently, as the under-20 crowd. And Facebook was started by college students as a kind of snide "pig book" to put various students' photos together and allow people to weigh in on who was "hotter." It evolved into a college networking site, and expanded from there. But, still.
The tasks that Facebook and Twitter enhance ... staying connected with as large a group as possible, staying up-to-the-minute informed about what everyone in the social world you care about is doing, and in the process keeping track of where you fit in the social hierarchy of it all ... have been a primary focus of teenagers since time immemorial. Forty years ago, there were gossip cliques by the school lockers and fights over who got to use the family phone to keep up with the latest social status news. All Facebook and Twitter do is give teenagers additional tools to accomplish one of their prime developmental tasks: figuring out how to define themselves in relation to, and as distinct from, the rest of their peers, and exploring a wide variety of social connections within that group.
So in that context, texting, Facebook and Twitter are all terrific developments that, among other things, certainly free up the family phone. The puzzling thing is why they've been so popular among people who are supposed to be a bit beyond that stage. At some point in our development, we're supposed to let go of that obsessive focus on what everyone else is doing in order to focus on our own work and achievements. We're supposed to mature into valuing fewer but more meaningful friendships over the herd social groups we favored as teenagers. And hopefully, we're supposed to get busy enough with more significant contributions to family, community and the world to either care about, or have time for, the movements and chatter of people we're not that deeply connected to. As free time becomes more limited, choices have to be made. And there's a trade-off: to go deep, you can't go as broad.
There are certainly valuable uses for Facebook, even in the 30-something and beyond set. Most of my friends who have teenagers have joined so they have a better awareness of the technology and world their children are experiencing ... and to help them keep track of what's going on in their children's lives. And for older people who can't get out as much, social networking sites offer a way to stay connected with the world, and to keep loneliness at bay. Not to mention their appeal to marketers, who see a way to reach large groups of people (and especially the all-important young demographic) with a sales message in a fairly easy manner.
So the sites have their uses. But using them to compensate for the loneliness of old age, track your kids, or sell a product, is different than being giddy about them--or being addicted to them--for their own sake. And that's the part that's perplexed me about their growing use and popularity among the over-30 set. When teenagers are texting or twittering inane comments during class, they're being difficult, but age-appropriate. When Senators are twittering inane comments during major policy speeches, there's something slightly askew.
But perhaps the fascination with both sites is just a product of our innately curious and exploratory natures. When my sister and I, at ages 15 and 17, bought lacrosse sticks (boys', because we couldn't locate girls'), I remember the way my dad was drawn almost irresistibly toward the back yard where we were trying them out. He watched from the back window, then the open door, then the grass at the foot of the steps. We could feel how much he was itching to have a go at it, even though he'd never held a lacrosse stick in his life. When we finally offered him a turn, he lit up like a Christmas tree and laughed out loud at the novelty of the play. He had a blast with it. But he didn't have the need to play as long as my sister and I did. He tried it, had fun, and then moved on to the other tasks and activities of his day.
The kids come up with something new, and we can't help but want to try it out. But with different life and developmental tasks demanding our focus and time, we don't, or at least we shouldn't, stay as obsessed with it as they are--whether the "it" is the hula hoop, skateboarding, hanging out at the mall ... or a passionate attachment to Facebook or Twitter.
Is that natural dissipation of interest coming to pass with the social networking sites, as well? Hard to say. But if Heffernan's subjects are any guide, it may be ... until, of course, the next exciting new fad, fashion, techno-gizmo, or toy comes to town.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Without the financial support that many white families can provide, minority young people have to continually make sacrifices that set them back.
The year after my father died, I graduated from grad school, got a new job, and looked forward to saving for a down payment on my first home, a dream I had always had, but found lofty. I pulled up a blank spreadsheet and made a line item called “House Fund.”
That same week I got a call from my mom—she was struggling to pay off my dad’s funeral expenses. I looked at my “House Fund” and sighed. Then I deleted it and typed the words “Funeral Fund” instead.
My father’s passing was unexpected. And so was the financial burden that came with it.
For many Millennials of color, these sorts of trade-offs aren’t an anomaly. During key times in their lives when they should be building assets, they’re spending money on basic necessities and often helping out family. Their financial future is a rocky one, and much of it comes down to how much—or how little—assistance they receive.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Welfare reform has driven many low-income parents to depend more heavily on family and friends for food, childcare, and cash.
Pity the married working mom, who barely has time to do the dishes or go for a run at night, much less spend a nice evening playing Boggle with her husband and kids.
But if married working parents arestruggling with time management these days, imagine the struggles of low-income single parents. Single-parent households (which by and large are headed by women) have more than tripled as a share of American householdssince 1960. Now, 35 percent of children live in single-parent households.
But while the numbers are growing, the amount of help available to single mothers is not. Ever since the 1996 Personal Responsibility and Work Opportunity Law (generally referred to as welfare reform) placed time limits and work requirements on benefits in an effort to get welfare recipients back into the workforce, single-parent families have had a harder time receiving government benefits. Some states have made it more difficult for low-income single-parent families to get other types of assistance too, such as imposingwork requirements and other barriers for food stamps. According to a recentNew York Times column, between 1983 and 2004, government benefits dropped by more than a third for the lowest-income single-parent families.
Managers who believe themselves to be fair and objective judges of ability often overlook women and minorities who are deserving of job offers and pay increases.
Americans are, compared with populations of other countries, particularly enthusiastic about the idea of meritocracy, a system that rewards merit (ability + effort) with success. Americans are more likely to believe that people are rewarded for their intelligence and skills and are less likely to believe that family wealth plays a key role in getting ahead. And Americans’ support for meritocratic principles has remained stable over the last two decades despite growing economic inequality, recessions, and the fact that there is less mobility in the United States than in most other industrialized countries.
This strong commitment to meritocratic ideals can lead to suspicion of efforts that aim to support particular demographic groups. For example, initiatives designed to recruit or provide development opportunities to under-represented groups often come under attack as “reverse discrimination.” Some companies even justify not having diversity policies by highlighting their commitment to meritocracy. If a company evaluates people on their skills, abilities, and merit, without consideration of their gender, race, sexuality etc., and managers are objective in their assessments then there is no need for diversity policies, the thinking goes.
A week after officials released a video of an officer shooting Laquan McDonald 16 times, Mayor Rahm Emanuel said the superintendent had lost the trust of the community.
It took 14 months for Chicago authorities to release the videotape of an officer killing Laquan McDonald. But now that the footage is public, events have begun to move much faster.
Mayor Rahm Emanuel fired Police Superintendent Garry McCarthy overnight, the Chicago Sun-Timesand Tribunereported. Emanuel announced the move Tuesday morning. The mayor had previously scheduled the press conference to announce the creation of a task force on police accountability.
McCarthy’s professional demise seemed pre-ordained by Tuesday. He was at the center of two raging controversies: First, of whether the police department acted improperly in investigating McDonald’s death, and second, about whether top city leaders delayed charging Officer Jason Van Dyke because of political considerations. At least one person was going to be fired, and McCarthy was first on the list.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
The competition is fierce, the key players are billionaires, but the path—and even the destination—remains uncertain.
The race to bring driverless cars to the masses is only just beginning, but already it is a fight for the ages. The competition is fierce, secretive, and elite. It pits Apple against Google against Tesla against Uber: all titans of Silicon Valley, in many ways as enigmatic as they are revered.
As these technology giants zero in on the car industry, global automakers are being forced to dramatically rethink what it means to build a vehicle for the first time in a century. Aspects of this race evoke several pivotal moments in technological history: the construction of railroads, the dawn of electric light, the birth of the automobile, the beginning of aviation. There’s no precedent for what engineers are trying to build now, and no single blueprint for how to build it.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.