My faith in adult society got a little boost the past weekend when I read that a growing number of people are becoming disillusioned with Facebook and are discontinuing their affiliation, or at least their frequent visits, to the site.
Not that Facebook, or its conceptual offspring Twitter, are in any immediate danger of extinction. The numbers of both networks are still climbing. But as Virginia Heffernan reported in the Sunday Times Magazine, there's a growing number of people who are becoming disenchanted with Facebook--and in some cases the whole idea of the Facebook--for a number of reasons.
For some, it's concerns about privacy. Facebook isn't just a friendly neighborhood park; the company profits from the information it collects on users. There were many who objected, in early 2008, to the fact that the site was holding onto profile information even when people closed down their accounts. Not to mention the "oops" when Facebook decided to let everyone in a user's circle know about other internet purchases a user made. There were also some who turned away after the kerfluffle over Facebook's assertion, last February, that it owned the copyright to all content on the site, and some who object to having their personal activity so closely monitored by some large, unseen entity.
But what intrigued me about the group Heffernan interviewed was the number who were simply tiring of checking in on other people's lives all the time, investing in connections that felt more like stalking or distant newsletters instead of direct one-on-one friendship, and a growing unease about how they're spending, or wasting, their time.
I find these growing sentiments reassuring because of an assessment a friend of mine made last spring about the social-network frenzy of Facebook and Twitter. A friend, it should be noted, whose entire job revolves around the development of new technology in Silicon Valley. But both of those technologies, he said, were really geared toward the needs and interests of teenagers and young people. Twitter, after all, evolved from cell phone texting, which nobody does anywhere near as impressively, or frequently, as the under-20 crowd. And Facebook was started by college students as a kind of snide "pig book" to put various students' photos together and allow people to weigh in on who was "hotter." It evolved into a college networking site, and expanded from there. But, still.
The tasks that Facebook and Twitter enhance ... staying connected with as large a group as possible, staying up-to-the-minute informed about what everyone in the social world you care about is doing, and in the process keeping track of where you fit in the social hierarchy of it all ... have been a primary focus of teenagers since time immemorial. Forty years ago, there were gossip cliques by the school lockers and fights over who got to use the family phone to keep up with the latest social status news. All Facebook and Twitter do is give teenagers additional tools to accomplish one of their prime developmental tasks: figuring out how to define themselves in relation to, and as distinct from, the rest of their peers, and exploring a wide variety of social connections within that group.
So in that context, texting, Facebook and Twitter are all terrific developments that, among other things, certainly free up the family phone. The puzzling thing is why they've been so popular among people who are supposed to be a bit beyond that stage. At some point in our development, we're supposed to let go of that obsessive focus on what everyone else is doing in order to focus on our own work and achievements. We're supposed to mature into valuing fewer but more meaningful friendships over the herd social groups we favored as teenagers. And hopefully, we're supposed to get busy enough with more significant contributions to family, community and the world to either care about, or have time for, the movements and chatter of people we're not that deeply connected to. As free time becomes more limited, choices have to be made. And there's a trade-off: to go deep, you can't go as broad.
There are certainly valuable uses for Facebook, even in the 30-something and beyond set. Most of my friends who have teenagers have joined so they have a better awareness of the technology and world their children are experiencing ... and to help them keep track of what's going on in their children's lives. And for older people who can't get out as much, social networking sites offer a way to stay connected with the world, and to keep loneliness at bay. Not to mention their appeal to marketers, who see a way to reach large groups of people (and especially the all-important young demographic) with a sales message in a fairly easy manner.
So the sites have their uses. But using them to compensate for the loneliness of old age, track your kids, or sell a product, is different than being giddy about them--or being addicted to them--for their own sake. And that's the part that's perplexed me about their growing use and popularity among the over-30 set. When teenagers are texting or twittering inane comments during class, they're being difficult, but age-appropriate. When Senators are twittering inane comments during major policy speeches, there's something slightly askew.
But perhaps the fascination with both sites is just a product of our innately curious and exploratory natures. When my sister and I, at ages 15 and 17, bought lacrosse sticks (boys', because we couldn't locate girls'), I remember the way my dad was drawn almost irresistibly toward the back yard where we were trying them out. He watched from the back window, then the open door, then the grass at the foot of the steps. We could feel how much he was itching to have a go at it, even though he'd never held a lacrosse stick in his life. When we finally offered him a turn, he lit up like a Christmas tree and laughed out loud at the novelty of the play. He had a blast with it. But he didn't have the need to play as long as my sister and I did. He tried it, had fun, and then moved on to the other tasks and activities of his day.
The kids come up with something new, and we can't help but want to try it out. But with different life and developmental tasks demanding our focus and time, we don't, or at least we shouldn't, stay as obsessed with it as they are--whether the "it" is the hula hoop, skateboarding, hanging out at the mall ... or a passionate attachment to Facebook or Twitter.
Is that natural dissipation of interest coming to pass with the social networking sites, as well? Hard to say. But if Heffernan's subjects are any guide, it may be ... until, of course, the next exciting new fad, fashion, techno-gizmo, or toy comes to town.
There are two types of people in the world: those with hundreds of unread messages, and those who can’t relax until their inboxes are cleared out.
For some, it’s a spider. For others, it’s an unexpected run-in with an ex. But for me, discomfort is a dot with a number in it: 1,328 unread-message notifications? I just can’t fathom how anyone lives like that.
How is it that some people remain calm as unread messages trickle into their inboxes and then roost there unattended, while others can’t sit still knowing that there are bolded-black emails and red-dotted Slack messages? I may operate toward the extreme end of compulsive notification-eliminators, but surveys suggest I’m not alone: One 2012 study found that 70 percent of work emails were attended to within six seconds of their arrival.
This has led me to a theory that there are two types of emailers in the world: Those who can comfortably ignore unread notifications, and those who feel the need to take action immediately.
The plight of non-tenured professors is widely known, but what about the impact they have on the students they’re hired to instruct?
Imagine meeting your English professor by the trunk of her car for office hours, where she doles out information like a taco vendor in a food truck. Or getting an e-mail error message when you write your former biology professor asking for a recommendation because she is no longer employed at the same college. Or attending an afternoon lecture in which your anthropology professor seems a little distracted because he doesn’t have enough money for bus fare. This is an increasingly widespread reality of college education.
Many students—and parents who foot the bills—may assume that all college professors are adequately compensated professionals with a distinct arrangement in which they have a job for life. In actuality those are just tenured professors, who represent less than a quarter of all college faculty. Odds are that students will be taught by professors with less job security and lower pay than those tenured employees, which research shows results in diminished services for students.
In any case, people have probably heard the phrase in reference to something gone awry at work or in life. In either setting, when the shit does hit the fan, people will tend to look to the most competent person in the room to take over.
And too bad for that person. A new paper by a team of researchers from Duke University, University of Georgia, and University of Colorado looks at not only how extremely competent people are treated by their co-workers and peers, but how those people feel when, at crucial moments, everyone turns to them. They find that responsible employees are not terribly pleased about this dynamic either.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
Science: Humblebragging doesn’t work. If you want to brag, just brag. Even better, just complain.
"Nothing is more deceitful," said Darcy, "than the appearance of humility. It is often only carelessness of opinion, and sometimes an indirect boast." - Jane Austen
Praise and sympathy: They are two of life’s essentials, the oxygen and carbon dioxide of social interaction. The first is most directly elicited by bragging, and the second, by complaining. The humblebrag—e.g. I’m exhausted from Memorial Day weekend; it’s soooo hard to get out of Nantucket—sits at the center of these competing needs. It is a boast in sheepish clothing, kvelling dressed in kvetch. And, like nearly all forms of multi-tasking, the drive to satisfy two goals at once typically results in double-failure.
The military origins of wearable tech, a century before the Apple Watch
On July 9, 1916, The New York Timespuzzled over a fashion trend: Europeans were starting to wear bracelets with clocks on them. Time had migrated to the human wrist, and the development required some explaining.
“Until recently,” the paper observed, “the bracelet watch has been looked upon by Americans as more or less of a joke. Vaudeville artists and moving-picture actors have utilized it as a funmaker, as a ‘silly ass’ fad.”
But the wristwatch was a “silly-ass fad” no more. “The telephone and signal service, which play important parts in modern warfare, have made the wearing of watches by soldiers obligatory,” the Times observed, two years into World War I. “The only practical way in which they can wear them is on the wrist, where the time can be ascertained readily, an impossibility with the old style pocket watch.” Improvements in communications technologies had enabled militaries to more precisely coordinate their maneuvers, and coordination required soldiers to discern the time at a glance. Rifling through your pocket for a watch was not advisable in the chaos of the trenches.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Along with the Nancy Drew series, almost all of the thrillers in the popular teenage franchise were produced by ghostwriters, thanks to a business model that proved to be prescient.
In the opening pages of a recent installment of the children’s book series The Hardy Boys, black smoke drifts though the ruined suburb of Bayport. The town's residents, dressed in tatters and smeared with ash, stumble past the local pharmacy and diner. Shards of glass litter the sidewalk. “Unreal,” says the mystery-solving teenager Joe Hardy—and he's right. Joe and his brother Frank are on a film set, and the people staggering through the scene are actors dressed as zombies. But as is always the case with Hardy Boysbooks, something still isn’t quite right: This time, malfunctioning sets nearly kill several actors, and the brothers find themselves in the middle of yet another mystery.
For many intellectually and developmentally disabled people, large campuses or farmsteads may be better options than small group homes. But new state laws could make it hard for big facilities to survive.
In December 2014, I watched 24-year-old Andrew Parles fit wood shapes into a simple puzzle in the new vocational building at the Bancroft Lakeside Campus, a residential program in New Jersey that serves 47 adults with autism and intellectual disabilities. The task wasn’t challenging for Andrew, but his team was taking it slow: Andrew was still recovering from surgery after detaching his own retinas through years of self-injurious behavior. A staff member stood nearby—not hovering, exactly, but close enough to intervene if Andrew suddenly started to hit himself in the head. His mother, Lisa, was hopeful that he’d soon able to participate in the programs he had enjoyed before his surgery: working in Lakeside’s greenhouse, painting in the art studio, delivering food for Meals on Wheels.
Orr: “It’s a pleasure to meet you, Your Grace. My name is Tyrion Lannister.”
At last! I know I speak for quite a few book readers when I say that pretty much the only thing that kept me going through the eleventy thousand discursive, digressive pages of George R. R. Martin’s fifth tome, A Dance With Dragons, was the promise of Tyrion finally meeting up with Daenerys Targaryen. And, of course, after eleventy thousand pages, it never happened. So on behalf of myself and everyone else who sacrificed sleep, work, family, and friends waiting for this moment, let me say thank you, David Benioff and D. B. Weiss. Bonus points for what seemed to be a cameo by Strong Belwas (a book character who was written out of the show) as the nameless fighter who freed Tyrion from his chains.