To hear the subjects talk about it, you'd think they'd discovered a new law of physics. Shocking as it might seem, it turns out that if you're not live-blogging or tweeting or reporting on an event as it unfolds--and you know that nobody is going to quote you or write about the event after the fact--you act and experience an event quite differently. "You actually listen to the conversation, not just wait for your turn to speak," marveled a blogger who, the New York Times reported in a recent article, has begun organizing strictly off-the-record gatherings in New York.
The difference between reporting an event (or performing for reporters) and simply experiencing an event or conversation is one journalists learn very early. Twenty years ago, when I got my first job as an aviation journalist and was assigned to cover air shows, my pilot friends were green with envy. I was going to get paid for what they'd do for free. But reporting on an event is very, very different from simply experiencing it. It turns you from a participant into an observer. You have to step back and gauge what facts are the most important to gather; what story line you're going to pursue. The same is true for photographers. The eye of a photographer is an analytical one; judging the best angle, best light, best focal length in a world reduced to whatever narrow slice is visible through a view-finder. Real-time impressions and emotion are sacrificed for a lasting, illustrative image. There is a cost to recording an event; a cost paid in removal from full immersion or enjoyment of the moment as it actually happens.
By the same token, there's a night-and-day difference between a public versus a private conversation. Any journalist, spokesperson or politician could tell you that. So could anyone who has spent much time watching cable television pundits talk past each other in mind-numbing diatribes sparked by the presence of cameras and their attendant promise of publicity and notoriety.
But it used to be that only certain professionals understood, or had to struggle with, the downsides of either recording events, or being the subject of a recorded event. Now, almost everyone is getting a taste of it, thanks to the prolific spread of blogging, Facebook, twitter and other mass publication vehicles. And--thanks God, as my Italian neighbors would say--some of them are beginning to realize that the sword has two sides. That perhaps not everything has to be, or even should be, transmitted instantly into the universal, public realm.
There are many things to applaud about a world where more people can have voices and communication among groups is easier. Twitter proved an invaluable tool in aiding the protesters in Iran with regards to public gatherings and police movements. And the mass proliferation of blogs reminds me, at least in some ways, of the dawning of the cable television era, when niche groups suddenly found programming targeted specifically to them.
There's also nothing inherently evil about any of the new communication methods or technologies. But most advances come with some kind of tradeoff, and anything used to excess begins to be problematic--as some people are beginning to find out with an all-shared, all-the-time, lifestyle. One clear issue is preservation of privacy--especially for those who didn't volunteer to be part of a global chat-room. But living a reported life 24 hours a day also has a cost, not only in terms of the type and quality of the interactions and conversations it allows, but also in terms of how present the reporters are in any given moment.
It's possible to both experience and report an event, but not instantaneously or simultaneously. When I had an assignment to fly a a U-2 spy plane last fall, high enough to see the curvature of the earth, I got so preoccupied with taking photos and notes that I realized, part-way through the flight, that I wasn't actually experiencing any of it with any real depth. And to write anything of substance, I needed to first experience something of substance. So I turned off my intercom microphone, put the camera down, and just sat for a while. Looked out the window. Focused on what my senses were experiencing. Let my mind wander and my eyes drink in my surroundings. And in the richness of that silence, impressions softly bloomed. Of how fragile the world's atmosphere appeared. How being that high above the earth felt as if we were surreptitious invaders at the edge of a foreign realm ruled by powerful titans who needed no heat, air pressure or oxygen to survive. Of how lonely even a beautiful planet would be without anyone to welcome you home again.
And even those thoughts, put so cleanly into words here, took time to process and ferment, once I came back to earth.
New technology is all well and good. But the fact remains that it's tough to talk and listen at the same time, or be connected to an outside audience (even that of posterity) and still be fully immersed in the place, time, and dynamic of where you are. And that's a long-standing law of physics--or at least of human neuroscience and psychology--that I doubt any new technology is likely to overcome.
Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.
During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.
The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.
How Vladimir Putin is making the world safe for autocracy
Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.
Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
Why extreme wealth makes it hard for people to do better than their parents did.
The numbers are sobering: People born in the 1940s had a 92 percent chance of earning more than their parents did at age 30. For people born in the 1980s, by contrast, the chances were just 50-50.
The finding comes from a new paper out of The Equality of Opportunity Project, a joint research effort of Harvard and Stanford led by the economist Raj Chetty. The paper puts numbers on what many have seen firsthand for years: The American dream—the ability to climb the economic ladder and achieve more than one’s parents did—is less and less a reality with every decade that goes by.
There are two main reasons why today’s 30-somethings have a harder time than their parents did, according to the authors. First, the expansion of the gross domestic product has slowed since the 1950s, when growth was frequently above 5 percent a quarter. That means the economic pie is growing at a slower rate than it once did, so there’s less to go around. Second, the distribution of that growth is more unequal, and more benefits are accruing to those at the top. Those at the bottom, on the other hand, are not able to achieve as big a share as they once did. Their wages are not growing, so they are stuck at the same level as, or below, their parents. “Because incomes have been stagnant for a relatively large proportion of society, it’s harder for people who stay within that chunk to beat their parents in absolute terms,” Robert Manduca, one of the paper’s co-authors, told me.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.
Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.
The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.
A chain helmed by the nominee for labor secretary has unseated Chick-Fil-A as the perfect encapsulation of this cultural moment.
Despite his predilections for KFC or taco bowls, or his appearances in ads for Pizza Hut and McDonald’s, the president-elect is really a Carl’s Jr. kind of guy. The California-based chain is best known for its oversized burgers, hypersexualized ads, and confusing affiliation with Hardee’s—the fast-food chain it acquired back in 1997. Like Trump, Carl’s Jr. aspires to flashiness and brashly appeals to men. It’s slogan? Eat Like You Mean It. Trump made this unspoken kinship official on Thursday, when he announced Andy Puzder, the longtime CEO of Carl’s Jr and Hardee’s, as his choice for labor secretary.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.
Updated at 2:20 p.m.
President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.
Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.
Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”
Civic participation offers a way out of the 2016 doldrums.
For anyone still in a post-election stupor, unsure what to do or how to repair our ailing democracy, here are three words of advice:
Start a club.
I don’t mean that sarcastically, as in, “Oh, you got a beef with Trump or the rest of them in Washington? Well, join the club!” I mean it literally. Make a group. Invite people. Create rules and rituals. Establish goals. Meet regularly. In short: Start a club.
This is the great democratic self-cure sitting right before our eyes. I was reminded of this immediately after the election, when so many people I knew were in states of shock or despondence. At Citizen University, the nonprofit I run, my colleagues and I decided that doing something was better than doing nothing. We accelerated plans for a project called Civic Saturday, which we’d been intending to launch in the new year but instead launched four days after Donald Trump was elected president.