After years of too little attention, the subject of head injuries in sports, and how to prevent them, is now what Twitter would call a "trending topic."
First came the turnaround in attitudes toward NFL player head injuries, and the helmet-to-helmet tackles and hits that increase the risk of those injuries. Then came the discussion about skier Lindsey Vonn's continued participation in the World Cup last week, despite clear indications and admissions on her part that she was still skiing behind the course and "in a fog" after suffering a concussion in a training accident. And now, there's the U.S. lacrosse league debating whether or not the girls -- who now only have to wear protective eye gear—should be required to wear helmets as well.
Girls' lacrosse has dramatically different rules than the boys' game: body checks are illegal, as are certain stick checks, and there is a regulated safety zone around each girl's head. Nevertheless, research quoted in a New York Timesarticle today concluded that when it comes to concussions, lacrosse ranks third in female sports (behind basketball and soccer). In addition, despite the less-aggressive nature and rules of the girls' game, girls' lacrosse has an in-game concussion rate only 15 percent lower than the boys.
So if concussions are an issue in girls' lacrosse, the argument goes, we should require girls to wear more protective headgear. After all, the boys' helmets, intended to reduce skull fracture and intracranial bleeding, are thought to reduce the number of concussions, as well.
But does the addition of extra safety gear actually reduce the risk of the injuries it is designed to prevent? Well, yes ... and no. Which is what fuels the debate on the issue.
Taken by itself, it's easy enough to prove that wearing a helmet, like wearing a seat belt, decreases the chance or severity of injury in an impact. But humans are far more complex creatures than crash test dummies. And so the true impact of safety equipment becomes far more complex, as well.
In his 1995 book Risk, British researcher John Adams spelled out several reasons why safety equipment does not always increase safety the way its designers or legislators think it will. The first is a phenomenon called "risk compensation," in which humans respond to additional safety equipment by taking greater risks than they did when they felt less protected. For example, Adams said, while seat belts unquestionably gave a person better protection if they were in a collision, the chances of being in a collision went up in places with seat belt laws, because seat-belted drivers took more risks in how they drove.
For all the time and discussion space we devote to the goal of eliminating accidents or injuries, Adams suggests that people have "risk thermostats," and that we all adjust our behavior to maintain the level of risk in our lives that we find acceptable. We all compensate for the extra margin provided by safety equipment to some degree, and some of us will push the new boundaries further than others. All of which means that safety equipment often doesn't make as much of a difference as its proponents believe it will.
Indeed, there are many who argue that mandatory helmets, and increasingly strong helmets, have actually exacerbated the problem of head injury in sports ranging from boys' lacrosse and ice hockey to professional football. So perhaps helmets for female lacrosse players really are a bad idea, as U.S. Lacrosse (the sport's governing body) argues.
So what's the solution? In many cases, improving safety has had more to do with changing a group's culture and attitudes about high-risk activities than it does any specific technological advance -- especially in individual sports or hobbies.
A prominent example is the Cirrus Design company (a company profiled by James Fallows in his Atlanticarticle and subsequent bookFree Flight). In an effort to build a safer aircraft, Cirrus included a full-airplane parachute and vastly improved "glass" cockpit displays in its Cirrus airplane. But when the airplane was first introduced, it actually had a significantly higher-than-average fatality rate, because pilots -- comforted by the extra technology and safety systems -- "compensated" by pushing the aircraft into weather they wouldn't otherwise have undertaken. In the end, the company was able to bring its accident rates down by requiring additional training and working to change the culture of its buyers—at least to some degree.
The field of SCUBA diving also vastly reduced its accident rate over several decades by changing its group attitudes toward risk. Once upon a time, diving was a macho sport where the toughest regularly pushed the limits. Today, attitudes about pushing the limits have changed. Dive without a buddy, push your depth or time limits, and a diver today is likely to be seen as stupid, not brave.
Notably, the NFL is now taking a similar approach toward head injuries. Instead of simply improving the cushioning in players' helmets, the NFL is trying to change the league's culture, rules and consequences related to hits to the head, or tackles "leading" with a player's helmet. How well that works remains to be seen, of course. But the popular image and standard for what's "admirable" and "acceptable" in tackling technique has already changed dramatically, even in the breathtakingly short span of a single season.
But girls' lacrosse already has a restrictive set of rules regarding contact. And most of the concussions its players suffer come from accidental contact and falls, not intentionally aggressive maneuvering. So is it a different case? Could helmets actually make it safer?
"I think helmets encourage you to push the limits of whatever the rules are," one high school athlete responded, when I asked the question. "If you're only allowed one kind of hit, you'll hit as hard as you can in that one way. But given that girls' lacrosse has so many rules restricting contact, [helmets] might actually help."
Of course, given the complexities of how humans assess and respond to risk, and the fact that lacrosse players are unlikely to be timid or risk-adverse by nature, it's also a fair bet that whatever safety margin helmets provide would—at best—be narrowed by some amount by compensating behavior on the part of the players. Which means at some point in the future, U.S. Lacrosse, like Cirrus and the NFL, may find itself compensating for that compensation through more complex solutions than the seemingly-simple answer of a helmet.
Angela Merkel has served formal notice that she will lead the German wandering away from the American alliance.
Seven years after the end of the Second World War, on the 10th of March 1952, the governments of the United States, the United Kingdom, France, and the newly established Federal Republic of Germany received an astounding note from the Soviet Union.
The Soviet Union offered to withdraw the troops that then occupied eastern Germany and to end its rule over the occupied zone. Germany would be reunited under a constitution that allowed the country freedom to choose its own social system. Germany would even be allowed to rebuild its military, and all Germans except those convicted of war crimes would regain their political rights. In return, the Allied troops in western Germany would also be withdrawn—and reunited Germany would be forbidden to join the new NATO alliance.
As Republicans in Congress try to fend off the flurry of scandals, they are haunted by a question: Is this as good as it’s going to get?
The speaker of the House strode to his lectern on a recent Thursday to confront another totally normal day on Capitol Hill: health care, tax reform, a president under investigation, rumblings of impeachment.
“Morning, everybody!” Paul Ryan chirped. “Busy week!”
It was indeed: Less than a day had passed since the appointment of a special prosecutor to investigate Russia’s involvement in the presidential campaign; just a few hours since President Trump angrily tweeted that the investigation was “the single greatest witch hunt of a politician in American history!”; and only minutes since the Russia-linked former national-security adviser, Michael Flynn, had begun defying congressional subpoenas. A few days prior, the president had been accused of revealing sensitive intelligence information to the Russian foreign minister.
Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.
During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.
The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
In his new book, Ben Sasse has identified the right project for America: rehabilitating a shared moral language.
In just two short years, Senator Ben Sasse has gone from Capitol Hill newbie to digital president puncher, tweeting about Donald Trump’s affairs and the Midwestern dumpster fires he found more appealing than 2016’s Oval Office contenders.
Yet, on his breaks from Twitter, Sasse managed to craft a serious new book, The Vanishing American Adult. It advances a thesis that’s at once out of place at this political moment and almost too on-the-nose for the Trump years: He believes Americans have lost their sense of personal integrity and discipline. For the country to deal with the troubles ahead—including automation, political disengagement, and the rise of nativist, huckster politicians, he says—people must recover their sense of virtue. The republic depends on it.
In the next two months, Congress will have to raise the debt ceiling and pass a budget. GOP leaders don’t know how they’re going to do either of them.
There’s nothing that united Republicans more tightly during the Obama years than their shared criticism of all the debt that racked up under the president’s watch. They raised political hell every time Democrats needed to raise the debt ceiling, and in 2011 they brought the country to the brink of default by insisting on spending and reforms in exchange for their votes.
This year, however, it’s all on them.
Trump administration officials told lawmakers this week that the Treasury Department would need authority to issue more debt earlier than expected this year, urging Congress to act before its traditional summer recess begins in August. Republican leaders initially believed they would have until the fall before the Treasury Department exhausted the “extraordinary measures” it undertakes to buy more time, but Trump’s budget director, Mick Mulvaney, testified that tax receipts have come in slower that expected.
Facing reported financial problems and allegations of abuse, the once-bankable star now seems stuck in franchise hell with no obvious exit.
When Johnny Depp sailed onscreen in 2003’s Pirates of the Caribbean: The Curse of the Black Pearl as Captain Jack Sparrow (to this day, a memorable superhero entrance), it was his first-ever appearance in a summer blockbuster. He’d been in surprise wintertime hits (Edward Scissorhands, Sleepy Hollow), well-regarded Oscar players (Donnie Brasco, Chocolat), and, of course, many a cult classic (Fear and Loathing in Las Vegas, Ed Wood). But the idea of Depp headlining a big-budget, mainstream franchise film was alarming enough to Disney’s then-studio head Michael Eisner that he protested, on seeing early footage, that Depp was “ruining the movie!”
Fourteen years later, Disney is serving up a fifth Pirates of the Caribbean, this time subtitled Dead Men Tell No Tales, budgeted at a cool $230 million. Since bursting into international superstardom with the first Pirates, Depp has become increasingly reliant on mega-budgeted action films and broad comedies. At the same time, his public profile has collapsed after his now ex-wife Amber Heard accused him of domestic violence during their divorce, and stories emerged of the mega-budgeted lifestyle that had somehow mired Depp in deep financial trouble despite his movie earnings.
A century and a half after the Civil War, Mayor Mitch Landrieu asked his city to reexamine its past—and to wrestle with hard truths.
Mayor Mitch Landrieu of New Orleans has revived the genre of Memorial Day orations. In his widely read and re-played speech of May 19, 2017, defending his leadership of the removal of four prominent public monuments, one to Reconstruction era white supremacist violence, and the other three to Confederate leaders, Robert E. Lee, Jefferson Davis, and P. G. T. Beauregard, Landrieu eloquently tried to pull the Confederacy once and for all – at least in New Orleans – down from its pedestals. He beautifully labeled his city “a bubbling cauldron of many cultures,” expressing its ancient roots in many Native American peoples; in at least two European empires; in African, Irish, Italian, French, and many other ethnic lineages; and of course in cuisine, jazz and “second lines.” New Orleans, he said, is a city made by all the nations of the world, but one great “gumbo” made from many. The speech was as deeply patriotic as it was also deeply political—“e pluribus unum” carries a weight right now in Trump’s America that makes most politicians shy from such fulsome embraces of pluralism and brutally honest historical consciousness. Indeed, any historical consciousness, save for toxic forms of nostalgia, is out of style among Trump’s supporters as well as his cowed, silent enablers in the Republican Party.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
Colleges are adjusting to increasing contact with adults who are more ingrained in their children’s lives than ever.
Stacy G.’s daughter was having a meltdown. Her daughter, a sophomore at a prestigious private college, wanted an internship at Boston Children’s Hospital, a plum job that would look great on her applications to graduate school. After four weeks of frantically waiting for the school to arrange for an interview at the hospital, Stacy called her daughter’s adviser at the internships office to complain.
“For $65,000 [in full attendance costs], you can bet your sweet ass that I’m calling that school ... If your children aren’t getting what they’ve been promised, colleges are going to get that phone call from parents,” Stacy said. “It’s my money. It’s a lot of money. We did try to have her handle it on her own, but when it didn’t work out, I called them.”