One of the great mysteries of the Toyota debacle is why Toyota ignored the complaints for so long. Or at least it's a mystery to reporters on cable news, abetted by consumer advocates who were all too happy to imply that Toyota didn't care how many people it killed as long as they made a profit.
Maybe so, but I doubt it; you don't usually make a profit by killing your customers. It's too risky, in this age of nosy regulators and angry consumer activists.
Their behavior becomes a bit more explicable when you consider this argument from Ted Frank:
The Los Angeles Times recently did a story detailing all of the NHTSA reports of Toyota "sudden acceleration" fatalities, and, though the Times did not mention it, the ages of the drivers involved were striking.
In the 24 cases where driver age was reported or readily inferred, the drivers included those of the ages 60, 61, 63, 66, 68, 71, 72, 72, 77, 79, 83, 85, 89--and I'm leaving out the son whose age wasn't identified, but whose 94-year-old father died as a passenger.
These "electronic defects" apparently discriminate against the elderly, just as the sudden acceleration of Audis and GM autos did before them. (If computers are going to discriminate against anyone, they should be picking on the young, who are more likely to take up arms against the rise of the machines and future Terminators).
In the original Sudden Acceleration Incident craze that afflicted America in the late eighties, the National Highway Safety Transportation Administration eventually ruled that the problem was "pedal misapplication", aka stepping on the gas when you meant to step on the brake. These incidents were highly correlated with three things: being elderly, being short, and parking (or leaving a parking space). The elderly are more prone to the sort of neuronal misfiring described in yesterday's New York Times. Shorter people have to hunt more for the pedals. And starting up from a complete stop is the most likely time to press the wrong pedal.
I was interested in Frank's argument, so I took a look at the LA Times article, which is really admirably thorough. Here are the results, categorized into a nifty, though not necessarily particularly useful, spreadsheet. I went one further than Frank, tracking down the ages of all but a couple of the named drivers. If y'all wondered why I wasn't blogging today, well, there's your answer. I've excluded three cases where the information was just too sparse to have any idea what happened, but otherwise, that's the complete list.
Several things are striking. First, the age distribution really is extremely skewed. The overwhelming majority are over 55.
Here's what else you notice: a slight majority of the incidents involved someone either parking, pulling out of a parking space, in stop and go traffic, at a light or stop sign . . . in other words, probably starting up from a complete stop.
In many of the other cases, we don't really know what happened, because there were no witnesses of exactly when the car started to run away.
In fact, it's a little hard to be sure that some of the cases were sudden acceleration incidents, because the witnesses to what happened in the car were all killed; the family is trying to reconstruct what happened from their knowledge of the deceased. Obviously, most people are going to err on the side of believing that the car was at fault, rather than a beloved relative.
Further complicating matters, most of the cases involve either a lawsuit against Toyota, a complainant facing possible criminal charges, or both.
In some of the cases, the police or doctors have an alternate theory of what happened: one of the SAIs was bipolar, which puts you at extraordinarily high risk of suicide, and no one knows what actually happened in the car. At least two others involve young men who were driving at very high speed, which is something that young men tend to do with or without a sticky accelerator. Several more of the drivers seem to have had a medical situation, like a stroke, to which doctors and/or police attribute the acceleration.
The oddest "striking" fact is that a disproportionate number seem to be immigrants--something like a third, by my count, which is about double the number of immigrants in the general population. I have no idea what to make of that; are they more likely to file complaints with the NHTSA? Maybe they're shorter, on average, or learned to drive later in life? Or perhaps it's just a statistical fluke.
At any rate, when you look at these incidents all together, it's pretty clear why Toyota didn't investigate this "overwhelming evidence" of a problem: they look a lot like typical cases of driver error. I don't know that all of them are. But I do know that however advanced Toyota's electronics are, they're not yet clever enough to be able to pick on senior citizens.
Unfortunately, that won't help Toyota much. It will still face a wave of lawsuits, and all the negative publicity means that it may be hard for the company to get a fair trial. Even if it does, the verdict in the court of public opinion will still hurt their sales for some time to come.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Sturm and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Universities themselves may be contributing to burnout.
With half of all doctoral students leaving graduate school without finishing, something significant and overwhelming must be happening for at least some of them during the process of obtaining that degree. Mental illness is often offered as the standard rationale to explain why some graduate students burn out. Some research has suggested a link between intelligence and conditions such as bipolar disorder, leading some observers to believe many graduate students struggle with mental-health problems that predispose them to burning out.
But such research is debatable, and surely not every student who drops out has a history of mental illness. So, what compels students to abandon their path to a Ph.D.? Could there be other underlying factors, perhaps environmental, that can cause an otherwise-mentally-healthy graduate student to become anxious, depressed, suicidal, or, in rare cases, violent?
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Switching pastas and breads is a small decision that could save lives.
Multigrain is a genius approach to selling both white bread and righteousness. The term crept under the umbrella of health quietly. It wasn’t clear why, exactly. (The grain part? Or the multi?) At least it wasn’t white bread, right?
As many eaters of bread came to understand that white bread is a nutritional equivalent of Pixy Stix—the nutritious, fibrous shell of the wheat having been removed, leaving us with only the inner starch, which our bodies almost instantly turn into sugar—it needed some rebranding.
Multigrain is now often used to imply wholesomeness, a virtue to which it often has no claim. Containing the flour of multiple grains does not mean containing whole grains. When millers leave the grain intact before milling, this is whole grain flour. It contains fiber, appeasing the pancreas and microbes that demand it for optimal performance. So, the term to look for is 100 percent whole wheat. (Or wholegrain, though the grain is usually wheat.)
Comedy-drama series like Fleabag and Transparent show how vulnerability is as important as unlikeability and strength when it comes to portraying fictional women.
In the first episode of the HBO series Enlightened, the show’s heroine, Amy Jellicoe, learns that she’s been fired. She does not take the news well. Within minutes, she goes from pitiable victim, sobbing abjectly in a bathroom stall, to mascara-streaked fury. “Go back to your sad, fucking, little desk,” she sneers at her assistant before tracking her ex-lover and presumed betrayer to the office lobby. “I will destroy you—I will bury you—I will kill you, motherfucker!” she screams at him through the elevator doors that she somehow, in a feat of desperation, manages to pry open.
Though the scene aired five years ago, it’s still a pretty radical few minutes of television, and not just because of the ferocity of Laura Dern’s performance. What feels most striking is the series’ willingness to dramatize an extended scene of female distressfor something other than a moralizing end. In this sense, Enlightened anticipates the Amazon series, Fleabag, which evinces a similar empathy toward a female character in the grip of powerfully negative emotions: anger, sadness, grief, self-doubt, shame. It’s probably no accident the two shows have almost identical promotional stills—close-ups of their protagonist’s makeup-smudged faces, staring directly to camera. Like a number of other female-centric, female-created tragicomedies to have emerged on TV in recent years—Transparent, Girls, Catastrophe, Insecure—the series also share a commitment to more compassionate portrayals of dysfunctional heroines, by suspending judgment even (or especially) when they’re at their worst.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
A century ago, millions of Americans banded together in defense of white, Christian America and traditional morality—and most of their compatriots turned a blind eye to the Ku Klux Klan.
On August 8, 1925, more than 50,000 members of the Ku Klux Klan paraded through Washington, D.C. Some walked in lines as wide as 20 abreast, while others created formations of the letter K or a Christian cross. A few rode on horseback. Many held American flags. Men and women alike, the marchers carried banners emblazoned with the names of their home states or local chapters, and their procession lasted for more than three hours down a Pennsylvania Avenue lined with spectators. National leaders of the organization were resplendent in colorful satin robes and the rank and file wore white, their regalia adorned with a circular red patch containing a cross with a drop of blood at its center.
Nearly all of the marchers wore pointed hoods, but their faces were clearly visible. In part, that was because officials would sanction the parade only if participants agreed to walk unmasked. But a mask was not really necessary, as most members of the Klan saw little reason to hide their faces. After all, there were millions of them in the United States.