One of the great mysteries of the Toyota debacle is why Toyota ignored the complaints for so long. Or at least it's a mystery to reporters on cable news, abetted by consumer advocates who were all too happy to imply that Toyota didn't care how many people it killed as long as they made a profit.
Maybe so, but I doubt it; you don't usually make a profit by killing your customers. It's too risky, in this age of nosy regulators and angry consumer activists.
Their behavior becomes a bit more explicable when you consider this argument from Ted Frank:
The Los Angeles Times recently did a story detailing all of the NHTSA reports of Toyota "sudden acceleration" fatalities, and, though the Times did not mention it, the ages of the drivers involved were striking.
In the 24 cases where driver age was reported or readily inferred, the drivers included those of the ages 60, 61, 63, 66, 68, 71, 72, 72, 77, 79, 83, 85, 89--and I'm leaving out the son whose age wasn't identified, but whose 94-year-old father died as a passenger.
These "electronic defects" apparently discriminate against the elderly, just as the sudden acceleration of Audis and GM autos did before them. (If computers are going to discriminate against anyone, they should be picking on the young, who are more likely to take up arms against the rise of the machines and future Terminators).
In the original Sudden Acceleration Incident craze that afflicted America in the late eighties, the National Highway Safety Transportation Administration eventually ruled that the problem was "pedal misapplication", aka stepping on the gas when you meant to step on the brake. These incidents were highly correlated with three things: being elderly, being short, and parking (or leaving a parking space). The elderly are more prone to the sort of neuronal misfiring described in yesterday's New York Times. Shorter people have to hunt more for the pedals. And starting up from a complete stop is the most likely time to press the wrong pedal.
I was interested in Frank's argument, so I took a look at the LA Times article, which is really admirably thorough. Here are the results, categorized into a nifty, though not necessarily particularly useful, spreadsheet. I went one further than Frank, tracking down the ages of all but a couple of the named drivers. If y'all wondered why I wasn't blogging today, well, there's your answer. I've excluded three cases where the information was just too sparse to have any idea what happened, but otherwise, that's the complete list.
Several things are striking. First, the age distribution really is extremely skewed. The overwhelming majority are over 55.
Here's what else you notice: a slight majority of the incidents involved someone either parking, pulling out of a parking space, in stop and go traffic, at a light or stop sign . . . in other words, probably starting up from a complete stop.
In many of the other cases, we don't really know what happened, because there were no witnesses of exactly when the car started to run away.
In fact, it's a little hard to be sure that some of the cases were sudden acceleration incidents, because the witnesses to what happened in the car were all killed; the family is trying to reconstruct what happened from their knowledge of the deceased. Obviously, most people are going to err on the side of believing that the car was at fault, rather than a beloved relative.
Further complicating matters, most of the cases involve either a lawsuit against Toyota, a complainant facing possible criminal charges, or both.
In some of the cases, the police or doctors have an alternate theory of what happened: one of the SAIs was bipolar, which puts you at extraordinarily high risk of suicide, and no one knows what actually happened in the car. At least two others involve young men who were driving at very high speed, which is something that young men tend to do with or without a sticky accelerator. Several more of the drivers seem to have had a medical situation, like a stroke, to which doctors and/or police attribute the acceleration.
The oddest "striking" fact is that a disproportionate number seem to be immigrants--something like a third, by my count, which is about double the number of immigrants in the general population. I have no idea what to make of that; are they more likely to file complaints with the NHTSA? Maybe they're shorter, on average, or learned to drive later in life? Or perhaps it's just a statistical fluke.
At any rate, when you look at these incidents all together, it's pretty clear why Toyota didn't investigate this "overwhelming evidence" of a problem: they look a lot like typical cases of driver error. I don't know that all of them are. But I do know that however advanced Toyota's electronics are, they're not yet clever enough to be able to pick on senior citizens.
Unfortunately, that won't help Toyota much. It will still face a wave of lawsuits, and all the negative publicity means that it may be hard for the company to get a fair trial. Even if it does, the verdict in the court of public opinion will still hurt their sales for some time to come.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
The city is riding high after the NBA final. But with the GOP convention looming, residents are bracing for disappointment.
Cleveland’s in a weird mood.
My son and I attended the Indians game on Father’s Day, the afternoon before game seven of the NBA Finals—which, in retrospect, now seems like it should be blockbustered simply as The Afternoon Before—when the Cavaliers would take on the Golden State Warriors and bring the city its first major-league sports championship in 52 years.
I am 52 years old. I’ve lived in Northeast Ohio all my life. I know what Cleveland feels like. And it’s not this.
In the ballpark that day, 25,269 of us sat watching a pitcher’s duel, and the place was palpably subdued. The announcer and digitized big-screen signage made no acknowledgement of the city’s excitement over the Cavaliers. There were no chants of “Let’s Go Cavs,” no special seventh-inning-stretch cheer for the Indians’ basketball brothers, who play next door in the Quicken Loans Arena, which in a few weeks will host the Republican National Convention.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The Republican candidate is deeply unpopular, and his Democratic rival is promoting her own version of American nationalism.
American commentators have spent the weekend pondering the similarities between Britain’s vote to leave the European Union and America’s impending vote on whether to take leave it of its senses by electing Donald Trump. The similarities have been well-rehearsed: The supporters of Brexit—like the supporters of Trump--are older, non-college educated, non-urban, distrustful of elites, xenophobic, and nostalgic. Moreover, many British commentators discounted polls showing that Brexit might win just as many American commentators, myself very much included, discounted polls showing that Trump might win the Republican nomination. Brexit may even result in the installation this fall of a new British prime minister, Boris Johnson, who is entertaining, self-promoting, vaguely racist, doughy, and orange. It’s all too familiar.
Why professors, librarians, and politicians are shunning liberal arts in the name of STEM
I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.