I have no special standing to speak on the career of Senator Daniel Inouye of Hawaii, who died yesterday at age 88 -- apart from his having been in public office for the entirety of my conscious life. David Graham did a very good appreciation of Inouye last night on our site. (Wikipedia photo of Lt. Inouye, at roughly age 20.)
But on the principle that you should never pass up the opportunity to give a deserved compliment; with the knowledge that we've come reflexively to view all politicians as unprincipled corner-cutters (a perspective Americans have held through most of our long history); and with the understanding that Inouye's bravest exploits were long enough ago that many Americans would never have heard of them, I wanted to direct attention to the character and dignity of this man.
His bravery during the Second World War was of both the physical and the moral variety. Physical, in the episode for which he won the Medal of Honor. The details of what he did, as set out in the Medal of Honor citation, are almost incredible. His moral courage lay in volunteering to serve, in a segregated Japanese-American unit in the European theater (the famous 442nd Regimental Combat Team), at a time when tens of thousands of Japanese-Americans were being interned as alleged security threats.
Inouye was most clearly in the public eye during the Watergate hearings, which themselves occurred before most of today's U.S. population was born. He was dignified, fair-minded but probing, and non-showboating, in the way we would like to think our senators should always be. Jim Webb, a fellow decorated and wounded combat veteran who served with Inouye these past six years in the Senate, released this statement last night, which rings true.
I deeply regret the passing of Senator Inouye, for whom I had enormous respect as a famed soldier, a principled public servant, and a United States Senator who broke new historical ground with his service. He was a leader whose dignity and judgment caused him to be listened to by politicians of both parties and of all political philosophies. He will be remembered as one of the great Senators of the post-World War Two era. I am grateful for having had the opportunity to serve alongside him.
It is worth recognizing and remembering people who have played positive roles in national life. Their examples might do some good.
UPDATE: Thanks to George Conk for pointing me toward this TPM appreciation of Inouye last night. Conk has video of Inouye's other most prominent moment in the national spotlight, during the Iran-Contra hearings.
Senator Inouye first got my attention in his confrontation with Oliver North (and Brendan Sullivan). His closing statement is a masterful 'we come not to bury Caesar but to praise him.' I've got the video and text links here.
I was living in Southeast Asia during the Iran-Contra hearings, which in those days before worldwide 24/7 cable news meant that I never actually saw them. Inouye's statement on North, shown in video here, is genuinely gripping.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
Why haven’t more challengers entered the race to defeat the Iraq War hawk, Patriot Act supporter, and close friend of big finance?
As Hillary Clinton loses ground to Bernie Sanders in Iowa, where her lead shrinks by the day, it’s worth noticing that she has never made particular sense as the Democratic Party’s nominee. She may be more electable than her social-democratic rival from Vermont, but plenty of Democrats are better positioned to represent the center-left coalition. Why have they let the former secretary of state keep them out of the race? If Clinton makes it to the general election, I understand why most Democrats will support her. She shares their views on issues as varied as preserving Obamacare, abortion rights, extending legal status to undocumented workers, strengthening labor unions, and imposing a carbon tax to slow climate change.
Conservatives want to defund the group, even if it means shutting down the government. And they’re holding the GOP leadership accountable.
It has become an annual harbinger of autumn in this era of divided government: The calendar swings from August to September, Congress returns from its long summer break, and Republican leaders try to figure out how to keep the federal lights on past the end of the month.
In 2013, John Boehner gave in to Senator Ted Cruz and his conservative allies in the House, and the government shut down for two weeks in a failed fight over Obamacare. A year ago, Boehner and Mitch McConnell succeeded in twice putting off a losing battle over immigration until after they could wrest control of the Senate from the Democrats.
With federal funding set to expire on September 30, conservatives are once again demanding a standoff that Boehner and McConnell are hell-bent on avoiding. This time around, the issue that might prevent an orderly—if temporary—extension of funding is Planned Parenthood. Along with Cruz, House conservatives insist that any spending bill sent to President Obama’s desk explicitly prohibit taxpayer dollars from going to the women’s health organization, which has come under fire over undercover videos that purportedly show its officials discussing the sale of fetal tissue. Democrats have rallied around Planned Parenthood, and an effort to ax its approximately $500 million in annual funding is likely to fall short, either by running into a filibuster in the Senate or a presidential veto.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Actually, a good amount: Belittling their plight by comparing it to blue-collar workers’ ignores the trickle-down harms of an exhausting work culture.
Over the past few decades, workers without college degrees have not only seen jobs disappear and wages stagnate—the jobs that remain have, all too often, gotten worse. Constant surveillance is common; schedules are erratic; escalating performance quotas exact faster work. But these trends, often thought to be confined to front-line workers, have creeped up corporate hierarchies, affecting managers and executives. That’s prompted a new controversy: Are white-collar workers victims of exploitation, or merely whining?
A devastating report on the work culture at Amazon’s headquarters recently reignited the debate. The New York Times’s August exposé, based on dozens of interviews, portrayed a firm with all the regimentation and rigidity of military boot camp, minus the esprit de corps. Workers routinely cried at their desks. Rather than being comforted or accommodated, sick employees were dumped into Orwellianly named “Performance Improvement Plans” that simply hastened their eventual departures. Faced with a comprehensive employee-ranking system, cabals of managers agreed to praise one another while talking down the performance of others. Amazon’s “collaborative feedback tool” encouraged a Panopticon of vicious feedback—and similar software may be coming to many more firms.
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
The NBC show isn’t casting its net wide enough when it comes to finding new players.
Since the departure of many of its biggest stars two years ago, Saturday Night Live has mostly avoided major cast changes. Yesterday, NBC announced the show would add only one new cast member for its 41st season—the near-unknown stand-up comic Jon Rudnitsky. SNL is, of course, a sketch-comedy show, but it keeps hiring mostly white stand-ups who have a markedly different skill set, with limited results. As critics and viewers keep calling out for greater diversity on the show, it’s hard to imagine the series’s reasoning in sticking to old habits.
As is unfortunately typical today, controversy has already arisen over some tasteless old jokes from Rudnitsky’s Twitter and Vine feeds, similar to the furore that greeted Trevor Noah’s hiring at The Daily Show this summer. But Rudnitsky was apparently hired on the back of his stand-up performances, not his Internet presence, similar to the other young stand-ups the show has hired in recent years: Pete Davidson, Brooks Wheelan (since fired), and Michael Che. It’s a peculiar route to the show, because SNL is 90 percent sketch acting, and unless you’re hosting Weekend Update (like Che), you’re not going to do a lot of stand-up material. So why hire Rudnitsky?
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
Many educators are introducing meditation into the classroom as a means of improving kids’ attention and emotional regulation.
A five-minute walk from the rickety, raised track that carries the 5 train through the Bronx, the English teacher Argos Gonzalez balanced a rounded metal bowl on an outstretched palm. His class—a mix of black and Hispanic students in their late teens, most of whom live in one of the poorest districts in New York City—by now were used to the sight of this unusual object: a Tibetan meditation bell.
“Today we’re going to talk about mindfulness of emotion,” Gonzalez said with a hint of a Venezuelan accent. “You guys remember what mindfulness is?” Met with quiet stares, Gonzalez gestured to one of the posters pasted at the back of the classroom, where the students a few weeks earlier had brainstormed terms describing the meaning of “mindfulness.” There were some tentative mumblings: “being focused,” “being aware of our surroundings.”