Something to read while waiting for Congress to vote on the debt ceiling.
Something to read while waiting for Congress to vote on the debt ceiling.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
Readers share their own experiences in an ongoing series.
Prompted by Emma Green’s note on the Supreme Court case Whole Women’s Health v. Hellerstedt, for which a group of lawyers filed a document openly describing their abortions, readers share their own stories in an ongoing collection edited by Chris Bodenner. We are posting a wide range of experiences—from pro-choice and pro-life readers, women and men alike—so if you have an experience not represented so far, please send us a note: firstname.lastname@example.org.
People in Great Britain felt their leaders weren’t treating them fairly. Politicians in the U.S. should take note.
Britain’s Brexit vote has shocked the political elites of both the U.S. and Europe. The vote wasn’t just about the EU; in fact, polls before the referendum consistently showed that Europe wasn’t top on voters’ lists of concerns. But on both sides of the Atlantic Ocean, large numbers of people feel that the fundamental contracts of capitalism and democracy have been broken. In a capitalist economy, citizens tolerate rich people if they share in the wealth, and in a democracy, they give their consent to be governed if those governing do so in their interest. The Brexit vote was an opportunity for people to tell elites that both promises have been broken. The most effective line of the Leave campaign was “take back control.” It is also Donald Trump’s line.
Trump and others vow to pull out of the TPP and beef up tariffs, but that wouldn’t stop companies from continuing to moving jobs to where labor is cheapest.
Free trade is largely to blame for the decline of America’s middle class. That’s the increasingly popular sentiment embraced by politicians like Donald Trump, Bernie Sanders, and even, to some degree, Hillary Clinton, in the run-up to the 2016 presidential election. In a speech Tuesday, Trump went the furthest yet, vowing to rip up past trade deals and renegotiate other trade agreements “to get a better deal for our workers.”
But is it really trade deals like the North American Free Trade Agreement (NAFTA) that have caused manufacturing jobs to disappear? Or is it just the predilection of companies to go where labor is the cheapest, whether or not there are trade deals? To be sure, some trade deals might have incentivized companies who were on the fence to go overseas. And the U.S. could have handled those departures better, focusing more on retraining the workers who lost their jobs. But at the end of the day, reversing trade policy won’t bring jobs to America if there are other places in the world where goods can be made with cheaper labor. And unless politicians want to ask Americans to work for a few dollars a day, which is the going rate in many developing countries, those aren’t jobs that would create middle-class wages anyway.
Astronomers say they have discovered an ancient astronomical tool, potentially used by prehistoric humans for stargazing rituals.
Telescopes as we know them today trace their origins back to the Enlightenment. The earliest such devices emerged about 400 years ago. But humankind has fashioned environments for stargazing for far longer than that.
Scholars have long speculated about the astronomical orientation of the Pyramids at Giza, for instance, and the possibility that Stonehenge was built to be a celestial observatory.
Now, there’s evidence of ancient telescopic structures that date back even farther, to about 6,000 years ago. Astronomers are exploring ancient tombs in Portugal that they believe may have been used by prehistoric humans to enhance specific views of the night skies. Researchers are focusing on the alignment of the stars with megalithic tombs—stone structures known as dolmens that feature long narrow entrances that act as apertures, essentially zooming in on stars and planets that wouldn’t always be visible from the outside. “These structures could therefore have been the first astronomical tools to support the watching of the skies, millennia before telescopes were invented,” the Royal Astronomical Society wrote in an statement announcing the research on Wednesday.
Their degrees may help them secure entry-level jobs, but to advance in their careers, they’ll need much more than technical skills.
American undergraduates are flocking to business programs, and finding plenty of entry-level opportunities. But when businesses go hunting for CEOs or managers, “they will say, a couple of decades out, that I’m looking for a liberal arts grad,” said Judy Samuelson, executive director of the Aspen Institute’s Business and Society Program.
That presents a growing challenge to colleges and universities. Students are clamoring for degrees that will help them secure jobs in a shifting economy, but to succeed in the long term, they’ll require an education that allows them to grow, adapt, and contribute as citizens—and to build successful careers. And it’s why many schools are shaking up their curricula to ensure that undergraduate business majors receive something they may not even know they need—a rigorous liberal-arts education.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The common definition of "clean" might be detrimental to our skin.
A rock monster tries to save a village from destruction.
Wading into an important existential quagmire