Last Saturday I did something that I probably should have done a long time ago--I went gallery-hopping with a few buddies through Chelsea. It's really sinful to live in New York for almost ten years, as I have, consider yourself a creative person and not take the ritual tour. At any rate, I was motivated by much of the art I'd seen while away this summer in the Woods. There were all kinds of people there--painters, sculptors, photographers, singers, composers, poets, novelists, essayists, printmakers, architects--really an entire range of folks. In the evenings, an artist would usually give a presentation of their work. I think I understood, at best, 20 percent of what I saw, though I was moved by quite a bit more.
That got me thinking about something I've talked about here--the vast and blissful ignorance of childhood. As a kid, there was so much that I didn't understand. I can remember being five and hearing my Dad saying to his friends, "I can dig it, I can dig it" and thinking "Dig what?" That's just a small thing, and there seemingly hundreds of those small things. And then there were big things--Did the Human Beatbox really have a heart attack? Was Scott La Rock really shot? Is wrestling actually real? What did Gwen Stacy look like? Where does Optimus Prime's trailer really go when he transforms? Why does that girl in pre-Algrebra keep punching me in the arm?
Of course as a kid, I hated having all those questions, I hated the not knowing. I took to imagination as a kind of coping mechanism for my ignorance, in much the same way that early societies took to religion to explain the night. If you can't know what actually happened to Scott La Rock, why not find your father's old Rand McNally atlas, flip to a map of New York and stare really hard at that yellow portion marked "Bronx" in red lettering, and try to divine what happened. You fill the gaps for what you can't know with your own imagination, and then some decades later, that filling in process becomes an essential tool of your life.
When I was in the Woods, and I'd see those presentations, it was that old feeling again. I have no capacity to understand jazz, classical or opera. But I was lucky enough to be in the company of about twenty fellow artists, all of us assembled to hear this woman sing for an hour. I was lucky enough to be in the company of other artists and hear this dude play for an hour. I knew they both were big deals, and when I heard them, I could tell they were enormously talented. But I had no context to explain why. I couldn't tell you, technically, why they were great in the way that I can tell you, technically, why Fitzgerald or Black Thought are great. I was left only with emotion and imagination.
I grew up without the internet, and in that world, where literal truth could not be readily verified, emotion and imagination was often all I had. I want to get back to that feeling, to a place where there are gaping holes in my understanding which do not hunger for literal fact.. So I went to Chelsea and saw a lot of stuff that I did not understand. So I went to Chelsea and got unconscious and got uncomfortable.
The piece above is at Slag, and I encourage everyone in the area to see the whole exhibition. (The screen can't really carry the piece's incredible depth and weight) It's an oil painting, "Funeral," by Mircea Suciu, a dude who I'd never heard of. That's my loss. His stuff really stuck with me. But damn if I can tell you why. I'm not even sure I need to know. Sometimes knowing is beside the point.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
It’s the cloudless map’s first major makeover since 2013.
More than 1 billion people use Google Maps every month, making it possibly the most popular atlas ever created. On Monday, it gets a makeover, and its many users will see something different when they examine the planet’s forests, fields, seas, and cities.
Google has added 700 trillion pixels of new data to its service. The new map, which activates this week for all users of Google Maps and Google Earth, consists of orbital imagery that is newer, more detailed, and of higher contrast than the previous version.
Most importantly, this new map contains fewer clouds than before—only the second time Google has unveiled a “cloudless” map. Google had not updated its low- and medium-resolution satellite map in three years.
The way members of the ‘model minority’ are treated in elite-college admissions could affect race-based standards moving forward.
In his new book, Earning Admission: Real Strategies for Getting Into Highly Selective Colleges, the strategist Greg Kaplan urges Asians not to identify as such on their applications. “Your child should decline to state her background if she identifies with a group that is overrepresented on campus even if her name suggests affiliation,” he advises parents, also referencing Jews. Such tips are increasingly common in the college-advising world; it’s not unusual for consultants, according to The Boston Globe, to urge students to “deemphasize the Asianness” in their resumes or avoid writing application essays about their immigrant parents “coming from Vietnam with $2 in a rickety boat and swimming away from sharks.”
It’s not because they’re inherently harsher leaders than men, but because they often respond to sexism by trying to distance themselves from other women.
There are two dominant cultural ideas about the role women play in helping other women advance at work, and they are seemingly at odds: the Righteous Woman and the Queen Bee.
The Righteous Woman is an ideal, a belief that women have a distinct moral obligation to have one another’s backs. This kind of sentiment is best typified by Madeleine Albright’s now famous quote, “There is a special place in hell for women who don’t help each other!” The basic idea is that since all women experience sexism, they should be more attuned to the gendered barriers that other women face. In turn, this heightened awareness should lead women to foster alliances and actively support one another. If women don’t help each other, this is an even worse form of betrayal than those committed by men. And hence, the special place in hell reserved for those women.
Fears of civilization-wide idleness are based too much on the downsides of being unemployed in a society premised on the concept of employment.
People have speculated for centuries about a future without work, and today is no different, with academics, writers, and activists once again warning that technology is replacing human workers. Some imagine that the coming work-free world will be defined by inequality: A few wealthy people will own all the capital, and the masses will struggle in an impoverished wasteland.
A different, less paranoid, and not mutually exclusive prediction holds that the future will be a wasteland of a different sort, one characterized by purposelessness: Without jobs to give their lives meaning, people will simply become lazy and depressed. Indeed, today’s unemployed don’t seem to be having a great time. One Gallup poll found that 20 percent of Americans who have been unemployed for at least a year report having depression, double the rate for working Americans. Also, some research suggests that the explanation for rising rates of mortality, mental-health problems, and addiction among poorly-educated, middle-aged people is a shortage of well-paid jobs. Another study shows that people are often happier at work than in their free time. Perhaps this is why many worry about the agonizing dullness of a jobless future.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The 18th-century ailment was on the brink of elimination before budget cuts helped bring it back.
In recent months, newspapers around the country have published stories that sound like they could have been written 100 years ago. Indiana’s syphilis cases skyrocketed by 70 percent in a single year. Texas’ Lubbock county was under a “syphilis alert.” Various counties face shortages of the medication used to treat syphilitic pregnant women.
But the headlines are very much modern—and urgent. Syphilis is back, public-health experts say.
For many years, syphilis was considered a practically ancient ailment—a “Great Pox” that, like tuberculosis or polio, Americans just don’t get anymore. There were just 6,000 cases of primary and secondary syphilis in 2000, and the CDC briefly thought the disease’s total elimination was within reach.
Three Atlantic staffers discuss “The Winds of Winter,” the tenth and final episode of the sixth season.
Every week for the sixth season of Game of Thrones, Christopher Orr, Spencer Kornhaber, and Lenika Cruz discussed new episodes of the HBO drama. Because no screeners were made available to critics in advance this year, we'll be posting our thoughts in installments.
Obama has taken credit for his administration’s deferred-action program. But legally speaking, this challenge was about something else.
In her law-professor days, now-Justice Elena Kagan wrote a much-noted article arguing that presidents should, in effect, take ownership of their administrations’ bureaucratic policymaking. EPA environmental regulation should be embraced as presidential environmental regulation. FDA public-health regulation should be seen as presidential health regulation. Presidents should be encouraged to make regulation their own in both how they engage with the bureaucracy and how they discuss an administration’s regulatory output. She argued: “[P]residential leadership enhances transparency, enabling the public to comprehend more accurately the sources and nature of bureaucratic power.”
United States v. Texas—a challenge to a Department of Homeland Security program to provide undocumented immigrant parents of U.S. citizen children temporary protection against involuntary removal—shows that the opposite is true. Both the media and the public appear confused about “the sources and nature of [DHS’s] power.” Far from promoting public comprehension, President Obama, no doubt abetted by his opponents, has muddled public understanding by aggressively branding the program as his own.
Critics claim British voters were unqualified to decide such a complicated issue. But democracy itself isn’t the problem.
It’s easy, in retrospect, to characterize David Cameron’s decision to hold a referendum on Britain’s EU membership as a colossal blunder, at least from the prime minister’s perspective. The idea was reportedly conceived at a pizza restaurant at Chicago O’Hare airport, an inauspicious place to hatch plans of international consequence. Cameron, by many accounts, promised to stage the vote not because he believed in it, or took it especially seriously, or felt the public was demanding it, but because he wanted to appease right-wing “euroskeptics” in his party ahead of the 2015 election. It worked. Cameron won that election, and soon found himself campaigning for Britain to remain in the European Union. Then a majority of Britons voted to do just the opposite. A disgraced David Cameron now finds himself without a job and his country temporarily without its bearings, in a jolted world. Blunders don’t get much bigger.