The press blames black flight from major cities on whites, but history and the numbers show that's not true.
Whenever we talk about gentrification it really is a good idea not simply to understand who's coming and who's going, but precisely when the coming and going happened. In reference to our conversations around Washington, D.C., it's really important to understand that the black population was falling in the city long before the arrival of hipsters, interlopers, and white people in general.
Washington's black population peaked in 1970 at just over half a million (537,712 to be precise.) It's declined steadily ever since, with the biggest decline occurring between 1970 and 1980 when almost 100,000 black people left the city. Whites were also leaving the city by then, but at a much slower rate--the major white out-migration happened in the 50s and the 60s.
By 1990 whites had started coming back. But black people--mirroring a national trend--continued to leave. At present there are around 343,000 African-Americans in the District, a smaller number, but still the largest ethnic group in the city. I say this to point out that the idea that incoming whites are "forcing out" large number of blacks has yet to be demonstrated.
A slew of newspaper articles assume the truth of gentrification. But any proponent of the gentrification thesis (explicit or implicit) needs to fully explore and answer the following question: Is white migration into the city forcing black migration back out?
Speaking as though this is the case because it "feels true" isn't evidence. Indeed it's the flip side of blaming white migration to the suburbs on riotous, criminally inclined blacks.
I don't say this so much in defense of hipster interlopers, as I do in opposition to the theory that black people are, solely, the thing that is acting upon them. Understanding the vestiges of white supremacy isn't the same as understanding black people. There needs to be a lot more agency in this discussion. There also needs to be a lot less nostalgia.
One that note, I'd mention that "Chocolate City"--like most
majority-black cities--is a recent innovation, covering the last half of
the 20th century. As late as 1950, there were more whites than blacks
in Washington, and the city was still gaining white residents. By
1960--pre-riots, mind you--their numbers were falling precipitously.The
shift was seen, at the time, as a bad thing. Still it would be facile
to conclude that the latest shift back is a "good" thing.
likely, we are using a local matter as an inadequate substitute for a
broader national situation that still plagues us. The fact is that the
two parties--those blacks who remain by choice or otherwise, and those
whites who are returning--are not equal. In the District, you are
looking at a black population that is reeling under a cocktail of an
ancient wealth gap, poor criminal justice policy, and economic
instability. On the other side, you have a well-educated, well-insulated white population with different wants and different
There is much more here to consider
about what that means, about what people feel like they're losing. Even
as I interrogate the statistics, I maintain that people are not stupid,
and that it's critically important to understand why they feel as they
do. Black people have not owned much in this country. And yet, in the
later years of the 20th century, we felt like we felt like we owned many
of America's great cities.
I suspect much of our present angst can be traced to the lifting of that illusion.
FEMA Director Craig Fugate on why the Katrina response failed, why it’s important to talk about “survivors” instead of “victims,” and why citizens can’t just wait for the government to save them in a huge disaster
Ta-Nehisi Coates is a national correspondent at The Atlantic, where he writes about culture, politics, and social issues. He is the author of The Beautiful Struggle and the forthcoming Between the World and Me.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
An image of a small child evokes an unfathomably huge tragedy.
I had just dropped my son off at daycare when I opened Twitter and came across a photo that over the next 24 hours would become a totem of the refugee crisis in Europe and the Middle East, and the blight that is the Syrian civil war. The picture would quickly reappear, this time as an earnest social-media meme, at a meeting of dithering UN officials and a gathering of unfeeling Arab leaders: a small Syrian boy in a red shirt, blue shorts, and worn shoes, lying face down in wet sand, his head cocked to one side along a gray, glistening shoreline, his lifeless hands cupped upwards, his knees slightly bent.
My first reaction was despair. My second was: My son sleeps just like that.
The attention this photo has received has generated discomfort as well as indignation—for understandable reasons. There are important ethical questions surrounding the taking or sharing of photos of children, dead or alive, in the media, including questions about the intent of the sharers and the consent of the subject. The scale of the Syrian tragedy is orders of magnitude greater, and infinitely more variegated, than this one picture, or this one victim’s story, can possibly convey. Over the last four and a half years, an estimated 240,000 people have died in the grinding violence, including nearly 12,000 children. More than half of Syria’s pre-war population—half, the proportional equivalent of nearly 170 million Americans—have been forced to flee their homes, spawning the largest exodus of refugees in a generation. Seven hundred and fifty thousand Syrian children won’t be going back to school this fall.
Fractured by internal conflict and foreign intervention for centuries, Afghanistan made several tentative steps toward modernization in the mid-20th century. In the 1950s and 1960s, some of the biggest strides were made toward a more liberal and westernized lifestyle, while trying to maintain a respect for more conservative factions. Though officially a neutral nation, Afghanistan was courted and influenced by the U.S. and Soviet Union during the Cold War, accepting Soviet machinery and weapons, and U.S. financial aid. This time was a brief, relatively peaceful era, when modern buildings were constructed in Kabul alongside older traditional mud structures, when burqas became optional for a time, and the country appeared to be on a path toward a more open, prosperous society. Progress was halted in the 1970s, as a series of bloody coups, invasions, and civil wars began, continuing to this day, reversing almost all of the steps toward modernization taken in the 50s and 60s. Keep in mind, when looking at these images, that the average life expectancy for Afghans born in 1960 was 31, so the vast majority of those pictured have likely passed on since.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
What do Google's trippy neural network-generated images tell us about the human mind?
When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.
The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft.
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft of various types, representing what military officials said were the Chinese military's most cutting-edge technology. While the entire event was a show of strength, Chinese officials insisted the message was about peace, with the logo displayed on posters featuring an image of a dove.