The press blames black flight from major cities on whites, but history and the numbers show that's not true.
Whenever we talk about gentrification it really is a good idea not simply to understand who's coming and who's going, but precisely when the coming and going happened. In reference to our conversations around Washington, D.C., it's really important to understand that the black population was falling in the city long before the arrival of hipsters, interlopers, and white people in general.
Washington's black population peaked in 1970 at just over half a million (537,712 to be precise.) It's declined steadily ever since, with the biggest decline occurring between 1970 and 1980 when almost 100,000 black people left the city. Whites were also leaving the city by then, but at a much slower rate--the major white out-migration happened in the 50s and the 60s.
By 1990 whites had started coming back. But black people--mirroring a national trend--continued to leave. At present there are around 343,000 African-Americans in the District, a smaller number, but still the largest ethnic group in the city. I say this to point out that the idea that incoming whites are "forcing out" large number of blacks has yet to be demonstrated.
A slew of newspaper articles assume the truth of gentrification. But any proponent of the gentrification thesis (explicit or implicit) needs to fully explore and answer the following question: Is white migration into the city forcing black migration back out?
Speaking as though this is the case because it "feels true" isn't evidence. Indeed it's the flip side of blaming white migration to the suburbs on riotous, criminally inclined blacks.
I don't say this so much in defense of hipster interlopers, as I do in opposition to the theory that black people are, solely, the thing that is acting upon them. Understanding the vestiges of white supremacy isn't the same as understanding black people. There needs to be a lot more agency in this discussion. There also needs to be a lot less nostalgia.
One that note, I'd mention that "Chocolate City"--like most
majority-black cities--is a recent innovation, covering the last half of
the 20th century. As late as 1950, there were more whites than blacks
in Washington, and the city was still gaining white residents. By
1960--pre-riots, mind you--their numbers were falling precipitously.The
shift was seen, at the time, as a bad thing. Still it would be facile
to conclude that the latest shift back is a "good" thing.
likely, we are using a local matter as an inadequate substitute for a
broader national situation that still plagues us. The fact is that the
two parties--those blacks who remain by choice or otherwise, and those
whites who are returning--are not equal. In the District, you are
looking at a black population that is reeling under a cocktail of an
ancient wealth gap, poor criminal justice policy, and economic
instability. On the other side, you have a well-educated, well-insulated white population with different wants and different
There is much more here to consider
about what that means, about what people feel like they're losing. Even
as I interrogate the statistics, I maintain that people are not stupid,
and that it's critically important to understand why they feel as they
do. Black people have not owned much in this country. And yet, in the
later years of the 20th century, we felt like we felt like we owned many
of America's great cities.
I suspect much of our present angst can be traced to the lifting of that illusion.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
Five years ago, on a boat off the southern coast of Sri Lanka, I met the largest animal that exists or has ever existed.
The blue whale grows up to 110 feet in length. Its heart is the size of a small car. Its major artery is big enough that you could wedge a small child into it (although you probably shouldn’t). It’s an avatar of hugeness. And its size is evident if you ever get to see one up close. From the surface, I couldn’t make out the entire animal—just the top of its head as it exposed its blowhole and took a breath. But then, it dove. As its head tilted downwards, its arching back broke the surface of the water in a graceful roll. And it just kept going, and going, and going. By the time the huge tail finally broke the surface, an unreasonable amount of time had elapsed.
A recent push for diversity has been blamed for weak print sales, but the company’s decades-old business practices are the true culprit.
Marvel Comics has been having a rough time lately. Readers and critics met last year’s Civil War 2—a blockbuster crossover event (and aspiritual tie-in to the year’s big Marvel movie)—with disinterest and scorn. Two years of plummeting print comics sales culminated in a February during which only one series managed to sell over 50,000 copies. Three crossover events designed to pump up excitement came and went with little fanfare, while the lead-up to 2017’s blockbuster crossover Secret Empire—where a fascist Captain America subverts and conquers the United States—sparked such a negative response that the company later put out a statement imploring readers to buy the whole thing before judging it. On March 30, a battered Marvel decided to try and get to the bottom of the problem with a retailer summit—and promptly stuck its foot in its mouth.
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.
When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.
When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).
Several studies show beneficiaries of the program are more likely to be obese. But the answer is not to cut benefits, some academics say.
Among other programs President Trump proposed slashing in his budget blueprint Tuesday, the Supplemental Nutrition Assistance Program, previously known as the food stamps program, would lose 29 percent of its funding over 10 years.
Conservative groups praised the budget proposal’s combination of boosted defense spending and cuts to “domestic programs that are redundant, improper, or otherwise wasteful,” as Romina Boccia, a fellow in federal budgetary affairs at the Heritage Foundation, said in a statement. Liberal groups, meanwhile, said it would “harm America's most vulnerable people and make matters worse for those who can least afford it,” as Felicia Wong, president of the Roosevelt Institute, a progressive think tank, put it.
I bought into the St. Ives lie for years. In the already insecure times of high school and college, my skin was host to constant colonies of acne, my nose peppered with blackheads, my chin and forehead a topographical horror of cystic zits that lasted for weeks. But as I moved into adulthood, it didn’t go away, making me, I suppose, part of a trend—adult acne is on the rise, particularly among women.
I’m sure it never really seemed so bad to others as it did to me, as is the way with these things. I covered it up with layers of gloppy foundation, then with more proficiently applied makeup later on, then went on hormonal birth control, which improved the situation significantly.
But for many of the years in-between, I washed my face with St. Ives Apricot Scrub, which is an exfoliator made with granules of walnut shell powder. It is extremely rough. Perhaps too rough. We’ll find out: Kaylee Browning and Sarah Basile recently filed a class-action lawsuit against St. Ives’s maker, Unilever, alleging that the wash “leads to long-term skin damage” and “is not fit to be sold as a facial scrub.”
The national park wouldn’t let him collect rocks for research.
“How did the Grand Canyon form?” is a question so commonly pondered that YouTube is rife with explanations. Go down into the long tail of Grand Canyon videos, and you’ll eventually find a two-part, 35-minute lecture by Andrew Snelling. The first sign this isn’t a typical geology lecture comes about a minute in, when Snelling proclaims, “The Grand Canyon does provide a testament to the biblical account of Earth’s history.”
Snelling is a prominent young-Earth creationist. For years, he has given lectures, guided biblical-themed Grand Canyon rafting tours, and worked for the nonprofit Answers in Genesis. (The CEO of Answers in Genesis, Ken Ham, is also behind the Creation Museum and the Ark Encounter theme park.) Young-Earth creationism, in contrast to other forms of creationism, specifically holds that the Earth is only thousands of years old. Snelling believes that the Grand Canyon formed after Noah’s flood—and he now claims the U.S. government is blocking his research in the canyon because of his religious views.