I am invested in, and emotionally responsive, to the idea of black people doing for themselves.
Yes. So many of us are. Including me. You can scroll back through the archives and look at my response to Obama's Father's Day speech, to his speech to the NAACP, and his invocation of personal responsibility. For those of us of a particular quasi-nationalist persuasion, the idea hits a sweet spot and lives in a long tradition of "doing for self."
The difference between Obama, on the one hand, and Douglass, Garvey, Ida, and Martin, on the other, is that the latter always paired "do for self" with a solid critique of race in America. When Ida Wells was attacking the "manhood" of black men during the height of the Red Summers, she was just as aggressively inveighing against American racism. Douglass's final autobiography is filled with a moral critique of his own community. But it's obviously paired with a recognition of the history and nature of racism.
Obama is only practically capable of delivering on half of that formula -- and it's a half that's very popular in the larger country. The self-help nationalistic strain of black thought resonates with how Americans think about themselves. Consider this:
For all of Malcolm's invective, his most seductive notion was that of collective self-creation: the idea that black people could, through force of will, remake themselves. Toward the end of his book, Marable tells the story of Gerry Fulcher, a white police officer, who--almost against his will--fell under Malcolm's sway. Assigned to wiretap Malcolm's phone, Fulcher believed Malcolm to be "one of the bad guys," interested in killing cops and overthrowing the government. But his views changed. "What I heard was nothing like I expected," said Fulcher. "I remember saying to myself, 'Let's see, he's right about that ... He wants [blacks] to get jobs. He wants them to get education. He wants them to get into the system. What's wrong with that?'"
Fulcher is a white police officer who should be plotting against Malcolm--but "do for self" resonated with Fulcher. With that said, "do for self" -- divorced from a critique of racism--has the convenient side-effect of letting white people off the hook. This is the version of "do for self" that Obama delivers -- a palatable black nationalism, inoffensive to, and uncritical of, white people. Obama's a smart dude, with a serious knowledge of black history. I suspect he knew what he was doing when he went on his bamboozled riff. I suspect he knows exactly what he's doing now.
I am not unsympathetic to his dilemma. There's simply no way he can be president and be honest with the country about race. The one time he tried it, during Gates-gate, he paid for it.
On top of that, Obama's very presence in the White House has deep symbolic significance to many African-Americans. There's an element of the black Left that would have that symbolism dismissed, and argue that black people have somehow been duped. I think that's wrong. Living with racism is hard. Living with the belief that racism has not changed, and never will change, is even harder. Obama is radical evidence that the latter claim, no matter how much we feel it, is false. That means something.
Of course the result is that Obama gets a pass, on policy, from black people, that Hillary Clinton simply would never enjoy. If you're in the business of pressuring the Democratic Party to be more progressive, this is a source of frustration.
So where does that leave us? Is it wrong for the head of the American government to speak as black citizen out of convenience? What do we say to the crowds of black people in Beaumont, Texas, who cheer his rhetoric on? I think there's something to be learned there. And yet I also believe in applying pressure.
I watch that clip in Beaumont and laugh. I don't know what that says.
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
No one will ever find a closer exoplanet—now the race is on to see if there is life on its surface.
One hundred and one years ago this October, a Scottish astronomer named Robert Innes pointed a camera at a grouping of stars near the Southern Cross, the defining feature of the night skies above his adopted Johannesburg. He was looking for a small companion to Alpha Centauri, our closest neighboring star system.
Hunched over glass photographic plates, Innes teased out a signal. Across five years of images, a small, faint star moved, wiggling on the sky. It shifted just as much as Alpha Centauri, suggesting its fate was intertwined with that binary system. But this small star was closer to the sun than Alpha. Innes suggested calling it Proxima Centauri, using the Latin word for “nearest.”
The dim red star soon entered the collective imagination, inspiring dreams of interstellar travel. Gravity has linked the star to the Alpha Centauri system, but our culture of science and storytelling has linked it to the solar system. Today, that link will grow stronger, when an international team of astronomers announces that this nearest of stars also hosts the closest exoplanet, one that might look a whole lot like Earth.
Do mission-driven organizations with tight budgets have any choice but to demand long, unpaid hours of their staffs?
Earlier this year, at the encouragement of President Obama, the Department of Labor finalized the most significant update to the federal rules on overtime in decades. The new rules will more than double the salary threshold for guaranteed overtime pay, from about $23,000 to $47,476. Once the rules go into effect this December, millions of employees who make less than that will be guaranteed overtime pay under the law when they work more than 40 hours a week.
Unsurprisingly, some business lobbies and conservatives disparaged the rule as unduly burdensome. But pushback also came from what might have been an unexpected source: a progressive nonprofit called the U.S. Public Interest Research Group (PIRG). “Doubling the minimum salary to $47,476 is especially unrealistic for non-profit, cause-oriented organizations,” U.S. PIRG said in a statement. “[T]o cover higher staffing costs forced upon us under the rule, we will be forced to hire fewer staff and limit the hours those staff can work—all while the well-funded special interests that we're up against will simply spend more.”
City dwellers spend nearly every moment of every day awash in wi-fi signals. Homes, streets, businesses, and office buildings are constantly blasting wireless signals every which way for the benefit of nearby phones, tablets, laptops, wearables, and other connected paraphernalia.
When those devices connect to a router, they send requests for information—a weather forecast, the latest sports scores, a news article—and, in turn, receive that data, all over the air. As it communicates with the devices, the router is also gathering information about how its signals are traveling through the air, and whether they’re being disrupted by obstacles or interference. With that data, the router can make small adjustments to communicate more reliably with the devices it’s connected to.
This much is obvious: Young people don’t buy homes like they used to.
In the aftermath of the recession and weak recovery, the share of 18- to- 34 year olds—a.k.a.: Millennials—who own a home has fallen to a 30-year low. For the first time on record going back more than a century, young people are now more likely to live with their parents than with a spouse.
It’s become en vogue to argue that young people’s turn against homeownership might be a good thing. After all, houses are not always dependable investment vehicles, a lesson the country learned all too painfully after the Great Recession. Without being anchored to any one city from their mid-20s and into their 30s, young people who don’t own are free to roam about the country in search of the best jobs. What’s more, given the copious advantages of a college degree in this economy, perhaps many young people could be commended for investing in their intelligence, professional networks, and abilities rather than devote that same income to a roof, floor, and furniture.
If his administration gets its way, it would be even easier for future commanders in chief to take military action without approval from Congress.
President Obama has been emphatically warning Americans about the dangers of a Trump presidency. But these warnings divert attention from a much darker reality. His Justice Department is in fact pushing the law in a direction that will enable the next president to declare war against any “terrorist” group or nation without the consent of Congress.
This reality is clear from the Department’s response to a lawsuit challenging the legality of Obama’s war against the Islamic State.
In 1973, Congress passed the War Powers Resolution over President Richard Nixon’s veto. It represented the culmination of a national effort to prevent future presidents from repeating Nixon’s unilateral escalations in Vietnam. The Resolution provides that, when a president commits American forces to a new military engagement, he has 60 days to gain the explicit authorization of Congress for the war. If Congress refuses its consent, the Resolution requires the commander in chief to withdraw his forces from the battlefield within the next 30 days.
Donald Trump’s campaign manager says he’s actually winning, thanks to “undercover” supporters. Plenty of past presidential hopefuls have mistakenly believed the same.
A candidate or operative on a campaign that's losing has three options: despair; accept what’s happening and try to fix it; or deny. Right now, the Donald Trump campaign is exhibiting all three.
For despair, there are the staffers who are reportedly “suicidal” inside Trump Tower, and those who have simply quit. For acceptance, Trump himself has admitted he’s in trouble. But newly promoted campaign manager Kellyanne Conway is taking the denial route.
“Donald Trump performs consistently better in online polling where a human being is not talking to another human being about what he or she may do in the election,” she told the British outlet Channel 4. “It’s because it’s become socially desirable, if you’re a college educated person in the United States of America, to say that you’re against Donald Trump.”
A new survey suggests the logistics of going to services can be the biggest barrier to participation—and Americans’ faith in religious institutions is declining.
The standard narrative of American religious decline goes something like this: A few hundred years ago, European and American intellectuals began doubting the validity of God as an explanatory mechanism for natural life. As science became a more widely accepted method for investigating and understanding the physical world, religion became a less viable way of thinking—not just about medicine and mechanics, but also culture and politics and economics and every other sphere of public life. As the United States became more secular, people slowly began drifting away from faith.
Of course, this tale is not just reductive—it’s arguably inaccurate, in that it seems to capture neither the reasons nor the reality behind contemporary American belief. For one thing, the U.S. is still overwhelmingly religious, despite years of predictions about religion’s demise. A significant number of people who don’t identify with any particular faith group still say they believe in God, and roughly 40 percent pray daily or weekly. While there have been changes in this kind of private belief and practice, the most significant shift has been in the way people publicly practice their faith: Americans, and particularly young Americans, are less likely to attend services or identify with a religious group than they have at any time in recent memory.
The U.S. presidential nominee’s anti-Islam rhetoric has motivated some to speak out against stereotypes.
Donald Trump has effectively declared Muslims the enemy, accusing them of shielding terrorists in their midst, pushing to ban them from entering the country, and suggesting that the United States should start thinking seriously about profiling them. In response, some American Muslim women are speaking out against Trump and his anti-Muslim rhetoric.
“I never really felt like I was ‘the other’ until now,” said Mirriam Seddiq, a 45-year-old immigration and criminal-defense lawyer from Northern Virginia who recently started a political-action committee called American Muslim Women. “It’s a strange realization to have, but it’s what motivated me to do this. There are so many misconceptions about Muslim women, and I want to help counter that narrative.”