A new report from the National Science Board says that when it comes to high tech industries, the U.S. is still by far the global leader.
There's a common trope that the United States, having been gutted of its manufacturing jobs by the brute force of globalization, is now on the verge of giving up its crown as the world's leading innovator. Yesterday, The Washington Post published an article on our loss of high-tech manufacturing jobs to Asia that played right into the theme. Here was the apocalyptic lede:
The United States lost more than a quarter of its high-tech manufacturing jobs during the past decade as U.S.-based multinational companies placed a growing percentage of their research-and-development operations overseas, the National Science Board reported Tuesday.
Scary stuff. And a bit overwrought.
There's no question that the U.S. needs to be vigilant about its place as the world's research lab. But when it comes to R&D, we're still number one, and the report cited by the Washington Post, "Science and Engineering Indicators 2012," actually helps our case.
In 2009, the United States was responsible for 31% of of the world's R&D spending. That was down from 38% in 1999. As the Post dolefully notes, R&D expenditures in "China and nine other Asian countries have risen to match that of the United States." Is that so scary? It takes China, Japan and eight other high-growth Asian countries just to equal total research spending in the U.S. Are we supposed to mourn the death of a unipolar R&D age?
The picture brightens further once you look at US R&D growth independently. From 2004 to 2009, it grew from $302 billion up to more than $400 billion. Even after the recession, investment (at least, non-residential investment) in the United States has been anything but moribund.
Some American corporations have moved their highly technical design and engineering work to manufacturing centers in Asia, particularly China. But that shift hasn't been dramatic. In 1999, U.S.-based multinationals spent 87.4% of their R&D budgets, about $126 billion, domestically. In 2009, they spent 84.3% of their budgets here, or roughly $199 billion. We're taking a similar slice of a much bigger pie.
So a massive shift in R&D towards Asia is still largely hypothetical. But what about those hundreds of thousands of high tech manufacturing jobs we've lost? In some industries, the losses have been due to the migration of manufacturing to Asia, such as personal electronics. But that's just a piece of the story. We can also blame the recession and the large jumps in U.S. productivity fueled by technological advances.
In the end, the U.S. still has the world's most robust set of high-tech manufacturing industries, which according to the NSB include pharmaceuticals, communications equipment, computers, aircraft and spacecraft, scientific and measuring equipment, and semiconductors. We may not rule in cell phones these days, but we do well in aircraft and drugs. Measured by value added, the U.S. high tech sector was worth $390 billion in 2010. Compare that to the second place European Union, at $270 billion, and third place China, at $260 billion.
Should we be on guard? Yes. We need to keep investing in education and research to maintain our place in the world of discovery and technology. But, despite some dour headlines, we're still here.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.
And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.
— Deuteronomy 15: 12–15
Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
Bernie Sanders and Jeb Bush look abroad for inspiration, heralding the end of American exceptionalism.
This election cycle, two candidates have dared to touch a third rail in American politics.
Not Social Security reform. Not Medicare. Not ethanol subsidies. The shibboleth that politicians are suddenly willing to discuss is the idea that America might have something to learn from other countries.
The most notable example is Bernie Sanders, who renewed his praise for Western Europe in a recent interview with Ezra Klein. “Where is the UK? Where is France? Germany is the economic powerhouse in Europe,” Sanders said. “They provide health care to all of their people, they provide free college education to their kids.”
On ABC’s This Week in May, George Stephanopoulos asked Sanders about this sort of rhetoric. “I can hear the Republican attack ad right now: ‘He wants American to look more like Scandinavia,’” the host said. Sanders didn’t flinch:
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
On July 16, 1945, the United States Army detonated the world’s first nuclear weapon in New Mexico’s Jornada del Muerto desert.
On July 16, 1945, the United States Army detonated the world’s first nuclear weapon in New Mexico’s Jornada del Muerto desert. The test, code-named “Trinity,” was a success, unleashing an explosion with the energy of about 20 kilotons of TNT and beginning the nuclear age. Since then, nearly 2,000 nuclear tests have been performed. Most of these took place during the 1960s and 1970s. When the technology was new, tests were frequent and often spectacular, and they led to the development of newer, more deadly weapons. Since the 1990s, there have been efforts to limit the testing of nuclear weapons, including a U.S. moratorium and a U.N. comprehensive test ban treaty. As a result, testing has slowed—though not halted—and there are looming questions about who will take over for those experienced engineers who are now near retirement. Gathered here are images from the first 30 years of nuclear testing. (A version of this article first ran here in 2011.)
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
Some say the so-called sharing economy has gotten away from its central premise—sharing.
This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.
The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”