In the 1840s, a French economist named Frederic Bastiat wrote:
In the economic sphere, an act, a habit, an institution, a law produces not only one effect, but a series of effects. Of these effects, the first alone is immediate; it appears simultaneously with its cause; it is seen. The other effects emerge only subsequently; they are not seen. There is only one difference between a good economist and a bad one: the bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those which must be foreseen.
In layman's terms, this is the law of unintended consequences, and it plays out, like Murphy's Law, in more spheres than just economics. And while not all unintended consequences are negative, we notice most when an attempt to improve something ends up with an unexpected counter-effect. The saying "the road to Hell is paved with good intentions" refers not only to those who think of doing good but don't act, but also those who think they're acting to a good end but end up causing harm.
The conundrum of the law is that, in many cases, the two types of effects are too closely linked to separate out cleanly. Eliminating the unintended negative consequence would require eliminating the positive effect, as well.
The first time I went to Sudan, for example, I interviewed aid workers and pilots who were flying relief supplies into regions of the country that had been decimated by 18 years of civil war. Without the supplies, people would die. But the local population had also grown dependent on the handouts, and some of the aid was being stolen by troops and helping to support continued fighting. What do you do in a situation like that? In that case, the need to stave off death by starvation was deemed more important than the subtler problems of stolen food and long-term economic impact.
But the issue gets stickier when the "seen" effect isn't addressing a need that's quite so dire or immediate. Take the case of a second-hand bookseller in Salisbury, England who claims he was put out of business by Oxfam--a non-profit organization that, ironically, was one of the organizations sending supplies into war-torn Sudan.
Oxfam does a lot of good work in the world. The United Nations camps for Darfur refugees I visited a couple of years ago in eastern Chad had been set up and were being run by Oxfam personnel who were sacrificing a lot to be there. Doing that work, of course, requires money. U.N. contracts supply part of the organization's operating budget, but Oxfam also relies heavily on charitable donations. According to a recent New York Times article on the subject, Oxfam also receives $500 million a year in support from the British government. Like many chartitable entities, from Goodwill to local hospital foundations, Oxfam also runs a series of shops where it sells donated goods. The proceeds help to support its development and aid programs around the world. It's a win-win for everyone -- donors get a tax break, starving children in Africa get food and clean water.
But here's the sticky part. Oxfam has opened up 130 used book stores around Europe, which bring in a reported $32 million a year ... and are competing with small, mom-and-pop used booksellers in the same neighborhoods. Oxfam has renovated, clean, and similarly-designed and decorated storefronts ... which it can afford to invest in, because it has government support, volunteer workers and tax-deductible, donated products. So it has a market advantage because of its special status as a non-profit organization--an advantage that at least a couple of booksellers claim has put them out of business.
The Oxfam spokesperson quoted in the Times article seemed a tad insensitive, at best, when he shrugged and quipped "Independent candle makers don't have the business they once had either. And if someone's business model is so marginal that an Oxfam shop opening nearby decimates it, then we are not the problem." This, mind you, from an organization that deals almost exclusively with people around the world whose "business models" are so marginal that they would not survive at all without outside assistance.
Marc Harrison, a former Catholic priest who had to close his second-hand bookstore when he couldn't pay his mortgage this past summer, accused Oxfam of "destroying lives here to save them elsewhere."
It's true, of course, that Oxfam's proceeds go to a good cause, instead of personal pockets--although part of its operating budget is the salaries of its worldwide personnel. It's also hard to argue that a former priest who has to close his second-hand bookshop because he can't pay his mortgage is a greedy capitalist. I would wager, in fact, that one doesn't open a second-hand bookstore for the golden profits it's going to garner, any more than people open animal shelters for the good, easy money involved. It's more about preserving something considered precious and finding orphans good second homes. And while the world is not fair, and businesses often have an edge over a competitor because of more favorable loan or other business terms, the Oxfam case does seem to represent particularly unfair competition.
It's an argument that has been raised before, in many different sectors. In trade negotiations in the aerospace industry, Boeing argued that Airbus had an unfair edge because of its government subsidies; Airbus argued back that Boeing had benefitted from NASA's research, which was a subsidy of a different sort. And NASA itself has been accused of unfair competition in soliciting new business to try to shore up its ever-changing and unsteady Congressional funding. NASA had always allowed private corporations to use its test facilities for a fee, but the fee used to be less than what other commercial test centers charged, because much of the overhead was covered by civil-servant salaries. Private industry objected, and NASA ceded the point, changing to a system of "full cost accounting" which put its costs at a more comparable level to that of private entities.
But it's easier to make those adjustments in a field where business is done by contract pricing. It would be harder to implement that kind of "level-playing-field" shift in the used bookstore market. The used clothing industry--also populated by many non-profit organizations--has a small commercial component, as well, but most for-profit "consignment stores" (the upmarket term for a used clothing outlet) tend to be pickier about the quality of their products to differentiate themselves from the everyday thrift stores. They also offer donors a piece of the profits, to lure customers who might otherwise donate the clothing to a non-profit outlet.
Perhaps booksellers could follow the same model, although the profit margin may not be big enough for that to create much incentive in the used book industry. But regardless, the question of non-profits generating funds through commercial means--while a staple of support for charitable organizations for many years--can sometimes unintentionally cross into some muddy, gray areas of commerce, fairness, and collateral damage. Successfully navigating the lines between good works, self-sustaining funding, and commercial competition and rights is a tough challenge. And a solution that preserves the good benefits while avoiding the negative side-consequences may prove as elusive in Salisbury as it did in Sudan.
Non-profit organizations do a tremendous amount of good in the world. But just as with the work they do around the world, the irony remains that a good intention, and even really good work, can sometimes carry with it "unseen" and unintended consequences. At home, as well as abroad.
FEMA Director Craig Fugate on why the Katrina response failed, why it’s important to talk about “survivors” instead of “victims,” and why citizens can’t just wait for the government to save them in a huge disaster
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Fractured by internal conflict and foreign intervention for centuries, Afghanistan made several tentative steps toward modernization in the mid-20th century. In the 1950s and 1960s, some of the biggest strides were made toward a more liberal and westernized lifestyle, while trying to maintain a respect for more conservative factions. Though officially a neutral nation, Afghanistan was courted and influenced by the U.S. and Soviet Union during the Cold War, accepting Soviet machinery and weapons, and U.S. financial aid. This time was a brief, relatively peaceful era, when modern buildings were constructed in Kabul alongside older traditional mud structures, when burqas became optional for a time, and the country appeared to be on a path toward a more open, prosperous society. Progress was halted in the 1970s, as a series of bloody coups, invasions, and civil wars began, continuing to this day, reversing almost all of the steps toward modernization taken in the 50s and 60s. Keep in mind, when looking at these images, that the average life expectancy for Afghans born in 1960 was 31, so the vast majority of those pictured have likely passed on since.
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft.
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft of various types, representing what military officials said were the Chinese military's most cutting-edge technology. While the entire event was a show of strength, Chinese officials insisted the message was about peace, with the logo displayed on posters featuring an image of a dove.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.