In the 1840s, a French economist named Frederic Bastiat wrote:
In the economic sphere, an act, a habit, an institution, a law produces not only one effect, but a series of effects. Of these effects, the first alone is immediate; it appears simultaneously with its cause; it is seen. The other effects emerge only subsequently; they are not seen. There is only one difference between a good economist and a bad one: the bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those which must be foreseen.
In layman's terms, this is the law of unintended consequences, and it plays out, like Murphy's Law, in more spheres than just economics. And while not all unintended consequences are negative, we notice most when an attempt to improve something ends up with an unexpected counter-effect. The saying "the road to Hell is paved with good intentions" refers not only to those who think of doing good but don't act, but also those who think they're acting to a good end but end up causing harm.
The conundrum of the law is that, in many cases, the two types of effects are too closely linked to separate out cleanly. Eliminating the unintended negative consequence would require eliminating the positive effect, as well.
The first time I went to Sudan, for example, I interviewed aid workers and pilots who were flying relief supplies into regions of the country that had been decimated by 18 years of civil war. Without the supplies, people would die. But the local population had also grown dependent on the handouts, and some of the aid was being stolen by troops and helping to support continued fighting. What do you do in a situation like that? In that case, the need to stave off death by starvation was deemed more important than the subtler problems of stolen food and long-term economic impact.
But the issue gets stickier when the "seen" effect isn't addressing a need that's quite so dire or immediate. Take the case of a second-hand bookseller in Salisbury, England who claims he was put out of business by Oxfam--a non-profit organization that, ironically, was one of the organizations sending supplies into war-torn Sudan.
Oxfam does a lot of good work in the world. The United Nations camps for Darfur refugees I visited a couple of years ago in eastern Chad had been set up and were being run by Oxfam personnel who were sacrificing a lot to be there. Doing that work, of course, requires money. U.N. contracts supply part of the organization's operating budget, but Oxfam also relies heavily on charitable donations. According to a recent New York Times article on the subject, Oxfam also receives $500 million a year in support from the British government. Like many chartitable entities, from Goodwill to local hospital foundations, Oxfam also runs a series of shops where it sells donated goods. The proceeds help to support its development and aid programs around the world. It's a win-win for everyone -- donors get a tax break, starving children in Africa get food and clean water.
But here's the sticky part. Oxfam has opened up 130 used book stores around Europe, which bring in a reported $32 million a year ... and are competing with small, mom-and-pop used booksellers in the same neighborhoods. Oxfam has renovated, clean, and similarly-designed and decorated storefronts ... which it can afford to invest in, because it has government support, volunteer workers and tax-deductible, donated products. So it has a market advantage because of its special status as a non-profit organization--an advantage that at least a couple of booksellers claim has put them out of business.
The Oxfam spokesperson quoted in the Times article seemed a tad insensitive, at best, when he shrugged and quipped "Independent candle makers don't have the business they once had either. And if someone's business model is so marginal that an Oxfam shop opening nearby decimates it, then we are not the problem." This, mind you, from an organization that deals almost exclusively with people around the world whose "business models" are so marginal that they would not survive at all without outside assistance.
Marc Harrison, a former Catholic priest who had to close his second-hand bookstore when he couldn't pay his mortgage this past summer, accused Oxfam of "destroying lives here to save them elsewhere."
It's true, of course, that Oxfam's proceeds go to a good cause, instead of personal pockets--although part of its operating budget is the salaries of its worldwide personnel. It's also hard to argue that a former priest who has to close his second-hand bookshop because he can't pay his mortgage is a greedy capitalist. I would wager, in fact, that one doesn't open a second-hand bookstore for the golden profits it's going to garner, any more than people open animal shelters for the good, easy money involved. It's more about preserving something considered precious and finding orphans good second homes. And while the world is not fair, and businesses often have an edge over a competitor because of more favorable loan or other business terms, the Oxfam case does seem to represent particularly unfair competition.
It's an argument that has been raised before, in many different sectors. In trade negotiations in the aerospace industry, Boeing argued that Airbus had an unfair edge because of its government subsidies; Airbus argued back that Boeing had benefitted from NASA's research, which was a subsidy of a different sort. And NASA itself has been accused of unfair competition in soliciting new business to try to shore up its ever-changing and unsteady Congressional funding. NASA had always allowed private corporations to use its test facilities for a fee, but the fee used to be less than what other commercial test centers charged, because much of the overhead was covered by civil-servant salaries. Private industry objected, and NASA ceded the point, changing to a system of "full cost accounting" which put its costs at a more comparable level to that of private entities.
But it's easier to make those adjustments in a field where business is done by contract pricing. It would be harder to implement that kind of "level-playing-field" shift in the used bookstore market. The used clothing industry--also populated by many non-profit organizations--has a small commercial component, as well, but most for-profit "consignment stores" (the upmarket term for a used clothing outlet) tend to be pickier about the quality of their products to differentiate themselves from the everyday thrift stores. They also offer donors a piece of the profits, to lure customers who might otherwise donate the clothing to a non-profit outlet.
Perhaps booksellers could follow the same model, although the profit margin may not be big enough for that to create much incentive in the used book industry. But regardless, the question of non-profits generating funds through commercial means--while a staple of support for charitable organizations for many years--can sometimes unintentionally cross into some muddy, gray areas of commerce, fairness, and collateral damage. Successfully navigating the lines between good works, self-sustaining funding, and commercial competition and rights is a tough challenge. And a solution that preserves the good benefits while avoiding the negative side-consequences may prove as elusive in Salisbury as it did in Sudan.
Non-profit organizations do a tremendous amount of good in the world. But just as with the work they do around the world, the irony remains that a good intention, and even really good work, can sometimes carry with it "unseen" and unintended consequences. At home, as well as abroad.
A rock structure, built deep underground, is one of the earliest hominin constructions ever found.
In February 1990, thanks to a 15-year-old boy named Bruno Kowalsczewski, footsteps echoed through the chambers of Bruniquel Cave for the first time in tens of thousands of years.
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
The day—a celebration of corporate conformity disguised as a celebration of individuality—helped to bring about the current dominance of “business casual.”
The New York Times ran a story Wednesday announcing “The End of the Office Dress Code.” The suit and its varied strains, the article argues—corporate uniforms that celebrate, well, corporate uniformity—are giving way to more individualized interpretations of “office attire.” As the writer Vanessa Friedman puts it, “We live in a moment in which the notion of a uniform is increasingly out of fashion, at least when it comes to the implicit codes of professional and public life.”
It’s true. We live in a time in which our moguls dress in hoodies and t-shirts, and in which more and more workers are telecommuting—working not just from home, but from PJs. It’s a time, too, when the lines between “work” and “everything else” are increasingly—and sometimes frustratingly—fluid. And so: It’s also a time when many of us are trying to figure out, together, what “work clothes” actually means, and the extent to which the term might vary across professions. As Emma McClendon, who curated a new exhibit on uniforms for the Museum at the Fashion Institute of Technology, summed it up: “We are in a very murky period.”
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Washington voters handed Hillary Clinton a primary win, symbolically reversing the result of the state caucus where Bernie Sanders prevailed.
Washington voters delivered a bit of bad news for Bernie Sanders’s political revolution on Tuesday. Hillary Clinton won the state’s Democratic primary, symbolically reversing the outcome of the state’s Democratic caucus in March where Sanders prevailed as the victor. The primary result won’t count for much since delegates have already been awarded based on the caucus. (Sanders won 74 delegates, while Clinton won only 27.) But Clinton’s victory nevertheless puts Sanders in an awkward position.
Sanders has styled himself as a populist candidate intent on giving a voice to voters in a political system in which, as he describes it, party elites and wealthy special-interest groups exert too much control. As the primary election nears its end, Sanders has railed against Democratic leaders for unfairly intervening in the process, a claim he made in the aftermath of the contentious Nevada Democratic convention earlier this month. He has also criticized superdelegates—elected officials and party leaders who can support whichever candidate they chose—for effectively coronating Clinton.
Americans persist in thinking that Adam Smith's rules for free trade are the only legitimate ones. But today's fastest-growing economies are using a very different set of rules. Once, we knew them—knew them so well that we played by them, and won. Now we seem to have forgotten
IN Japan in the springtime of 1992 a trip to Hitotsubashi University, famous for its economics and business faculties, brought me unexpected good luck. Like
several other Japanese universities, Hitotsubashi is almost heartbreaking in
its cuteness. The road from the station to the main campus is lined with cherry
trees, and my feet stirred up little puffs of white petals. Students glided
along on their bicycles, looking as if they were enjoying the one stress-free
moment of their lives.
They probably were. In surveys huge majorities of students say that they study
"never" or "hardly at all" during their university careers. They had enough of
that in high school.
I had gone to Hitotsubashi to interview a professor who was making waves. Since
the end of the Second World War, Japanese diplomats and businessmen have acted
as if the American economy should be the model for Japan's own industrial
growth. Not only should Japanese industries try to catch up with America's lead
in technology and production but also the nation should evolve toward a
standard of economic maturity set by the United States. Where Japan's economy
differed from the American model—for instance, in close alliances between
corporations which U.S. antitrust laws would forbid—the difference should be
considered temporary, until Japan caught up.
A Brexit advocate says U.S. support for the EU fundamentally misreads what the institution has become.
With less than a month until British citizens vote on whether the U.K. should stay in or leave the European Union, Americans could be forgiven for being preoccupied with their ownpoliticaldramas. Still, President Obama conspicuously weighed in on the British debate in April, writing in The Daily Telegraph “with the candour of a friend” that the vote’s outcome would be “of deep interest to the United States.” Specifically: “The U.S. and the world need your outsized influence to continue—in Europe.”
British voters themselves aren’t so convinced. Polls currently show the “Remain” side in the lead, but the outcome is by no means assured. Advocates of continued U.K. membership in the 28-member political and economic bloc have argued that exiting the organization would severely damage the British economy; diminish the U.K.’s international influence; and destabilize a European continent already wracked by a refugee crisis and economic problems. Those advocating for a so-called Brexit—the “Leave” camp—argue that it would liberate the U.K. from onerous regulations devised and enforced by non-representative foreign bodies based in Brussels. (EU bodies set policy for member states on, among other things, trade, agriculture, and some fiscal matters; member states generally retain control over their own foreign and defense policies. Britain specifically has negotiated the ability to opt out of certain EU-wide policies, particularly on immigration and further political integration.) With its sovereignty thus restored, the U.K. would be better able to handle its own economic, immigration, and other challenges.
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
While fish are disappearing from the oceans, squid, octopus, and cuttlefish populations have been rising since the 1960s. Why?
Every winter in Spencer Gulf, a large inlet intruding into Australia’s south coast, hundreds of thousands of giant cuttlefish gather to breed. They’re about the size and weight of a corgi, with ever-changing displays of shadow and colour rippling across their dynamic skins. At the height of the breeding season, these amorous, multi-armed, living rainbows can get so numerous that there’s one of them in every square meter of water.
But lately, these mating swarms have dwindled to a small fraction of their former glory, and no one knows why. Pollution, warming waters, and a dearth of prey are all possibilities. But Bronwyn Gillanders from the University of Adelaide suspected that the decline might just be part of a natural cycle, a downward trend stuck between upward ones. She couldn’t test that idea, since no one had any long-term data on giant cuttlefish numbers. But such data did exist for other cephalopods—octopuses, squid, and other species of cuttlefish. Gillanders’s team member Zoe Doubleday pulled it all together, by scouring earlier studies and contacting other scientists.
What’s harder to believe: that it took a year for Andrea Constand to accuse the star of sexual assault, or that it’s taken 11 years and dozens more women coming forward for those accusations to be heard in court?
To date, more than 50 women have accused Bill Cosby of sexual misconduct. Constand was the first. In January of 2005 she told police that a year earlier, Cosby had touched and penetrated her after drugging her. A prosecutor decided against proceeding with the case, and Constand followed up with a civil suit that resulted in a 2006 settlement. After that came an accelerating drip of women making allegations about incidents spanning a wide swath of Cosby’s career, from Kristina Ruehli (1965) to Chloe Goins (2008).