In the 1840s, a French economist named Frederic Bastiat wrote:
In the economic sphere, an act, a habit, an institution, a law produces not only one effect, but a series of effects. Of these effects, the first alone is immediate; it appears simultaneously with its cause; it is seen. The other effects emerge only subsequently; they are not seen. There is only one difference between a good economist and a bad one: the bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those which must be foreseen.
In layman's terms, this is the law of unintended consequences, and it plays out, like Murphy's Law, in more spheres than just economics. And while not all unintended consequences are negative, we notice most when an attempt to improve something ends up with an unexpected counter-effect. The saying "the road to Hell is paved with good intentions" refers not only to those who think of doing good but don't act, but also those who think they're acting to a good end but end up causing harm.
The conundrum of the law is that, in many cases, the two types of effects are too closely linked to separate out cleanly. Eliminating the unintended negative consequence would require eliminating the positive effect, as well.
The first time I went to Sudan, for example, I interviewed aid workers and pilots who were flying relief supplies into regions of the country that had been decimated by 18 years of civil war. Without the supplies, people would die. But the local population had also grown dependent on the handouts, and some of the aid was being stolen by troops and helping to support continued fighting. What do you do in a situation like that? In that case, the need to stave off death by starvation was deemed more important than the subtler problems of stolen food and long-term economic impact.
But the issue gets stickier when the "seen" effect isn't addressing a need that's quite so dire or immediate. Take the case of a second-hand bookseller in Salisbury, England who claims he was put out of business by Oxfam--a non-profit organization that, ironically, was one of the organizations sending supplies into war-torn Sudan.
Oxfam does a lot of good work in the world. The United Nations camps for Darfur refugees I visited a couple of years ago in eastern Chad had been set up and were being run by Oxfam personnel who were sacrificing a lot to be there. Doing that work, of course, requires money. U.N. contracts supply part of the organization's operating budget, but Oxfam also relies heavily on charitable donations. According to a recent New York Times article on the subject, Oxfam also receives $500 million a year in support from the British government. Like many chartitable entities, from Goodwill to local hospital foundations, Oxfam also runs a series of shops where it sells donated goods. The proceeds help to support its development and aid programs around the world. It's a win-win for everyone -- donors get a tax break, starving children in Africa get food and clean water.
But here's the sticky part. Oxfam has opened up 130 used book stores around Europe, which bring in a reported $32 million a year ... and are competing with small, mom-and-pop used booksellers in the same neighborhoods. Oxfam has renovated, clean, and similarly-designed and decorated storefronts ... which it can afford to invest in, because it has government support, volunteer workers and tax-deductible, donated products. So it has a market advantage because of its special status as a non-profit organization--an advantage that at least a couple of booksellers claim has put them out of business.
The Oxfam spokesperson quoted in the Times article seemed a tad insensitive, at best, when he shrugged and quipped "Independent candle makers don't have the business they once had either. And if someone's business model is so marginal that an Oxfam shop opening nearby decimates it, then we are not the problem." This, mind you, from an organization that deals almost exclusively with people around the world whose "business models" are so marginal that they would not survive at all without outside assistance.
Marc Harrison, a former Catholic priest who had to close his second-hand bookstore when he couldn't pay his mortgage this past summer, accused Oxfam of "destroying lives here to save them elsewhere."
It's true, of course, that Oxfam's proceeds go to a good cause, instead of personal pockets--although part of its operating budget is the salaries of its worldwide personnel. It's also hard to argue that a former priest who has to close his second-hand bookshop because he can't pay his mortgage is a greedy capitalist. I would wager, in fact, that one doesn't open a second-hand bookstore for the golden profits it's going to garner, any more than people open animal shelters for the good, easy money involved. It's more about preserving something considered precious and finding orphans good second homes. And while the world is not fair, and businesses often have an edge over a competitor because of more favorable loan or other business terms, the Oxfam case does seem to represent particularly unfair competition.
It's an argument that has been raised before, in many different sectors. In trade negotiations in the aerospace industry, Boeing argued that Airbus had an unfair edge because of its government subsidies; Airbus argued back that Boeing had benefitted from NASA's research, which was a subsidy of a different sort. And NASA itself has been accused of unfair competition in soliciting new business to try to shore up its ever-changing and unsteady Congressional funding. NASA had always allowed private corporations to use its test facilities for a fee, but the fee used to be less than what other commercial test centers charged, because much of the overhead was covered by civil-servant salaries. Private industry objected, and NASA ceded the point, changing to a system of "full cost accounting" which put its costs at a more comparable level to that of private entities.
But it's easier to make those adjustments in a field where business is done by contract pricing. It would be harder to implement that kind of "level-playing-field" shift in the used bookstore market. The used clothing industry--also populated by many non-profit organizations--has a small commercial component, as well, but most for-profit "consignment stores" (the upmarket term for a used clothing outlet) tend to be pickier about the quality of their products to differentiate themselves from the everyday thrift stores. They also offer donors a piece of the profits, to lure customers who might otherwise donate the clothing to a non-profit outlet.
Perhaps booksellers could follow the same model, although the profit margin may not be big enough for that to create much incentive in the used book industry. But regardless, the question of non-profits generating funds through commercial means--while a staple of support for charitable organizations for many years--can sometimes unintentionally cross into some muddy, gray areas of commerce, fairness, and collateral damage. Successfully navigating the lines between good works, self-sustaining funding, and commercial competition and rights is a tough challenge. And a solution that preserves the good benefits while avoiding the negative side-consequences may prove as elusive in Salisbury as it did in Sudan.
Non-profit organizations do a tremendous amount of good in the world. But just as with the work they do around the world, the irony remains that a good intention, and even really good work, can sometimes carry with it "unseen" and unintended consequences. At home, as well as abroad.
Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.
During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.
The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
How Vladimir Putin is making the world safe for autocracy
Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.
Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.
His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.
Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.
The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.
On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.
Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.
A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.
Updated at 2:20 p.m.
President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.
Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.
Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”
“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.
I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”
“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”
Unbelievable, I thought. According to them, I’m too generous with my hellos.
When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
No other place mixes affordability, opportunity, and wealth so well. What’s its secret?
If the American dream has not quite shattered as the Millennial generation has come of age, it has certainly scattered. Living affordably and trying to climb higher than your parents did were once considered complementary ambitions. Today, young Americans increasingly have to choose one or the other—they can either settle in affordable but stagnant metros or live in economically vibrant cities whose housing prices eat much of their paychecks unless they hit it big.
The dissolution of the American dream isn’t just a feeling; it is an empirical observation. In 2014, economists at Harvard and Berkeley published a landmark study examining which cities have the highest intergenerational mobility—that is, the best odds that a child born into a low-income household will move up into the middle class or beyond. Among large cities, the top of the list was crowded with rich coastal metropolises, including San Francisco, San Jose, Los Angeles, San Diego, and New York City.