Megan Garber writes about some of our earliest forays into lighting our cities with giant "moon-towers" which imitated luna, err, lumination:
During the hot summer of 1882, the installation of the new moon towers became its own kind of brilliant spectacle. People gathered to witness the building of structures that represented Progress and Ingenuity and, in a very real sense, The Future. They also gathered to witness some drama. Since electrical engineers were just learning their trade -- that trade, in Detroit's case, being the erection of 150-foot-tall poles anchoring 500 pounds worth of lights -- accidents were, perhaps, inevitable. And falling towers -- thin metal, plus gravity -- had an uncanny way of slicing through roofs as they toppled toward the ground.
The light itself, though, was the true attraction. It was, as Brush had guaranteed, "picturesque and romantic," one observer put it. Within the glow of the manmade moons, "the foliage is weird and beautiful. All places within the scope of light are bathed in the faint but fairy-like illumination of the moon in its first-quarter."
But not all of the crowds were excited about the new buildings studding their town's landscape. On the contrary, "many Detroiters," Freeberg writes, "were skeptical from the start." Some found the towers to be eyesores, each structure braced with a chaotic network of wires and posts. (One man even tried to chop down the wires that hung near his home, an act of civic-cosmetic rebellion for which he was arrested.) The lights also brought unanticipated complications along with their steady illumination. Animals, for one thing, were unaccustomed to the newly extended daytime. Chickens and geese, unable to sleep in this new state of omnipresent light, began to die of exhaustion.
Humans, too, found the high-slung orbs to be as disorienting as they were ethereal. As tall as the towers were, they still left shadows in their wake -- shadows tinged with sharp blue light, Freeberg notes, which left pedestrians "dazed and puzzled." Foggy evenings, combined with the air pollution of a newly industrialized America, could thrust all of Detroit into effective darkness -- meaning, Freeberg writes, that "Detroiters could only speculate about the lovely sight that their lights must be creating as they shone down on the blanket of mist and soot that smothered the city." Even during occasions when the fog broke enough to allow some light to penetrate to the streets below, "many found themselves groping along sidewalks in an eerie gloom."
I think it was Cynic who once described The Atlantic's Tech channel as half-tech and half-history. I often think that technology is under-appreciated in understanding the advance of civil rights. In the 1860s, Northern soldiers advancing into the South were shocked to see slavery was as bad as abolitionists said it was. One hundred years, Northern whites could see that Bull Connor was every bit as bad as civil rights workers said it was, right from their own couches.
At any rate, I love the historical approach to technology. When I finished reading this, I couldn't quite get Tesla out of my head.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Two scholars discuss the ups and downs of life as a right-leaning professor.
“I don’t think I can say it too strongly, but literally it just changed my life,” said a scholar, about reading the work of Ayn Rand. “It was like this awakening for me.”
Different versions of this comment appear throughout Jon A. Shields and Joshua M. Dunn Sr.’s book on conservative professors, Passing on the Right, usually about people like Milton Friedman and John Stuart Mill and Friedrich Hayek. The scholars they interviewed speak in a dreamy way about these nerdy celebrities, perhaps imagining an alternate academic universe—one where social scientists can be freely conservative.
The assumption that most college campuses lean left is so widespread in American culture that it has almost become a caricature: intellectuals in thick-rimmed glasses preaching Marxism on idyllic grassy quads; students protesting minor infractions against political correctness; raging professors trying to prove that God is, in fact, dead. Studies about professors’ political beliefs and voting behavior suggest this assumption is at least somewhat correct. But Shields and Dunn set out to investigate a more nuanced question: For the minority of professors who are cultural and political conservatives, what’s life actually like?
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
After the successful Allied invasions of western France, Germany gathered reserve forces and launched a massive counter-offensive in the Ardennes, which collapsed by January. At the same time, Soviet forces were closing in from the east, invading Poland and East Prussia. By March, Western Allied forces were crossing the Rhine River, capturing hundreds of thousands of troops from Germany's Army Group B. The Red Army had meanwhile entered Austria, and both fronts quickly approached Berlin. Strategic bombing campaigns by Allied aircraft were pounding German territory, sometimes destroying entire cities in a night. In the first several months of 1945, Germany put up a fierce defense, but rapidly lost territory, ran out of supplies, and exhausted its options. In April, Allied forces pushed through the German defensive line in Italy. East met West on the River Elbe on April 25, 1945, when Soviet and American troops met near Torgau, Germany. Then came the end of the Third Reich, as the Soviets took Berlin, Adolf Hitler committed suicide on April 30, and Germany surrendered unconditionally on all fronts on May 8 (May 7 on the Western Front). Hitler's planned "Thousand-Year Reich" lasted only 12 incredibly destructive years. (This entry is Part 17 of a weekly
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
Why thyroid diseases are so common—and still so mysterious
When I first suspected I was suffering from hypothyroidism, I did what any anxious, Internet-connected person would do and Googled "dysfunctional thyroid symptoms," and, in another tab, "hypothyroid thinning hair??" for good measure.
What came up sounded like someone describing me for an intimately detailed police sketch:
heightened sensitivity to cold
unexplained weight gain
a pale, puffy face ("Finally, a medical explanation for this," I thought.)
This, combined with the fact that a close family member had recently been diagnosed with a thyroid disorder, sent me scurrying to the nearest endocrinologist's office. They took a blood test, and two weeks later the results came back. Sure enough, the doctor said solemnly, I had hypothyroidism, which meant my thyroid was under-active. She would be starting me on thyroid medication. She couldn't know for sure, but I might have to take drugs for the rest of my life.
Online communities like those on Tumblr are perpetuating ideas of "beautiful suffering," confusing what it means to be clinically depressed.
A few months ago, Laura U., a typical 16-year-old at an international school in Paris, sat at her computer wishing she looked just like the emaciated women on her Tumblr dashboard. She pined to be mysterious, haunted, fascinating, like the other people her age that she saw in black and white photos with scars along their wrists, from taking razor blades to their skin. She convinced herself that the melancholic quotes she was reading—“Can I just disappear?” or “People who die by suicide don’t want to end their lives, they want to end their pain”—applied to her.