An extremist group has seized the African city of Timbuktu, systematically destroying its monuments.
The West African city of Timbuktu used to be one of Africa's richest and most important, a nexus of trade across the Sahara and a center of religious and scientific learning as far back as the 1400s. The relics of that history still stand in the form of such world heritage sites as the University of Sankore. More recently, this city in the sprawling West African country of Mali has been a tourism draw. But, on April 2, it came under new ownership: rebels from an ethnic minority known as Tuareg, who'd sought independence for years. Five days later they got it, declaring northern Mali as the independent country of Azawad. Then, on June 1, breakaway rebels with the extremist Islamist group Ansar Dine (translation: "Defenders of Faith") took control of Timbuktu.
In their first month of rule, Ansar Dine has shut down the tourism industry ("We are against tourism. They foster debauchery," a representative said), sent locals fleeing, and, over the past four days, destroyed half of the shrines that mark Timbuktu's ancient and remarkable history. The United Nations condemned the destruction and the International Criminal Court suggested it could be a war crime, but Ansar Dine insisted they won't slow down, later pulling a beautiful Gothic door off the Sidi Yahya mosque that became one of the world's great centers of learning during the 1400s. They follow an extreme form of Islam (though a relatively modern one; it emerged in late-1700s Saudi Arabia) that sees Timbuktu's shrines and mosque-universities as sacrilegious; a form of idol-worship. Their campaign is still going -- it's been compared to the Taliban's early-2001 destruction of ancient Buddha statues -- and some observers worry that many of Timbuktu's historical treasures, which have survived countless invasions and empires, won't live out the month.
Because you may never be able to be visit them yourself if you haven't already, here are the photos and stories of some of Timbuktu's most important historical sites.
A team of donkeys walks past the Djingarey Ber, the oldest mosque in Timbuktu. King Mansa Musa paid an architect 200 kilograms of gold to design it, a show of his kingdom's prestige, and it was completed in 1327. Ever since, it has been a symbol of the grandeur of the medieval Malian empire. Though Mali is today a very poor part of the world, 14th century Timbuktu was a center of wealth, trade, and education, including at mosques like this one, which doubled as learning centers. (emilio labrador/Flickr)
A Tuareg man stands in front of the Djingarey Ber mosque. Many Tuaregs, who are traditionally nomadic and tend to live in Mali's north, have long sought to secede from the south, where the capital city of Bamako sits some 600 miles away. Amazingly, Djingarey Ber is built mostly from mud-brick and wood (though there is one large limestone wall) yet has amazingly stood for almost 700 years. Its architect installed cactus-like sticks in the sides of the walls so that, every year after the seasonal rains, engineers could climb up the side to repair any damage, which they've done for centuries since. (Reuters)
The interior of the Djingarey Ber mosque, which was designed to hold 2000 worshipers at a time. The UN designated it a UNESCO World Heritage Site in 1988. (Wikimedia Commons)
Locals cart goods past the Sankore mosque, which is often known as Sankore University for its remarkable history as a place for education as well as religion. Though less architecturally significant than the older Djingarey Ber, Sankore developed in the 15th and 16th centuries as one of the medieval world's great centers of learning. Students would travel here to learn history, math, and astronomy, as well as Islam, from its respected scholars. It is still in use as a mosque; a speaker, used to broadcast the daily call to prayer, juts out from its side. (emilio labrador/Flickr)
This photo shows Sankore from the opposite end of the famous, mud-brick minaret. This is the outer courtyard wall. (upyernoz/Flickr)
This is the main entrance of the Sidi Yahya mosque, which along with Sankore and Djingarey Ber make up what is sometimes called the "University of Timbuktu," the trio of medieval-era Islamic and education centers. It was built in 1400 but left empty in expectation of a holy leader, who emerged in 1441 as a man named Sidi Yahya, after which the complex was later named. (Muhamed Maznillah)
The ornately decorated front door of Sidi Yahya mosque reflects the increasing Moroccan, gothic influence on 15th century Timbuktu. (Muhamed Maznillah)
The tomb of Sidi Yahya himself, the namesake for the 15th century mosque in which he is buried. In early June, members of the extremist group Ansar Dine, which has seized Timbuktu, destroyed his tomb. They declared that the burial site made Yahya a false idol, threatening to continue their destruction of Timbuktu's historic sites. Though the UN and many others condemned Ansar Dine's act, it doesn't appear that there's anything they can do to stop them. (Muhamed Maznillah)
A U.S. museum displays a copy of a manuscript page, the original of which is in Timbuktu, hand-written by the prominent Islamic scholar Omar ibn Said. The West African's late-1800s religious writings were both an important contribution to Islamic thinking and a testament to Timbuktu's continued significance, centuries later, for Islam. Said was captured by slave-traders in 1807 and shipped to the Carolinas, where he died in 1964, a common slave age either 93 or 94. His writings are held in Timbuktu's Mama Haidara Manuscript Library. Though Ansar Dine extremists have not targeted this library, locals say they are worried about their cache of ancient Islamic manuscripts, some of which go back to the 13th century. (AP Images)
The streets in front of Sankore are usually fuller than this. But this photo was taken on April 11, a week and a half after rebels seized Timbuktu, reportedly sending many residents fleeing over fear of more fighting. (AP Images)
This building probably isn't in danger, but its story is a reminder of Timbuktu's history: Africans have long traversed the Saharan desert, typically through Timbuktu, using the strategically located city to pass goods, slaves, and knowledge between black sub-Saharan Africa and the Arab-dominated north. But the first European to cross to Timbuktu was the Scottish explorer Alexander Gordon Laing, who set out from Tripoli in July 1825 at the behest of the UK colonial secretary. He arrived over a year later, in August 1826, broke, sick, and without a right hand, which he'd lost in one of many skirmishes with marauding Tuareg. He settling into this house, where he planed to remain only three days before continuing on, but ended up staying 38, on the final of which he was murdered. (upyernoz/Flickr)
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
There’s a common perception that women siphon off the wealth of their exes and go on to live in comfort. It’s wrong.
A 38-year-old woman living in Everett, Washington recently told me that nine years ago, she had a well-paying job, immaculate credit, substantial savings, and a happy marriage. When her first daughter was born, she and her husband decided that she would quit her job in publishing to stay home with the baby. She loved being a mother and homemaker, and when another daughter came, she gave up the idea of going back to work.
Seven years later, her husband told her to leave their house, and filed for a divorce she couldn’t afford. “He said he was tired of my medical issues, and unwilling to work on things,” she said, citing her severe rheumatoid arthritis and OCD, both of which she manages with medication. “He kicked me out of my own house, with no job and no home, and then my only recourse was to lawyer up. I’m paying them on credit.” (Some of the men and women quoted in this article have been kept anonymous because they were discussing sensitive financial matters, some of them involving ongoing legal disputes.)
Garry Marshall's patronizing 'holiday anthology' film boasts a star-studded ensemble, but its characters seem barely human.
It’s hard to know where to begin with Mother’s Day, a misshapen Frankenstein of a movie that feels like it escaped the Hallmark headquarters halfway through its creation and rampaged into theaters, trying to teach audiences how to love. The third in Garry Marshall’s increasingly strange “holiday anthology” series, Mother’s Day isn’t the rom-com hodge-podge that Valentine’s Day was, or the bizarre morass of his follow-up New Year’s Eve. But it does inspire the kind of holy terror that you feel all the way down to your bones, or the revolted tingling that strikes one at a karaoke performance gone tragically wrong.
While it’s aiming for frothiness and fun, Mother’s Day is a patronizing and sickly sweet endeavor that widely misses the mark for its entire 118-minute running time (it feels much longer). The audience gets the sense that there are many Big Truths to be learned: that family harmony is important, that it’s good to accept different lifestyles without judgment, that loss is a natural part of the circle of life. But its overall construction—as a work of cinema—always feels a little off. One character gets a life lesson from a clown at a children’s party, and departs with a hearty “Thanks, clown!” Extras wander in the background and deliver halting bits of expositional dialogue like malfunctioning robots. Half of the lines seem to have been recorded post-production and are practically shouted from off-screen to patch over a narrative that makes little sense. Mother’s Day is bad in the regular ways (e.g. the acting and writing), but also in that peculiar way, where it feels as though the film’s creator has never met actual humans before.
Congress delayed the fight to fund the virus—a decision that comes at the cost of public health and potentially billions for the U.S. economy.
In all likelihood, Congress was never all that close to finding a way to push past factional politics and fund efforts to fight Zika. Lawmakers have adjourned for recess after a failure to find common ground on the issue, and as my colleague Nora Kelly notes, the divide comes mostly over the same political issues that hold up any congressional productivity. Despite ample evidence of the virus’s severity, Republicans balk at the idea of expanding public-health funding and executive spending, or they propose “poison-pill” measures—such as raiding the Ebola fund—as counterproductive solutions. Congress seems relatively lukewarm about finding a solution, but that inactivity could cost it much more in the long run.
In Trump’s aftermath, his enemies on the right will have to take stock and propose a meaningful alternative vision for the GOP’s future.
Donald Trump’s big victories in the Mid-Atlantic primaries don’t represent quite the end of the ballgame—but they come damn close.
And now Donald Trump’s many and fierce opponents in the Republican Party and the conservative movement face the hour of decision. Trump looks ever more certain to be the party nominee. Yet not perhaps since George McGovern in 1972 has a presumptive nominee so signally failed to carry the most committed members of his party with him.
So what happens now to those who regard themselves as party thought-leaders? Do they submit? Or do they continue to resist?
Resistance now means something more—and more dangerous—than tapping out #NeverTrump on Twitter. It means working to defeat Trump even knowing that the almost certain beneficiary will be Hillary Clinton.
Knowing the right people certainly has benefits, but how long do they last?
It would seem a safe bet that when faced with two offers from similarly prestigious companies, a job candidate would, most of the time, end up taking the one with higher pay. But when New York University’s Jason Greenberg and MIT’s Roberto M. Fernandez analyzed over 700 job offers from a cohort of students graduating from elite MBA programs, they found that something other than pay was driving students’ decisions.
In a paper that will soon be published in the journal Sociological Science, Greenberg and Fernandez write that the students were significantly more likely to accept jobs found through networking—done either through alums of their program or their own social connections—even if those jobs came with lower pay than offers arriving through more formal channels, like on-campus recruiting. The choice, the researchers suggest, may be driven by students’ interest in their own career development, and a belief that taking a job with more networking opportunities would give them a professional edge, even if it came at the cost of compensation.
DATE: MAY 1, 1994
FROM: DR. HUNTER S. THOMPSON
SUBJECT: THE DEATH OF RICHARD NIXON: NOTES ON THE PASSING OF AN AMERICAN MONSTER.... HE WAS A LIAR AND A QUITTER, AND HE SHOULD HAVE BEEN BURIED AT SEA.... BUT HE WAS, AFTER ALL, THE PRESIDENT.
"And he cried mightily with a strong voice, saying, Babylon the great is fallen, is fallen, and is become the habitation of devils, and the hold of every foul spirit and a cage of every unclean and hateful bird."
Richard Nixon is gone now, and I am poorer for it. He was the real thing -- a political monster straight out of Grendel and a very dangerous enemy. He could shake your hand and stab you in the back at the same time. He lied to his friends and betrayed the trust of his family. Not even Gerald Ford, the unhappy ex-president who pardoned Nixon and kept him out of prison, was immune to the evil fallout. Ford, who believes strongly in Heaven and Hell, has told more than one of his celebrity golf partners that "I know I will go to hell, because I pardoned Richard Nixon."