How on earth did Oprah Winfrey--an unlikely media mogul if there ever was one--get so popular, powerful and rich, all at the same time? Ever since she announced at the end of last week that she was walking away from her legendarily popular syndicated talk show in 2011, the airwaves have been abuzz with discussion about what it means, what impact it will have ... and how she managed to get this successful in the first place.
In yesterday's New York Times, columnist David Carr argued that Oprah Winfrey should be studied in every business school in America--not only for the smart moves she made, but also for the mistakes she didn't make. She didn't go public with her company, so she retained control. She didn't lend her name to other people's products. When she decided to add a magazine to her stable, she created her own, with such a clear sense of branding that she put herself on each and every cover. She didn't use her wealth to invest in fields she knew nothing about. Oprah did extend her brand into new shows, from Dr. Phil to Rachel Ray, but her offshoots all had the same feel and market as the mothership. And she didn't try to cash in on every possible profit opportunity, including the success of the books she turned into overnight bestsellers.
Oprah's branding success, according to Vogue editor Anna Wintour, was due to the fact that she steered her business through "personal choices," like a woman who has an enviably clear and innate sense of what looks good on her. Which is undoubtedly true. But that complicates the matter of how one would teach or replicate Oprah's success in b-school.
Oprah Winfrey, after all, gives a whole new meaning to the "Chicago School" of economics. A meaning that would make Milton Friedman, the father of the adage "the purpose of business is to make as much money as possible for shareholders," turn over in his grave. Oprah never allowed shareholders, of course, which simplified the matter. But, still. In an era where the primacy of the bottom line ruled triumphant, Oprah gave away cars, eschewed commissions on products she made popular, and turned down the short-term money that going public or selling the company could have brought. And made $2.3 billion as a result of it.
Scholars could parse all her decisions for wisdom about brand management, risk, leadership, growth strategies, marketing, and internal R&D investment. They might even find places where her success seemed to illustrate well-known models or schools of thought. Someone is probably working on it right now, as a matter of fact. Which is all well and good, because there's certainly a lot of wisdom that can be gleaned from the story of Oprah's successful climb from a local Chicago talk show host to the CEO of her own production company and network, while becoming a seismic cultural force and, arguably, the most powerful and wealthy woman in America.
The trouble is, Oprah's success isn't just the sum of her strategies. The engine that not only drove those particular strategies, but also made them successful, was a deep sense of identity, authenticity, and purpose that can't be imitated or crafted through method. If Oprah has a deep and guiding understanding of her audience, it's not because she's methodically observed them. It's because she's lived their struggles, hopes, joys and sorrows. And those struggles gave her first a connection, and then a purpose, from which all other decisions organically flowed.
In the world of Silicon Valley, it's said there are two types of entrepreneurs: missionaries, and mercenaries. Mercenaries can make a lot of money if they're smart and have good strategies. But missionary entrepreneurs are the ones who change industries and the world--not only because they continue on no matter how hard the going gets, but because they bring to bear an irresistible combination of passion, authenticity and sense of purpose bigger than mere profit or themselves. Success, for them, is as much about impact as it is about profit. Which is, ironically, how many of them become incredibly profitable.
Clearly, Oprah is a missionary entrepreneur. But how do you teach someone to be a successful missionary? Even Polonius' advice to Laertes, "to thine own self be true" is insufficient. If asked, I suspect Oprah would say that first you have to learn who you are, where you came from, how that affects and informs you, and what matters in the world. You also have to care about something bigger than yourself, and imagine a way in which your particular skills could allow you to make a difference in that area. And whether you seek that path out, or stumble upon it along the way, you have to care about making that difference enough that the vision of it keeps you going through the dark, and can act as a compass to steer your decisions along the way.
Add to that some smarts, savvy, and sharp thinking about content, brand management, marketing, and growth, and you have a legend in the making. But those last bits are the only pieces that can be taught. Honest self-knowledge, authenticity, passion and purpose are harder to acquire. Most often, they emerge from battles fought in the midnights of our solitude, if we manage to scrounge up the courage to face what we find there.
But if you can't teach the intuition that emerges from those internal journeys, you can at least teach its importance. Asking "what would Oprah do?" might not be a bad exercise when contemplating tough or tempting business options. It's not a quantifiable model, of course, and the results can't be proven. But it wouldn't be a bad placeholder while encouraging students to explore enough about themselves and the world to develop a true-steering compass and passionate purpose of their own.
Note: I will be offline for the next week, returning Friday, December 4th. Photo Credit: Flickr User whoohoo120
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
The presumptive successor to John Boehner abruptly ended his bid after determining he could not get the support he needed from conservatives.
Behind Kevin McCarthy’s stunning decision Thursday to end his bid for speaker lay a simple calculation: Even if he could scrape together the 218 votes he needed to win the formal House election later this month, he would begin his term a crippled leader unable to unite a party that he said was “deeply divided.”
The majority leader and presumed successor to John Boehner had been widely expected to win the House GOP’s secret-ballot nomination on Thursday. All he needed was a simple majority of the 247-member caucus, and he easily had the votes over long-shot challengers Jason Chaffetz of Utah or Daniel Webster of Florida, who won the endorsement of the renegade House Freedom Caucus. But even if he’d won on Thursday, McCarthy knew he was still short of the threshold he needed on the floor, knowing that Democrats would vote as a bloc against him.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.