Honesty seems like such a no-brainer of a requirement. But it's caused a great deal of controversy in Canada over the past few weeks--controversy heightened by the upcoming launch of a new, politically conservative Canadian television channel called Sun TV.
A Licensee shall not broadcast ... d) false or misleading news.
At first glance, it seems such an obvious, common-sense requirement that I was a little surprised that the Canadians had felt a need to put it in writing, or that anyone could possibly argue against it. But with a little more thought, I realized how profound the stricture really was. I also began to wonder why we don't have a similar requirement here in the U.S--and how different our public discourse might be if we did.
The controversy over the Canadian rule erupted in January, when the Canadian Radio-television and Telecommunications Commission (CRTC), Canada's equivalent to our FCC, proposed amending the rule to prohibit only:
...any news that the licensee knows to be false or misleading and that endangers or is likely to endanger the lives, health or safety of the public.
The root of the proposed amendment apparently goes back 10 years to a Canadian Supreme Court ruling that affirmed the free speech right of a Holocaust denier named Ernst Zundel to espouse those views. The Canadian Joint Parliamentary Committee on the Scrutiny of Regulations subsequently asked the CRTC to review its "false and misleading news" prohibition to determine if it violated free-speech guarantees.
The CRTC dragged its feet for 10 years. But then, this January, the proposed amendment was announced. Why the sudden action after 10 years of inaction? That's part of the controversy. The CRTC chairman says they were ordered to to it by the regulatory committee, but one of the committee co-chairmen says that's not true.
The controversy was also heightened by the impending launch of a new, privately-owned Canadian television station called Sun TV, now scheduled to go on-air April 18th. Sun TV is owned by Quebecor, the same company that owns the Toronto Sun tabloid newspaper, which has a reputation as a right-wing publication. The station is being promoted as a feisty, "controversially Canadian, hard-news" television version of the paper (according to Quebecor's president) and an outlet that will "take on mainstream media" (according to its vice president).
Critics accused the CRTC of looking to change the rules to give Sun TV more leeway in what it broadcasts. But both the CRTC and the parliamentary committee deny any correlation between the two events. And it is true that the committee had been requesting a review of the rule for a decade. In any event, a huge public outcry ensued, and the parliamentary committee finally looked into the matter itself and concluded that a broadcast station did not have the same rights and freedoms as an individual and, further, that a broadcasting license was a privilege, not a right. The committee pointed out that stations already had to comply with numerous restrictions and conditions to get and maintain their licenses, including limits on the content of their broadcasts. Consequently, the CRTC withdrew its proposed amendment. Canada will continue to require stations to refrain from broadcasting "false or misleading news."
Or, at least, the rule will remain on the books. Apparently, the CRTC has never actually taken any action against a station pursuant to that rule. One of the arguments for the amendment, in fact, was that the CRTC lacked enforcement capability, and had never enforced the rule anyway. But the CRTC does have the ability to revoke a station's license--which might give a station owner at least a little pause before allowing its on-air talent to present unsupported theories as fact or get too overzealous in their conclusions or spin on the news.
But the question remains ... why don't we have a similar requirement here in the U.S.? Traditionally, both broadcast radio and television and cable television stations have been subject to regulation, including content regulation, by the FCC. Although that regulation originated from the fact that airwaves were extremely limited, and not accessible to everyone, the regulation continued even after the birth and expansion of cable television, because courts recognized that television and radio are "uniquely pervasive" in people's lives, in a way print media are not. Indecent speech is already prohibited on broadcast television and, at least in theory, on cable (although courts' opinions on the best remedies for enforcing that goal seem to vary). Before its repeal in 1987, both broadcast and cable stations were both subject to the "Fairness Doctrine," which required the stations to present a balance of both sides to any controversial issue.
So given that we've long recognized that a broadcaster or cablecaster has power beyond an individual citizen or even print media, and therefore does not warrant quite the same "free speech" or "free press" rights without restriction (as the Canadian parliament just concluded) ... why can't we have a restriction on broadcasting (or cablecasting) false or misleading news?
One reason is probably the same reason the Fairness Doctrine no longer exists. It's laughable now, with the explosion of narrow-interest fringe websites and narrow-audience, right-wing and left-wing cable shows on Fox News and MSNBC, but in the deregulation atmosphere of the 1980s, the FCC's rationale for getting rid of the Fairness Doctrine was twofold: first, that the Fairness Doctrine inhibited the broadcasters' right to free speech, and second, that the free market was a better regulator of news content on television than the government. Specifically, the FCC said that individual media outlets would compete with each other for viewers, and that competition would necessarily involve establishing the accuracy, credibility, reliability and thoroughness of each story ... and that over time, the public would weed out new providers that proved to be inaccurate, unreliable, one-sided, or incredible.
One wonders, really, if the FCC had ever studied human behavior or the desire of people to have their individual points of view validated. Far from "weeding out" providers of one-sided, or even incredible information, we now revel in what New York Times columnist Nicholas Kristof once called "The Daily Me"--a selection of news outlets that never ever challenge our particular points of view.
Contrary to the FCC's theory, our particular public seems to reward, rather than punish, outrageous or one-sided news providers. And while that may make each of us feel nice and righteous as we pick and choose our news broadcasters and commentators, one would be hard-pressed to argue that it enhances the quality of our public--or even our personal--discourse. Especially given the questionable "truth" of many of the statements or inferences made on those highly targeted outlets. In theory, we could all fact-check everything we hear on the TV or radio, of course. But few people have the time to do that, even if they had the contacts or resources.
But forget about the Fairness Doctrine. Imagine, instead, if all those broadcasters were simply prohibited from broadcasting (or cablecasting) "false or misleading news." Is it unacceptable censorship to require someone to be basically honest in what they broadcast as "news"--and which we are more likely to accept as truth, because it comes from a serious and authoritative-sounding news anchor?
Think about it. We prohibit people from lying in court, because the consequences of those lies are serious. That's a form of censorship of free speech, but one we accept quite willingly. And while the consequences of what we hear on television and radio are not as instantly severe as in a court case, one could argue that the damage widely-disseminated false information does to the goal of a well-informed public and a working, thriving democracy is significant, as well. What's more, if we really thought everyone had the right to say whatever they wanted, regardless of truth or consequences, we wouldn't prohibit anyone from yelling "fire" in a crowded theatre that wasn't actually on fire. We wouldn't have slander or libel laws. We wouldn't have laws about hate speech. And we'd allow broadcasters and cablecasters to air all words and all images, no matter how indecent, at all times.
Ah. But what if a broadcaster or cablecaster didn't know the information was false? I suppose you could prohibit only knowingly airing false or misleading information. But on the other hand, if a station were at risk for sanction or a license revocation for getting it wrong (even if the FCC rarely enforced the measure), it might motivate reporters and anchors to do a bit more fact checking--and even, perhaps, a bit more research into alternative viewpoints--before seizing on and running with a hot or juicy scoop or angle.
It's odd, really, that the idea of requiring news broadcasters to be fundamentally honest about the information they project across the nation and into our homes sounds radical. Surely we wouldn't argue that we want to be lied to and misled, would we?
In closing as in opening, it seems to me that Bernie Sanders was foolish in a Democratic primary to refrain from specifically mentioning the identity groups that Hillary Clinton has just ticked off in her closing statement.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
By announcing the first detection of gravitational waves, scientists have vindicated Einstein and given humans a new way to look at the universe.
More than a billion years ago, in a galaxy that sits more than a billion light-years away, two black holes spiraled together and collided. We can’t see this collision, but we know it happened because, as Albert Einstein predicted a century ago, gravitational waves rippled out from it and traveled across the universe to an ultra-sensitive detector here on Earth.
This discovery, announced today by researchers with the Laser Interferometer Gravitational-wave Observatory (LIGO), marks another triumph for Einstein’s general theory of relativity. And more importantly, it marks the beginning of a new era in the study of the universe: the advent of gravitational-wave astronomy. The universe has just become a much more interesting place.
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
If Bernie Sanders is serious about a political transformation in America, he needs a better plan.
If there’s one thing that fires up Bernie Sanders supporters—and makes his detractors roll their eyes—it’s his call for a “political revolution.” To his base, it’s the very point of his anti-establishment, anti-elite candidacy. To his critics, it’s the very embodiment of his campaign’s naïve impracticality and vagueness.
But now that voters in Iowa and New Hampshire have spoken, it’s time to take the idea of political revolution more seriously—more seriously, indeed, than Sanders himself appears to have. It’s time to ask: What exactly would it take?
It starts with Congress. And here it’s instructive to compare Sanders and Donald Trump. Both rely on broad, satisfying refrains of “We’re gonna”: We’re gonna break up the big banks. We’re gonna make Mexico build the wall. We’re gonna end the rule of Wall Street billionaires. We’re gonna make China stop ripping us off.
Once it was because they weren’t as well educated. What’s holding them back now?
Though headway has been made in bringing women’s wages more in line with men’s in the past several decades, that convergence seems to have stalled in more recent years. To help determine why, Francine D. Blau and Lawrence M. Kahn, the authors of a new study from the National Bureau of Economic Research parse data on wages and occupations from 1980 to 2010. They find that as more women attended and graduated college and headed into the working world, education and professional experience levels stopped playing a significant role in the the difference between men and women’s wages. Whatever remains of the discrepancy can’t be explained by women not having basic skills and credentials. So what does explain it?
When four American women were murdered during El Salvador’s dirty war, a young U.S. official and his unlikely partner risked their lives to solve the case.
On December 1, 1980, two American Catholic churchwomen—an Ursuline nun and a lay missionary—sat down to dinner with Robert White, the U.S. ambassador to El Salvador. They worked in rural areas ministering to El Salvador’s desperately impoverished peasants, and White admired their commitment and courage. The talk turned to the government’s brutal tactics for fighting the country’s left-wing guerrillas, in a dirty war waged by death squads that dumped bodies in the streets and an army that massacred civilians. The women were alarmed by the incoming Reagan administration’s plans for a closer relationship with the military-led government. Because of a curfew, the women spent the night at the ambassador’s residence. The next day, after breakfast with the ambassador’s wife, they drove to San Salvador’s international airport to pick up two colleagues who were flying back from a conference in Nicaragua. Within hours, all four women would be dead.
After a pair of poor showings in New Hampshire, Chris Christie and Carly Fiorina drop out of the race.
The Republican race is headed to South Carolina with two fewer candidates. The day after finishing sixth and seventh in the New Hampshire primaries, New Jersey Governor Chris Christie and former Hewlett-Packard CEO Carly Fiorina announced on Wednesday that they were suspending their campaigns.
Fiorina was always a long shot—she was practically a political newcomer, having only run one unsuccessful Senate campaign. And while her record at HP was vulnerable to attack, Republican figures saw in her both private-sector experience and a woman who could counter Hillary Clinton’s monopoly on a “historic” woman’s candidacy. While many political professionals sniffed at Fiorina’s candidacy, remembering that 2010 Senate race, she broke out after a commanding performance in the undercard to the first Republican debate. That earned her a promotion to the main stage at the next debate, where she scored another victory. But it was all downhill from there. Dogged by questions of honesty and unable to earn media attention, her campaign faded quickly.
The hit new indie release is the opposite of action-packed, yet it’s compelling in its simplicity.
Solitude, it turns out, can be addictive. So I learned playing the new hit indie game Firewatch, where all the action amounts to you, the player, being alone in the woods. You’re a lookout assigned to a summer posting in the Shoshone National Forest of Wyoming in 1989, meaning your job consists of nothing more than wandering around, clearing brush, and calling in any fires you might spot. Most video games equip you with tools and weapons, complex missions, and action sequences. All Firewatch gives you is a map, a compass, and a walkie-talkie—but it’s still one of the most compelling video games I’ve ever played.
It’s the latest in a quiet movement of video games, more psychological products that tap into the atmosphere and wonder of loneliness rather than looking for the simpler thrills the medium usually provides. It’s tempting to trace this trend’s origins back to Minecraft, which launched in 2009 and became a worldwide phenomenon on the back of its extraordinary simplicity. But in Minecraft, you start armed only with your bare hands in a world of monsters, and can eventually upgrade into a city-builder armed with powerful tools. Firewatch is a more intimate affair: a short story, playable over a few hours, that succeeds first and foremost as an emotional experience.