Honesty seems like such a no-brainer of a requirement. But it's caused a great deal of controversy in Canada over the past few weeks--controversy heightened by the upcoming launch of a new, politically conservative Canadian television channel called Sun TV.
A Licensee shall not broadcast ... d) false or misleading news.
At first glance, it seems such an obvious, common-sense requirement that I was a little surprised that the Canadians had felt a need to put it in writing, or that anyone could possibly argue against it. But with a little more thought, I realized how profound the stricture really was. I also began to wonder why we don't have a similar requirement here in the U.S--and how different our public discourse might be if we did.
The controversy over the Canadian rule erupted in January, when the Canadian Radio-television and Telecommunications Commission (CRTC), Canada's equivalent to our FCC, proposed amending the rule to prohibit only:
...any news that the licensee knows to be false or misleading and that endangers or is likely to endanger the lives, health or safety of the public.
The root of the proposed amendment apparently goes back 10 years to a Canadian Supreme Court ruling that affirmed the free speech right of a Holocaust denier named Ernst Zundel to espouse those views. The Canadian Joint Parliamentary Committee on the Scrutiny of Regulations subsequently asked the CRTC to review its "false and misleading news" prohibition to determine if it violated free-speech guarantees.
The CRTC dragged its feet for 10 years. But then, this January, the proposed amendment was announced. Why the sudden action after 10 years of inaction? That's part of the controversy. The CRTC chairman says they were ordered to to it by the regulatory committee, but one of the committee co-chairmen says that's not true.
The controversy was also heightened by the impending launch of a new, privately-owned Canadian television station called Sun TV, now scheduled to go on-air April 18th. Sun TV is owned by Quebecor, the same company that owns the Toronto Sun tabloid newspaper, which has a reputation as a right-wing publication. The station is being promoted as a feisty, "controversially Canadian, hard-news" television version of the paper (according to Quebecor's president) and an outlet that will "take on mainstream media" (according to its vice president).
Critics accused the CRTC of looking to change the rules to give Sun TV more leeway in what it broadcasts. But both the CRTC and the parliamentary committee deny any correlation between the two events. And it is true that the committee had been requesting a review of the rule for a decade. In any event, a huge public outcry ensued, and the parliamentary committee finally looked into the matter itself and concluded that a broadcast station did not have the same rights and freedoms as an individual and, further, that a broadcasting license was a privilege, not a right. The committee pointed out that stations already had to comply with numerous restrictions and conditions to get and maintain their licenses, including limits on the content of their broadcasts. Consequently, the CRTC withdrew its proposed amendment. Canada will continue to require stations to refrain from broadcasting "false or misleading news."
Or, at least, the rule will remain on the books. Apparently, the CRTC has never actually taken any action against a station pursuant to that rule. One of the arguments for the amendment, in fact, was that the CRTC lacked enforcement capability, and had never enforced the rule anyway. But the CRTC does have the ability to revoke a station's license--which might give a station owner at least a little pause before allowing its on-air talent to present unsupported theories as fact or get too overzealous in their conclusions or spin on the news.
But the question remains ... why don't we have a similar requirement here in the U.S.? Traditionally, both broadcast radio and television and cable television stations have been subject to regulation, including content regulation, by the FCC. Although that regulation originated from the fact that airwaves were extremely limited, and not accessible to everyone, the regulation continued even after the birth and expansion of cable television, because courts recognized that television and radio are "uniquely pervasive" in people's lives, in a way print media are not. Indecent speech is already prohibited on broadcast television and, at least in theory, on cable (although courts' opinions on the best remedies for enforcing that goal seem to vary). Before its repeal in 1987, both broadcast and cable stations were both subject to the "Fairness Doctrine," which required the stations to present a balance of both sides to any controversial issue.
So given that we've long recognized that a broadcaster or cablecaster has power beyond an individual citizen or even print media, and therefore does not warrant quite the same "free speech" or "free press" rights without restriction (as the Canadian parliament just concluded) ... why can't we have a restriction on broadcasting (or cablecasting) false or misleading news?
One reason is probably the same reason the Fairness Doctrine no longer exists. It's laughable now, with the explosion of narrow-interest fringe websites and narrow-audience, right-wing and left-wing cable shows on Fox News and MSNBC, but in the deregulation atmosphere of the 1980s, the FCC's rationale for getting rid of the Fairness Doctrine was twofold: first, that the Fairness Doctrine inhibited the broadcasters' right to free speech, and second, that the free market was a better regulator of news content on television than the government. Specifically, the FCC said that individual media outlets would compete with each other for viewers, and that competition would necessarily involve establishing the accuracy, credibility, reliability and thoroughness of each story ... and that over time, the public would weed out new providers that proved to be inaccurate, unreliable, one-sided, or incredible.
One wonders, really, if the FCC had ever studied human behavior or the desire of people to have their individual points of view validated. Far from "weeding out" providers of one-sided, or even incredible information, we now revel in what New York Times columnist Nicholas Kristof once called "The Daily Me"--a selection of news outlets that never ever challenge our particular points of view.
Contrary to the FCC's theory, our particular public seems to reward, rather than punish, outrageous or one-sided news providers. And while that may make each of us feel nice and righteous as we pick and choose our news broadcasters and commentators, one would be hard-pressed to argue that it enhances the quality of our public--or even our personal--discourse. Especially given the questionable "truth" of many of the statements or inferences made on those highly targeted outlets. In theory, we could all fact-check everything we hear on the TV or radio, of course. But few people have the time to do that, even if they had the contacts or resources.
But forget about the Fairness Doctrine. Imagine, instead, if all those broadcasters were simply prohibited from broadcasting (or cablecasting) "false or misleading news." Is it unacceptable censorship to require someone to be basically honest in what they broadcast as "news"--and which we are more likely to accept as truth, because it comes from a serious and authoritative-sounding news anchor?
Think about it. We prohibit people from lying in court, because the consequences of those lies are serious. That's a form of censorship of free speech, but one we accept quite willingly. And while the consequences of what we hear on television and radio are not as instantly severe as in a court case, one could argue that the damage widely-disseminated false information does to the goal of a well-informed public and a working, thriving democracy is significant, as well. What's more, if we really thought everyone had the right to say whatever they wanted, regardless of truth or consequences, we wouldn't prohibit anyone from yelling "fire" in a crowded theatre that wasn't actually on fire. We wouldn't have slander or libel laws. We wouldn't have laws about hate speech. And we'd allow broadcasters and cablecasters to air all words and all images, no matter how indecent, at all times.
Ah. But what if a broadcaster or cablecaster didn't know the information was false? I suppose you could prohibit only knowingly airing false or misleading information. But on the other hand, if a station were at risk for sanction or a license revocation for getting it wrong (even if the FCC rarely enforced the measure), it might motivate reporters and anchors to do a bit more fact checking--and even, perhaps, a bit more research into alternative viewpoints--before seizing on and running with a hot or juicy scoop or angle.
It's odd, really, that the idea of requiring news broadcasters to be fundamentally honest about the information they project across the nation and into our homes sounds radical. Surely we wouldn't argue that we want to be lied to and misled, would we?
“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”
Last Sunday the host of a popular news show asked me what it meant to lose my body. The host was broadcasting from Washington, D.C., and I was seated in a remote studio on the Far West Side of Manhattan. A satellite closed the miles between us, but no machinery could close the gap between her world and the world for which I had been summoned to speak. When the host asked me about my body, her face faded from the screen, and was replaced by a scroll of words, written by me earlier that week.
The host read these words for the audience, and when she finished she turned to the subject of my body, although she did not mention it specifically. But by now I am accustomed to intelligent people asking about the condition of my body without realizing the nature of their request. Specifically, the host wished to know why I felt that white America’s progress, or rather the progress of those Americans who believe that they are white, was built on looting and violence. Hearing this, I felt an old and indistinct sadness well up in me. The answer to this question is the record of the believers themselves. The answer is American history.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
New data shows that students whose parents make less money pursue more “useful” subjects, such as math or physics.
In 1780, John Adams wrote a letter to his wife, Abigail, in which he laid out his plans for what his children and grandchildren would devote their lives to. Having himself taken the time to master “Politicks and War,” two revolutionary necessities, Adams hoped his children would go into disciplines that promoted nation-building, such as “mathematicks,” “navigation,” and “commerce.” His plan was that in turn, those practical subjects would give his children’s children room “to study painting, poetry, musick, architecture, statuary, tapestry, and porcelaine.”
Two-hundred and thirty-five years later, this progression—“from warriors to dilettantes,” in the words of the literary scholar Geoffrey Galt Harpham—plays out much as Adams hoped it would: Once financial concerns have been covered by their parents, children have more latitude to study less pragmatic things in school. Kim Weeden, a sociologist at Cornell, looked at National Center for Education Statistics data for me after I asked her about this phenomenon, and her analysis revealed that, yes, the amount of money a college student’s parents make does correlate with what that person studies. Kids from lower-income families tend toward “useful” majors, such as computer science, math, and physics. Those whose parents make more money flock to history, English, and performing arts.
Most adults can’t remember much of what happened to them before age 3 or so. What happens to the memories formed in those earliest years?
My first memory is of the day my brother was born: November 14, 1991. I can remember my father driving my grandparents and me over to the hospital in Highland Park, Illinois, that night to see my newborn brother. I can remember being taken to my mother’s hospital room, and going to gaze upon my only sibling in his bedside cot. But mostly, I remember what was on the television. It was the final two minutes of a Thomas the Tank Engine episode. I can even remember the precise story: “Percy Takes the Plunge,” which feels appropriate, given that I too was about to recklessly throw myself into the adventure of being a big brother.
In sentimental moments, I’m tempted to say my brother’s birth is my first memory because it was the first thing in my life worth remembering. There could be a sliver of truth to that: Research into the formation and retention of our earliest memories suggests that people’s memories often begin with significant personal events, and the birth of a sibling is a textbook example. But it was also good timing. Most people’s first memories date to when they were about 3.5 years old, and that was my age, almost to the day, when my brother was born.
Gentrification is pushing long-term residents out of urban neighborhoods. Can collective land ownership keep prices down permanently?
AUSTIN, Tex.—Not long ago, inner cities were riddled with crime and blight and affluent white residents high-tailed it to the suburbs, seeking better schools, safer streets, and, in some cases, fewer minority neighbors.
But today, as affluent white residents return to center cities, people who have lived there for years are finding they can’t afford to stay.
Take the case of the capital city of Texas, where parts of East Austin, right next to downtown, are in the process of becoming whiter, and hip restaurants, coffee shops, and even a barcatering to bicyclists are opening. Much of Austin’s minority population, meanwhile, is priced out, and so they’re moving to far-out suburbs such as Pflugerville and Round Rock, where rents are affordable and commutes are long.
The singer’s violent revenge fantasy was intended to provoke outrage, and to get people to talk about her. It succeeds on both counts.
Of all the scandalized reactions to Rihanna’s music video for “Bitch Better Have My Money,” my favorite comes, as is not surprising for this sort of thing, from the Daily Mail. Labelling herself in the headline as a “concerned parent” (a term to transport one to the days of Tipper Gore’s crusade against lyrics if there ever was one), Sarah Vine opens her column by talking at length about how so very, very reluctant she was to watch Rihanna’s new clip. Then she basically goes frame-by-frame through the video, recounting her horror at what unfolds. “By the time it had finished, I wondered whether I ought not to report [Rihanna] to the police,” Vine writes. “Charges: pornography, incitement to violence, racial hatred.”
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The unwillingness of the former secretary of state to take questions from the press contrasts sharply with Jeb Bush’s marked affinity for public disclosure.
Howard Kurtz reported on Sunday night that the Hillary Clinton campaign has decided to open itself to more press interviews. Kurtz quoted the campaign’s communications director, Jennifer Palmieri: “By not doing national interviews until now, Palmieri concedes, ‘we’re sacrificing the coverage. We’re paying a price for it.’”
Meanwhile Jeb Bush chatted July 2 with the conservative website, the Daily Caller. The Daily Caller interview broke an unusually protracted no-interview period for Bush. It had been more than two weeks since he appeared on the Tonight show with Jimmy Fallon. Bush spoke that same day, June 17, to Sean Hannity’s radio show and ABC News. Five days earlier, he’d spoken to Germany’s Der Spiegel—altogether, five interviews in the month of June. That brought his total, since the beginning of February, to 39, according to the Bush campaign.*