Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
The candidates are back on the campaign trail, following the third, and final, debate on Wednesday night.
It’s Thursday, October 20—the election is now less than three weeks away. Donald Trump and Hillary Clinton are returning to the campaign trail to deliver their final pitch to voters, ahead of Election Day. We’ll bring you the latest updates from the trail, as events unfold. Also see our continuing coverage:
A gender-studies professor explains how the industry works.
Humans have been creating images of sex and genitalia for millions of years, but it is only in the past few centuries—since the 1600s, according to historians—that these representations started meeting academics’ preferred definition of pornography, which involves both the violation of taboos and the intention of arousal. The first efforts to make money off of this new endeavor could not have come long after that.
With the publication of Playboy and Hustler in the mid-20th-century, porn started going corporate, and the industry has since bloomed into an enterprise so vast that people have a hard time estimating its size. Like any other industry, porn has its shady qualities—labor abuses, content piracy, and a blemished supply chain, to name a few. But unlike nearly any other industry, these unseemly features are allowed to thrive, mostly unchecked, behind the curtain of social taboo.
Science says lasting relationships come down to—you guessed it—kindness and generosity.
Every day in June, the most popular wedding month of the year, about 13,000 American couples will say “I do,” committing to a lifelong relationship that will be full of friendship, joy, and love that will carry them forward to their final days on this earth.
Except, of course, it doesn’t work out that way for most people. The majority of marriages fail, either ending in divorce and separation or devolving into bitterness and dysfunction. Of all the people who get married, only three in ten remain in healthy, happy marriages, as psychologist Ty Tashiro points out in his book The Science of Happily Ever After, which was published earlier this year.
Social scientists first started studying marriages by observing them in action in the 1970s in response to a crisis: Married couples were divorcing at unprecedented rates. Worried about the impact these divorces would have on the children of the broken marriages, psychologists decided to cast their scientific net on couples, bringing them into the lab to observe them and determine what the ingredients of a healthy, lasting relationship were. Was each unhappy family unhappy in its own way, as Tolstoy claimed, or did the miserable marriages all share something toxic in common?
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
As the group sheds territory, its propaganda wing has been forced to come up with a new storyline.
On the morning of October 17, Iraqi Prime Minister Haider al-Abadi announced the launch of the operation to recapture the Iraqi city of Mosul from the Islamic State. In the hours that followed, Kurdish Peshmerga claimed to have seized no fewer than nine villages and 200 square kilometers of territory. By lunchtime on day two, the spokesman for the U.S.-led coalition went as far as to say that the offensive was “on or ahead of schedule.”
Unsurprisingly, the Islamic State’s version of events read very differently. While its official media team conceded that the group had faced a large attack near Mosul on Monday morning, that was about all its propaganda shared with the mainstream news narrative. Indeed, while the peshmerga were counting up their captured kilometers at the end of the first day, the Islamic State’s Amaq News Agency was claiming that the reports were all false, and that it had, contrary to the lies peddled by the “crusader” media, managed to “absorb the momentum” of the encroaching forces before subsequently “repelling” them.
“Imagine what would happen if we don’t stand and fight [ISIS],” he said:
If we didn’t do that, you could have allies and friends of ours fall. You could have a massive migration into Europe that destroys Europe, leads to the pure destruction of Europe, ends the European project, and everyone runs for cover and you’ve got the 1930s all over again, with nationalism and fascism and other things breaking out. Of course we have an interest in this, a huge interest in this.
With two and a half weeks to go, the debate phase of the competition is at last at its end. In real time last night I did an endless tweet-storm commentary whose beginning you can find here and that wound up this way:
Most of what I thought, I said at the time. But to summarize:
1) Predictability. To my relief, most of the expert forecasts I quoted in my debate preview piece matched what actually occurred.
The match-up really did turn out to be an extreme contrast at every level—intellectual and rhetorical styles, bearing on stage, what each candidate talked about and didn’t. The things Jane Goodall foresaw about Trump’s primate-dominance moves actually took place, when he was free to roam the stage in debate #2. As his fallen rivals from the Republican primaries had predicted, Trump faced much greater challenges in these head-to-head debates than he had in the crowded-podium prelims. Back then, he could chime in with an insult whenever he wanted and otherwise just stay quiet and roll his eyes. In the head-to-head round, especially the last debate, he struggled to fill his allotted time with details on any topic and fell back on slogans from his stump speech. Also predictably, Hillary Clinton was as prepared as she could be and barely put a foot wrong.
Rarely have presidential nominees declared, without qualification, that it’s a woman’s right to choose.
Even in a presidential campaign that has become so intensely focused on gender, there was something surreal about watching Hillary Clinton’s response to a question about abortion in Wednesday night’s debate.
Here was the first woman nominated by a major party for the United States presidency, standing on the debate stage in “suffragette white,” and talking in no uncertain terms about her strong commitment to protecting a woman’s right to “make the most intimate, most difficult in many cases, decisions about her health care that one can imagine.”
Democrats are expected to support abortion rights, of course, but that support is often couched with carefully hedged language. This is an understandable impulse, given how divisive the issue of abortion remains.
Tristan Harris believes Silicon Valley is addicting us to our phones. He’s determined to make it stop.
On a recent evening in San Francisco, Tristan Harris, a former product philosopher at Google, took a name tag from a man in pajamas called “Honey Bear” and wrote down his pseudonym for the night: “Presence.”
Harris had just arrived at Unplug SF, a “digital detox experiment” held in honor of the National Day of Unplugging, and the organizers had banned real names. Also outlawed: clocks, “w-talk” (work talk), and “WMDs” (the planners’ loaded shorthand for wireless mobile devices). Harris, a slight 32-year-old with copper hair and a tidy beard, surrendered his iPhone, a device he considers so addictive that he’s called it “a slot machine in my pocket.” He keeps the background set to an image of Scrabble tiles spelling out the words face down, a reminder of the device’s optimal position.
The conservative thinker’s work is a reminder of how intellectually self-satisfied politicians and cable-news have become.
William F. Buckley Jr. could have made Donald Trump quiver with impotent rage. This is a guy who sent Ayn Rand postcards in liturgical Latin just to make her mad, and then bragged about it in her obituary. In part because of his trollish panache, the founder of National Review and longtime host of the television show Firing Line was a conservative mascot in life, and he has become mythologized in death. The 2016 election has made it clear that no one quite like Buckley is working in media today: Republicans are hurting for a cocksure slayer of pseudo-conservative invaders.
No wonder two Buckley retrospectives have come out this October. Open to Debate, by the Massachusetts Institute of Technology media-studies professor Heather Hendershot, examines Buckley’s tenure on Firing Line and the diverse ideologies represented on the show. A Torch Kept Lit, edited by the Fox News correspondent James Rosen, chronicles notable obituaries written by WFB, as Buckley’s fans often call him. Both indulge nostalgia in their own way, but their yearning points to something real: In American politics, and specifically in political media, quality debate has seemingly withered. The presidential election has been an 18-month-long series of lows for civil discourse, culminating in the insult-laden, nearly-impossible-to-follow presidential debates.