Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
When President Obama left, I stayed on at the National Security Council in order to serve my country. I lasted eight days.
In 2011, I was hired, straight out of college, to work at the White House and eventually the National Security Council. My job there was to promote and protect the best of what my country stands for. I am a hijab-wearing Muslim woman––I was the only hijabi in the West Wing––and the Obama administration always made me feel welcome and included.
Like most of my fellow American Muslims, I spent much of 2016 watching with consternation as Donald Trump vilified our community. Despite this––or because of it––I thought I should try to stay on the NSC staff during the Trump Administration, in order to give the new president and his aides a more nuanced view of Islam, and of America's Muslim citizens.
Meet the protesters who tricked conference attendees into waving Russian flags.
Two men made trouble—and stirred up a social-media frenzy—on the third day of the Conservative Political Action Conference by conducting a literal false-flag operation.
Jason Charter, 22, and Ryan Clayton, 36, passed out roughly 1,000 red, white, and blue flags, each bearing a gold-emblazoned “TRUMP” in the center, to an auditorium full of attendees waiting for President Trump to address the conference. Audience members waved the pennants—and took pictures with them—until CPAC staffers realized the trick: They were Russian flags.
The stunt made waves on social media, as journalists covering CPAC noticed the scramble to confiscate the insignia.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
Trump is undermining America’s national security by trying to shape analysis to support his world view.
The White House recently sought to enlist the Departments of Homeland Security and Justice to build a case for its controversial and unpopular immigration ban, CNN reported on Thursday. Among intelligence professionals, the request to produce analysis that supports a favored policy—vice producing analysis, and allowing it to inform policy—is called politicization. It is anathema to the training most analysts receive and the values that lie at the heart of the vocation. There is a high cost to putting ideology over informed assessments of political, economic, and military realities.
At the Central Intelligence Agency, where I served as director of strategy in the Directorate of Analysis, the subject of politicization is introduced to analysts almost as soon as they enter into service. There is good reason for this: Politicization is not an academic issue.
The preconditions are present in the U.S. today. Here’s the playbook Donald Trump could use to set the country down a path toward illiberalism.
It’s 2021, and President Donald Trump will shortly be sworn in for his second term. The 45th president has visibly aged over the past four years. He rests heavily on his daughter Ivanka’s arm during his infrequent public appearances.
Fortunately for him, he did not need to campaign hard for reelection. His has been a popular presidency: Big tax cuts, big spending, and big deficits have worked their familiar expansive magic. Wages have grown strongly in the Trump years, especially for men without a college degree, even if rising inflation is beginning to bite into the gains. The president’s supporters credit his restrictive immigration policies and his TrumpWorks infrastructure program.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
“No… it’s a magic potty,” my daughter used to lament, age 3 or so, before refusing to use a public restroom stall with an automatic-flush toilet. As a small person, she was accustomed to the infrared sensor detecting erratic motion at the top of her head and violently flushing beneath her. Better, in her mind, just to delay relief than to subject herself to the magic potty’s dark dealings.
It’s hardly just a problem for small people. What adult hasn’t suffered the pneumatic public toilet’s whirlwind underneath them? Or again when attempting to exit the stall? So many ordinary objects and experiences have become technologized—made dependent on computers, sensors, and other apparatuses meant to improve them—that they have also ceased to work in their usual manner. It’s common to think of such defects as matters of bad design. That’s true, in part. But technology is also more precarious than it once was. Unstable, and unpredictable. At least from the perspective of human users. From the vantage point of technology, if it can be said to have a vantage point, it's evolving separately from human use.
Since the middle of last year, a group of Filipino reporters, photographers, and cameramen have been at the frontline of Philippine President Rodrigo Duterte’s war on drugs. They are a different type of war correspondent, and the drug war, a different type of war.
The correspondents work what they call the “night shift,” the unholy hours between 10 p.m. and 5 a.m., when the dead bodies are found. They wait at Manila’s main police station and rush from there to the site of the most recent kill. They keep count of the corpses, talk to witnesses and families, interview the police, attend wakes and funerals. A lot of what the world learned about the carnage, especially in the early months, is due largely to the night shift reporters.
Minimum-wage jobs are physically demanding, have unpredictable schedules, and pay so meagerly that workers can't save up enough to move on.
Fifty years ago, President Lyndon B. Johnson made a move that was unprecedented at the time and remains unmatched by succeeding administrations. He announced a War on Poverty, saying that its “chief weapons” would be “better schools, and better health, and better homes, and better training, and better job opportunities.”
So starting in 1964 and for almost a decade, the federal government poured at least some of its resources in the direction they should have been going all along: toward those who were most in need. Longstanding programs like Head Start, Legal Services, and the Job Corps were created. Medicaid was established. Poverty among seniors was significantly reduced by improvements in Social Security.
In “American Bitch,” Hannah confronts an author accused of sexual misconduct—and sees how her own past fits into a larger system.
Why do the girls of Girls act that way? That’s the question underlying five years of baffled cultural responses to Lena Dunham’s epic of questionable decisions, cruelty, narcissism, and grace. Girls has never given a straightforward answer to the question. Despite unflinching confessional dialogue and occasional backstory development and sharp cultural satire, Hannah Horvath and her friends still have an air of Athena, sprung into existence fully formed. Asking why these girls spill drinks and impulsively marry and vomit off of bunkbeds is like asking why anyone exists at all.
This has made Girls unusual in a cultural landscape where the tragic flashback is the go-to decoder of individual motivation. To take two recent examples from HBO, The Young Popeconnected Pope Pious’s childhood abandonment to his adult torment, and Westworld’s so-called “key insight” was that to be human is to remember suffering. In society more broadly, ongoing dialogues about trauma, triggering, and privilege—dialogues that Dunham often wades into as a public figure—insist that personal history needs to be taken as seriously as present conduct does.
Ellen Stofan was the only woman to testify at a congressional hearing about the future of space exploration—and the only person left out of the official tweets about the event.
Last week, the House Science, Space and Technology committee invited four witnesses from NASA’s past to discuss the agency’s future endeavors, including a human mission to Mars, a possible return to the moon, and the commercial space sector. NASA consistently polls as Americans’ favorite federal agency, and its popularity cuts across party lines. The hearing could have been a brief respite from the bickering that has seized Washington of late. And it almost was.
Near the end, Ellen Stofan, NASA’s chief science officer under President Barack Obama, gave Mars enthusiasts some reason for hope. Americans can expect a lunar habitat by the 2020s and humans in Mars orbit in 2032, she said.
That’s the clearest timeline on NASA’s “Journey to Mars” in some time. Many space enthusiasts were, well, enthused. But then last Friday, Stofan shared this picture: