Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
Plagues, revolutions, massive wars, collapsed states—these are what reliably reduce economic disparities.
Calls to make America great again hark back to a time when income inequality receded even as the economy boomed and the middle class expanded. Yet it is all too easy to forget just how deeply this newfound equality was rooted in the cataclysm of the world wars.
The pressures of total war became a uniquely powerful catalyst of equalizing reform, spurring unionization, extensions of voting rights, and the creation of the welfare state. During and after wartime, aggressive government intervention in the private sector and disruptions to capital holdings wiped out upper-class wealth and funneled resources to workers; even in countries that escaped physical devastation and crippling inflation, marginal tax rates surged upward. Concentrated for the most part between 1914 and 1945, this “Great Compression” (as economists call it) of inequality took several more decades to fully run its course across the developed world until the 1970s and 1980s, when it stalled and began to go into reverse.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
Two historians weigh in on how to understand the new administration, press relations, and this moment in political time.
The election of Donald Trump, and the early days of his presidency, have driven many Americans to rummage through history in search of context and understanding. Trump himself has been compared to historical figures ranging from Ronald Reagan to Henry Ford, and from Andrew Jackson to Benito Mussolini. His steps have been condemned as unprecedented by his critics, and praised as historic by his supporters.
To place contemporary events in perspective, we turned to a pair of historians of the United States. Julian Zelizer is a professor of history and public affairs at Princeton University. He is the author, most recently, of The Fierce Urgency of Now: Lyndon Johnson, Congress, and the Battle for the Great Society. Morton Keller is a professor emeritus of history at Brandeis University. He has written or edited more than 15 books, including Obama’s Time: A History. They’ll be exchanging views periodically on how to understand Trump, his presidency, and this moment in political time. —Yoni Appelbaum
“The question confronting us as a nation is as consequential as any we have faced since the late 1940s,” a group of Republican and Democratic experts write.
Ben Rhodes, one of Barack Obama’s top advisers, once dismissed the American foreign-policy establishment—those ex-government officials and think-tank scholars and journalists in Washington, D.C. who advocate for a particular vision of assertive U.S. leadership in the world—as the “Blob.” Donald Trump had harsher words. As a presidential candidate, he vowed never to take advice on international affairs from “those who have perfect resumes but very little to brag about except responsibility for a long history of failed policies and continued losses at war.” Both men pointed to one of the Beltway establishment’s more glaring errors: support for the war in Iraq.
Now the Blob is fighting back. The “establishment” has been unfairly “kicked around,” said Robert Kagan, a senior fellow at the Brookings Institution and former official in the Reagan administration. As World War II gave way to the Cold War, President Harry Truman and his secretary of state, Dean Acheson, “invented a foreign policy and sold it successfully to the American people. That’s what containment was and that’s what the Truman Doctrine was. … That was the foreign-policy establishment.” During that period, the U.S. government also helped create a system for restoring order to a world riven by war and economic crisis. That system, which evolved over the course of the Cold War and post-Cold War period, includes an open international economy; U.S. military and diplomatic alliances in Asia, Europe, and the Middle East; and liberal rules and institutions (human rights, the United Nations, and so on).
A $100 million gangster epic starring Robert De Niro, Al Pacino, and Joe Pesci has become too risky a proposition for major studios.
Martin Scorsese’s next project, The Irishman, is as close as you can get to a box-office guarantee for the famed director. It’s a gangster film based on a best-selling book about a mob hitman who claimed to have a part in the legendary disappearance of the union boss Jimmy Hoffa. Robert De Niro is attached to play the hitman, Al Pacino will star as Hoffa, and Scorsese favorites Joe Pesci and Harvey Keitel are also on board. After Scorsese branched into more esoteric territory this year with Silence, a meditative exploration of faith and Catholicism, The Irishman sounds like a highly bankable project—the kind studios love. And yet, the film is going to Netflix, which will bankroll its $100 million budget and distribute it around the world on the company’s streaming service.
You can tell a lot about a person from how they react to something.
That’s why Facebook’s various “Like” buttons are so powerful. Clicking a reaction icon isn’t just a way to register an emotional response, it’s also a way for Facebook to refine its sense of who you are. So when you “Love” a photo of a friend’s baby, and click “Angry” on an article about the New England Patriots winning the Super Bowl, you’re training Facebook to see you a certain way: You are a person who seems to love babies and hate Tom Brady.
The more you click, the more sophisticated Facebook’s idea of who you are becomes. (Remember: Although the reaction choices seem limited now—Like, Love, Haha, Wow, Sad, or Angry—up until around this time last year, there was only a “Like” button.)
The preconditions are present in the U.S. today. Here’s the playbook Donald Trump could use to set the country down a path toward illiberalism.
It’s 2021, and President Donald Trump will shortly be sworn in for his second term. The 45th president has visibly aged over the past four years. He rests heavily on his daughter Ivanka’s arm during his infrequent public appearances.
Fortunately for him, he did not need to campaign hard for reelection. His has been a popular presidency: Big tax cuts, big spending, and big deficits have worked their familiar expansive magic. Wages have grown strongly in the Trump years, especially for men without a college degree, even if rising inflation is beginning to bite into the gains. The president’s supporters credit his restrictive immigration policies and his TrumpWorks infrastructure program.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
In late 2015, in the Chilean desert, astronomers pointed a telescope at a faint, nearby star known as ared dwarf. Amid the star’s dim infrared glow, they spotted periodic dips, a telltale sign that something was passing in front of it, blocking its light every so often. Last summer, the astronomers concluded the mysterious dimming came from three Earth-sized planets—and that they were orbiting in the star’s temperate zone, where temperatures are not too hot, and not too cold, but just right for liquid water, and maybe even life.
This was an important find. Scientists for years had focused on stars like our sun in their search for potentially habitable planets outside our solar system. Red dwarfs, smaller and cooler than the sun, were thought to create inhospitable conditions. They’re also harder to see, detectable by infrared rather than visible light. But the astronomers aimed hundreds of hours worth of observations at this dwarf, known as TRAPPIST-1 anyway, using ground-based telescopes around the world and NASA’s Spitzer Space Telescope.
High-school textbooks too often gloss over the American government’s oppression of racial minorities.
Earlier this month, McGraw Hill found itself at the center of some rather embarrassing press after a photo showing a page from one of its high-school world-geography textbooks was disseminated on social media. The page features a seemingly innocuous polychromatic map of the United States, broken up into thousands of counties, as part of a lesson on the country’s immigration patterns: Different colors correspond with various ancestral groups, and the color assigned to each county indicates its largest ethnic representation. The page is scarce on words aside from an introductory summary and three text bubbles explaining specific trends—for example, that Mexico accounts for the largest share of U.S. immigrants today.
Neither truck drivers nor bankers would put up with a system like the one that influences medical residents’ schedules.
The path to becoming a doctor is notoriously difficult. Following pre-med studies and four years of medical school, freshly minted M.D.s must spend anywhere from three to seven years (depending on their chosen specialty) training as “residents” at an established teaching hospital. Medical residencies are institutional apprenticeships—and are therefore structured to serve the dual, often dueling, aims of training the profession’s next generation and minding the hospital’s labor needs.
How to manage this tension between “education and service” is a perennial question of residency training, according to Janis Orlowski, the chief health-care officer of the Association of American Medical Colleges (AAMC). Orlowski says that the amount of menial labor residents are required to perform, known in the profession as “scut work,” has decreased "tremendously" since she was a resident in the 1980s. But she acknowledges that even "institutions that are committed to education … constantly struggle with this,” trying to stay on the right side of the boundary between training and taking advantage of residents.