Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
As Oklahoma attorney general, Scott Pruitt sued the federal government to prevent rules about air and water pollution from taking effect.
Throughout the long campaign, and in the long month that has followed, President-elect Donald Trump sounded some odd notes about the environment.
He denied the existence of climate change, calling it a hoax or a fraud. He repeatedly announced his intent to repeal all of the Obama administration’s environmental regulations. He lamented, wrongly, that you couldn’t use hairspray anymore because it damaged the ozone layer.
And then, out of nowhere, he met with Al Gore, who won a Nobel Peace Prize for educating the public about the dangers of climate change.
While the broad strokes of Trump’s policies were never in doubt, there was often enough bizarreness to wonder what he would do with the powers of the Environmental Protection Agency.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
Trump's election has reopened questions that have long seemed settled in America—including the acceptability of open discrimination against minority groups.
When Stephen Bannon called his website, Breitbart, the “platform for the alt-right” this summer, he was referring to a movement that promotes white nationalism and argues that the strength of the United States is tied to its ethnic European roots. Its members mostly stick to trolling online, but much of what they do isn’t original or new: Their taunts often involve vicious anti-Semitism. They make it clear that Jews are not included in their vision of a perfect, white, ethno-state.
On the opposite side of American politics, many progressive groups are preparing to mount a rebellion against Donald Trump. They see solidarity among racial minorities as their goal, and largely blame Trump’s election on racism and white supremacy. Three-quarters of American Jews voted against Trump, and many support this progressive vision. Some members of these groups, though, have singled out particular Jews for their collusion with oppressive power—criticisms which range from inflammatory condemnations of Israel to full-on conspiracies about global Jewish media and banking cabals.
Americans are optimistic about the communities they live in—but not their nation. Why?
I have been alive for a long time. I remember the assassination of John F. Kennedy, when I was a 10th-grader, and then watching with my family through the grim following days as newscasters said that something had changed forever. The next dozen years were nearly nonstop trauma for the country. More assassinations. Riots in most major cities. All the pain and waste and tragedy of the Vietnam War, and then the public sense of heading into the utterly unknown as, for the first time ever, a president was forced to resign. Americans of my children’s generation can remember the modern wave of shocks and dislocations that started but did not end with the 9/11 attacks.
Through all this time, I have been personally and professionally, and increasingly, an American optimist. The long years I have spent living and working outside the United States have not simply made me more aware of my own strong identity as an American. They have also sharpened my appreciation for the practical ramifications of the American idea. For me this is the belief that through its cycle of struggle and renewal, the United States is in a continual process of becoming a better version of itself. What I have seen directly over the past decade, roughly half in China and much of the rest in reporting trips around the United States, has reinforced my sense that our current era has been another one of painful but remarkable reinvention, in which the United States is doing more than most other societies to position itself, despite technological and economic challenges, for a new era of prosperity, opportunity, and hope.
Why has Trump shown such eagerness to select former military brass for his Cabinet? The reasons may be both pragmatic and political.
Donald Trump didn’t always speak highly of military brass. “I know more about ISIS than the generals do,” he said in fall 2016. “Believe me.” In September, he added, “I think under the leadership of Barack Obama and Hillary Clinton, the generals have been reduced to rubble. They have been reduced to a point where it’s embarrassing for our country…. And I can just see the great—as an example—General George Patton spinning in his grave as ISIS we can’t beat.”
But Trump’s disdain had a caveat: “I have great faith in the military. I have great faith in certain of the commanders, certainly.”
These days, he’s leaning toward the second pole. Already, Trump has selected three retired generals for Cabinet-level jobs. On Tuesday, he formally announced that he’s nominating retired Marine General James Mattis as defense secretary. On Wednesday, multiple outlets reported that he has selected John Kelly, another retired Marine general, as secretary of homeland security. Former Lieutenant General Michael Flynn got the nod as national security adviser on November 17.
A unified theory of why political satire is biased toward, and talk radio is biased against, liberals in America.
Soon after Jon Stewart arrived at The Daily Show in 1999, the world around him began to change. First, George W. Bush moved into the White House. Then came 9/11, and YouTube, and the advent of viral videos. Over the years, Stewart and his cohort mastered the very difficult task of sorting through all the news quickly and turning it around into biting, relevant satire that worked both for television and the Internet.
Now, as Stewart prepares to leave the show, the brand of comedy he helped invent is stronger than ever. Stephen Colbert is getting ready to bring his deadpan smirk to The Late Show. Bill Maher is continuing to provoke pundits and politicians with his blunt punch lines. John Oliver’s Last Week Tonight is about to celebrate the end of a wildly popular first year. Stewart has yet to announce his post-Daily Show plans, but even if he retires, the genre seems more than capable of carrying on without him.
To many white Trump voters, the problem wasn’t her economic stance, but the larger vision—a multi-ethnic social democracy—that it was a part of.
Perhaps the clearest takeaway from the November election for many liberals is that Hillary Clinton lost because she ignored the working class.
In the days after her shocking loss, Democrats complained that Clinton had no jobs agenda. A widely shared essay in The Nationblamed Clinton's "neoliberalism" for abandoning the voters who swung the election. “I come from the white working class,” Bernie Sanders said on CBS This Morning, “and I am deeply humiliated that the Democratic Party cannot talk to where I came from.”
But here is the troubling reality for civically minded liberals looking to justify their preferred strategies: Hillary Clinton talked about the working class, middle class jobs, and the dignity of work constantly. And she still lost.
Strangling public-sector unions in Wisconsin has shrunk teachers’ pay and benefits. Who’s next?
Back in 2009, Rick Erickson was happy with his job as a teacher in one of the state’s northernmost school districts on the shores of Lake Superior. He made $35,770 a year teaching chemistry and physics, which wasn’t a lot of money, but then again, he received stellar healthcare and pension benefits, and could talk honestly with administrators about what he needed as a teacher every two years when his union sat down with the school district in collective bargaining sessions.
Then, five years ago, Wisconsin passed Act 10, also known as the Wisconsin Budget Repair Bill, which dramatically limited the ability of teachers and other public employees to bargain with employers on wages, benefits, and working conditions. After Act 10,Erickson saw his take-home pay drop dramatically: He now makes $30,650. His wife is a teacher, too, and together they make 11 percent less than they did before Act 10. The local union he once led no longer exists, and so he can’t bargain with the school district for things like prep time and sick days. He pays more for health care and his pension, and he says both he and his wife may now not be able to retire until they are much older than they had planned.