Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
Take a walk along West Florissant Avenue, in Ferguson, Missouri. Head south of the burned-out Quik Trip and the famous McDonalds, south of the intersection with Chambers, south almost to the city limit, to the corner of Ferguson Avenue and West Florissant. There, last August, Emerson Electric announced third-quarter sales of $6.3 billion. Just over half a mile to the northeast, four days later, Officer Darren Wilson killed Michael Brown. The 12 shots fired by Officer Wilson were probably audible in the company lunchroom.
Outwardly, at least, the City of Ferguson would appear to occupy an enviable position. It is home to a Fortune 500 firm. It has successfully revitalized a commercial corridor through its downtown. It hosts an office park filled with corporate tenants. Its coffers should be overflowing with tax dollars.
Freddie Gray's death on April 19 leaves many unanswered questions. But it is clear that when Gray was arrested in West Baltimore on the morning of April 12, he was struggling to walk. By the time he arrived at the police station a half hour later, he was unable to breathe or talk, suffering from wounds that would kill him.*
Gray died Sunday from spinal injuries. Baltimore authorities say they're investigating how the 25-year-old was hurt—a somewhat perverse notion, given that it was while he was in police custody, and hidden from public view, that he apparently suffered injury. How it happened remains unknown. It's even difficult to understand why officers arrested Gray in the first place. But with protestors taking to the streets of Baltimore since Gray's death on Sunday, the incident falls into a line of highly publicized, fatal encounters between black men and the police. Meanwhile, on Tuesday, a reserve sheriff's deputy in Tulsa, Oklahoma, pleaded not guilty to a second-degree manslaughter charge in the death of a man he shot. The deputy says the shooting happened while he was trying to tase the man. Black men dying at the hands of the police is of course nothing new, but the nation is now paying attention and getting outraged.
After a five-month delay, Loretta Lynch made history last week. On Thursday, the Senate confirmed Lynch as the next U.S. attorney general, the first African American woman ever to hold this Cabinet position. Her long-stalled nomination sometimes seemed in doubt, held hostage to partisan jockeying between Democrats and Republicans. But one political bloc never gave up, relentlessly rallying its support behind Lynch: the black sorority.
During her initial hearing, the seats behind Lynch were filled with more than two dozen of her Delta Sigma Theta Sorority sisters arrayed in crimson-and-cream blazers and blouses, ensuring their visibility on the national stage. These Delta women—U.S. Representatives Marcia Fudge and Joyce Beatty among them—were there to lend moral support and show the committee that they meant business. The Deltas were not alone. The Lynch nomination also drew support from congressional representatives from other black sororities: Alpha Kappa Alpha members Terri Sewell and Sheila Jackson Lee took to the House floor to advocate for a vote while Sigma Gamma Rho members Corinne Brown and Robin Kelly and Zeta Phi Beta member Donna Edwards used social media and press conferences to campaign on Lynch’s behalf.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Hours after a major earthquake wreaked havoc across his country, Nepali Information Minister Minendra Rijal appeared at a news conference on Saturday to announce that schools would be closed for the next five days. "We never imagined we'd face such devastation," he said.
But for geologists, Saturday's disaster—which has claimed over 2,400 lives—was sadly predictable.
"Physically and geologically what happened is exactly what we thought would happen," James Jackson, head of the earth-sciences department at the University of Cambridge, told the Associated Press.
Blessed with stunning natural scenery, Nepal is a popular tourist destination that attracts hundreds of thousands of travelers each year. But the source of the country's beauty is what makes it particularly vulnerable to earthquakes. Much of Nepal's population lives in a valley beneath the Himalayas, a mountain range formed by collisions between the Indian and Central Asian tectonic plates. These collisions—which occur when the Indian plate slides underneath its much larger neighbor—are what cause earthquakes. According to The Washington Post, a chunk of the earth measuring 75 by 37 miles shifted 10 feet in 30 seconds on Saturday, destroying much of what lay atop the surface.
I’m not a dog person. I prefer cats. Cats make you work to have a relationship with them, and I like that. But I have adopted several dogs, caving in to pressure from my kids. The first was Teddy, a rottweiler-chow mix whose bushy hair was cut into a lion mane. Kids loved him, and he grew on me, too. Teddy was probably ten years when we adopted him. Five years later he had multiple organs failing and it was time to put him to sleep.
When I arrived at the vet, he said I could drop him off. I was aghast. No. I needed to stay with Teddy.As the vet prepped the syringe to put him to sleep, I started sobbing. The vet gave me a couple minutes to collect myself and say goodbye. I held Teddy's paw until he died. Honestly, I didn't think I was that attached.
A lot of Internet ink has been spilled over how lazy and entitled Millennials are, but when it comes to paying for a college education, work ethic isn't the limiting factor. The economic cards are stacked such that today’s average college student, without support from financial aid and family resources, would need to complete 48 hours of minimum-wage work a week to pay for his courses—a feat that would require superhuman endurance, or maybe a time machine.
To take a close look at the tuition history of almost any institution of higher education in America is to confront an unfair reality: Each year’s crop of college seniors paid a little bit more than the class that graduated before. The tuition crunch never fails to provide new fodder for ongoing analysis of the myths and realities of The American Dream. Last week, a graduate student named Randy Olson listened to his grandfather extol the virtues of putting oneself through college without family support. But paying for college without family support is a totally different proposition these days, Olson thought. It may have been feasible 30 years ago, or even 15 years ago, but it's much harder now.
Whenever a college student asks me, a veteran high-school English educator, about the prospects of becoming a public-school teacher, I never think it’s enough to say that the role is shifting from "content expert" to "curriculum facilitator." Instead, I describe what I think the public-school classroom will look like in 20 years, with a large, fantastic computer screen at the front, streaming one of the nation’s most engaging, informative lessons available on a particular topic. The "virtual class" will be introduced, guided, and curated by one of the country’s best teachers (a.k.a. a "super-teacher"), and it will include professionally produced footage of current events, relevant excerpts from powerful TedTalks, interactive games students can play against other students nationwide, and a formal assessment that the computer will immediately score and record.
In a few weeks, millions of college students will enter the real world with dreams of finding work that's meaningful and challenging—and preferably lucrative enough to live roommate-free in a major city. As they embark on their job searches, recent graduates are frequently given the vague advice to "go out and network."
But what exactly should this networking entail? What does one say to a perfect stranger whom one has cajoled into "grabbing coffee," while also telepathically conveying one's desire for a job?
Science has one piece of advice, which is this: Ask them for advice.
Far from inconveniencing or annoying the advice-giver, research shows that asking for advice appears to boost perceptions of intelligence.
In her new book No One Understands You and What To Do About It, Heidi Grant Halvorson tells readers a story about her friend, Tim. When Tim started a new job as a manager, one of his top priorities was communicating to his team that he valued each member’s input. So at team meetings, as each member spoke up about whatever project they were working on, Tim made sure he put on his “active-listening face” to signal that he cared about what each person was saying.
But after meeting with him a few times, Tim’s team got a very different message from the one he intended to send. “After a few weeks of meetings,” Halvorson explains, “one team member finally summoned up the courage to ask him the question that had been on everyone’s mind.” That question was: “Tim, are you angry with us right now?” When Tim explained that he wasn’t at all angry—that he was just putting on his “active-listening face”—his colleague gently explained that his active-listening face looked a lot like his angry face.