Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.
And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.
— Deuteronomy 15: 12–15
Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.
Bernie Sanders and Jeb Bush look abroad for inspiration, heralding the end of American exceptionalism.
This election cycle, two candidates have dared to touch a third rail in American politics.
Not Social Security reform. Not Medicare. Not ethanol subsidies. The shibboleth that politicians are suddenly willing to discuss is the idea that America might have something to learn from other countries.
The most notable example is Bernie Sanders, who renewed his praise for Western Europe in a recent interview with Ezra Klein. “Where is the UK? Where is France? Germany is the economic powerhouse in Europe,” Sanders said. “They provide health care to all of their people, they provide free college education to their kids.”
On ABC’s This Week in May, George Stephanopoulos asked Sanders about this sort of rhetoric. “I can hear the Republican attack ad right now: ‘He wants American to look more like Scandinavia,’” the host said. Sanders didn’t flinch:
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.
The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.
The piece begins by detailing how Clinton helped the global bank.
“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.
Last week, the U.S. finally received some good news in Syria: After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.
The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.
Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.
After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.
With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.