Are print books really about to disappear, overtaken like horse-drawn carriages in the age of Detroit and the Ford Model T? Truth is, nobody knows. Nobody ever really knows what the future is going to hold, no matter how sure they sound in their predictions.
Certainly, for all the fuss made about the Kindle, more than 95% of book buyers are still opting for the print version ... except, possibly, in the hot romance and erotic fiction categories. Earlier this year, Peter Smith, of IT World, noted that "of the top 10 bestsellers under the 'Multiformat' category [of Fictionwise ebooks sold], nine are tagged 'erotica' and the last is 'dark fantasy.'" That's only one list, but it's an interesting side-note that makes sense: just as with the internet and cable television, there's a particularly strong appeal to getting access to what Smith calls "salacious" content without having to face the check-out clerk with the goods in hand.
Nevertheless, the point remains that a greater number of readers are switching over to ebooks in one format or another. So beyond the basic question of "will print books go away" (which I personally doubt, but again, nobody really knows the answer to), the questions I find more intriguing relate to if or how digital reading changes the reading experience and, perhaps, even the brains that do the reading.
Electronic readers like Kindle are too recent a development to have generated much specific, targeted research yet. But a montage of essays titled "Does the Brain Like Ebooks?" that appeared on the New York Times website this week offered some fascinating information and viewpoints on the subject. The collection had contributions from experts in English, neuroscience, child development, computer technology and informatics. And while the viewpoints differed, there was some general consensus about a few points:
1. Clearly, there are differences in the two reading experiences. There are things electronic books do better (access to new books in remote areas of the world, less lugging around, and easier searching for quotes or information after the fact). There are also things print books do better (footnote reading, the ability to focus solely on the text at hand, far away from any electronic distraction, and--oh, yeah. No battery or electronic glitch issues.)
To those factors, I would add two more: First -- I think it's important to remember that Kindle doesn't actually give you a book. It gives you access to a book. For people who don't want to cart around old volumes or make multiple trips to the library, that might be considered a good thing. But at least one potential downside to this feature became painfully clear to many Kindle readers this summer when Amazon reached into its customers' Kindle libraries and took back two books for which the company realized it did not possess the copyright. Ironically, the books were by George Orwell -- including 1984, his book about the perils of centralized information control. Access goes both ways.
Second ... one of the writers of the Times essays, Prof. Alan Liu at the University of California, Santa Barbara, said that he didn't think anyone really made serendipitous discoveries while browsing the shelves of a physical library (so losing a physical library wouldn't be a loss, at least in that sense). Perhaps not, because most people go to libraries with specific search goals in mind. But bookstores, on the other hand ... there I'd disagree. I often browse the aisles of my local bookstores, just to see what's new and what might catch my eye. Most of the books I buy, in fact, are items I discovered while browsing ... something that, ironically, electronic "browsers" do not allow.
Browsing, to my way of thinking, is what I do in Filene's Bargain Basement. The clothes there are a jumbled mass. So even if you go in looking, potentially, for a shirt, you might end up with a pair of slacks that just happened to be hanging nearby. Same with a bookstore. Same, in fact, with the print version of the New York Times I get every morning. I scan the pages just seeing what might catch my eye to read. Sometimes it's a photo that catches my eye, sometimes it's a leading paragraph, sometimes it's a headline, and sometimes it's a callout. Or, sometimes, I'll be reading one article and another on that same page will catch my attention--one I never would have sought out on my own. And my knowledge and understanding of the world is far better and broader for all those serendipitous juxtapositions.
Electronic media and browsers have many good qualities, but they're lousy for that kind of unspecific window shopping. Browsers don't browse. They help you do specific searches. Looking for a black coat, or that article Sam Smith wrote two months ago on synthetic sneaker soles? The internet is great. Not sure what you want? Heaven help you. So to lose physical collections of books, either in stores or on individual bookshelves, would make it harder to make those delightful side discoveries that take us out of our narrow fields of focus and interest and, potentially, broaden our minds.
2. In the case of adults, we probably process information similarly in both electronic and print formats ... with two important distinctions. The first distinction is that electronic books, with hyperlinks and connections to a world web of side-roads, offer far more distractions to the reader. In doing a research paper, this can be useful. But it also offers temptations to divert our attention from a deeper immersion in a story or text that our brains are poorly equipped to resist. (Apparently we change tasks, on average, every three minutes when working in an internet-connected environment.)
"Frequent task-switching costs time and interferes with the concentration needed to think deeply about what you read," cautioned Sandra Aamodt, the former editor of Nature Neuroscience and another of the Times essayists.
The second feature of electronic reading, which may compound this first effect, is that there is evidently something about an electronic medium, with its "percentage done" scale and electronic noises or gizmos, that makes us crave and focus on those rewards. Which is probably why electronic games are more addictive than board games. After a couple of rounds of solitaire with real cards, I'm done and ready to move on to something else. But I removed the solitaire software from my computer almost 20 years ago when I realized that I couldn't seem to tear myself away from it, once I started playing.
Is our comprehension and, more importantly, what Proust apparently called "the heart of reading"--"when we go beyond the author's wisdom and enter the beginning of our own," as one of the essayists, put it, impacted by a heightened drive to make progress through a text? If so, that would be a bad thing. So it seems a point worth studying further.
3. Most adults, however, at least have the ability to process longer and deeper contemplative thoughts from what we read, even if we don't always exercise that ability. But according to Maryanne Wolf, a cognitive neuroscientist and child development specialist at Tufts University, that ability to focus attention deeply and for a concerted length of time is learned, not innate. Children apparently have to develop neural pathways and circuits for reading, and those circuits are affected by the demands of the reading material. Chinese children learning a more symbolic and visual language, for instance, develop different circuits than English-speaking children.
So electronic reading ... especially with hyperlinks and video embeds and other potential distractions, could potentially keep young readers from developing some important circuits. As Wolf put it in her essay:
"My greatest concern is that the young brain will never have the time (in milliseconds or in hours or in years) to learn to go deeper into the text after the first decoding, but rather will be pulled by the medium to ever more distracting information, sidebars, and now, perhaps videos (in the new vooks). The child's imagination and children's nascent sense of probity and introspection are no match for a medium that creates a sense of urgency to get to the next piece of stimulating information. the attention span of children may be one of the main reasons why an immersion in on-screen reading is so engaging, and it may also be why digital reading may ultimately prove antithetical to the long-in-development, reflective nature of the expert reading brain as we know it."
Interesting enough, the one computer scientist in the group was of the opinion that the best use of electronic books and capabilities was to enhance print books, not to replace them. But it's all interesting food for thought ... and, hopefully, more research as electronic readers find their way into more households and hands.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
Bernie Sanders and Jeb Bush look abroad for inspiration, heralding the end of American exceptionalism.
This election cycle, two candidates have dared to touch a third rail in American politics.
Not Social Security reform. Not Medicare. Not ethanol subsidies. The shibboleth that politicians are suddenly willing to discuss is the idea that America might have something to learn from other countries.
The most notable example is Bernie Sanders, who renewed his praise for Western Europe in a recent interview with Ezra Klein. “Where is the UK? Where is France? Germany is the economic powerhouse in Europe,” Sanders said. “They provide health care to all of their people, they provide free college education to their kids.”
On ABC’s This Week in May, George Stephanopoulos asked Sanders about this sort of rhetoric. “I can hear the Republican attack ad right now: ‘He wants American to look more like Scandinavia,’” the host said. Sanders didn’t flinch:
The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.
The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.
The piece begins by detailing how Clinton helped the global bank.
“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”
Who can devise the most convoluted way to wipe out the Islamic State?
Everyone with a stake in Middle Eastern geopolitics publicly declares that ISIS must be defeated. Yet opinions range widely on how this should be achieved.
Saudi Arabia, for example, believes ISIS cannot be defeated unless Syrian President Bashar al-Assad is removed from power. Turkey has just convinced NATO nations that the war against ISIS can only be won if Turkey’s traditional Kurdish opponents are neutralized first. Israel sees only one way to defeat ISIS: destroy Iran’s nuclear program and clip its wings regionally.
So what explains these apparently contradictory aims? The cynical view would be that all these parties are less interested in defeating ISIS than in achieving their own regional goals, and that they’re only pretending to be concerned about wiping out the group. Clearly, however, there is no place for cynicism in Middle Eastern politics. Everyone involved in the region is known to be sincere, albeit in radically different ways.
A hawkish senator doesn't apply the lessons of Iraq
Earlier this week, Senator Lindsey Graham, a hawkish Republican from South Carolina, used a Senate Armed Services Committee hearing to stage a theatrical display of his disdain for the Obama administration’s nuclear deal with Iran.
The most telling part of his time in the spotlight came when he pressed Defense Secretary Ashton Carter to declare who would win if the United States and Iran fought a war:
Here’s a transcript of the relevant part:
Graham: Could we win a war with Iran? Who wins the war between us and Iran? Who wins? Do you have any doubt who wins?
Carter: No. The United States.
Graham: We. Win.
Little more than a decade ago, when Senator Graham urged the invasion of Iraq, he may well have asked a general, “Could we win a war against Saddam Hussein? Who wins?” The answer would’ve been the same: “The United States.” And the U.S. did rout Hussein’s army. It drove the dictator into a hole, and he was executed by the government that the United States installed. And yet, the fact that the Iraqi government of 2002 lost the Iraq War didn’t turn out to mean that the U.S. won it. It incurred trillions in costs; thousands of dead Americans; thousands more with missing limbs and post-traumatic stress disorder and years of deployments away from spouses and children; and in the end, a broken Iraq with large swaths of its territory controlled by ISIS, a force the Iraqis cannot seem to defeat. That’s what happened last time a Lindsey Graham-backed war was waged.
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, Prince Charles surprised by an eagle, wildfire in California, a sunset in Crimea, and much more.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, fireworks in North Korea, Prince Charles surprised by an eagle, wildfire in California, protests in the Philippines and Turkey, a sunset in Crimea, and much more.
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
Even when they’re adopted, the children of the wealthy grow up to be just as well-off as their parents.
Lately, it seems that every new study about social mobility further corrodes the story Americans tell themselves about meritocracy; each one provides more evidence that comfortable lives are reserved for the winners of what sociologists call the birth lottery. But, recently, there have been suggestions that the birth lottery’s outcomes can be manipulated even after the fluttering ping-pong balls of inequality have been drawn.
What appears to matter—a lot—is environment, and that’s something that can be controlled. For example, one study out of Harvard found that moving poor families into better neighborhoods greatly increased the chances that children would escape poverty when they grew up.
While it’s well documentedthat the children of the wealthy tend to grow up to be wealthy, researchers are still at work on how and why that happens. Perhaps they grow up to be rich because they genetically inherit certain skills and preferences, such as a tendency to tuck away money into savings. Or perhaps it’s mostly because wealthier parents invest more in their children’s education and help them get well-paid jobs. Is it more nature, or more nurture?
Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.
After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.
With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.