The Autumn of the Multitaskers
Neuroscience is confirming what we all suspect: Multitasking is dumbing us down and driving us crazy. One man’s odyssey through the nightmare of infinite connectivity
Illustrations by Istvan Banyai
I think your suggestion is, Can we do two things at once? Well, we’re of the view that we can walk and chew gum at the same time.
—Richard Armitage, deputy secretary of state, on the wars in Afghanistan and Iraq, June 2, 2004 (Armitage announced his resignation on November 16, 2004.)
To do two things at once is to do neither.
—Publilius Syrus, Roman slave, first century B.C.
In the midwestern town where I grew up (a town so small that the phone line on our block was a “party line” well into the 1960s, meaning that we shared it with our neighbors and couldn’t use it while one of them was using it, unless we wanted to quietly listen in—with their permission, naturally, and only if we were feeling awfully lonesome—while they chatted with someone else), there were two skinny brothers in their 30s who built a car that could drive into the river and become a fishing boat.
My pals and I thought the car-boat was a wonder. A thing that did one thing but also did another thing— especially the opposite thing, but at least an unrelated thing—was our idea of a great invention and a bold stride toward the future. Where we got this idea, I’ll never know, but it caused us to envision a world to come teeming with crossbred, hyphenated machines. Refrigerator-TV sets. Dishwasher-air conditioners. Table saw-popcorn poppers. Camera-radios.
With that last dumb idea, we were getting close to something, as I’ve noted every time I’ve dropped or fumbled my cell phone and snapped a picture of a wall or the middle button of my shirt. Impressive. Ingenious. Yet juvenile. Arbitrary. And why a substandard camera, anyway? Why not an excellent electric razor?
Because (I told myself at the cell-phone store in the winter of 2003, as I handled a feature-laden upgrade that my new contract entitled me to purchase at a deep discount that also included a rebate) there may come a moment on a plane or in a subway station or at a mall when I and the other able-bodied males will be forced to subdue a terrorist, and my color snapshot of his trussed-up body will make the front page of USA Today and appear at the left shoulder of all the superstars of cable news.
While I waited for my date with citizen-journalist destiny, I took a lot of self-portraits in my Toyota and forwarded them to a girlfriend in Colorado, who reciprocated from her Jeep. Neither one of us almost died. For months. But then, one night on a snowy two-lane highway, while I was crossing Wyoming to see my girl’s real face, my phone made its chirpy you-have-a-picture noise, and I glanced down in its direction while also, apparently, swerving off the pavement and sailing over a steep embankment toward a barbed-wire fence.
It was interesting to me—in retrospect, after having done some reading about the frenzied activity of the multitasking brain—how late in the process my prefrontal cortex, where our cognitive switchboards hide, changed its focus from the silly phone (Where did it go? Did it slip between the seats? I wonder if this new photo is a nude shot or if it’s another one from the topless series that seemed like such a breakthrough a month ago but now I’m getting sick of) to the important matter of a steel fence post sliding spear-like across my hood …
(But her arms are too short to shoot a nude self-portrait with a camera phone. She’d have to do it in a mirror …)
The laminated windshield glass must have been high quality; the point of the post bounced off it, leaving only a star-shaped surface crack. But I was still barreling toward sagebrush, and who knew what rocks and boulders lay in wait …
Then the phone trilled out its normal ringtone.
Five minutes later, I’d driven out of the field and gunned it back up the embankment onto the highway and was proceeding south, heart slowing some, satellite radio tuned to a soft-rock channel called the Heart, which was playing lots of soothing Céline Dion.
“I just had an accident trying to see your picture.”
“Will you get here in time to take me out to dinner?”
“I almost died.”
“Well, you sound fine.”
“Fine’s not a sound.”
I never forgave her for that detachment. I never forgave myself for buying a camera phone.
The abiding, distinctive feature of all crashes, whether in stock prices, housing values, or hit-TV-show ratings, is that they startle but don’t surprise. When the euphoria subsides, when the volatile graph lines of excitability flatten and then curve down, people realize, collectively and instantly (and not infrequently with some relief), that they’ve been expecting this correction. The signs were everywhere, the warnings clear, the researchers in rough agreement, and the stories down at the bar and in the office (our own stories included) revealed the same anxieties.
Which explains why the busts and reversals we deem inevitable are also the least preventable, and why they startle us, if briefly, when they come—because they were inevitable for so long that they should have come already. That they haven’t, we reason, can mean only one of two things. Thanks to technology or some other magic, we’ve entered a new age when the laws of cause and effect (as propounded by Isaac Newton and Adam Smith) have yielded to the principle of dream-and-make-it-happen (as manifested by Steve Jobs and Oprah). Either that, or the thing that went up and up and up and hasn’t come down, though it should have long ago, is being held aloft by our decision to forget it’s up there and to carry on as though it weren’t.
But on to the next inevitable contraction that everybody knows is coming, believes should have come a couple of years ago, and suspects can be postponed only if we pay no attention to the matter and stay very, very busy. I mean the end of the decade we may call the Roaring Zeros—these years of overleveraged, overextended, technology-driven, and finally unsustainable investment of our limited human energies in the dream of infinite connectivity. The overdoses, freak-outs, and collapses that converged in the late ’60s to wipe out the gains of the wide-eyed optimists who set out to “Be Here Now” but ended up making posters that read “Speed Kills” are finally coming for the wired utopians who strove to “Be Everywhere at Once” but lost a measure of innocence, or should have, when their manic credo convinced us we could fight two wars at the same time.
The Multitasking Crash.
The Attention-Deficit Recession.
We all remember the promises. The slogans. They were all about freedom, liberation. Supposedly we were in handcuffs and wanted out of them. The key that dangled in front of us was a microchip.
“Where do you want to go today?” asked Microsoft in a mid-1990s ad campaign. The suggestion was that there were endless destinations—some geographic, some social, some intellectual—that you could reach in milliseconds by loading the right devices with the right software. It was further insinuated that where you went was purely up to you, not your spouse, your boss, your kids, or your government. Autonomy through automation.
This was the embryonic fallacy that grew up into the monster of multitasking.
Human freedom, as classically defined (to think and act and choose with minimal interference by outside powers), was not a product that firms like Microsoft could offer, but they recast it as something they could provide. A product for which they could raise the demand by refining its features, upping its speed, restyling its appearance, and linking it up with all the other products that promised freedom, too, but had replaced it with three inferior substitutes that they could market in its name:
Efficiency, convenience, and mobility.
For proof that these bundled minor virtues don’t amount to freedom but are, instead, a formula for a period of mounting frenzy climaxing with a lapse into fatigue, consider that “Where do you want to go today?” was really manipulative advice, not an open question. “Go somewhere now,” it strongly recommended, then go somewhere else tomorrow, but always go, go, go—and with our help. But did any rebel reply, “Nowhere. I like it fine right here”? Did anyone boldly ask, “What business is it of yours?” Was anyone brave enough to say, “Frankly, I want to go back to bed”?
Maybe a few of us. Not enough of us. Everyone else was going places, it seemed, and either we started going places, too—especially to those places that weren’t places (another word they’d redefined) but were just pictures or documents or videos or boxes on screens where strangers conversed by typing—or else we’d be nowhere (a location once known as “here”) doing nothing (an activity formerly labeled “living”). What a waste this would be. What a waste of our new freedom.
Our freedom to stay busy at all hours, at the task—and then the many tasks, and ultimately the multitask—of trying to be free.
While the president continued talking on the phone (Ms. Lewinsky understood that the caller was a Member of Congress or a Senator), she performed oral sex on him.
—The Starr Report, 1998
I t isn’t working, it never has worked, and though we’re still pushing and driving to make it work and puzzled as to why we haven’t stopped yet, which makes us think we may go on forever, the stoppage or slowdown is coming nonetheless, and when it does, we’ll be startled for a moment, and then we’ll acknowledge that, way down deep inside ourselves (a place that we almost forgot even existed), we always knew it couldn’t work.
The scientists know this too, and they think they know why. Through a variety of experiments, many using functional magnetic resonance imaging to measure brain activity, they’ve torn the mask off multitasking and revealed its true face, which is blank and pale and drawn.
Multitasking messes with the brain in several ways. At the most basic level, the mental balancing acts that it requires—the constant switching and pivoting—energize regions of the brain that specialize in visual processing and physical coordination and simultaneously appear to shortchange some of the higher areas related to memory and learning. We concentrate on the act of concentration at the expense of whatever it is that we’re supposed to be concentrating on.
What does this mean in practice? Consider a recent experiment at UCLA, where researchers asked a group of 20-somethings to sort index cards in two trials, once in silence and once while simultaneously listening for specific tones in a series of randomly presented sounds. The subjects’ brains coped with the additional task by shifting responsibility from the hippocampus—which stores and recalls information—to the striatum, which takes care of rote, repetitive activities. Thanks to this switch, the subjects managed to sort the cards just as well with the musical distraction—but they had a much harder time remembering what, exactly, they’d been sorting once the experiment was over.
Even worse, certain studies find that multitasking boosts the level of stress-related hormones such as cortisol and adrenaline and wears down our systems through biochemical friction, prematurely aging us. In the short term, the confusion, fatigue, and chaos merely hamper our ability to focus and analyze, but in the long term, they may cause it to atrophy.
The next generation, presumably, is the hardest-hit. They’re the ones way out there on the cutting edge of the multitasking revolution, texting and instant messaging each other while they download music to their iPod and update their Facebook page and complete a homework assignment and keep an eye on the episode of The Hills flickering on a nearby television. (A recent study from the Kaiser Family Foundation found that 53 percent of students in grades seven through 12 report consuming some other form of media while watching television; 58 percent multitask while reading; 62 percent while using the computer; and 63 percent while listening to music. “I get bored if it’s not all going at once,” said a 17-year-old quoted in the study.) They’re the ones whose still-maturing brains are being shaped to process information rather than understand or even remember it.
This is the great irony of multitasking—that its overall goal, getting more done in less time, turns out to be chimerical. In reality, multitasking slows our thinking. It forces us to chop competing tasks into pieces, set them in different piles, then hunt for the pile we’re interested in, pick up its pieces, review the rules for putting the pieces back together, and then attempt to do so, often quite awkwardly. (Fact, and one more reason the bubble will pop: A brain attempting to perform two tasks simultaneously will, because of all the back-and-forth stress, exhibit a substantial lag in information processing.)
Productive? Efficient? More like running up and down a beach repairing a row of sand castles as the tide comes rolling in and the rain comes pouring down. Multitasking, a definition: “The attempt by human beings to operate like computers, often done with the assistance of computers.” It begins by giving us more tasks to do, making each task harder to do, and dimming the mental powers required to do them. It finishes by making us forget exactly how on earth we did them (assuming we didn’t give up, or “multiquit”), which makes them harder to do again.
Much of the problem is the metaphor. Or perhaps it’s our need for metaphors in general, particularly when the subject is our minds and the comparison seems based on science. In the days of rudimentary chemistry, the mind was thought to be a beaker of swirling volatile essences. Then came classical physical mechanics, and the mind was regarded as a clocklike thing, with springs and wheels. Then it was steam-driven, maybe. A combustion chamber. Then came electricity and Freud, and it was a dynamo of polarized energies—the id charged one way, the superego the other.
Now, in the heyday of the microchip, the brain is a computer. A CPU.
Except that it’s not a CPU. It’s whatever that thing is that’s driven to misconstrue itself—over and over, century after century—as a prototype, rendered in all-too- vulnerable tissue, of our latest marvel of technology. And before the age of modern technology, theology. Further back than that, it’s hard to voyage, since there was a period, common sense suggests, when we didn’t even know we had brains. Or minds. Or spirits. Humans just sort of did stuff. And what they did was not influenced by metaphors about what they ought to be capable of doing but very well might not be equipped for (assuming you wanted to do it in the first place), like editing a playlist to e-mail to the lover whose husband you’re interviewing on the phone about the movie he made that you’re discussing in the blog entry you’re posting tomorrow morning and are one-quarter watching certain parts of as you eat salad and carry on the call.
Would it be possible someday—through drugs, maybe, or esoteric Buddhism, or some profound, postapocalyptic languor—to stop coming up with ideas of what we are and then laboring to live up to them?
The great spooky splendor of the brain, of course, is that no matter what we think it fundamentally resembles— even a small ethereal colosseum where angels smite demons and demons play dead, then suddenly spit fire into the angels’ faces—it does a good job, a great job, of seeming to resemble it.
For a while.
I do like to read a book while having sex. And talk on the phone. You can get so much done.
—Jennifer Connelly, movie star, 2005
After the near-fatal consequences of my 2003 decision to buy a phone with a feature I didn’t need, life went on, and rather rapidly, since multitasking eats up time in the name of saving time, rushing you through your two-year contract cycle and returning you to the company store with a suspicion that you didn’t accomplish all you hoped to after your last optimistic, euphoric visit.
“Which of the ones that offer rebates don’t have cameras in them?”
“The decent models all do. The best ones now have video capabilities. You can shoot little movies.”
I wanted to ask, Of what? Oncoming barbed wire? The salesman was a believer, though—a zealot.
“Oh, yeah,” he said, “as well as GPS-based, turn-by-turn navigation systems. Which are cool if you drive a lot.”
“You have to look down at the screen, though.”
“They’re paid subscription services, you need to know, but we’re giving away the first month free, and even after that, the rates are reasonable.”
I shook my head. I was turning down whiz-bang features for the first time, and so had some of my friends, one of whom had sprung for a new BlackBerry that he’d holed up in his office to learn to use. He’d emerged a week later looking demoralized, muttering about getting old, although he’d just turned 34.
“Those little ones there—the ones that aren’t so slim, that you give away free.”
“That too is an option. Mostly they’re aimed at kids, though. Adolescents.”
I wanted one anyway. I’d caught air in my Land Cruiser off a sheer embankment, lost my girlfriend, chucked my dream of snapping a hog-tied terrorist, and once, because of another girl—a jealous type who never trusted that I was where I said I was—I’d been forced to send on a shot of L.A. palm trees to prove that I was not in Oregon meeting up with yet another girl whom I’d drunk coffee with after a poetry reading and who must have been bombed a few weeks later when she sent me a text message at 3 a.m. while I was sleeping beside the jealous girl. My bedmate heard the ring, crept out of bed, and read the message, then woke me up and demanded that I explain why it seemed to suggest we’d shared more than double espressos—an effect curiously enhanced by the note’s thumb-typed dyslexic style: Thuoght I saw thoes parkly eyes this aft, that sensaul deivlish mouth, and it took me rihgt in again, like vapmires do.
“I’ll take the fat little free one,” I told the salesman.
“The thing’s inert. It does nothing. It’s a pet rock.”
I informed him that I was old enough to have actually owned a pet rock once and that I missed it.
Here’s the worst of the chilling little thoughts that have come to me during microtasking seize-ups: For every driver who’s ever died while talking on a cell phone (researchers at the Harvard Center for Risk Analysis estimate that some 2,600 deaths and 330,000 injuries may be caused by drivers on cell phones each year), there was someone on the other end who, chances are, was too distracted to notice. Too busy cooking, NordicTracking, fluffing up his online dating profile, or—most hauntingly of all, I’d think, for a listener destined to discover that the acoustic chaos he’d interpreted as the other phone going out of range, or perhaps as a network-wide disturbance triggered by a solar flare, was actually a death, a human death, a death he had some role in— sitting on the toilet.
Or would watching streaming pornography be worse?
Not that both of these activities can’t be performed on the same computer screen. And often are—you can bet on it. In bathrooms. Even airport bathrooms, on occasion. In some of which, via radio, the latest business headlines can be monitored, permitting (in theory and therefore in fact, because, as the First Law of Multitasking dictates, any two or eight or 16 processes that can overlap must overlap) the squatting day trader viewing the dirty Webcast (while on the phone with someone, don’t forget) to learn that the company he just bought stock in has entered merger talks with his own employer and surged almost 20 percent in under three minutes!
“Guess how much richer I’ve gotten while we’ve been yakking?” he says into his cell, breaking his own rule about pretending that when he’s on the phone, he’s on the phone. Exclusively. Fully. With his entire being.
Must be driving through a tunnel.
I’ve been fired, I’ve been insulted in front of co-workers, but the time I flew thousands of miles to meet a boss who spent our first and only hour together politely nodding at my proposals while thumbing out messages on a new device, whose existence neither of us acknowledged and whose screen he kept tilted so I couldn’t see it, still feels, five years later, like the low point of my career.
This is the perfect “one plus one equals three” opportunity.
— Robert Pittman, president and COO of America Online, on the merger between AOL and Time Warner, 2000
There may be a financial cost to multitasking as well. The sum is extremely large and hard to vouch for, the esoteric algorithm that yielded it a puzzle to all but its creator, possibly, but it’s one of those figures that’s fun to quote in bars.
Six hundred and fifty billion dollars. That’s what we might call our National Attention Deficit, according to Jonathan B. Spira, who’s the chief analyst at a business- research firm called Basex and has estimated the per annum cost to the economy of multitasking-induced disruptions. (He obtained the figure by surveying office workers across the country, who reported that some 28 percent of their time was wasted dealing with multitasking- related transitions and interruptions.)
That $650 billion reflects just one year’s loss. This means that the total debt is vastly higher, since personal digital assistants (the devices that, in my opinion, turned multitasking from a habit into a pathology, which the advent of Bluetooth then rendered fatal and the spread of wireless broadband made communicable) are several annums old. This puts our shortfall somewhere in the trillions— even before we add in the many billions that vanished when Time Warner and AOL joined their respective corporate missions—so ably accomplished when the firms were separate—into one colossal mission impossible.
And don’t forget to add Enron to the tab, a company that seemed to master so many enterprises, trading everything from energy to weather futures, that the Wall Street analysts’ brains froze up trying to “recontext” (another science term) what looked at first like a capitalist dynamo as the street-corner con that it turned out to be. Reports suggest that the illusion depended nearly as much on cunning set design as it did on phony accounting. The towering stack of Broadway stages that Enron called its headquarters—with its profusion of workstations, trading boards, copiers, speakerphones, fax machines, and shredders—made visiting banker-broker types go snow-blind. When the fraud was exposed, the press accused the moneymen of overestimating Enron. In truth, they’d underestimated Enron, whose hectic multitasking front concealed the managers’ Zenlike focus on one proficiency, and only one.
Which is easy to practice on an audience whose brains are already half dormant from the stress of scheduling flights on fractionally owned jets and changing the tilt and speed of treadmills according to the shifting readouts of miniature biofeedback monitors strapped around their upper arms.
What has the madness of multitasking cost us? The better question might be: What hasn’t it?
And the IOUs keep coming, signed at the bottom with millions of our names. We issued this currency. We’re the Federal Reserve of the attention economy, the central bank of overcommitment, keeping the system liquid with adrenaline. The problem is that we, the bankers, are also the borrowers. That’s multitasking for you. It moves in circles. Circles that we run around ourselves, as we try to pay off the debts we owe ourselves with funny money engraved with our own faces.
Here’s one item from my ledger:
Cost of pitying Kevin Federline while organizing business trip online and attaching computer peripheral: $279.
Federline—I know. A mayfly on the multimedia river who, now that he has mated, deserves to break back up into pixels. That he hasn’t means pixels are far too cheap and plentiful, particularly on the AOL welcome page, where for several months last year Federline’s image was regularly positioned beside the icon I click to get my e-mail. With practice, I learned to sweep past him the way the queen sweeps past her guards, but one afternoon his picture triggered a brainslide that buried half my day.
What the avalanche overwhelmed was a mental function that David E. Meyer, a psychology professor at the University of Michigan, calls “adaptive executive control.” Thanks to Federline, I lost my ability, as Meyer would say, to “schedule task processes appropriately” and to “obey instructions about their relative priorities.”
Meyer, it’s worth noting, is a relative optimist among the researchers studying multitasking, since he’s convinced that some people can learn, with enough practice, to perform two tasks simultaneously as successfully as if they were doing them sequentially. But “enough practice” turns out to mean at least 2,000 tries, and I had just the one chance at the cheap fare to San Francisco that I’d turned on my laptop to reserve, only to be distracted by the picture of Federline winking at me from one browser window over.
The photo, a link explained, was taken while Federline was taping a TV show and happened to peer down at his phone, only to learn that what’s-her-hair, his wife, the psycho, bad-mother rehab-escapee (I had last caught up on her misadventures weeks or months before, while waiting out an eBay auction for an auxiliary hard drive “still in box”), had sent him a text message asking for a divorce. Federline’s face looked as raw as a freshly unbandaged plastic-surgery patient’s, but the aspect of the photo that grabbed me (as the promotional fare hovered in the ether, still unbooked and up for grabs) was the idea I suddenly entertained about its origins. The picture of Federline in cell-phone shock had been snapped on the sly by another phone, I sensed, and possibly by a hanger-on whom Federline regarded as a “bro.” It also seemed plausible that after the taping, Federline bought dinner for this Judas—who, in my reconstruction of events, had already beamed the spy shot to a tabloid and been wired big money in return. If so, he was probably richer than Federline, who depended for funds on the wife who’d just dumped him.
This thought sequence caused me to remember the hard drive—still sitting unopened in a closet—that I’d bought in that Internet auction way back when, while catching up on the Hollywood gossip news. Here’s the mental flowchart: Federline dumped > story about his prenuptial with Britney Spears > story was read during eBay auction > time to get some use out of my purchase.
Removing the hard drive from its shell of molded Styrofoam sloppily wrapped in masking tape stirred serious doubts about the seller’s claim that the gadget was unused. This put me in a quandary. Should I send the hard drive back? Blackball its seller on a message board? Best to test it first. I riffled through drawers to find the proper cable, plugged the device into a USB port, and only then became aware of the fluorescent Post-it note stuck in the corner of my laptop screen. “Grab discount SF fare,” the note read. Where had it gone? Where had I gone, rather? How could a piece of paper in a color specially formulated to signal the brain Important! Don’t Ignore! be upstaged by a picture of a sad minor celebrity? If the Post-it note had been a road sign warning of a hairpin mountain curve and Federline’s photo a radio interview, I and my car would be rolling down a cliff now.
Back to the San Francisco ticket, then. I brought up the main Expedia/Orbitz/Travelocity page and typed in the code for the San Francisco airport, which I couldn’t believe I got wrong. To fix it, I was forced to use one of those drop-down alphabetized lists that the highlight line always moves too fast through, meaning I click my mouse several entries too late. Seattle this time. I scrolled back up.
All tickets sold out.
The scientists call this ruinous mental lurching “dual task interference,” or just plain bottlenecking. I call it the reason Keven Federline cost me a cheap flight to San Francisco. (It also explains, perhaps, why sexual threesomes are often disappointing.)
I just wish the military understood the concept. They might understand then why “walking and chewing gum” in Afghanistan and Iraq is no way to catch bin Laden.
My hunch is that when we look back on it someday, at our juggling of electronic lives and the array of subtly different personas that each one encourages (we’re terse when texting, freewheeling on the phone, and in some middle state while e-mailing), the spectacle will appear as quaint and stylized as those scenes in old movies of stiff-backed lady operators, hair in bobby pins, rapidly swapping phone jacks from hole to hole as they connect Chicago to Miami, reporter to city desk, businessman to mistress. Such scenes were, for a time, cinematic shorthand for the frenzy of modern life, but then communications technology changed, and those operators lost their jobs.
We’ve got to be patient and committed [in Iraq], but we’ve got to multitask … We’ve got to talk about Iran—Iran is more dangerous than Iraq—and we have got to get the job done in Afghanistan and in Pakistan.
—Rudolph Giuliani, Republican presidential candidate, July 2007
The night the bubble finally popped for me began when I pushed a button on my hospital bed to summon the gray-haired night nurse. To convey my appreciation when she arrived and to help establish a relationship that I hoped would lead her to agree with me that my morphine drip was far too slow, I did as the gurus of management urge executives to do when they engage in important negotiations. I “reallocated” my “presence” and “enriched” my “medium.” I removed my headphones, closed my book, aimed the remote and clicked off the TV, and looked the old woman in the eye.
“What?” she said.
Her question came too quickly. Because of the way the human brain works—always lagging slightly, always falling a bit behind itself when it has to drop many things, one thing at a time, and refocus on a new thing—my attention had not yet caught up with my expression. Also, perhaps because of the way that morphine works, I was unnaturally aware of the mechanisms inside my mind. I could actually feel the neurological switching, the mental grinding of fine, tiny gears that makes multitasking such an inefficient, slow, error-prone, tiring way to get things done.
“Still hurts,” I finally said. “Wondering if you’d shorten up the intervals.” I left out the I’s, text message–style, because that’s how people in agony communicate. Teenagers, too, but aren’t they also in agony, with the shy self- consciousness of partials who don’t show all their cards, out of fear that they haven’t yet drawn many worth playing?
The nurse made a face that the gurus would call “equivocal”—meaning that it can support conflicting interpretations, even in a real-time, face-to-face, “presence- rich” exchange—and then glanced down at the iPod on my blanket.
“Book on tape,” I said.
“You can do those both at once?” She eyed the real book lying on my lap.
“Same one,” I said. “I like to double up.”
I had no answer. I had a comeback—Because I can, because it’s possible—but a comeback is just a way to keep things rolling when perhaps they ought to stop. When the nurse looked away and punched in new instructions on the keypad attached to my IV stand, I heard her thinking, No wonder this guy has kidney stones.No wonder he’s so hungry for narcotics. She turned around in time to see my hands moving from the book they’d just reopened to the tangled wires of the earphones.
“I’m grateful that you came so quickly and showed such understanding,” I said, not textishly, relaxing my syntax to suit the expectations of the elderly.
“Maybe more dope will be just the thing,” the nurse said, shedding equivocation with every word, as a dreamy warmth spread through my limbs and she soft-stepped out and shut the door. When I woke in the wee hours, my book, in both its forms, had slid off the bed onto the floor, the TV remote was lost among the blankets, and the blinking “sleep” indicator of the laptop computer I’ve failed to mention (delivered to my bedside by a friend who’d shared my delusion that even 25-bed Montana hospitals must offer wireless Internet these days) was exhaling onto the walls a lovely blue light that tempted me never to boot it up again.
That night, last May, as I drowsed and passed my stones, the mania left me, and it hasn’t returned.
What happened to the skinny brothers’ car-boat was that it sank the third time they took it fishing. It cracked down the length of its hull, took on water, then nose-dived into the sandy bottom, leaving its revved-up rear propeller sticking up two feet out of the river, furiously churning air until its creators returned in a canoe and whacked it silent with a crowbar.
The catastrophe, visible from half the town, was the talk of the party line that night, with most of the grown-ups joining in one pooled call that was still humming when I was sent to bed.
"Where do you want to go today?” Microsoft asked us.
Now that I no longer confuse freedom with speed, convenience, and mobility, my answer would be: “Away. Just away. Someplace where I can think.”