The White Cane as Technology

A conversation with scholar Georgina Kleege about what her cane tells her, how tech designers should think about visual impairments, and why "bluetooth shoes for the blind" are a terrible idea

A couple of months ago, a cartoon appeared in the New Yorker magazine depicting people crossing a busy New York street, staring at their smartphones, swinging white canes to sense their surroundings.

The image got me thinking about the purported “inattentional blindness" induced by smartphones, and the poorly understood functions of the white cane. So I emailed with Georgina Kleege, a literature scholar, professor at UC Berkeley, and a daily user of both a smart phone and a white cane. I asked her about these technologies, 20/20 vision, app design, and re-thinking navigational smarts.

Sara Hendren: Whenever I see someone using a cane on the street, I perceive it as such an elegant wayfinding device—that is, when paired with the magnificent sensing system of the human body. Can you talk a little bit about how a person learns to deploy a cane for maximal sensitivity?

Georgina Kleege: I think there's a popular misconception that blind people use a cane as an extension of the hand to feel the space around us. But, along with my cane, I use hearing, touch, and sometimes even olfactory perception in combination to get me where I want to go.

The cane’s tip sweeps the ground before my feet to alert me to obstacles and curbs, and to announce details about the texture of the surface underfoot. On regular routes, changes in the pavement’s texture signal that I am approaching a destination or turning point. But while I attend to this tactile information I am very conscious of sounds: both the echoes of the sound the cane makes, which can sometimes tell me something about my surroundings, and the sound of traffic, children at play in a schoolyard, footsteps behind or coming toward me, music playing at a corner bar, and so forth. Restaurants, bakeries, flower stands, drugstores, and bookshops all exhale their particular scents. To take advantage of all this information, I direct my attention outward in all directions, creating a sort of sphere of perception to surround me as I move.

The cartoon seems to imply that a dependence on phones for information and social interaction necessitates a new prosthetic. But my guess is that a smartphone and a cane are an interesting combination when used in tandem. Do you use one more than the other, or both in different ways, or something else?

Since I need to rely on my hearing to get around, I tend not to use my phone when I'm in motion. I sometimes use GPS navigation with turn-by turn directions spoken out loud, but it can be tricky if I'm also listening for traffic sounds and other people I might walk into. When I use GPS, I prefer to get the maximum amount of information; I want to hear all the street names, all the businesses I pass. I retain a memory of this for future reference: "Oh, there's a Thai restaurant across the street from that movie theater," that kind of thing.

Not long ago, I came across a project: “bluetooth shoes for the blind.” The designers put sensors in the soles of a pair of shoes; a blind person would then type in a destination into the smart phone's GPS, and the shoes would vibrate to tell you when to make a turn. The inventors admitted that these shoes would not help with maneuvering through crowded city streets. Also, they don't make a distinction between a curb and an open manhole. So I say—who are they kidding? It's an example of a kind of technology that’s supposed to be attractive because it would replace the cane, making the blindness less visible, and allowing the blind person to "pass" more successfully as sighted.

It's also an example of the well-meaning but utterly wrong-headed notion that "ubiquitous computing" will save the world. I see a troubling number of technologies now touting themselves as prototypes for blind or deaf users—as an afterthought application. They've figured out a gadget that's tactile in its sensing and response functions, say, and then they think: Great! Now—what's the application? And the go-to becomes disability tech. It’s a more challenging, sustained effort to look closely at the tools in use already, especially ones that support the dynamic work of all human bodies in making choices and judgments as complex as those required by spatial navigation.

And yes, "passing" is always driving too much of the design impetus around disability tech. I'm curious if there are new tools or ideas that do seem promising to you?

In general, I find that the pared-down format of apps versus their corresponding websites actually makes them easier to use. On a website for a bank or whatever, I sometimes have to tab around quite a bit to find what I'm looking for. But on the phone there's less information displayed on each page, so it's easier to find.

I have one blind-specific app that's pretty useful. It's called Blind Square, and it's the equivalent to Four Square, in that it tells me about businesses and points of interest near my current location, or a location I plan to visit. My favorite function is the "Look Around" feature. I can turn around and point the phone in different directions and it will tell me what's there. There's also a "Simulate Location" feature that lets me do this when I'm at a distance. So if I'm going to meet you at a coffee shop, I can check out what other businesses are nearby before I actually get there.

Since the screenreader is built into the iPhone rather than something that needs to be added on, I often recommend it to sighted people who find it hard to read the teeny tiny print on a web page, or on an iBook, etc. Like the dictation function: It may be that it was originally put there for a certain population, but why shouldn't everyone use it?

There are now some apps that allow the user to take a picture of something, a restaurant menu, a food package in the supermarket, and then read the text out loud. So far these are add-ons and thus expensive. And I've heard that they don't work so well, but I hold out hope.

You've written critically about the oversimplified notion of blindness as a state of having no vision at all, when in fact only 10-20 percent of people who are "legally blind"—in cultures where such a measure exists—see nothing. Most people, in other words, use the vision they have in connection to the sphere of perception you were describing. Some see high-contrast lights and darks, for example, or objects directly in front of them but not in their peripheral vision. And this is a spectrum that mirrors that of sightedness too. There's a really big difference between someone seeing with 20/40 vision versus someone seeing at 20/10—and to possess 20/20 vision, which is often associated with "seeing clearly" in every sense, is actually just to be statistically average. What implications does a more nuanced understanding of the relativity of vision have for technology design, either so-called "assistive" tech or personal devices?

Jump to comments
Presented by

Sara Hendren is an artist, researcher, and writer based in Cambridge, Massachusetts. She edits Abler and lectures at the Rhode Island School of Design.

Get Today's Top Stories in Your Inbox (preview)

A Fascinating Short Film About the Multiverse

If life is a series of infinite possibilities, what does it mean to be alive?

Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus


The Death of Film

You'll never hear the whirring sound of a projector again.


How to Hunt With Poison Darts

A Borneo hunter explains one of his tribe's oldest customs: the art of the blowpipe


A Delightful, Pixar-Inspired Cartoon

An action figure and his reluctant sidekick trek across a kitchen in search of treasure.


I Am an Undocumented Immigrant

"I look like a typical young American."


Why Did I Study Physics?

Using hand-drawn cartoons to explain an academic passion



More in Technology

Just In