When Robots Hallucinate
What do Google's trippy neural network-generated images tell us about the human mind?

When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.

The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."
That's how this...

...turned into this:

Google’s engineers say the effect is not unlike the way a person might find meaning in a cloudscape. When asked to look for something recognizable, people—and computers, it turns out—identify and “over-interpret” the outlines of things they already know.
“This network was trained mostly on images of animals, so naturally it tends to interpret shapes as animals. But because the data is stored at such a high abstraction, the results are an interesting remix of these learned features,” wrote Google engineers Alexander Mordvintsev, Christopher Olah, and Mike Tyka in a blog post. “The results vary quite a bit with the kind of image, because the features that are entered bias the network towards certain interpretations. For example, horizon lines tend to get filled with towers and pagodas. Rocks and trees turn into buildings. Birds and insects appear in images of leaves.”
And because neural networks assess images in layers—by color, by the sorts of lines or shapes depicted, and so on—the complexity of the image generated depended on which layer the engineers asked the computer to enhance. The lowest layers are the contours of things—lines and shadows—whereas the highest layers are where more sophisticated imagery emerges. “For example, lower layers tend to produce strokes or simple ornament-like patterns, because those layers are sensitive to basic features such as edges and their orientations,” the engineers wrote.
Those simple patterns, when enhanced by Deep Dream, end up looking distorted and otherworldly. But how they came to look that way still doesn't answer the question raised by the kinds of images Google's computers came up with: Why would a neural network dream up scenes that mirror the hallucinations people experience when they're tripping on psychedelic drugs?
“An important thing to remember is that all normal sensory perception in humans is hallucinations constrained by sensory input,” said Lucas Sjulson, a research assistant professor at New York University’s Langone Neuroscience Institute. “So our hallucinations correspond to some degree to what's actually in the outside world. But perceptions are all internally generated.”
In other words, all human perception is generated in the brain, not in the actual world, even when the thing you’re perceiving actually exists. "People think of your eyeball like a camera, but it's not a camera," Sjulson said. Your eyes may enable you to see, but your brain ultimately makes sense of whatever it is you’re seeing—whether it’s the coffee mug actually sitting on the desk next to you or the kaleidoscope of fractal imagery imposed on it by your brain.

When people take drugs like LSD, they provoke a part of the brain's cortex that “leads to the generability of these sorts of patterns,” Sjulson said. So it makes sense that asking a computer to obsess over one layer of imagery that it would normally perceive as multilayered would produce a similar visual effect. “I think that this is probably an example of some sort of similar phenomenon. If you look at what the brain does, the brain evolved over long periods of time to solve problems, and it does so in a highly optimized way. Things are learned with humans developmentally through evolution and then also through visual experience.”
That’s how people are training computers to see, too: through visual experience. How the neural network is seeing, then, may be more revealing than what it sees. Which is, of course, what Google engineers set out to explore in the first place.
“We actually ‘see’ things that aren’t there all the time,” said Jeffrey Guss, a psychiatrist at NYU who has studied how treatments involving psilocybin, the psychoactive agent found in some mushrooms, may help cancer patients. “Our visual cortex—not our eyes—are programmed to look for recognizable patterns … to see something in the information that our eyes provide. There are dozens of psychology experiments that show we often see what we expect to see, what we're told we are going to see, rather than what is actually there."
Another way to think about hallucinating is as a kind of connective tissue between what we see and what our brain expects. The fact that hallucinations themselves are, at times, surprising to the person experiencing them, doesn’t change the fact that they can represent the brain’s attempt to grasp for meaning. That doesn’t, however, mean that the images or shapes that appear are meaningful in and of themselves. “While visual hallucinations are sometimes a part of psychedelic experiences, we don’t really consider them terribly important in the big picture of how we use them or think about them,” Guss said. “We’re much more drawn to the ways that they alter meaning and provide a unique experience of the self-than the visuals, which are usually seen as entertaining and interesting, but not with that much intrinsic meaning.”
Although hallucinations are often associated with drug culture, people routinely have bizarre visual experiences even when they aren’t under the influence. In his book, Hallucinations, the late neurologist Oliver Sacks argued they are far more common experience than many people realize. “In other cultures, hallucinations have been regarded as gifts from the gods or the Muses, but in modern times they seem to carry an ominous significance in the public (and also the medical) mind, as portents of severe mental or neurological disorders,” he wrote in The New York Times in 2012. “Having hallucinations is a fearful secret for many people—millions of people—never to be mentioned, hardly to be acknowledged to oneself, and yet far from uncommon.”
In a 2009 TED talk, Sacks recalled his conversation with a 95-year-old woman who was blind but worried she was losing her mind when she began seeing bizarre things.
So I said, "What sort of things?" And she said, "People in Eastern dress, in drapes, walking up and down stairs. A man who turns towards me and smiles. But he has huge teeth on one side of his mouth. Animals too. I see a white building. It's snowing, a soft snow. I see this horse with a harness, dragging the snow away. Then, one night, the scene changes. I see cats and dogs walking towards me. They come to a certain point and then stop. Then it changes again. I see a lot of children. They are walking up and down stairs. They wear bright colors, rose and blue, like Eastern dress.”
Sometimes, she said, before the people come on, she may hallucinate pink and blue squares on the floor, which seem to go up to the ceiling. I said, "Is this like a dream?" And she said, "No, it's not like a dream. It's like a movie." She said, "It's got color. It's got motion. But it's completely silent, like a silent movie." And she said that it's a rather boring movie. She said, "All these people with Eastern dress, walking up and down, very repetitive, very limited."
In human brains, bizarre image perception is associated with issues in the eyes, in the brain, and other conditions: migraines, fever, and seizures, for example. In computer brains, such imagery suggests that artificial brains are more human than they may seem. “The fact that humans report that Google’s Inceptionism looks to them like what they see when they hallucinate on LSD or other drugs suggests that the machinery ‘under the hood’ in our brains is similar in some way to deep neural networks,” said Jeff Clune, an assistant professor of computer science at the University of Wyoming.
Of course, that’s only true if the Deep Dream images actually reflect what people see when they are hallucinating. Clune says he would love to see the idea scientifically tested, “before we put too much stock in it.” But the fact that so many people say that Google’s images look to them like a drug-induced hallucination suggests the resemblance is real. “If that’s what humans report happens when they trip, that suggests that drugs like LSD and mushrooms are doing something similar,” Clune said. “Making the brain reimagine what it sees to cause neurons in particular layers of the visual cortex to fire more and more and more.”
By replicating the architecture of the brain in computer form, then, scientists may better understand the human way of seeing the world—both as it is, and as it appears to be.