Inside the Science That Plays Your Brain Like a Videotape

More

Our senses take in a vast array of sights and sounds every day, sparking activity in our neural circuits in corresponding patterns. Now, scientists are developing ways to decode these patterns and recreate the original audiovisual stimuli -- not unlike the video playback of a VHS tape. This episode of Science Bytes, produced by Kikim Media for PBS and the Public Library of Science, explores some of the research in this field. Last year, scientists at U.C. Berkeley demonstrated that they could reconstruct the YouTube videos their research subjects were watching by analyzing fMRI scans of their brain activity. Associate professor of psychology Jack Gallant at Berkeley's Vision Science lab describes how they did it. Later in the segment, Bob Knight, a professor of neuroscience and psychology also at Berkeley, successfully reconstructs the audio that patients are hearing during brain surgery for epilepsy. Thanks to electrodes measuring electrical activity in the cortex, they could decode this activity and play back the audio the patient heard -- with spooky accuracy. 

The episode doesn't linger on the stimuli and the resulting videos Gallant's lab recreated, but the shot-by-shot comparison is amazing (below). It's not hard to imagine a cyborg future in which we can "make movies with our brains," to quote music video directors Daniel Scheinert and Daniel Kwan.

On their YouTube page, the researchers describe how they built a library of random YouTube videos and then mapped them to the responses in their subjects' brains: 

The procedure is as follows:

[1] Record brain activity while the subject watches several hours of movie trailers.

[2] Build dictionaries (i.e., regression models) that translate between the shapes, edges and motion in the movies and measured brain activity. A separate dictionary is constructed for each of several thousand points at which brain activity was measured. (For experts: The real advance of this study was the construction of a movie-to-brain activity encoding model that accurately predicts brain activity evoked by arbitrary novel movies.)

[3] Record brain activity to a new set of movie trailers that will be used to test the quality of the dictionaries and reconstructions.

[4] Build a random library of ~18,000,000 seconds (5000 hours) of video downloaded at random from YouTube. (Note these videos have no overlap with the movies that subjects saw in the magnet). Put each of these clips through the dictionaries to generate predictions of brain activity. Select the 100 clips whose predicted activity is most similar to the observed brain activity. Average these clips together. This is the reconstruction.

For more videos from Kikim Media, visit http://www.kikim.com.

Jump to comments

Kasia Cieplak-Mayr von Baldegg is the executive producer for video at The AtlanticMore

Cieplak-Mayr von Baldegg's work in media spans documentary television, advertising, and print. As a producer in the Viewer Created Content division of Al Gore's Current TV, she acquired and produced short documentaries by independent filmmakers around the world. Post-Current, she worked as a producer and strategist at Urgent Content, developing consumer-created and branded nonfiction campaigns for clients including Cisco, Ford, and GOOD Magazine. She studied filmmaking and digital media at Harvard University, where she was co-creator and editor in chief of H BOMB Magazine.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

Just In