This music video for BELL makes use of a hacked Kinect to trace her facial expressions and create eerie interactive projections that respond to her movements. Tiny dots of light jump when she smiles or raises her eyebrows, and neon geometric shapes bloom around her eyes. Projection mapping, also known as 3D projection, is the process of mapping projected video onto a three dimensional surface, rather than the white rectangle of a screen. 

The visuals were developed by Zach LiebermanFrancisco Zamorano, Andy Wallace, and Michelle Calabro, and the project uses FaceTracker software by Jason Saragih. Zach Lieberman is an artist and a professor at Parsons School of Design, and has developed numerous visionary projects involving interactive projection and motion tracking. The EyeWriter, for example, is a collaborative project to develop technology that allows people suffering from ALS to draw with their eyes. 

In an interview below, Lieberman shares some thoughts on the making of the video, the cyborg future, and what he's working on next.

The Atlantic: How did you get into working with interactive video projections? What interests you about it as an artistic medium?

Zach Lieberman: I am an artist who works with technology, creating custom software for performances and installations. A lot of my work has to do with bodies, gestures and movement. Recently, I've been looking at writing software that can track facial features and create a live "mask," and one extension of that research was to begin to project on the face visuals that would change live, almost as a new form of face painting.  

How did you come to collaborate with BELL? What was the inspiration for the video?

I met her at a party, and she started following my work online.  She saw some of the face projection research I had been doing, and thought it would work well with her song, "Chase No Face" which, if you Google it, you can discover is a cat that has no face, the inspiration for her song. It seemed like a really good fit, to think about creating live and dynamic faces.

How would you describe the process of hacking the Kinect to achieve these visuals to someone who is unfamiliar with the software and the Kinect?

The Kinect is a 3D camera, which means that in addition to sensing color pixels, like a traditional camera, it also senses depth information per pixel. So each pixel has color info and depth, and you can use that info to track much better then a normal camera.

Our day-to-day physical and mental lives are increasingly infused with technology and digital media. How long do we have before we are all officially cyborgs?

I think it will be some time, but you are already seeing convergence happening all around us. I think we need to find creative ways to use this technology, so we don't just use everything in the way the companies want us to, but use it in unexpected ways, in order to express ourselves and create our own dreams. 

What’s next for you?

I've been working on some commercial projects recently with YesYesNo, a company I co-founded. We've been doing some data visualization work trying to turn running data, GPS data from Nike+ devices, into artwork, so that by running, for example, you paint a painting. We've got some similar projects lined up for the fall. I'm also a professor at Parsons School of Design, in the design and technology program, and I'm getting ready for the start of a new semester -- always an exciting time.

Night Lights, created by YesYesNo two years ago, was a building-sized interactive projection in Auckland, New Zealand. 


 

To see more of Zach Lieberman's work, visit http://thesystemis.com/.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.