Inspired by masked rituals and mirror neurons, Karolina Sobecka's interactive installation is an experiment in mimicry, empathy, and communicating across interspecies boundaries. The premise of All the Universe Is Full of the Lives of Perfect Creatures is simple; the face of an animal -- ranging from wolves to rats -- copies the viewer's facial expressions like a living, responsive mask. The artist created this eerie "blended reality," blurring the lines between real and virtual, with face tracking technology and a one-way mirror. She discusses the creative themes at work in the piece and the impressive range of software involved in an interview below.
The Atlantic: How did you get into art and digital and interactive art specifically?
Karolina Sobecka: I've always been drawn to making things, and the art education was one way to get to do that for a living. I gravitated to the time-based formats, and then to interactivity. I work with the emergent technologies because they are an amazing tool -- allowing for new kinds of expressions and aesthetics, but also because they have become pervasive in our culture and I think it's important to be able to use this new language that the popular culture is expressed in. Interactive projects are a kind of 'procedural representation' (to use a term coined by Ian Bogost). They let us represent a system, a situation, the relationships between elements, rather than a static view. Often the viewer is one of those elements, and he or she actively completes the representation with their actions. Most of these projects are prompted by curiosity, and interactivity allows for really interesting explorations.
What was the inspiration for this project?
This project follows a few others which are about interaction itself. I'm interested in how we communicate with one another and one might argue that interactivity is a kind of communication, a non-verbal conversation. Mimicry is a very basic way of trying to communicate and understand each other. Studies show that it is automatic and pervasive, and has a huge influence on social psychology. This idea of 'being in someone else's skin,' as a way of understanding them, has been with us for a long time -- taking for example masked rituals and performances. Recently a neural mechanism has been discovered that explains how we gain experiential insight of other minds -- the mirror neurons. These special neurons activate when we perform an action as well as when we watch someone perform an action. They have been implicated not only in motor mimicry, but also in playing a role in ‘theory of mind’ concepts such as emotional recognition or contagion, empathy and self-awareness. Emotional contagion is based on interpreting the emotional state of another being expressed through their physical features. Emotions have typical facial characteristics, and the mirror neurons are ‘mapping’ the facial features of another person onto the respective areas in our own brain. So this mechanism suggests that the masked rituals might be far more than simply a symbolic performance -- that they might be actually a kind of 'embodied simulation' of other creatures.
Using a mirror in the installation was also a result of my interest in the combination of the virtual and physical worlds -- inserting a layer of imagination into a physical reality. The chain of causes and effects remains in place, although slightly augmented. The familiar is transformed into the uncanny, prompting us to see the mechanics of perception, interaction, and relationships with others anew. “In a sense, mirrors are the best ‘virtual reality’ system that we can build,” said Marco Bertamini of the University of Liverpool. Yet they are part of our physical reality. This makes them ideal to use in this kind of 'blended' reality experiment.
How would you describe the setup to someone who is unfamiliar with the technology?
Behind the mirror there's a monitor, a small computer, and a camera that looks out onto the viewers. There are two applications at work on the computer. One is a video tracker that analyzes the real-time video from the camera, recognizes faces in this footage, and their 'architecture' -- which then allows for recognizing facial features and expressions. The information about the face's position, rotation and expression is sent to the second application running (the game application) as a kind of input. The 'game' is built with the same technology as many other video games -- except in this case the interactivity is in the animal's behavior trying to mimic or correspond to what the video tracker has recognized.
The video tracker is made in openFrameworks, a C++ library (openframeworks.cc/) and is using a FaceTracker library from Jason Saragih, and the ofxFaceTracker addon by Kyle McDonald. The game is built with a game engine called Unity3d.
How does the mirror work?
The mirror is a half-silvered (or sometimes called one-way) mirror, the kind that are used in interrogation rooms. The surface reflects about half the light that strikes it and lets through the other half. Whether the surface looks reflective or transparent depends on the balance of light on both sides of it. In this installation the light from an LCD monitor behind the mirror appears on its surface.
How did you craft the animals, and their expressions? What software did you use?
The animals are modeled and animated in an open source 3D application Blender. They are then imported into Unity3d where their movement and the blending of animations are scripted to correspond to the input from the video tracker. The animals represent a spectrum of domestication of species (from a wolf through goat to rat). When I was working on it I realized how much more expressive the predators seem to be compared to ruminants for example.
What's next for you?
I'm working on a few other projects, a series of accessories for environmental awareness, Amateur Human, and other interactive installations that will be on view at the San Francisco Film Society's Kinotek exhibition in April.
For more work by Karolina Sobecka, visit http://www.gravitytrap.com/.