Microsoft's Kinect, an inexpensive motion and depth-sensitive camera designed for the Xbox 360 video game console, has sparked an amazing range of "Kinect hacks" since its release in 2010. Developers, artists, and musicians have reprogrammed and repurposed the technology for anything from goofy DIY projects to face tracking and projection mapping. Chris Vik, an Australian composer and musician, developed his own program to translate his gestures and movements into live music. The results are mesmerizing dancelike performances of intricately looped electronic music, and in his latest mind-boggling show, an original piece composed for the historic, 32-foot-tall organ in the Melbourne Town Hall.

Vik composed and performed Carpe Zythum with singer Elise Richards at the town hall in November 2011. When it was built, the organ comprised more than 300 miles of wire, 3,000 magnets, and 6,494 pipes, according to this original brochure from 1928 (pdf via the Organ Historical Trust of Australia, including the image below). Thanks to recently added MIDI capabilities, the 83-year-old instrument can be controlled electronically via Vik's custom-built Kinectar software. The show, captured on video by Unkle Nicnac Films, is an eerie blend of old and new technology with a fittingly gothic vibe. Vik shares another live performance video and talks about the evolution of his work in interview below.

The Atlantic: How did you get into working at this intersection of technology, music, and performance?

Chris Vik: Since I was 12 or 13 (about 15 years) I've been creating computer-based music and working on amateur programming projects -- I've always had pretty solid passion for both music and technology. In the final year of my BA in Fine Arts (sound specialization) at RMIT, I embarked on a project based around crowd interaction for an art installation. The idea was to have people enter the room with the space being aware of people's movements; as they began exploring the room they would hopefully realize they were actually creating the music simply by moving around in the room.

This project never quite eventuated, as during my time researching various camera/sensor technologies, I was introduced to Microsoft's Kinect gaming sensor and I got sidetracked. It wasn't ideal for the project I had in mind, but I saw the potential of using it for music creation. It could be used to easily (and affordably) track the human form and spit out the body's joint positions in 3D space as numbers. Being the nerdy musician that I am, I realized that numbers can be easily applied to create music by controlling digital synthesizers. After toying around with my very first experiment, I uploaded a video on YouTube (see Dubstep Bassline using Xbox Kinect) and within a couple of weeks was contacted by Microsoft to perform at an event in Sydney. That was a year ago -- since then I've done a number of performances all around Australia, collaborated with singers, dancers, visual artists and even guitarists to explore the possibilities of using motion tracking technology within the Arts.

Vik performs at Microsoft's REMIX11 conference

How would you explain Kinectar to someone unfamiliar with the Kinect?

Kinectar lets computer-based musicians explore the potential of using human movement to control and create music. Although I've been writing electronic music for over 15 years, I have no skills in playing the keyboard (which is the most common way of playing synthesizers live), which I've found really limiting. I suppose there might be a lot of people out there that can sympathize with this, and I feel this was what ended up influencing the direction of my software. The key point to Kinectar is that it isn't a gimmicky program that does one thing that acts like a single-dimensional game -- instead it gives a user complete creative control over how they want to use their movements to control a sound. It's a tool that allows people to explore this very exciting technology that I see as the future of not only music, but human-computer interaction as a whole.

Watching your live performances (particularly the REMIX11 show above) the system seems super responsive. How do you go about designing the motion cues for the Kinect?

Keeping the movement cues simple to start with is very important. First I make a sound on the computer, then in my head I imagine how those sounds could be represented by movement. So just say I want a piano sound to play louder and softer, I just start attaching a particular movement of my hand to the velocity of the note that's being played until I find something that feels natural to use. Then I practice that movement for a short time, go back to picking a new parameter of the sound to alter with another movement, and repeat the process. This way I'm always practicing all of the movements together, mastering one layer of parameters at a time.

The latency of the sensor system overall is actually pretty small (counted in milliseconds), but for a musician, that can be a noticeable amount of time. This is why playing rhythmic instruments like drums can be slightly problematic. That's not to say that drums can't be done, only it's probably better to stick with instruments that can allow a bit of latency, or that can by synchronized with the music in the background.

What motivated you to make your software available on your site for free? Are other people working with it now?

I love the idea of the Internet. If there wasn't a platform like this to share ideas on, none of what I've done would have existed. The Internet allows the freedom for people to share anything and everything, and in my eyes, it has sped up the evolutionary process of technological development to an extraordinary pace -- people imagine it, and it happens.

For me to allow other people in the world to have the opportunity to expand on the possibilities of this technology really excites me. I was studying the final year of my BA in Fine Arts last year and was using my software as a tool for the projects in that course, so I had the time and focus to work on this passion and really see it through.

I have people contacting me every day of the week, letting me know how they've been using my software. These projects include University projects, performing artists, technology hobbyists, visual artists and even teachers working with children, using my software as a platform for them to explore their creativity through a completely new and futuristic medium. I love to hear about what people are doing with Kinectar, and it just motivates me to keep following my passion.

What's next for you? And are there other giant musical instruments you plan to play via Kinectar?

I'm certainly the kind of person that's always got something in the pipelines. I have a few videos up my sleeve that should be coming out in the next few weeks that will hopefully continue to show off the potential behind Kinectar. I'm involved as a developer in a couple of very ambitious arts projects, but both are still early days. I also write electronic music under the name Synaecide, and have plans to tour my live sets using my Kinectar system shortly. I'll be traveling to the USA in June/July for some performances, development work and hopefully some technical workshops and talks. I'll be putting new videos up of my experiments regularly which you can check by just searching "chris vik" in YouTube or on my channel.

For more work by Chris Vik, visit

Via the Verge.

We want to hear what you think about this article. Submit a letter to the editor or write to