Kinect Delaunay

April 9th, 2013

Some fun with Delaunay Triangulation and Kinect 3D Meshes. I really love that pencil drawn like look of the images.

Kinect Delaunay from Kamen Dimitrov on Vimeo.

For the curious, here is the Openframeworks project on GitHub: https://github.com/kamend/KinectDelaunay

Absolut Blank

December 6th, 2012

I am happy to report our little documentary from the installation we did for the Absolut Blank campaign in Bulgaria. This is probably my favorite project for this year and really like to thank Absolut for giving us the freedom to do everything the way we wanted!

ABSOLUT BLANK | PHORMATIK from PHORMATIK on Vimeo.

Of course everything interactive was done inside Openframeworks!

My experience with Kinect installations

November 27th, 2012

I have spent the last year, building couple of different Kinect installations, so in this post I would like to share some tips and points, which might seem rather obvious, but can be easily forgotten when building and designing your interaction software and patterns.

The Kinect IR camera does not work correctly in daylight, because the Sun emits a lot of IR light. I was really surprised the first time I had this problem and then I felt rather stupid :)

The IR light emitted by the laser gets reflected by leather clothes, shoes, so don’t get surprised when someone dressed with leather have holes in their depth image or have trouble with skeleton tracking.

Define your area of interaction, I think this is really important! You can use led strips to restrict the area, where the user gets detected and where the user must stay. The last time we used three flashlights pointing at each other, people though that they were the motion sensors and were pretty impressed, so you could definitely do some experimentation there. Defining strict borders is good, because people will know immediately where they have to stay to interact and all other people won’t interfere with the camera. Another cool trick, if you are using OpenNI and you want to restrict some parts on the left and right side, is stick tape on the sides of the IR camera.

If you are using Skeleton tracking, usually it takes like 5 secs, until the user skeleton gets detected and calibrated, so you should provide some kind of a feedback during that time, loading board or some kind of a message, that the user has been detected. People expect immediate feedback!

Mind the short people and kids! This was my first mistake with our first installation, people has to rise their hands in order to activate some elements in the installation. Software wise, I was just detecting when the blob passes certain vertical threshold, so when some kids came to try it, they could not reach the threshold.

People love their “shadow”! There is something magical, about seeing your silhouette in a different color or form, people just love that, especially if they are at a party. They will dance their ass off :) So even though sometimes rigged characters are fun, consider implementing some blobs.

I will try to add some more tips along the way, but if somebody has something else to share, please feel free to comment!

Kinect Triangulation to SVG (FREE OSX App)

September 11th, 2012

Here is a little tool for OSX, I made with Openframeworks. It’s an app for turning Kinect triangulated blobs into SVG files. Please note that you need a Kinect camera plugged in to use this!

You can download it here: KinectToSVG.zip

Usage:

) Unzip in a folder and do not delete the “data/” folder.
) Plug-in your Kinect
) Stay close to it, around 1-1.5m
) Press ‘space’ to save your SVG
) Your SVG is saved in the “data/” folder with the name “mesh.svg”, it will overwrite the file, everytime you save!
) Have fun!

* Props to Marek Bereza and his Kinect blob triangulation code found in https://github.com/HellicarAndLewis/Triptych

Long Exposure Kinect Clouds

June 25th, 2012

I have been thinking about this idea for a long time, so tonight, I though to give it a quick go. I think the results turned out pretty interesting: unique Kinect point clouds and colors. The setup is pretty much self explanatory: a projector beaming the Kinect cloud on a while wall, with a custom software I made with Openframeworks, my DSLR pointed directly at the wall and connected via USB so I could shoot remotely and of course me, making a fool of my self in front of the Kinect sensor.














My personal favourite:

The next step is calibrating the Kinect with the projector and beaming the clouds directly over myself, results coming soon!

FLORA – Interactive Installation

May 28th, 2012

Here is a short documentary of the installation we did for the Burgas Contemporary Festival 2012. The installation utilize Openframeworks for receiving and sending motion sensor data, Resolume for visualisation and MadMapper + Syphon for spreading and mapping the whole content over the building.

Real Time Motion Tracking in Cinema 4d

April 24th, 2012

Recently I haven been playing around with Cinema 4d and found a really nice tool for skeleton tracking from Kinect and sending all joints coordinates through OSC to a Cinema 4d plug-in. The plug-in automatically creates Null Object with every joint and records all movements as Key Frames. The tool is called NI Mate, made by Delicode. There are also plug-ins for Blender and Animata.

Here is a short example of me making a fool of myself in front of the Kinect :)

And here is a pre-recorded Cinema 4d Project, if you want to play around and dont have a Kinect device in hand.

NOTE (May 9, 2012 at 7:51 am): Another alternative to Delicode NI-Mate (which is now priced around 200EUR) is using OSCeleton and KiCapOSC (both open source).

CENC – Mapping Festival 2011

November 22nd, 2011

This is probably one of the most inspiring and creative Kinect applications, I have seen!