Experimental Work > Kinect, Processing, 2ndLife

I recommend you turn down the sound, which is high pitched…

Experimentation (2012) from Sachiko Hayashi on Vimeo.

Experimentation is a project consisting of use of Kinect, Processing, and Second Life. The project focuses on gestural interface, via which three components (avatar, sound, visual) are brought together in real-time in a mixed-reality performance. The gestural interface via Kinect and Processing programming enable elimination of pre-programmed (pre-animated) and pre-recorded sequences of avatar movements, sound and visual components.
Kinect captures real-life body movement to control avatar movement in Second Life. Kinect is also used for the part of real-life audio-visual performance, for which Kinect captures the real-life imagery of the performer and her surroundings and manipulates it in real-time via programming in Processing. The sound is generated by computer glitch/feedback in real-time also through gestural interface. The audio-visual performance is then streamed into Second Life to be combined with the movement of the avatar in real-time. Because of the gestural interface for the movement of the avatar and incorporation of real-life surroundings, each performance becomes truly unique.
Experimentation is my first real-time mixed-reality virtual performance, and has been made possible with support from HUMlab, UmeĆ„ University, Sweden. This machinima was made of documentation materials taken in HUMlab’s H3 location. Second Life location at HUMlab sim.

concept, image, sound, programming and performance: Sachiko Hayashi

Advertisements

Tim the Enchanter Kinect


Project for Programming Usable Interfaces Prototyping Lab at the Carnegie Mellon Human-Computer Interaction Institute.

Also see: Comic Kinect: for all those who wish life was a comic book
http://youtu.be/9QQW_sPzNNM
A Kinect hack project for Interactive Art and Computational Design at Carnegie Mellon University.
http://golancourses.net/2011spring/projects/project-3-interaction/

Transform yourself to Ultra Seven by Kinect

http://youtu.be/Uuq9SCL_LXY
The code of http://www.youtube.com/watch?v=RUG-Uvq-J-w is now available for download at http://code.google.com/p/kinect-ultra/ .

Kinect Interactive Collage Machine


http://www.giusepperagazzini.com
“INTERACTIVE COLLAGE MACHINE”, an interactive collage controlled by kinect in real time. music: Boh blues (outro) by Diego Perugini

The video was made using kinect and quartz composer programming with the amazing “Tryplex toolkit” by onesecond. Many thanks to the fantastic help of Sebastian Kox of oneseconds, Developer/interaction designer of the QC toolkit.
onesecond.com
code.google.com/p/tryplex/

The Kinect Interactive Collage Machine is an artistic, motion-capture performance that gives users the ability to dive into a world of changing art and have their actions be the basis for visual changes. The video by Youtube user giusepperagazzini displays the Interactive Collages at work and how certain gestures contribute to the ever-changing graphics found in the hack. In this video, the user is seen dancing and the figure in the program imitates the user. Certain arm gestures initiate the visual changes as the user continues to dance. Some other environments provide virtual instruments that the user can interact with.