Interactive Fiction > Inanimate Alice

Here’s a terrific example of digital narrative for young audiences…

alice

http://www.inanimatealice.com/

‘Inanimate Alice’ tells the story of Alice, a young girl growing up in the first half of the 21st century, and her imaginary digital friend, Brad.Over ten episodes, each a self contained story, we see Alice grow from an eight year old living with her parents in a remote region of Northern China to a talented mid-tw enties animator and designer with the biggest games company in the world.

HOUR OF CODE / KHAN ACADEMY Tutorials

Welcome to our Hour of Code on Khan Academy!

Khan Academy is a not-for-profit with the goal of changing education for the better by providing a free world-class education for anyone anywhere. Khan Academy’s materials and resources are available to you completely free of charge.

https://www.khanacademy.org/cs/programming
Drawing and animation
Programming is how we tell computers what we want them to do, like to build iPhone apps, video games, or websites… At Khan Academy, you can use our programming environment to build graphics, animations, and interactive visualizations. If you’ve never programmed before, follow these tutorials to learn how!

Experimental Work > Kinect, Processing, 2ndLife

I recommend you turn down the sound, which is high pitched…

Experimentation (2012) from Sachiko Hayashi on Vimeo.

Experimentation is a project consisting of use of Kinect, Processing, and Second Life. The project focuses on gestural interface, via which three components (avatar, sound, visual) are brought together in real-time in a mixed-reality performance. The gestural interface via Kinect and Processing programming enable elimination of pre-programmed (pre-animated) and pre-recorded sequences of avatar movements, sound and visual components.
Kinect captures real-life body movement to control avatar movement in Second Life. Kinect is also used for the part of real-life audio-visual performance, for which Kinect captures the real-life imagery of the performer and her surroundings and manipulates it in real-time via programming in Processing. The sound is generated by computer glitch/feedback in real-time also through gestural interface. The audio-visual performance is then streamed into Second Life to be combined with the movement of the avatar in real-time. Because of the gestural interface for the movement of the avatar and incorporation of real-life surroundings, each performance becomes truly unique.
Experimentation is my first real-time mixed-reality virtual performance, and has been made possible with support from HUMlab, UmeĆ„ University, Sweden. This machinima was made of documentation materials taken in HUMlab’s H3 location. Second Life location at HUMlab sim.

concept, image, sound, programming and performance: Sachiko Hayashi