Interfication - video-based interactive installation
Video-based interactive installation, presented at 8th International Conference on Entertainment Computing, CNAM, Paris, 2009 (Video 1).
It analyzes motion by video camera and changes behaviour of objects on the screen. The all possible object's frames were pre-rendered as several video files and shown on the screen when it's needed.
In this work, I checked an idea of creating interactive media work from a set of prerendered videos. For example, flower can be represented as video of growth, death, and shaking by the wind at the left and right (Fig. 1). When a viewer goes in the front of camera, the flower grows and shakes. When a viewer goes away, it dies (see Fig. 2 for full list of sequences, note that there are two versons for moving left and right of the flower) .
Or, some “virtual kinetic object” can be represented as a video loop and change its speed depending on the viewer’s speed of motion.
Idea appeared from my desire to create interactive art, but I don’t have exparience for making implessive generative graphics. So the decision I find was quite straightforward: ask 3D designers to render quality videos and mix them in an interactive manner.
The app Interfication was made in SDL/OpenGL.
It renders static background and object on it.
All object’s videos were made in 3D editors and saved as RGB+Alpha TIFFs sequences. I convert whem into special video files using my custom format which saves only blocks of the frame with non-zero alpha values (Fig. 2). I use such approach to speed up rendering with Alpha-channel.
Remark: Now, we could use HAP codec for such purposes.
For analyzing data from webcamera, I use custom algorithm which computes horizontal speed of object’s motion.
Remark: Now, we could use an optical flow algorithm, for example, Farneback algorithm implemented in OpenCV.
Internally, main app was made in C++ Builder (Figs. 3, 4), which starts and controls SDL/OpenGL window. All basic part of app was implemented as dynamic libraries (DLLs), see Fig 5.
Part of the research was supported by the grant 09-01-00523 from the Russian Foundation for Basic Research and the Presidium of the Russian Academy of Sciences fundamental program 29, project P(29)7-2. I would like to thank G. Malyshev and P. Zakrevskiy for 3d animation, S. Zamuraev for fruitful discussions, I. Ilyin for the help in programming, S. Perevalova for the help in demo and A. Poptsova for the text correction.
Rapid Interactive Installation Development Using Robust Computer Vision and Image-Based Rendering. ICEC 2009: 298-299. Download article
Soon after Interfication project, I discovered openFrameworks and found that all modules I developed is already implemented in oF. So, I switched to use openFrameworks, and next project was made using it.
Nevertheless, the ideas behind Interfication project were continued.
At the first, the technology of packing some code to DLL. After several years of experiencing openFrameworks, I start to make DLLs from openFrameworks projects! For example, I made OSC and TCP/IP modules for Kuflex’s Unreal Engine projects Another World and Kusmos2. Also, I package Kuflex’s tracker software which works with depth cameras into DLL which connects with another openFrameworks projects very simple.
Next, the idea of using prerendered video was later continued in the projects “Interactive Guides” and “I’m a Conductor”, which uses Kinect and gesture recognition instead webcamera.
Next, the idea of using separate GUI-based app which controls rendering app, I continued in 2011, and next in 2018 by developing Configurers for such purpose.
Video 1. Installation demo
Fig. 1. Graph of video sequences representing flower’s behavior
Fig 2. Flower image sequences represented as files
Fig. 3. GUI-based app which controls installation’s OpenGL window
Fig 4. Structure of app
Fig 4. Custom modules, used in Interfication project