Reactable UiO

Semester Assignment, Kyrre Glette

About

The Reactable concept is an electronic instrument based around a tangible interface with visual feedback. People can place objects on and touch a screen in order to control sound generation. The Reactable UiO project aims to reproduce this concept, with the same kind of tangibles for the controller interface, however with a different mapping and sound generation solution.

Scope of the work

The goal of the Reactable UiO project was to make a more or less accurate replication of the original Reactable proposed by the Music Technology Group at the Universitat Pompeu Fabra, Spain. This consists of making a physical table structure with the necessary components to make the table work as intended. The vision/tracking software was provided by the reacTIVision project, while control logic, sound generation and visualization would have to be programmed.

The project has been carried out in cooperation with Viet Phi Uy Hoang. Although both have participated in all tasks, my primary tasks have been the visualization and control software, and the electronic equipment (illumination, projector). This report focuses on my work while the Wiki page will contain an up-to-date and comprehensive documentation of the project. The project was started entirely from scratch and thus a major part of the work has consisted of searching for and ordering suitable components, and constructing the table from these.

How it works

The reactable mainly consists of a camera which tracks the movements of the tangibles on the surface, and a projector projects a graphical feedback to this on the same surface. In order for the camera to not be disturbed by the projected image in the tracking of the objects, the tracking is performed in the infrared light range. This includes illuminating the tangibles with infrared light, and the camera filtering out visible light. In the same way, the infrared illumination for the tangibles does not disturb the users' perception of the projected image. On the software side, the reacTIVision image processing software sends OSC messages to self-developed programs which generate visuals and sound.

reactivision
Reactable and reacTIVision setup, image from reacTIVision page

Vision / tracking

For tracking the surface it is necessary with a camera which can detect infrared (IR) light, and have a wide enough angle to cover the entire surface. Most camera lenses have an infrared filter coating which blocks most of the IR light. In addition, a filter which blocks visible light is needed to avoid interference from the projected image on the surface.

Camera
We are using the camera Unibrain Fire-i since it was available in the lab. It captures 640x480 monochrome at 30 FPS and has a firewire interface. The lenses are replaceable and lenses which do not have IR coating are available. We have ordered and tried non-IR-coated 4.3mm and 2.1mm (wide angle) lenses.

ir coating comparison
left: lens with IR coating, right: lens without IR coating

At the moment we are using the 4.3mm lens since there are problems with detecting the objects with the 2.1mm lens. This is probably due to not having enough light and that the tangibles are smaller in the image. With the current illumination the camera sensor settings need to be set to maximum exposure time and a high gain, which indicates that not enough light is available for sensing. A high exposure time increases the reaction time and a high gain introduces noise in the image. Overall we feel that we would have needed a camera with one or more of the following: better(bigger) sensor (for less noise), better optics (for more light and better definition), and possibly better resolution. However, because of long delivery time of the Unibrain lenses this need did not become apparent before the final stage of the project.

IR pass / visible light cut-off filter
Since the IR illumination is 850nm, the ideal would be to use a filter which only lets this wavelength pass. We have ordered such a filter which should be possible to place on top of the camera with the help of a filter mount adapter. Due to problems with shipping we are currently using a cutout piece of photographic film negative which filters out most of the visible light (but not everything). It is expected that a commercial filter would give a cleaner result.

Infrared illumination

The challenge with illumination is to have enough light so that the camera sensor gets enough light to produce quality, recognizable, images. At the same time, it is important that the light is distributed evenly, and that highlights on the surface are avoided. After trying some QED222 5mm IR LEDs we found that even many these would not give a sufficient amount of light.

led comparison
left: side illumination by 21 QED222 IR LEDs, right: side illumination by 1 high-power IR LED

We decided then to go for a smaller amount high-power LED emitters instead. Although it is stated that these are rated for 3.2V~3.5V 350mA, a customer suggests that they can handle up to 700mA, according to the following product sheet suggests. We are currently feeding them around 500mA. Such LEDs generate heat and a thermal dissipation system was constructed by cutting pieces from a large aluminium dissipator, and drilling holes for screw fixation of the LED star. Some thermal conducting paste was applied. At the moment 5 such emitters are used. In addition some simple diffusers were mounted.

led setup
High-power LED mounted on aluminium heat dissipator

We found that pointing the LEDs in the direction of the surface either gave reflection highlights in the surface, or good but too uneven lighting with our 4/5 LEDs. It is possible that light distribution could have been better with some more (possibly 8) emitters. At the moment we have therefore decided to point the LEDs in another direction, to have a more "ambient" illumination setup. Some aluminium foil was added to the inside walls as an attempt to diffuse the light more evenly inside the box.

Surface projection

We needed a projection which could cover the circle with a diametre of 90 cm, which means that the shortest dimension (the height) of the image needed to be this size. With a box height of ca 90cm, this imposes either an extremely short-throw projector or a mirror setup (which has the effect of increasing the available throw distance for the projector). Since the projector in our lab did not give a large enough image even by a mirror setup, we needed to find and buy a new one.

The choice of projector was a Hitachi CP-A100 projector which has an extremely short throw (by a built-in mirror system). This made it unnecessary to set up a mirror projection system. The projector has good performance, however it should be noted that the distance from the bottom of the projector to the bottom of the image can be a problem, we solved this by placing the projector in an "add-on box" on one of the table walls.

The projection calculator has been useful when looking for candidate projectors. Other candidate short-throw projectors were Optoma EX-525ST, Epson EMP-400W and BenQ MP771.

Software

We are using the reacTIVision software for tracking the objects on the surface. This sends Open Sound Control (OSC) messages via UDP to client software. The messages comply with the TUIO protocol and contain information such as the position and orientation of the objects on the surface. For the client side there are TUIO clients, which are are easy-to-interface libraries available for several programming languages.

Processing Prototype Playground Program (PPPP)

We have implemented a prototype program in processing which receives TUIO messages from the reactivison software via the processing TUIO client. The purpose of the program is to have a simple and fun interface for trying out some of the reactable functionality, and the concept could be extended later to a more powerful music generator for live use.

software
Prototype software

Its main concepts are the "radar percussion sequencer" and the "loop blocks". The radar sweeps a line around the outer part of the surface, and when it hits a percussion marker, a sound is triggered. In the inner area there is space for markers which trigger loops. It is planned to extend this later to also include effects which can influence other markers. We are currently using the minim library in processing for generating sound - however there are some problems with unstable sound and it is planned to perform sound processing in MAX/MSP, by sending OSC control messages from processing.

OpenGL is used for the visualization of the feedback (blocks around the tangible boxes, radar line, and more). With the presence of a graphics accelerator this makes it possible to expand to high resolutions and high-quality visual effects while keeping a high frame rate, being less demanding on the CPU. This is also important in case the vision software is running on the same computer, also requiring CPU power.

Conclusion and future work

At the moment we have a working prototype which can be seen as a "proof of concept". The amount of work needed for searching for and purchasing parts took more time than expected, and thus little time has been spent on development of software. This can be seen as a first step of a reactable-like device for the fourMs lab, however it would be desirable that the project will be taken further and moved from an "alpha" state to a "beta" or more finished state. Only then will the real possibilities of the instrument be opened up, and allow for interesting experiments in terms of user interface, mapping and sound generation.

The following points are the most immediate possible improvements:

References