EyeCon

Introduction:

Human movement in general, and dance in particular, is of high complexity. As humans we perceive it subjectively and are able to interpret its functional or expressive content. As an engineer i have to look at it from a different perspective. I could try to break it down into a set of parameters. I could look at the overall speed of the movement or the exact position of the limbs in space or the body's position in the room. That's exactly what EyeCon is doing, it's breaking down movement into a set of abstracted features. In the end it allows you to individually assign digital media to each of those movement parameters.

The systems by which EyeCon senses movement are represented through graphic structures called Elements. The Elements are superimposed onto the live video image in the Video Window where they can be scaled and manipulated. The Elements are then assigned their individual properties such as the volume control of a particular sound or triggers for external media.

Types of Elements*

 

TOUCHLINES

Touchlines are lines drawn on the video image in the computer. these act as triggers for the presence or absence of body parts or objects, but can also be scaled, so that different places along the line can have different effects. In this sense they provide an easy way to track position along a line.

 

DYNAMIC FIELDS

Dynamic fields are boxes which represent fields and respond to total movement dynamic. They can be used to trigger sounds and images or to control the volume and pitch of sound files so that, for example, the faster the dancer gestures or moves, the louder the sound or the higher the pitch. As you might imagine, because they function so intuitively, Dynamic Fields are the most accesible and easy to use element.

FEATURE FIELDS

Feature Fields look at first like Dynamic Fields, but they function quite differently. For example, Feature Fields can measure the dancer's overall size (how expanded or contracted they are) or how close two dancers are to one another. They can be used to analyze shape (width compared to height), or topmost, leftmost, rightmost point on the body. Finally, there are a set of controls for sensing the direction of movement, so that a step to the right will sound differently than a step to the left, reaching up differently than reaching down, etc.

 

POLYGONAL FIELDS

The polygonal fields are acting like dynamic fields. The advantage is that they can have any shape and are not restricted to rectangles.

POSITION TRACKERS

Position trackers track the position of one or multiple persons as they move around the video image. This means, if you have an overhead camera, you can track the location of persons as they move within a room thus the environment can be made to respond to different persons differently. Touchlines and Dynamic fields can be attached to the tracker, so that a given array of controlers can move with the performer as they move around the space. Finally, this feature permits color-specific tracking, presenting the possibility of distinguishing between dancers by the color of their costume.

Types of output

Sounds

External synthesizers or other musical equipment
Internal synthesizer (MIDI). Virtually all computers have a built-in synthesizer chip.
*.wav files may be triggered, volume controlled, panned, pitched, etc.

 

Images

Still pictures (digital pictures, *.jpg, *.bmp, etc.)

Animations

FLASH animations

Movies

*.avi files (movies) may be played with control over starting and stopping points, direction (ie. forwards or backwards film motion) and speed.

 

Data link

By linking EyeCon to external systems -- either running on the same machine, or a different one -- it is possible to control a great variety of secondary media. This is done via data link or network. The additional computer does not need to be a PC. External software and hardware which can be controlled by EyeCon include software and hardware synthesizers, MIDI-compatible lighting boards, as well as programs like Director, MAX/msp, Reactor, and Isadora. The data link can be via USB or Ethernet and can use MIDI or OSC protocols.

 

Technical background

EyeCon works by comparing the individual pixels of two different video frames and analysing them for differences in brightness or color. The difference between these two frames is time. They are the same scene, but one is the present, and one is the past. In some cases, there may be only 0.04 seconds between, but to the body part or parts in motion, this is plenty to allow the determination that motion has occurred. Generally speaking, not all of the pixels in the two images are compared, but rather only those marked off by EyeCon's Elements.

 

* - A complete list of Elements, along with technical information about each one, can be found in our Table of the Elements.