System Architecture

Pulsefield System Design

Pulsefield System Design

The Pulsefield consists of several loosely-coupled subsystems which communicate over a local area network using Open Sound Control (OSC).  This allows different subsystems and existing commercial software to be easily combined with the core systems.

Correlation Analyzer

Several cameras (typically six), are connected via ethernet to the first subsystem, the correlation analyzer.  This analyzes the incoming images and forms a set of visibility decisions based on whether the background (the opposite side of the Pulsefield) is visible or not at each of the preprogrammed positions.   These positions usually correspond to the location of each LED on the opposite side.  With 960 LEDs, this results in a 960×6 matrix of visibility data points, each one of which is in one of three states: visible, blocked, or indeterminate.   These visibility data are broadcast over the LAN using OSC as each new frame is acquired at around 20 frames/second.

Target Hypotheses and Controller

The visibility data is used by this subsystem to form hypotheses as to the locations of people within the Pulsefield.   Effectively, rays are drawn between each camera and each LED for the visible paths, and separately for the blocked paths.

Blocked Rays
Blocked Rays
Visible rays
Visible rays

The software identifies potential locations for people as areas where no green lines cross — the locations where an object would block the view of the other side for all the cameras.  These potential locations are then compared with the hypotheses for the locations of the people in the Pulsefield from the previous frame.   Using certain reasonable parameters for the minimum diameter of a person, the maximum speed that someone could move between frames (i.e. in 50 milliseconds), the best set of positions of the people (each one with a unique persistent ID from the previous frame) is formed.  If a particular person from a prior frame is not located, they are assumed to be where they were last seen for a fixed number of frames and, if they haven’t reappeared by then, are assumed to have exited or ducked below the level of the cameras.   Entries of new people are handled by assuming that new targets can form near the entrance to the Pulsefield only, since the sides are blocked by the structure.

Once the positions of the targets are formed for the current frame, they are broadcast via OSC to any listeners.  These broadcasts include the location, velocity, approximate size, ID, frame number, and current time for each target.

This module also acts as a central controller for the entire system.  OSC messages received here can control the overall system operation.   For example, messages from the TouchOSC are received, acted upon if they affect this module, or are dispatched to other modules.  This allows remote control of the parameters and overall operation of the system.

Video Apps

Tracking messages from the target identification are received by a Java video program which supports various “Apps” that determine the video, LED display, and sound output based on target positions.   OSC messages also control the choice of the apps and various parameters.   These apps, although they are full Java, use many of the library modules and overall event loop from Processing.  Some of those apps can be viewed on the Apps page here.   The video generated by these apps is displayed on a large video screen behind the Pulsefield.

Music Output

Most of the Apps also have an auditory component which is played on speakers around the Pulsefield.  The sound or music is generated based on MIDI or OSC messages from the App, usually in concert with the video display.  For most of the Apps, the sound generation occurs in Ableton Live, MSP/MAX, or Apples internal sound synthesis.   For Ableton, OSC messages are directed to a LiveOSC plugin that allows extensive control of Ableton’s tracks, playback, and parameters of devices using OSC commands.   For example, some of the Apps control Ableton macros which are assumed to be the first device in the device chain.   These, in turn, can be programmed to control and other device parameters from within Ableton.

Ableton device rack showing Macros which are controlled by OSC messages from the Pulsefield Apps
Ableton device rack showing Macros which are controlled by OSC messages from the Pulsefield Apps

For more flexible music synthesis, Max, either standalone or as Max for Live plugin can be used.   Max includes native blocks for receiving and sending OSC commands.   Similarly, SuperCollider, ChucK, or any other music synthesis program can be controlled and played from the Pulsefield Apps.

 LED Controller

The LED display around the inside of the Pulsefield performs several functions: as a distinct background for the camera image analysis, as an automated calibration mechanism for the camera’s lenses and orientation, as illumination of the space, and as one of the interactive feedback elements.   Each of the 960 LEDs is individually controllable and the entire display can be completely updated at the full frame rate of 20 frames/second  — see the Hardware section of this site for more details.   The LED Server subsystem receives OSC commands from the other subsystems and uses them to determine the RGB value of each LED.  During operation, the actual pattern depends on the current App, but typically the LEDs near each person are lit with a unique color for that person.  For example, the first person that enters may be assigned RED and then wherever they go, the LEDs are illuminated RED near them.   The next person may be MAGENTA, and there colors will also be overlaid on the display.   In addition, the LEDs can be made to pulse with the music being played — even to the degree that each person’s color pulses with the music track that they are individually controlling — a separate VU meter for each person in the Pulsefield.

iPad Controller

Although the Pulsefield, once started, can operate without intervention, finer control and monitoring is possible through an iPad running a custom template in the TouchOSC application.

iPad Controller showing Control page.
iPad Controller showing Control page.
iPad Controller showing Grid page.   This shows the position of the occupants in an overlaid grid in the Pulsefield.   The current song and track controls are also displayed and updated as people move between grid positions.
iPad Controller showing Grid page. This shows the position of the occupants in an overlaid grid in the Pulsefield. The current song and track controls are also displayed and updated as people move between grid positions.

 

Leave a Reply

Your email address will not be published. Required fields are marked *