Advertisement

Flyfire, a project initiated by the SENSEable City Laboratory in collaboration with ARES Lab (Aerospace Robotics and Embedded Systems Laboratory) aims to transform any ordinary space into a highly immersive and interactive display environment. In its first implementation, the Flyfire project sets out to explore the capabilities of this display system by using a large number of self-organizing micro helicopters. Each helicopter contains small LEDs and acts as a smart pixel. Through precisely controlled movements, the helicopters perform elaborate and synchronized motions and form an elastic display surface for any desired scenario. With the self-stabilizing and precise controlling technology from the ARES Lab, the motion of the pixels is adaptable in real time. The Flyfire canvas can transform itself from one shape to another or morph a two-dimensional photographic image into an articulated shape. The pixels are physically engaged in transitioning images from one state to another, which allows the Flyfire canvas to demonstrate a spatially animated viewing experience. Flyfire serves as an initial step to explore and imagine the possibilities of this free-form display: a swarm of pixels in a space. For more information, please contact: senseable-fly@mit.edu

Having trouble viewing this video? Try downloading the latest version of Flash or contact your IT department.

If you have a video that you think we should run, send a link to ecn_web@advantagemedia.com.

Advertisement
Advertisement