Viewing posts for the category System architecture

B-wind! v2.0 roadmap

The work we did at the AZ Residency allowed us to build a fully functional version of B-wind! adapted to fit a specific exhibition space.
We decided to call that version 1.0 to reflect the fact that for the first time all the little parts that make the installation work were in a functional stage. Our goal now is to work from this version on and evolve it to respond to our ongoing research work and to adapt it to the challenges of new exhibition spaces. In order to show that, we settled on a version naming scheme for the installation, where minor releases accompany our research work and major releases happen around public exhibition times.
So v2.0 will showcase all the work done in the installation until the next exhibition comes along. The first major version gave us a pretty good picture of how all the different parts that make up the system relate among themselves, and allowed us to simplify and abstract a lot of the complexity shown on previous system architecture diagrams (see above).
Currently, there are three main drivers for our research work towards the next major version of B-wind!: expressiveness, flexibility and efficiency.


Making someone feel like being the wind requires good action feedback, low control latency and expressive visual cues.
In this context good action feedback means making the wind generators' work at the remote location more expressive. For this to happen we need to focus on the wind generator design and control. We'll search for better and more powerful fans and look into how we can tweak the number of fans and their placement to get more expressive motion from the trees on the video stream.
Low control latency is also key to a good feeling of control. B-wind has a complex control feedback loop with lots of points where we can minimize latency. We will try to minimize motion tracking time and control signal generation at the installation and research latency prediction and compensation strategies to help with network lag to and from the remote location.
We will also look into making the visual cues presented to the participants at the installation a bit more expressive. Participants have found that the current fluid based distortion is perhaps a bit too subtle for the desired effect of power and control, so we will try and make the graphics a bit more impressive.


The B-wind! setup has to be flexible to adapt to different exhibition spaces. We learned from our recent experience what are the parts that need to be more flexible, both at the hardware and software level.
B-wind hardware support needs to be more flexible to allow us to adapt to different room sizes and projection surfaces. This normally means using more than one camera for user input and more than one projector for graphics output.
We will also launch soon a project to kickstart the RTiVISS surveillance kit and we will try to redesign the remote part of B-wind! to use it for remote wind generation and video aquisition.
At the software level, we will focus on making the code more modular and implementing external configuration and tweaking of setup parameters.


B-wind! has a lot of moving parts. Getting all of them to work together has been a tough but rewarding challenge. But now that everything is working we need to optimize it to work really fast. Modern hardware has a lot of computing power embedded both in the CPU cores and the GPU. Currently we're not making much use of it. We will work to better separate parts of the code that can be threaded effectively and to make better use of the graphics and stream processing power of the GPU.

Uff, that ' s a lot of ground to cover :) Stay tuned for B-wind! updates in the near future.


B-wind! @ O Espaço do Tempo v1.0

This is the first major release of the RTiVISS experience B-wind!, just in time for the AZ Labs exhibition @ O Espaço do Tempo. It reflects the work done over the two residency periods, and adaptations to fit available resources and space during the installation setup.

V1.0 includes multiple projection screens, inviting visitors to flow across the installation space. It uses available wired local network to cut back streaming video latency and an updated fan control code to use an Arduino-based DMX controller for the array of fans. We also set up a camera to capture the installation space and broadcast it live online for project documentation and remote maintenance.
Since the exhibition space is part of a convent surrounded by trees inside the walls of a castle, we managed to set up the remote location part of the installation just outside the exhibition space. This allowed us to optimize resource use — wired ethernet, power cables and DMX control equipment — and to make sure that sensitive equipment, like the IP camera, was kept inside the exhibition space at all times.
Having the two spaces so close together had an interesting effect. The fans rotating outside aroused curiosity that brought people to experience the interactive installation, and the experience of controlling a space so close by inspired people to go outside and see the trees and the fans working.

Now the exhibition is over but B-wind! development doesn't stop here, we still have a loads of information to process (ouch!).
The whole development experience was very intense and we certainly learned a lot from it. We also got some very good information from the feedback of installation users, and from the video recordings of their interactions.
We're already hard at work, improving the project for future presentations » Stay tuned for the upcoming B-wind 2.0!


B-wind v0.2 system architecture map

This is the updated system architecture for B-Wind! developed together with team members Pedro and Maurício during the second half of the AZ residency, nodes marked with dashed lines represent features that we're planning to develop in the next stage.

Currently, the system involves four main components:

  1. Real-time video capture and streaming at a remote location using a wireless IP Camera. The video stream is processed in openFrameworks using a custom GStreamer pipeline.

  2. Real-time motion tracking of the user at the instalation implemented using OpenCV.

  3. A working prototype of an Arduino based fan controller that can turn on and rotate a 12v fan according to motion tracking.

  4. Particle effects generated from the user's performance at the local installation, where motion information is used to displace a fluid simulation grid which is overlayed in the streamed video of the remote location.

We still have to hurry, for there's some work to be done to customize the system to the available technical resources and to the characteristics of the exhibition space. Go B-wind team, go!


B-wind v0.1 system architecture map

Here's the first version the sysrtem architecture for B-wind! created during the first half of the AZ residency, with the feedback of Pedro, Maurício, Sérgio and Jorge; dashed lines nodes will be developed next stage!