CBA Product Demo

We recently completed the first stage of the CBA Product Demo application which was displayed at the Sydney CeBIT Exhibition 05-07 May 2014. The application is designed to showcase the latest and greatest innovative products produced by CBA. The application currently showcases three products, their secure payment devices Albert, Leo & Emmy, and their iPad application Daily IQ.

Product Demo Three Way Split
The application in its current state has been developed to be displayed via a single projector at a resolution of 1920×1080 and a physical size of 7 x 4 meters. Due to such a huge display area the application can divide into 3 separate areas, effectively creating two new instances of the application which are completely separate from the first instance.
However the next stage of development will require the application to have the capacity to show up to four instances at any given time and will be displayed across two edge blended projectors at a resolution of 3648×1080 due to reduced projector throw distance.
3D product explorer of the CBA Albert device running in single instance mode.
3D product explorer of the CBA Leo device running in single instance mode.
3D carousel showcasing CBA’s Daily IQ iPad Application running in single instance mode.

Technology

During the prototyping phase of the project we experimented with a Microsoft Kinect as an input device. It worked well, however we found that because we were only really interested in hand movement in this case the Leap Motion gave us more accurate control.
The Brix i7-477 was selected to run the whole experience. This little guy fits in the palm of your hand and still runs our application in full HD at a silky smooth 60 fps.

The Leap Motion is an infrared sensor that supports hand and finger motions as input and does not require any physical contact. This allows the user to interact with the content simply by waving their hand in front of them. logotype’s open source native extension was used to hook into the leap motion data.

We’ll be putting together a proper showcase video within the next few days, but in the meantime here is a short compilation video of testing and deployment of both the “Product Demo” and “Bit Data” app which was automatically created by google+’s “auto awesome” video creator.

Challenges

Headless Display

Annoyingly it seems that the Intel HD family of GPUs (which the brix utilize) suffer from an issue where if a monitor isn’t plugged into the PC the GPU doesn’t run. This means if you want to remote into a PC that uses one of these GPUs while it isn’t plugged into a monitor all you see is a black screen. This is something we often do from time to time to either make updates or supply remote support. Because two of the PCs used in this experience are only used to relay the leap motion sensor data, it caused our applications to not initialize.

Amazingly it appears there is no software solution to this issue. At the time of writing this it seems the only way to resolve this is to either attach a monitor, or create a dummy VGA or HDMI dummy adaptor as seen above.

Multiple App Instances

From a development perspective one of the most challenging aspects of this project has been implementing multiple isolated instances of the experience within the one air application. We achieved this by creating multiple robotlegs instances and assigning each one it’s own Stage3D context. The biggest limitation with this approach is there are only four Stage3D contexts available when publishing a desktop application. Interestingly you can run two applications at the same time that both use four Stage3D contexts, so it’s obviously not a hardware limitation, rather it’s probably been put there as a safeguard against overloading low end machines.

Leap Motion Positioning

After watching people use the application for the duration of the CeBIT exhibition it became evident that some people struggled with the new form of interaction. Part of this may have been due to the fact that the leap motion was placed about half a meter in front of them and at about waist height, this was done so that when the user held their hand directly in front of them the “virtual cursor” would appear in the middle of the screen and would give them the freedom to move in any direction without moving out of view of the sensor. However it seemed a good proportion of users couldn’t resist putting there hand a few CMs from the leap motion sensor. This resulted in the “virtual cursor” being position off screen. Since the exhibition an exponential remapping of the “virtual cursor” position has been added which basically means that the further away the virtual cursor is from the center, the smaller the amount of movement occurs resulting in the cursor always being onscreen.

Related Projects

Digital Credits

Jamie Foulston – Art Direction
Lee Hung Nguyen – Digital Design
Lyndon Hale – Creative Direction
Pete Shand – Lead Developer
Michal Moczynski – Support Dev / Testing
Jonathan Kafkaris – Support Dev / Testing
Jon Rout – Digital Producer
Jake Soper – Digital Producer