Big Life & VR (Legacy)

Public Health Multi-Platform

Big Life started out as a simple set of calculators: you can go online and calculate your levels of sodium, or your life expectancy. Or both. But when the Ottawa Hospital Research Institute approached us about doing a redesign, a whole ecosystem was born.

Prototypes are some of the best things in the world to develop. We provided some deep design research and insights that showed examples of how to surface and track user data, using a combination of a responsive website, mobile applications and sensor inputs. We didn’t want to overwhelm the users with lots of itty bitty information, so we kept the calculator simple, engaging and easy to use. Ultimately, we provided a combination of services, from early concepting and storyboards all the way through to defining the interaction design and visual tone for the brand.

It’s true – measuring sodium and thinking about life expectancy is pretty serious business. But that doesn’t mean the tools have to be.

An Immersive Narrative Experience (R&D)

This project combines our passion narrative with our virtual reality and emerging technologies expertise. It’s virtual reality, positional/motion tracking, volumetric depth mapping, Voxel and 3D Modeling in a fully immersive experience. The viewer (donning a VR headset) follows a single storyline, but can see and feel it from different characters’ POVs.

To allow seamless ‘mind swapping’ between the characters, we developed an interactive targeting system that allows instantaneous swaps, without relying on overtly gamifying the content. One thing we have become aware of through this exciting project: the sheer amount of data is prohibitively expensive, so one aspect of our approach is to limit and bring down the requirements for the experience to run optimally on lower-end hardware – democratization of VR!

The experience is also intended to provoke discussion around the use of VR systems for everyday work and play - how and what would the wearer interact with? Would there be a deep use of gesture-based interactions? How do we get around the lack of tactile feedback? How do we infer the relation of the object in a 3 dimensional depth-of-field? While we do not have definitive answers yet, exploring the medium brings us ever nearer.

Powered by
Our website uses cookies to enhance your experience – by continuing to use this site you are agreeing to our use of cookies.