Leap Motion were the first to market a high-precision gesture-tracking device for consumer use. Gravillux was an early interactive art experience for the iOS platform, designed by Scott Snibbe. The question was how to synthesize a new gesture-driven experience that leveraged the best of both elements.
Our challenge was to design a gesture responsive interface so users could intuitively interact with the app’s settings, define the appearance of their preferred star-field and have fun creating their own gravity inspired artworks.
Gravillux was originally developed as an interactive artwork with interaction performed using a computer mouse or touch pen. The software starts with a cartesian grid which transforms into organic systems reminiscent of galaxies or subatomic particles. As people use their hands to “push” and “pull” on invisible gravitational lines, the stars fold into different formations generating complex patterns that twist, wrap or explode across the screen.
I transformed Gravillux into an App for Leap Motion by developing an interface that could interpret hand gestures. Users could select the grid density, choose different colors, and even select a track from their music library. This option had the effect of making the particles dance in time to the music.
Making particles move in response to hand gestures feels uniquely empowering. The patterns appear to map invisible forces that flow from the users fingertips. The application distributed on the Leap App store and helped introduce thousands of new users to the magic of gesture controlled interfaces.
Our approach was to onboard new users with with gestures they would make instinctively, then broaden the range as they gained confidence and got deeper into the experience. At the start, users point and tap their fingers to choose the colors and density of their star field.
Once the grid is set up, they can shape the grid using one or both hands. Pushing a finger forward generates gravitational hotspots - pulling back repels the particles outwards. The system also recognizes open and closed fists. Swiping an open hand empties the screen all together, resetting the canvas for another round.
As Creative Director on the project I was responsible for the final look and feel of the app. I also designed the gestural language and the interface that explains how to use each gesture. I also directed several user-testing sessions, helping the engineers improve the performance and usability of the gestural interface.