Do you remember the way Tom Cruise, playing John Anderton in Minority Report, used to explore the files of his investigations?
Here is a short reminder:
The interaction that John has with the images on this impressive holographic screen is really cool: it’s like the computer could read in his mind!
More than once I happened to speak with customers who were told that having that type of navigation was possible.
The main problem in which I find myself in those situations is the feeling, indeed, the certainty that these customers have been deluded by someone just to sell them a service. They have been victimized by deception, and bringing on the ground someone flying on the basis of deception is as infamous a job as telling a 4-year-old boy that Santa Claus doesn’t exist.
Unfortunately Minority Report is a film: all those images were edited in post production on the basis of a script and Tom Cruise was looking at a green screen while doing those actions. Oh yes, it’s a sad reality, but it’s true.
After this necessary premise, I want to tell you how, in the real world, my team and I managed to achieve
a human-machine interaction without any visible device.
Thanks to a dear friend of mine, Carlo Basci, the Roncaglia Group came to me with the idea of moving the new Dyson V11 on a 4-meter screen without the use of any physical interface. A beautiful challenge, which I could not resist. To make the audience interact with a 3D model rendered in realtime on a big screen, an innovative and very effective idea to present a new product involving the public.
It took weeks of research and evaluation, as the systems allowing to achieve this result are several and different. We considered Kinect, other cameras and also the path of programming gestures from scratch. When evaluating this type of installation, it’s typical to think it’s possible to complete the task doing everything “at home”. This is certainly possible, but programming a gestures engine from scratch takes months in terms of hours of development and very specific, therefore very expensive, coding skills.
The first issue to overcome is the understanding that, in order to meet an adequate budget for a temporary installation, it is necessary to rely on market solutions.
Having accepted this, you’ll have to focus on which products are out there and which are the most effective ways to implement them. As a consequence, creativity becomes knowing how to combine things that already exist, considering all their limits and constraints, rather than imagining how they should be in a perfect world, forgetting to take into consideration the time (therefore money) necessary to build something.
The ideal tool seemed to be the Microsoft Kinect.
Kinect libraries offer an excellent analysis of a person’s movements, and there are several libraries that can interpret gestures to pass data to the navigation of a rendering engine. However, we discarded this solution due to a single, decisive problem: it isn’t possible to decide which person, among those included in the Kinect field of view, the system should lock onto. Having the aim of creating an experience dedicated to one person at a time, this characteristic led us to avoid this type of approach. We also evaluated other devices, but their dynamics turned out to be very similar to Kinect’s one. The main reason may be that all consumer devices are programmed to be used in the living rooms of a house and therefore don’t plan to work in crowded places.
At the end of this wandering path, the choice fell on Christie Airscan.
It is an industrial derivation device that, through a laser scan, allows you to create a touch surface “in the air”.Once collected, the touch data is passed to a software, developed by my friends of Hotminds, in which the model of the DYSON V11 is loaded and which renders it in real time. The software we used is called FLY; it was created for industrial applications and is used by our customers as a sales tool in trade fairs where it is impossible to bring large machinery. Thanks to FLY, exhibitors can illustrate the product interactively, making the customer understand and elaborate the processes inside the machines, appreciating all the details and their complexity.
The final result of the installation is this:
it certainly isn’t the marvelous interaction of Minority Report but, considering that this is real life, I let you judge whether we have done a good job.
Good vision!