As I was jumping around of excitement (wasn't really jumping around .. okay maybe a bit) out came this little black - grey thingamajig that kinda reminds you of a smaller iPhone purely because of the cleanness of the design. As Kim already installed all the software needed from their website, I got to exploring the capabilities of this little box that premisses to revolutionise the way we interact with our computer. Actually the entire description of the product reminds me of something Apple would say to sell you a phone that does something other phones have been doing for years already, like make phone calls, or take pictures.
But nonetheless this little box managed to put a smile on my face once I started to play around in Google Earth with it. It sent me back some good years up to the point when I would imagine my palm being a plane or a helicopter but this time I actually had visual feedback of the movements done and thanks to Google Earths depth of field for major cities, I was dodging buildings and flying along the streets of New York and shooting random stuff using my thumb as a trigger (really got into the whole being a kid thing).
Fiddling around with it
It took a bit of time actually to get used to the smoothness of the control and to stop the world from completing 365 rotations in 3.7 seconds, but once I got the hang of it, controlling it application made a lot of sense and was, as I've mentioned, pretty enjoyable.
The first thing that came to mind when playing around for a few minutes was how cool would it be to control an AR Drone with just using your hand. Apparently I wasn't the only one that had such an idea, check these guys out: AR Drone with Leap Motion. But I would still love to do it, looks like fun and it wouldn't be that hard to implement.
I enjoyed the tutorials that come with the software, especially the one where you get to play around with a mass of light and colour and as I am easily impressed I kept playing an insane amount of time with that just to see how it interprets the movements I'm making.
Some of its features
After fiddling with it for a while the tracking system started to remind me of the Kinect sensor in the sense that it worked really good as long as some conditions were fulfilled. The Kinect sensor behaves the best when the person is standing directly in front of it and is not that good at tracking if that person turns 90 degrees and is seen from the side. Same goes for the Leap Motion. If the hand goes at a steeper angle, the sensor has difficulties tracking fingers which makes sense when you think how the whole thing is made. It does not recognise if the hand in its field of view is left or right and it sometime looses one or more fingers from tracking but otherwise it posses some nice features.
The user can either work with one hand, two hands or even a pointing tool that he software recognises. Scaling with one hand can be done by hovering the hands at different height above the sensor and with two by bringing the hands closer together or from placing them further apart which is a pretty cool feature if the application contains 3d objects that need manipulating. There are a whole bunch of other gestures implemented such as rotating, swiping taping the screen, clicking a keyboard and others that could come in handy for a bunch of interactive stuff.
The waiting begins
Since I wanted to learn a bit of Python I downloaded the SDK and got started with checking out the sample provided from their website which contains all the information needed to start developing your own applications and from what I'm concerned, I can't wait to try out some ideas and hopefully some great projects will come as soon as possible for this little box filled with promises and possibilities.