Leap Motion. Truth be told. Your finger is not a mouse.

Sorry to disappoint but I just did some experimentation and I am pretty sure about this now. My finger is not a mouse. Fingers are great for quickly pointing to a vague area, they are perfect for gestures but they are not that glorious for pointing to a tiny retina-size area on the screen with pixel precision. Sadly.

R.I.P. my “Miraculous Mouse” App (I even had a name …)

The very first thing I wanted to do with the LeapMotion was to build a mouse wrapper tool that emulates the mouse and its Mac OS X gestures. And surprise, looking at what others are doing with their Leap Dev Kit, this idea turns out to be not that original. Nevermind. Won’t stop me. I opened XCode and started coding.

The Basic UX Concept

Priority: Obviousness. The mouse pointer changes its size according to the distance of the pointing finger to the screen. Clicks are triggered by tapping towards the screen. Dragging by holding the pointing finger close to the screen and then releasing it by moving it back. As visual feedback the mouse cursor turns red to signalize a click event or a lighter red for getting close to the “click area”.

Before I went ahead with the more tricky features I did some basic tests to validate the usability of my brand new finger based pointing device.

Shattered Dreams

Some tests with a quickly written Leap controlled mouse pointer uncovered some flaws in my plan to disrupt the mouse industry with a wipe.

All 3 test results basically disappointed. Please go to Vimeo to watch the video in HD.

  1. Keeping the mouse steadily over a small area on the screen → The unfiltered signal from the Leap comes with some noise/shakiness. Most interestingly this is mainly caused by my fingers inability to hold still on a pixel level and not by eventual tracking noise from the LeapMotion device.
  2. Quickly focusing a spot on the screen → Takes longer than expected. Significantly longer than using a mouse or trackpad. Also issues with my finger’s shakiness. Once again I prove to be human.
  3. Pretending a potential click gesture by pushing towards the screen → When performing the movement along the z-axis, x and y is effected as well. Completely Locking X/Y while moving in one direction, not possible. Again additional filtering will be needed to achieve this programmatically.

And Here’s the Good News

All three issues can be solved or significantly improved programmatically. Adding noise filtering/calculating averages, using more advanced gesture recogniser methods, etc. It’s not the LeapMotion but noise introduced by human factors that needs to be balanced out.

Found Wisdom

In the end it comes down to your finger. Your arm. Your muscles. Achieving the same speed and accuracy of a mouse with your finger? WON’T HAPPEN. This also means that my mouse wrapper app might not be such a good idea in the end.

I need to switch the concept from a mouse emulation to something that removes the need for a mouse at a more basic level. Thinking about gestures for resizing and moving windows directly without explicit pointing. Maybe even a good way to combine the keyboard and the Leap in a meaningful way.

I am convinced that gestures are the future, especially ones with implicit spacial awareness. I am sure physics games will be absolute fun, too. However I am still focusing on use cases that can provide real life productivity enhancements.

Experimentation will continue.

Projekt 12.02.2013

Leap Motion. Put your hands up in the air!

I just received my Leap Motion development unit. I AM excited. Now the one thing I really need is loads of time to dig deep into the framework and come back with some fun use cases.


Very First Impressions

Tracking Area

In contrast to e.g. the Kinect, LeapMotion tracks the interaction space from below your hands. This makes its usage area quite unique.

⊕ Perfect for getting precise pointing information to a screen in front of it. Hey, that sounds quite useful.

⊕ Tracking the angle of the pointing object.

⊕ Tracking multiple objects simultaneously. Get ready for your 10 fingers independently gesturing.

⊖ Naturally it cannot track overlaying objects, e.g. crossing your fingers or overlapping hands. This is no surprise but it turns out to be a serious limitation – think about making a zooming/pinching gesture towards the screen – tracking that will most likely fail or be very unreliable.


This is the most impressive aspect. I don’t have the final hardware/software version but it already is FAST. It definitely is the most responsive 3D tracking I’ve seen so far. The processed response onscreen is almost immediate, much less latency than e.g. with the Kinect.

This is really great to see, as responsiveness will definitely be crucial for LeapMotion’s acceptance and efficiency – My ❶❶.


My first test shows that the Leap is astoundingly precise. I would compare its ability to quickly point on a very specific area on the screen with a trackpad, maybe a bit slower than that, but much more immediate. Depending on the chosen tracking targets (fingers, palm, normalized center, etc.) the pointer can flicker a bit, but that seems to be controlable with more advanced handling in software. Again compared to the Kinect, you’ll see much less tracking noise. Arm tiring might become an issue but I need to spend more time testing in order to find out.


Obviously Hardware and Software go hand in hand to make the LeapMotion a usable input device. My first impression hints to a very well designed piece of hardware. The software and SDK still seems to have some rougher edges but it is still evolving rapidly. More details on this after I had some more hands on experience with the code.

Projekt 10.02.2013