Sorry to disappoint but I just did some experimentation and I am pretty sure about this now. My finger is not a mouse. Fingers are great for quickly pointing to a vague area, they are perfect for gestures but they are not that glorious for pointing to a tiny retina-size area on the screen with pixel precision. Sadly.
R.I.P. my “Miraculous Mouse” App (I even had a name …)
The very first thing I wanted to do with the LeapMotion was to build a mouse wrapper tool that emulates the mouse and its Mac OS X gestures. And surprise, looking at what others are doing with their Leap Dev Kit, this idea turns out to be not that original. Nevermind. Won’t stop me. I opened XCode and started coding.
The Basic UX Concept
Priority: Obviousness. The mouse pointer changes its size according to the distance of the pointing finger to the screen. Clicks are triggered by tapping towards the screen. Dragging by holding the pointing finger close to the screen and then releasing it by moving it back. As visual feedback the mouse cursor turns red to signalize a click event or a lighter red for getting close to the “click area”.
Before I went ahead with the more tricky features I did some basic tests to validate the usability of my brand new finger based pointing device.
Some tests with a quickly written Leap controlled mouse pointer uncovered some flaws in my plan to disrupt the mouse industry with a wipe.
All 3 test results basically disappointed. Please go to Vimeo to watch the video in HD.
- Keeping the mouse steadily over a small area on the screen → The unfiltered signal from the Leap comes with some noise/shakiness. Most interestingly this is mainly caused by my fingers inability to hold still on a pixel level and not by eventual tracking noise from the LeapMotion device.
- Quickly focusing a spot on the screen → Takes longer than expected. Significantly longer than using a mouse or trackpad. Also issues with my finger’s shakiness. Once again I prove to be human.
- Pretending a potential click gesture by pushing towards the screen → When performing the movement along the z-axis, x and y is effected as well. Completely Locking X/Y while moving in one direction, not possible. Again additional filtering will be needed to achieve this programmatically.
And Here’s the Good News
All three issues can be solved or significantly improved programmatically. Adding noise filtering/calculating averages, using more advanced gesture recogniser methods, etc. It’s not the LeapMotion but noise introduced by human factors that needs to be balanced out.
In the end it comes down to your finger. Your arm. Your muscles. Achieving the same speed and accuracy of a mouse with your finger? WON’T HAPPEN. This also means that my mouse wrapper app might not be such a good idea in the end.
I need to switch the concept from a mouse emulation to something that removes the need for a mouse at a more basic level. Thinking about gestures for resizing and moving windows directly without explicit pointing. Maybe even a good way to combine the keyboard and the Leap in a meaningful way.
I am convinced that gestures are the future, especially ones with implicit spacial awareness. I am sure physics games will be absolute fun, too. However I am still focusing on use cases that can provide real life productivity enhancements.
Experimentation will continue.