Coffee & Synths. In preparation of …

coffee-and-synth-prep03

I just prepared the tech setup for our next week’s coffee & synths session. Trying to keep it as minimal and portable as possible, plus: avoiding computers as point of failure and distraction. Currently all parts fit in a in small basket and can be carried around easily. So the only thing we depend on infrastructure-wise is access to a single power outlet. That’s it.


Main Parts

  • 8 Channel Mixer
  • 4 Stereo Channels Headphone Amp
  • Stereo Audio Recorder

During the session the mixer and the headphone amp will be accessible to everybody. All relevant connectors and channels are color coded so you will know immediatly which knobs to turn to adjust your input or headphone level.

I can’t wait. This will be great fun. If you are interested and haven’t joined yet, you can do so here.

Coffee & Synths I. You are invited.

CoffeeAndSynth

Let’s combine the two greatest inventions in the world. Coffee & Synths! Bring your iPhone, OP-1, Animoog, Nintendo DS, MPC or any other mobile music device or app and join our open session!


When

Saturday afternoon, March 16th, 14:00 – 16:00 Agora, Kurz vor Eden Café, Berlin, free. This will be a one-time experiment. But if it works, we might do it again. Who knows.

Participate!

All you need is …

  • Your portable instrument of choice
  • Headphones

Join the event at Facebook

How it works

We meet at the wonderful Agora Café, create random groups of 3 people per session, put headphones on and jam out!

Everyone is welcome. We are really all beginners, somehow.

Everybody brings their own sound device – this can be [insert endless list of mobile music making gadgets here] really anything that produces sound, that isn’t too big to fit on a coffee table and has a heaphone jack. We attach all devices to a little mixer so that everybody on the table gets the same mixed signal. Every session lasts excactly 10 minutes, then we form a new random group and repeat. Other people in the room won’t hear a thing, but the session will be recorded, so you can listen and share the sounds afterwards.

In detail …

  1. All participants meet at 14:00
  2. We form a session of 3 people, randomly chosen from the ones who join
  3. To avoid confusion, in each session max. 1 participant takes the rhythm/drums part. he/she defines the tempo of the session.
  4. We connect everyone’s device to the mixer.
  5. Everyone puts on their headphones and adjusts their volumes.
  6. Starting the clock.
  7. ||| Play ||| No Style defined. Noise, Ambient, Electronica, everything can happen. All live and Improvised.
  8. The session ends after excactly 10 minute. There will be a counting clock on the table visible for everyone.
  9. From all people who did not participate in the session, again 3 people are chosen randomly for the next one.
  10. Repeating steps 1-9 until it’s 16:00.

We do only 1 session at a time, so you will have some time to relax until it’s your turn again. But don’t worry. Agora Café is a cozy place to simply hang out, too.

Yes, ruuules …

  • No prerecorded sequences please. Everything is played live and improvised.
  • One device per session. Pick yours wiseley.
  • No syncing.
  • No acoustic intruments.
  • No cords. No hassle. Keep your setup battery powered and as simple as possible.
  • No laptops (on the table).

Questions? Post them to group, twitter or reach me via mail.

Leap Motion. Truth be told. Your finger is not a mouse.

Sorry to disappoint but I just did some experimentation and I am pretty sure about this now. My finger is not a mouse. Fingers are great for quickly pointing to a vague area, they are perfect for gestures but they are not that glorious for pointing to a tiny retina-size area on the screen with pixel precision. Sadly.

R.I.P. my “Miraculous Mouse” App (I even had a name …)

The very first thing I wanted to do with the LeapMotion was to build a mouse wrapper tool that emulates the mouse and its Mac OS X gestures. And surprise, looking at what others are doing with their Leap Dev Kit, this idea turns out to be not that original. Nevermind. Won’t stop me. I opened XCode and started coding.

The Basic UX Concept

Priority: Obviousness. The mouse pointer changes its size according to the distance of the pointing finger to the screen. Clicks are triggered by tapping towards the screen. Dragging by holding the pointing finger close to the screen and then releasing it by moving it back. As visual feedback the mouse cursor turns red to signalize a click event or a lighter red for getting close to the “click area”.

Before I went ahead with the more tricky features I did some basic tests to validate the usability of my brand new finger based pointing device.

Shattered Dreams

Some tests with a quickly written Leap controlled mouse pointer uncovered some flaws in my plan to disrupt the mouse industry with a wipe.

All 3 test results basically disappointed. Please go to Vimeo to watch the video in HD.

  1. Keeping the mouse steadily over a small area on the screen → The unfiltered signal from the Leap comes with some noise/shakiness. Most interestingly this is mainly caused by my fingers inability to hold still on a pixel level and not by eventual tracking noise from the LeapMotion device.
  2. Quickly focusing a spot on the screen → Takes longer than expected. Significantly longer than using a mouse or trackpad. Also issues with my finger’s shakiness. Once again I prove to be human.
  3. Pretending a potential click gesture by pushing towards the screen → When performing the movement along the z-axis, x and y is effected as well. Completely Locking X/Y while moving in one direction, not possible. Again additional filtering will be needed to achieve this programmatically.

And Here’s the Good News

All three issues can be solved or significantly improved programmatically. Adding noise filtering/calculating averages, using more advanced gesture recogniser methods, etc. It’s not the LeapMotion but noise introduced by human factors that needs to be balanced out.

Found Wisdom

In the end it comes down to your finger. Your arm. Your muscles. Achieving the same speed and accuracy of a mouse with your finger? WON’T HAPPEN. This also means that my mouse wrapper app might not be such a good idea in the end.

I need to switch the concept from a mouse emulation to something that removes the need for a mouse at a more basic level. Thinking about gestures for resizing and moving windows directly without explicit pointing. Maybe even a good way to combine the keyboard and the Leap in a meaningful way.

I am convinced that gestures are the future, especially ones with implicit spacial awareness. I am sure physics games will be absolute fun, too. However I am still focusing on use cases that can provide real life productivity enhancements.

Experimentation will continue.

Leap Motion. Put your hands up in the air!

I just received my Leap Motion development unit. I AM excited. Now the one thing I really need is loads of time to dig deep into the framework and come back with some fun use cases.

leap_00_01

Very First Impressions

Tracking Area

In contrast to e.g. the Kinect, LeapMotion tracks the interaction space from below your hands. This makes its usage area quite unique.

⊕ Perfect for getting precise pointing information to a screen in front of it. Hey, that sounds quite useful.

⊕ Tracking the angle of the pointing object.

⊕ Tracking multiple objects simultaneously. Get ready for your 10 fingers independently gesturing.

⊖ Naturally it cannot track overlaying objects, e.g. crossing your fingers or overlapping hands. This is no surprise but it turns out to be a serious limitation – think about making a zooming/pinching gesture towards the screen – tracking that will most likely fail or be very unreliable.

Speed

This is the most impressive aspect. I don’t have the final hardware/software version but it already is FAST. It definitely is the most responsive 3D tracking I’ve seen so far. The processed response onscreen is almost immediate, much less latency than e.g. with the Kinect.

This is really great to see, as responsiveness will definitely be crucial for LeapMotion’s acceptance and efficiency – My ❶❶.

Precision

My first test shows that the Leap is astoundingly precise. I would compare its ability to quickly point on a very specific area on the screen with a trackpad, maybe a bit slower than that, but much more immediate. Depending on the chosen tracking targets (fingers, palm, normalized center, etc.) the pointer can flicker a bit, but that seems to be controlable with more advanced handling in software. Again compared to the Kinect, you’ll see much less tracking noise. Arm tiring might become an issue but I need to spend more time testing in order to find out.

leap_00_02

Obviously Hardware and Software go hand in hand to make the LeapMotion a usable input device. My first impression hints to a very well designed piece of hardware. The software and SDK still seems to have some rougher edges but it is still evolving rapidly. More details on this after I had some more hands on experience with the code.

About