I've seen some interesting alternative inputs technologies that will be coming to computer users this year. I'll share video demos of two that I’m most excited about are: the Leap Motion and the Myo armband.
The Leap Motion is a small camera-based sensor that connects to your computer and “watches” an area above your desk for hand movement, translating that movement into control of your computer. You can, for example, wave your hand to scroll a page, turn your fingers to control volume, and pinch and zoom images by literally pinching the air. Most of the gestures seem to be translated from current touchscreen technology, but I’m very excited about the opportunity to develop a new language of interface with a product like this. The Leap Motion should be available for purchase this May, for $79.
Here’s a quick video to demonstrate how it looks and works.
The second is a newer entry into this field, the Myo armband. This device uses a very different method to achieve a very similar end, controlling the movement of things. Instead of “watching” your hands with cameras, the Myo is an armband that monitors the muscles movements of your forearm, and translates those implulses into a way of controlling technolgies around you.
Two features of the Myo are of greatest interest to me. The first is that as a control mechanism, it’s personal...you are wearing it, and it can get smarter as it learns from you. Not many types of input devices can make that claim. The second is that, because of its sensitivity, it can pick up micro-movements of your muscles before you are even aware that they are moving, allowing for imperceptible controls. As with the input devices that Vernor Vinge described in Rainbows End, microscopic movements can be translated into rich and robust interactions with your technology. Here’s a video of the Myo in action.
Is there a place for these sorts of interfaces in libraries? I think so, but I would love to hear where you think they could be used.