1) Does this actually exist? The video looks SWEET, but feels like it's way to computer generated and too flawless (in the video) to be true.
2) I believe gesture control will never full take over our computing needs. Reason being, the feedback response from our skin's nerve endings are quicker than the feedback from our eyes. In other words - 'gorilla arm' - http://en.wikipedia.org/wiki/Touchscreen#.22Gorilla_arm.22 Furthermore, we'll continue to assume that touch-based and gesture-based systems will replace all machines and computers, when I believe this is simply not true. Our muscles were not built to work efficiently by "waving into the air". For some applications this is true, but not all.
And for your second question: "gorilla arm" depends on where you have your hands-- if you keep them below your heart, you can gesture almost indefinitely without fatigue.
This is obvious for people who use a lot of hand gestures when talking, or those who speak with sign language.
The main issue I see with gesture controls is that compared to traditional inputs, you are doing more work and making bigger movements. For certain situations, like gaming, where 1:1 movement can be immersive this makes sense. For others it does not.
I do think a system that allowed you to use small hand and finger movements to control your devices make sense. The large gestures don't seem to be optimal in most cases.
I think the better question is: is it really as seamless/flawless as the video portrays?
Some of the clips imply integration with systems that largely don't exist in the way they're showing them(ski HUDs are real, but not like that). Another notable omission is in the video game sequence - we never see them turn the camera in any direction. I don't think that would be possible with this system alone.
Yeah, I too hope it is more than just vaporware. That hesitation doesn't actually come from their presentation specifically, aside from maybe looking too polished, just general experience with similarly styled launches.
The faster they get this in the hands of real people/reviewers, the better. It looks great. Nintendo would love it, hehe.
They must have shown some demo of some prototype. At least I don't see how else they could have passed YC interview.
Either way I really like the approach. Start with some sensor that has good ergonomics (wrist band), do some awesome machine learning trick and get a gadget from the future as a result.
Similar products have existed for a long time, but mostly "in the lab." Nonetheless, it is cool, although I suspect there would be a lot of accidental gesturing -- there are many ways to prevent accidental gestures, but all at the expense of usability.
Who cares? Put that thing on with the Oculus Rift. You'll want to capture the full range of motion, because you'll determine gestures by their interactions with the virtual environment.
I wish I could upvote you more than once. People will look at this video and think "oh, they've solved the big problem!" when in fact they've solved half of the big problem. Distinguishing between intentional and unintentional gesture has been the academic elephant in the room for many years now, and no one has come up with a good solution yet.
Their FAQ states they've resolved this issue by leveraging unnatural gestures. There's no way to tell if their solution is any good but at least this should give the public an opportunity to decide which applications the tech is ready for and which it isn't. Things that tend to be difficult to define "in the lab."
For one off gestures, you could also very well have a quick "trigger" gesture that enables the device, and gets it going. When you are done, use trigger again,
I'm thinking it would be easier to create a distinguishing 'attention' gesture than it would be with other things. The finger exercise thing comes to mind. Also much easier to do than a voice activated thing like calling out to Siri in a restaurant or something similar.
They show the band fairly far up the arm. Presumably to get a good set of distinguishing signals. But I wonder how hard it would be with enough DSP to build this into a form fitting wrap around watch band? The combination of gesture articulation and appli-wrist-watch is a compelling one.
I would love to see a schematic for the device or a teardown. I designed an electrooculogram in college and I'm guessing this is a similar electromyograph-type device. Measuring any signal on the body is challenging and accounting for humidity/noise/etc is difficult.
Someone should make augmented reality games using fake martial-arts gestures. For older folks, there's the climactic battle scenes of "Big Trouble, Little China." For younger folks, there are the 'jutsu' from the Naruto anime/manga.
Preordered. Says I'm number 2,463 in the queue FWIW.
Been thinking for a while now that subtle enhancements like plain rings or bracelets would come to be popular for interacting with home automation, computers and so on. The sort of things that might not be as polarising or awkward as something like Google Glass and that you could wear at almost all times. This concept looks really smart and the potential is huge.
I assume you could wear two and significantly up your gesture potential in the way that chorded keyboards work (en.wikipedia.org/wiki/Chorded_keyboard)? Could even replace keyboards entirely if the learning curve wasn't too high?
Any YC people or Thalmic employees able to talk about what the prototypes are like now?
Here is a comment from reddit on this same device:
"Researcher in EMG systems for control here. I actually hadn't come across this device before, though.
It may look like futuristic wizardry, but everyone and their mother has been working on EMG signal classification algorithms for prosthesis and the like.
The video is definitely a farfetched depiction of what it's like to actually use the device, but its cool to see stuff like this being put out there for people to throw their money at. I wouldn't be surprised if my PI ends up buying a couple for the lab, actually."
Tracking is much precise than EMG based posture estimation. Unless these guys made a leap of at least one order of magnitude compared to current state of the art, expect the LEAP to be buch better in terms of gesture recognition.
Furthermore, this thing only allows the recognition of 20 gestures. That is about log 20 bit every few seconds. On the other hand, the LEAP will give you at least 10 finger positions plus rotations (thats more than 2^10 bits) every few milliseconds.
I think if people will get used to LEAP based HCI, they will not look at Myo, no matter if you can go to the kitchen with it.
I gotta say, the way it works opens a much more complete set of gestures options, and the fact that you don't have to stand in front/on top of something for it to work is pretty great. This is pretty cool.
I personally like the fact that you don't need to have your arms lifted out in front of you for many different types of actions. (And if you want to should it feel more natural, you have that option)
I cannot wait for this. I hope it's trivial/possible to pair it with multiple devices. Either way, a year from now, I fully expect to be wearing two of these at all times. :)
I feel like we just keep swinging back and forth between handheld input devices and cameras for gesture control. I thought cameras were cool because they didn't require me to put some weird thing on to use my computer, but now this is cool because it works at a higher resolution than gesture-control cameras. Who's going to up the ante?
One issue that I see with this, as with the Leap to a lesser degree, is how to signal the start and end of a gesture to the device. With a touchscreen this is pretty straightforward, but here it seems running a classifier on the entire stream of all your movements might give a lot of false positives? Or am I missing something?
right now we have more then one leap motion in our office, and its pretty cool to do some stuff with it, but today we saw your gadget and video and we were very impressed.
i really like the slogan 'WAVEGOODBYE TO CAMERA BASED GESTURE CONTROL' - because ur idea can be used everywhere, camera gestures only in the reach of a computer/console.
so i really looking forward for the final version, and to hold it in my hand. i hope your sdk is awesome from the scratch. it would suck, if all developers have to wait for month, until a good version is available.
what i'm wondering, in the video the design of the hardware is different to the selling screens. and i like the velcro version much more, because the other one does not look like 'one size fits all' and might slight away on the most arms.
The Leap Motion is still an interesting device because of its sensitivity and ability to resolve intricate finger positions/orientations. It's already in developers' hands and half the price of this device. With future iterations, I would expect to see the Leap integrated into tablets and phones-- so it could be everywhere as well.
I've experimented placing the Leap on my chest as if it were a pendant, facing out. It's pretty comfortable to gesture this way. For a presentation it wouldn't be the most expansive way to gesture, but it could be a discreet way to interface with something like Google Glass.
The muscles that control your fingers are way back there up your forearm, and transmit the force to the fingers by tendons that go through the wrist. Try grabbing your forearm where their band sits and making and releasing a fist.
1) Does this actually exist? The video looks SWEET, but feels like it's way to computer generated and too flawless (in the video) to be true.
2) I believe gesture control will never full take over our computing needs. Reason being, the feedback response from our skin's nerve endings are quicker than the feedback from our eyes. In other words - 'gorilla arm' - http://en.wikipedia.org/wiki/Touchscreen#.22Gorilla_arm.22 Furthermore, we'll continue to assume that touch-based and gesture-based systems will replace all machines and computers, when I believe this is simply not true. Our muscles were not built to work efficiently by "waving into the air". For some applications this is true, but not all.