speaking_in_tech Greg Knieriemen podcast enterprise This week is our year-end podcast spectacular... Your hosts Greg Knieriemen, Ed Saipetch and Sarah Vela review the year in tech and chuck in some predictions for next year. They also have a chat with very special guests Brad O’Neill, CEO of TechValidate Software, and Peter …
The Leap device of the headline looks very interesting- I have posted in Reg forums before about the Kinect not fully reaching its potential for content-creation applications, but this Leap device appears to have got a fair few peoples attention, and generated some interesting discussions, from the specific (does it work with X? could it do Y?) to the more general:
For now, they state that their focus in on hand/finger tracking (I've seen 3rd party demos of the Kinect doing similar) rather than the '3D object scanner on the cheap" that have also been demoed using the Kinect. Interesting times if you're into computers and making things.
They don't say exactly how it works at this time, but it seems to do its thing by different means to the Kinect, and with far cheaper hardware.
I was under the impression that MS is considering contact-less gesture controls for future laptops, but haven't heard anything lately due to all the noise around Win8. Oh, re OS choice, Leap say "At this time, our focus is on Windows and OS X, with Linux being on our agenda." It is encouraging that they are still platform-agnostic, and haven't been bought by a big player as FingerWorks (multi-touch gesture-based input devices) were by Apple.
Fingers crossed that it is as good as it seems and it takes off.
The Leap Motion is interesting tech, but appears to only track hands/fingers. This hardly makes it a Kinect killer as the Kinect, when coupled to libraries such as OpenNI, can track multiple people as full skeletons. I guess it really depends what your doing though and what user input needs your applications have. The Leap might be great as a touch screen replacement for situations where you really don't want people to touch the screen (eg workshops where engineers have greasy, oily hands but still want to scroll through online manuals, control diagnostic software, browse pr0n sites, etc, etc).
The kinect was massively downgraded from what the technology(ies? they bought a few) were originally capable of. It wasn't a case of limited technology, but limited cost. Now that microsoft has had time to build up larger and larger libraries, and optimize the software to match the hardware, I think other 3d tracking devices are going to struggle to catch up, unless they take a very different approach.