As you know, the devvvvs never sleep and constantly come up with implementations of random stuff, which you have to catch up with.
We have been accepted as one of the first who get a free development version of the Leap Motion device. The device is not yet available, but if your lucky and also got one, you can use it with vvvv right away.
Leap will ship another 10k dev devices in the next week, you might be lucky. apply here: http://developer.leapmotion.com
Otherwise this is just a public announcement that the Leap SDK found its way into our code base.
Comments:
Comments are no longer accepted for this post.
The difference of this and Kinect is in accuracy really, and hand gesture is central obviously, as this video explains:
http://www.youtube.com/watch?v=sVIdFkavgJc&feature=endscreen&NR=1
The Cobra sounds very promising but it’s nowhere to be found and the company Canesta has been bought by Microsoft http://en.wikipedia.org/wiki/Canesta .
But I agree that Leap is delaying their product…
Finally got my dev device (and surely lots of you too). Could you update the Leap node to support the current SDK version, because it does not work right now unfortunately?
Thanks in advance and thanks for making that plugin available that fast! Would be great if at NODE the plugin would be working, because there will be at least a few leaps available for testing.
I finally received the Leap as well, but having problem with Display/Screen ID as it shows my screen ID as 0 and Leap is looking for screen 1, so nothing is happening. I’ll have to try dual screen set-up later to see if I can resolve this issue. It could be different outputs from my graphic card Nvidia Quadro 4800 which is the problem. The monitor in my office in the university may not be the best, or even the graphic card configuration.
A bit disappointed that I had this problem straight from start, can’t wait to get it working. I’ll try it on the MAC SDK later to see if on Macbook pro it’s different story (I think it will work there).
and how is it in terms of accuracy & latency? how many hands in can track the same time?
can u bring this thing to the node?
extremely good… best i’v ever experienced, really. i tried with woei up to 4 hands. it was getting a bit jerky then, but that could have been the lightning condition.
sure, we will have it at node13 in the hackspace…
http://www.youtube.com/watch?v=_UHihaQOB3g <– 0:49 they are saving this feature for late use ? mmm…
anyway this its a great tool for interactive installations that require high precision !.Does this work as the kinect sensor? like this video,,,
http://www.youtube.com/watch?v=5_PVx1NbUZQ
or its like 2 standar cameras ?
im traveling from Argentina just for Node13, so i will be happy to meet all of you and to see this on hackspace!. :D