The sample is available as open source on GitHub. The project contains sample code that shows how to send MQTT commands to IBM Bluemix when gestures are recognized via Leap Motion.
Right now four gestures can be recognized: swipe left, swipe right, screen tap and circle. The best way to explain the functionality are pictures. Check out the pictures in the screenshots directory.
This is the device in front of a MacBook.
I’ve extended the Node-RED flow to receive commands from Leap Motion and forward them to the cars.
Here is the series of blog articles about the Anki Overdrive with Bluemix demos.
- Anki Overdrive Cars with Bluemix Demo – Slides and Architecture
- Collision Prevention for Anki Overdrive Cars with Bluemix
- Steering Anki Overdrive Cars via Speech Recognition on Bluemix
- Steering Anki Overdrive Cars via Kinect and Bluemix
- Steering Anki Overdrive Cars via Leap Motion Gestures and Bluemix