Here is another example how to steer Anki Overdrive cars via IBM Bluemix. In addition to a local controller, Watson speech recognition and a Kinect I’ve added gesture recognition via Leap Motion.
The sample is available as open source on GitHub. The project contains sample code that shows how to send MQTT commands to IBM Bluemix when gestures are recognized via Leap Motion.
Right now four gestures can be recognized: swipe left, swipe right, screen tap and circle. The best way to explain the functionality are pictures. Check out the pictures in the screenshots directory.
This is the device in front of a MacBook.
I’ve extended the Node-RED flow to receive commands from Leap Motion and forward them to the cars.
Here is the series of blog articles about the Anki Overdrive with Bluemix demos.