Drones Fly ‘Hands Free’ with Gestural Technology

Drone controlled by perceptual computing

Software developers use 3-D camera, open source application to fly drone with hand motions.

Drones are buzzing in the skies everywhere from Afghanistan to amateur sporting events. Typically, the unmanned aerial vehicles are controlled with a computer keyboard, game controller or smartphone, but a pair of software developers have discarded these familiar interfaces to create a “hands-free” flying experience.

“People find that it is a really intuitive way to control the drone because it is like a virtual airplane yoke,” said software developer Martin Förtsch.

German-based Förtsch and his partner Thomas Endres developed an application with Java code and the Intel Perceptual Computing Software Developer Kit that uses a 3-D camera and a Wi-Fi connection to pilot a Parrot AR drone with hand motions.

“To start the drone I have to just lift my hands to the camera and give two thumbs up,” said Förtsch standing in front of an Ultrabook computer with a Creative Labs Senz3D camera mounted atop the screen.

Earlier this year, the Parroteer virtual flight controller was the most innovative app at an Intel-sponsored Hackathon in Munich. The pair said they plan to make their open source code available on GitHub.

The four-propeller drone is an off-the-shelf Parrot AR 2.0. Out of the box, the French-made multi-sensor, camera-equipped drones are controlled with iOS or Android smartphones, but Förtsch and Endres wanted to develop a “cooler” new control interface. Initially, they used a laptop connected to a Leap Motion finger-sensing device but began using the Intel Perceptual Computing SDK and Creative Labs 3-D camera when they decided to participate in the Munich hackathon.

Drone controlled by 3D camera and Ultrabook

“Our software interprets the signals coming from the creative interactive gesture camera and sends messages to the drone to control it in all directions,” said software developer Thomas Endres.

“Our software interprets the signals coming from the creative interactive gesture camera and sends messages to the drone to control it in all directions,” said Endres.

“It is a 3-D depth camera so it can identify hand gestures and identify the coordinates of where your hands move,” said Förtsch. “The positions of my hands are saved as reference coordinates and every other hand movement is relative to those reference coordinates.”

Describing how they ported earlier code using the Perceptual Computing SDK, Förtsch said, “Normally you would implement your software using C or C++ libraries, but we are Java experts so we used Java bindings.”

Implementing the drone API in Java took less than 100 hours and the pair built the prototype gesture control using the SDK in a day. Making adjustments for fine-tune precision and responsiveness required additional time.

Förtsch and Endres were at the recent Intel Developer Forum in San Francisco demonstrating their creation for thousands of hardware and software developers.

Drones Fly ‘Hands Free’ with Gestural Technology

Drone controlled by perceptual computing

Software developers use 3-D camera, open source application to fly drone with hand motions.

Drones are buzzing in the skies everywhere from Afghanistan to amateur sporting events. Typically, the unmanned aerial vehicles are controlled with a computer keyboard, game controller or smartphone, but a pair of software developers have discarded these familiar interfaces to create a “hands-free” flying experience.

“People find that it is a really intuitive way to control the drone because it is like a virtual airplane yoke,” said software developer Martin Förtsch.

German-based Förtsch and his partner Thomas Endres developed an application with Java code and the Intel Perceptual Computing Software Developer Kit that uses a 3-D camera and a Wi-Fi connection to pilot a Parrot AR drone with hand motions.

“To start the drone I have to just lift my hands to the camera and give two thumbs up,” said Förtsch standing in front of an Ultrabook computer with a Creative Labs Senz3D camera mounted atop the screen.

Earlier this year, the Parroteer virtual flight controller was the most innovative app at an Intel-sponsored Hackathon in Munich. The pair said they plan to make their open source code available on GitHub.

The four-propeller drone is an off-the-shelf Parrot AR 2.0. Out of the box, the French-made multi-sensor, camera-equipped drones are controlled with iOS or Android smartphones, but Förtsch and Endres wanted to develop a “cooler” new control interface. Initially, they used a laptop connected to a Leap Motion finger-sensing device but began using the Intel Perceptual Computing SDK and Creative Labs 3-D camera when they decided to participate in the Munich hackathon.

Drone controlled by 3D camera and Ultrabook

“Our software interprets the signals coming from the creative interactive gesture camera and sends messages to the drone to control it in all directions,” said software developer Thomas Endres.

“Our software interprets the signals coming from the creative interactive gesture camera and sends messages to the drone to control it in all directions,” said Endres.

“It is a 3-D depth camera so it can identify hand gestures and identify the coordinates of where your hands move,” said Förtsch. “The positions of my hands are saved as reference coordinates and every other hand movement is relative to those reference coordinates.”

Describing how they ported earlier code using the Perceptual Computing SDK, Förtsch said, “Normally you would implement your software using C or C++ libraries, but we are Java experts so we used Java bindings.”

Implementing the drone API in Java took less than 100 hours and the pair built the prototype gesture control using the SDK in a day. Making adjustments for fine-tune precision and responsiveness required additional time.

Förtsch and Endres were at the recent Intel Developer Forum in San Francisco demonstrating their creation for thousands of hardware and software developers.