2017


A framework for natural human drone communication covering control, intent and social interaction.



Natural Human Drone Interactions


More than 770,000 U.S. drone registrations have been filed in about 15 months and over 100,000 drones have been registered in April 2017 alone. A big challenge caused by this increase, is the need for public acceptance and autonomy of drone operation in complex and messy environments, specifically in direct engagement with humans. Unlike pixels on a screen, drones are physical objects that occupy the same space as humans and can interact with us through lighting, motion, sound, or physical contact.

The project creates a richer interaction vocabulary between humans and flying robots taking into account human emotion recognition and translating the aerial robot’s physical characteristics into interpretable motions. A Parrot AR drone was used, which is a low cost and highly customisable quadcopter. Flight animation gestures such as motion, roll, pitch, yaw, angular and vertical speed as well as takeoff, landing and LED animations can be controlled directly by the API using NodeJS.




Autonomy


I prototyped aerial robotics choreographies based on human reactions. The video feed from the front camera of the quadcopter was captured and was analysed by the Affectiva Emotion Analysis SDK. Five basic facial human expressions, namely joy, anger, surprise, sadness, and fear were detected and the result was sent back to the computer. Five corresponding drone reactions or animated software scripts were then initiated as a response. Instead of anthropomorphism, the drone communicates with what it’s got–movement that is, it's position, direction compared to the user, speed, rotation, angles, altitude as well as light.

For example, when someone is afraid, the drone slows and drops low to the ground, it makes no fast movements and backs off. When someone is surprised, the drone approaches them almost slowly investigating, backing off, and then approaching again. Those interactions were inspired by cybernetics (Braitenberg vehicles), dance (Labanotation system), animation principles, human-robot interaction as well as by the interaction between falconers and their birds of prey.


Happy
Angry
Sad Surprised Fearful

Synthetic Temperaments for Autonomous Systems


The behaviour of the drone if interpretable by humans could potentially dynamically change their emotional state throughout the interaction. The implications of such development in drone technology may then lead to a better public acceptance of the technology.  And similarly may allow for the technology itself to navigate its own operational environment through guiding humans into a more certain reaction to their actions. Emergent use cases could include animation of flying characters, the creation of flying robot actors and synthetic swarms, drone robotic assistants or drones (co) operating autonomously with humans.

In a world in which we’re surrounded by more and more drones, capable of buzzing our heads at 90 mph, humans need better options than rolling over and playing dead. Can this be a new way to co-live harmoniously with our flying companions?


Press

Fast CoDesign
Prosthetic Knowledge
Wired
Deutsche Welle
Engadget
Gizmodo
PSFK
Best Video at the Human-Robot Interaction Conference 2018, IEEE Spectrum


Exhibitions

Re:Publica Festival


Publications

Social Interaction with Drones using Human Emotion Recognition.
In HRI’18 Companion: Conference on ACM/IEEE International Conference on Human-Robot Interaction, March 5-8, 2018, Chicago, IL, USA. ACM, NY, 2 pages. DOI: https://doi.org/10.1145/3173386.3176966

Mark