Tales of Mind-Controlled Drones and Games

7 min read

Controlling a robot with your mind sounds like something right out of Big Hero 6 right? Well in today's blog post I'll cover how I was able to write the code to do it myself with an EEG and a A.R. Drone and then later use the same logic to control a video game for my senior design project.

Background

Back when I was in college I was an honors student. As an honors student, to get honors credits for an otherwise normally offered class, you'd need to talk with the professor to come up with a special project worth at least 30 hours worth of work, if I remember correctly. You often needed to do this to meet the minimum honors credit requirement to stay in the program (which was good to do so you'd have first dibs on classes and dorming).

I was looking for a cool project to work on for a conversion project and after some back and forth with my advisor and professor, I had an interesting lead on a commercially available EEG headset. I also was suggested programming a drone in a project dedicated to exploring how path planning works (plus drones were becoming all the rage in 2015). What I ended up doing was -- you guessed it -- combining aspects of both to make a mind-controlled drone.

I won't delve into the nitty-gritties here but I will link you to all the relevant repos, documents, articles, and videos!


Fun fact: during a demo at UConn's annual Invention Convention, an editor of MultiRotor Pilot saw my demo and asked to do an article about it. I even ended up on the cover as you can see above. Check out the article here!

The Repos

EmoJArduino

This was the code for my initial attempt to control something as simple as an LED with the EEG headset. I already knew how to toggle an LED's on and off state with an Arduino from my freshman year Intro to Computer Science class so this was the next logical step. I wanted to use Java as the main language since that was the one I was most familiar with.

MindControlledUAV

This is the bulk of what I worked on for the mind-controlled drone project. I did this in Java as well. There are two implementations: Main.java and ImprovedMain.java. The first uses a program called EmoKey (provided as part of the Emotiv SDK) while the second uses the API I made mentioned below. Definitely check the README for either repo, it's fairly detailed.

Emotiv-EncogML

This was one of the more interesting (and frustrating) parts of my thesis. The logic Emotiv uses to classify facial expressions, moods, and mental commands are closed source and I wanted to see if I could improve on that with machine learning. In this project I grab raw EEG sensor data and use both artificial neural networks (ANNs) and support vector machines (SVMs) to try to classify the data.

Emotiv-JSON-API

This is the final mind-controlled project I worked on. It was an integral part in both the final version of my mind-controlled drone project and my team's mind-controlled video game project for senior design. This is API for receiving Emotiv EPOC EEG events in JSON format using TCP sockets. Also be sure to read the README for this one as well.

The Tech

Headsets

Below are the headsets I've used. I had the best results with the EPOC/EPOC+ since I found it to more successfully pick up signals with the saline-soaked felt pads over the semi-dry polymer used by the Insight. They dry out over time, but it's usually been sufficient for a day of demos either at UConn's Open Houses or the annual Invention Convention.


These are the 2 of the 3 Emotiv EEGs I ended up buying over the course of my college career. The first image is the EPOC, my first EEG headset. It looks near-identical to the EPOC+, my second EEG headset. Both have 14 saline-soaked felt pads that can detect electrical signals. The next image is my third EEG headset, the Insight, which can do the same, but using semi-dry polymer gel sensors.

Drone

I only used one drone for this project. I opted for the Power Edition for the longer battery life which actually came in very handy for a day of demos.

The Demo

After laboring away at the code for long hours trying to hunt down socket bugs, I finally managed to get a working prototype early 2015. I was able to demo this at the spring open house for UConn to get prospective students interested in the field, especially as a student who had no background in programming prior to college. It was such a success that from then on I participated in every open house up to graduation.

In the above video, I control a drone with mental commands and facial expressions:

  • smirk -> lift off
  • push -> spin
  • raise brow -> land

Presentations and Papers

Thesis: "A General Protocol and Application Programming Interface for Wireless Electroencephalographic Communication Systems"

This was my University Scholar thesis. Below is the summary used in my thesis idea submission:

Electroencephalographs (EEGs) are input devices available as commercial wireless headsets that are capable of picking up signals associated with facial expressions, emotions or even focused thoughts. These could be used by physically impaired people (e.g., paraplegics or victims of ALS) to control prostheses or wheelchairs, restore mobility, and increase independence. Yet, high-level and reusable protocols to program and control these EEGs are seriously lacking and impeding the development of sophisticated modular control applications. The objective of this project is to fill this void and demonstrate the flexibility of such protocols and APIs through a mind-controlled UAV project.

You can read more of the proposal here or the full thesis paper here on Overleaf.

Wireless Electroencephalographic Device Incorporation Into Gameplay: Senior Design Project


Me controlling a videogame that utilizes my Emotiv-JSON-API with an EEG and a Kinect

This was my senior design or capstone project as a student of the UConn School of Engineering. My team and I were focused on developing a new paradigm of gaming in which a user can use their mind with the aid of a wireless EEG to play a video game. The gameplay is meant to be intuitive and button-free as the game utilizes the user's movements, focused thoughts, and mood using a Kinect One and Emotiv EPOC+ to interact with the game world.

Read the write up here on Overleaf or see the project poster here.

Wrap-up

I dedicated a large chunk of my project time in college working on mind-controlled stuff. Not only is it a super fun talking point for both in general and in interviews, but I was able to make a real impact with it. I actually managed to inspire kids to learn programming and excited high school students about earning a degree in the computer science. The fact that new CS freshmen were able to recognize me later on as the "drone girl" meant I left a lasting impact on them and possibly even convinced them to try out the major. Plus it got me noticed by the Google recruiters, which ended up panning out for me surprisingly. But I'll have to save that story for another post. Until next time.

Previous Post Next Post