Tag Archives: Open Source

Leap2Arduino2Tx – Controlling a Drone Using the Leap Motion

Arvid Bräne - June 1st, 2015 - #Code, #Multirotors, #Open Source, #Studies

During the first half of 2015 I spent a lot of time on a school project I called Leap2Arduino2Tx (easy name to remember, right?) in a course called Digital Vision at Umeå University. The goal of the project was to be able to control a multirotor (or a drone, whatever you want to call it) by only using gestures from one hand. It might sound like something out of Minority Report, but the solution is actually quite simple!

Another goal of the project was also to do this without the need for any hardware modifications to the existing transmitter (in my case a Turnigy 9X). The reason for this is mostly that I want people with very little prior skill with electronics to be able to recreate this project, but also because I’m student with a limited budget. In this article I will cover a basic overview of the general parts in my solution, if you want to check out the code yourself you are more than welcome! Below is a short video  demonstrating how the controls work:

Solution

My solution consists of 6 parts in order for it to work; a hand, a Leap Motion, a computer (preferably a Mac), an Arduino (preferably an Arduino Uno or better), a transmitter (preferably a Turnigy 9X) and a drone of your choice, see image below.

Parts

The Leap Motion analyses the hand, sends data to a computer via USB that crunches the numbers and sends them to an Arduino via USB that converts numbers to a PPM pulse that then is transfered via a cable plugged into the trainer port to the transmitter that then sends the commands via 2,4 GHz to the drone. Simple!

This solution might look complicated, but it’s really just a long chain of converting numbers to other numbers; let’s break it down:

  1. The user moves his/hers hand (like in the video above) above the Leap Motion.
  2. The Leap Motion tracks the users hand using a depth-sensor and quickly figures out what motion the user is doing with his/hers hand at more than 60 times per second.
  3. The Leap Motion sends this data over USB to the computer that is running a Python program that is constantly recording this data and translating it into the same type of numbers as the transmitter is transmitting.
  4. The computer then sends these numbers (using CSV) via USB to an Arduino that reads these numbers and converts them into a (digital) PPM signal.
  5. Using one of the digital output pins the Arduino then sends these numbers to transmitter via the 3,5 mm trainer-port on the back of the transmitter.
  6. The transmitter (which is put into buddy box mode) then reads this signal (around every 22 ms) and then sends them to the drone via the standard 2,4 GHz band.
  7. The receiver on the drone takes these numbers and converts them to an analog PWM (if you are using PWM of course) signal which the control chip on your drone then interprets.

Bellow is a short video showing the necessary setup steps to get Leap2Arduino2Tx to work, detailed instructions can be found on the GitHub page.

Drawbacks

Since this is more of an MVP than a finished product it has some limitations and drawbacks, here are some:

  • The drone I used to test this out is a small 250 mm racing drone designed to go fast. It has no GPS which means it can’t stay in one point by itself, making it hard to control. The size of the quadrocopter is also an issue since its very responsive in its movements and easily influenced by the wind. This could be solved by building this solution for a more competent drone, such as a DJI Phantom.
  • The Arduino is limited in its USB-bus; it’s very hard to send large amounts of data and do precise PPM-conversions in real-time. This could be solved by excluding the Arduino and doing the PPM-conversion on the computer itself (something I didn’t have time to look into during the extent of the course) and sending it via USB to the transmitter directly.
  • I chose Python as a language for three reasons; I had some prior basic experience with it, it would be easy to make a GUI to show the stick movements and because the Leap SDK was available in Python. Since Python is a high-level programming language it requires a lot of resources (resources my five year old MacBook Pro didn’t really have), especially when using the Leap SDK which slows everything down.

Conclusion

My solution is defiantly not something I would use in its current state (see drawbacks), with that being said I think it has a huge potential as a proof-of-concept for others to develop upon.

During the project I emailed some large drone companies (3D Robotics, DJI etc.) to see if they wanted to help me in my research by giving/lending me a drone (I also included a link to the GitHub page so they could see what I was up to). Unfortunately none of these companies replied so I had to continue the work with my small racing drone. About one month after I sent these emails DJI released this video of the company using a Leap Motion to control one of their drones. I’m not accusing them of stealing my idea (which by the way has been done before, kind of), I just find it interesting that they decided to do it just now. The Leap Motion has been out for more than 3 years (?) and DJI has been making drones even longer, why the sudden interest?

If you feel like this is something you would try to replicate or build upon, you have my permission and blessing to check out the project on GitHub. If you want to dive even deeper into this project you might want to check out the Keynote or the Project Report.

Macintosh 128K

May, 2013 - #Code, #Open Source, #University, #Web, #Webdesign

TapeIdag & Inatt is often referred to as “the most important event of the year” by the students of Interaction and Design so when Albin Hübsch and I where put in charge of the event website we knew we had to deliver. Since the theme was set to “Retro” we decide to look at some old tech to get inspired. After a lot of research we decided on using the iconic design of the Macintosh 128K from 1984. I was also in charge of creating the logotype for the event and after consulting the rest of the PR-group we decided on using an old Cassette Tape (more on that here), which was a perfect match to the old Mac.

The “Hardware”

hardwareMaking a Macintosh in Photoshop and then putting an “interactive iframe” on-top is as simple as it is ineffective, so we agreed on making it a bit harder: no images, only CSS & HTML. The Mac chassis (and the floppy disk) is created using over 100 divs all styled, rotated and positioned to give the user a realistic feel. For the boot up process some simple Javascript generates a couple of printouts and then pushes the floppy disk into the slot.

The “Software”

softwareAfter the computer has been turned on and done it’s startup procedures, the “application” enters fullscreen mode. The user is then presented with an almost identical graphical user interface of the old Macs, all built on web technologies (such as jQuery UI, CSS, HTML, etc.). Yes, we “had” to use some images here. Here the possibilities are pretty much endless, but we chose to only include a Registration-“application”, a Nomination-“application”, and a Astroids game that we found here.

Finally

At the moment the code is quite cluttered an uncommented, but we have plans on releasing it on GitHub in a near future, structured and commented. For now, the project can be found, and tested(!), here.