One of the toys that the Computing Club has to play with is an AR.Drone 1.0. This is a pre-built WiFi-enabled quadrocopter manufactured by Parrot. There are official iOS and Android applications for remotely controlling the quadrocopter. The AR.Drone also streams a live video feed from its onboard camera to the controller. Flying the drone around from an app is fun enough, but where things get really interesting for the Computing Club is programming it to do things! Over the last few months undergraduates have been tinkering with the drone, making it do various things using the open-source javadrone API.

Kirill Sidorov and I, organisers of the Computing Club this academic year, were asked to prepare a demo for an upcoming School of Computer Science & Informatics Open Day. The aim of these open days is to enthuse A-Level students who are considering study in Computer Science. We needed something that was interactive and fun, but also allowed us to highlight some of the concepts of computer science and what makes it interesting. We decided on a motion-tracking AR.Drone demo. We'd use the on-board camera to have the drone follow an individual holding a target. There's some neat computer science here – control and computer vision in particular – and it also demonstrates the power using software to program real-world devices. Furthermore, it also meant we could build on the work done by Computing Club students and bring them in to chat to visitors at the Open Day.

Conveniently, a few days before the the first Open Day (17 April) was the two-day "Open Sauce" Hackathon. Kirill and I were attending anyway to help with the student-organised event, so we took advantage of the fruitful combination of hackathon ambience, energy drinks, and free food to build the demo over those two days. The repository is hosted on GitHub. The original output from the Hackathon is in this branch (warning: gnarled, hackathon-quality code). This was tweaked and (slightly) refactored over the following days in preparation for the Open Day, resulting in this.

Building the AR.Drone demo at the 2013 "Open Sauce" hackathon.

Building the AR.Drone demo at the 2013 "Open Sauce" hackathon.

The target we used during the hackathon was a ping-pong paddle wrapped in an A4 sheet of paper coloured with pink highlighter. In hindsight, the lighting conditions of the venue were very consistent, making it a favourable test environment. Kirill prototyped some image-by-image video processing to extract the target in MATLAB, and then translated to native Java. I handled the interaction with the AR.Drone and control loop. We also implemented a fairly crude but useful GUI to view the raw and processed image streams, debug some control parameters, and initiate take-off and landing (emergency, typically). The javadrone API made controlling the drone straightforward, and even allowed us to implement some nifty features like changing the drone's LED colours when the target is lost.

The image component outputs the location (a pixel coordinate) and extent (a measure proportional to the target's size in view) of the target in the camera's view. This information is used to handle our three control variables:

  1. Forward/back tilt for moving forwards and backwards to maintain a particular distance from the target.
  2. Left/right rotation to keep the target horizontally centred.
  3. Vertical ascent/descent to keep the camera and target at the same height.

We didn't have much time to fully explore the handling of the drone with respect to these control variables, but experimenting with a few simple linear controllers and a PID or two resulted in decent  tracking, as undergraduate George Sale demonstrates in this video:

(As shown in the video, as well as this other one, pretty much every flight ended up with a haywire drone and me initiating a forced landing.)

That was the hackathon; the Open Day proved much more challenging. In our hackathon experiments, the specificity of our target detection was excellent. Specificity was our primary concern, since a false-positive target detection puts bystanders wearing unfortunately coloured clothing on the receiving end of multi-bladed drone fury. The Open Day venue had very uneven lighting, with patchy artificial lights, and a large window in one corner that would temporarily flood the camera depending on the drone's angle. This caused the colour profile of the paddle to change drastically depending on the angle of the drone, the location of the target, and the location of the drone.

To deal with this, our first trick was to change the target. Significant variation in light reflection between dimly lit and brightly lit areas meant large changes in the target's brightness and hue. By switching to a backlit target we could ensure fairly consistent brightness, irrespective of ambient light. Using a bike light, a home-made filter (highlighted A4 paper), a diffuser (coffee filter paper), and filter assembly (polystyrene cup), we hacked together the following target:

(Yes, we effectively built a cheap Playstation Move controller.)

The resulting target had very consistent and distinct appearance. After this there were just a few camera-related issues to tackle; in particular:

  • Although the camera resolution is 640x480, the drone only streams 320x240 back to the laptop. Nothing much to say here, except it's surprising (802.11g is capable of the bandwidth and latency) and inconvenient.
  • Either the camera hardware or drone firmware was doing some unwanted brightness auto-adjustment which we had to un-adjust back on the laptop.
  • The lens quality is poor. We had to discard everything outside a centre 320px-wide circle to cull corner artefacts.

And, then, finally, we were left with a superb signal and negligible false-positive rate.

Target Triumph.

Target triumph! Left panel: raw stream. Right panel: processed video stream; red pixels and white circle indicate detected target.

The control still needs a lot of work, but the drone flies and reacts well. It's enjoyable watching people have a go at it. Initially people are very tentative. This is unsurprising; the drone's forward/back lunging can be vicious at first (although it usually stabilises before quite reaching the volunteer). After a few goes, they're eventually able to start taking it on tours around the demo area, almost like walking a dog; albeit a dog that is noisier, less behaved, and hovering in mid air.

2013 "Open Sauce" Hackathon Round-up

Posted by mattjw in Uncategorized - (Comments Off on 2013 "Open Sauce" Hackathon Round-up)

Last weekend's "Open Sauce" Hackathon was a big success. In addition to the funding I mentioned in my previous post, GitHub also got in touch a day before the event to bolster each prize category with one-year bronze and silver accounts.

There's a write-up and more photos at the CSCF website, so please navigate there for more information. I'll also maintain a list of other individuals' posts below.

Hackathon 2013 group photo.

Hackathon 2013 group photo.

While at the event Kirill Sidorov and I,  Computer Club co-organisers, also took the opportunity to write the software for a motion-tracking quadrocopter demo we'd been asked to for the School's upcoming Open Day. Write-up to follow on this blog.

Thanks to all the judges, undergraduate organisers, sponsors, and attendees for making it a great event!

Elsewhere:

Open Sauce Hackathon 2013

Posted by mattjw in Uncategorized - (Comments Off on Open Sauce Hackathon 2013)

Last year I attended the inaugural School of Computer Science & Informatics "Open Sauce" Hackathon as a participant. It was a hugely successful event, and good fun to work with Mark and Chris in building Motion Kitty Pi, a prototype Spotify home music streaming service for Raspberry Pi (with motion-triggered playback!). Not only was the event a success, it was superbly organised by undergraduates in the School of Computer Science & Informatics's Computer Club. Click here for a report on last year's event.

Last year's hackathon.

The 2012 hackathon.

Now being a lecturer and co-runner of the Computer Club I get to assist the undergraduates in organising this year's Hackathon, and it's shaping up to be even better than last year's! They've done an excellent job of organising and promoting the event, with over 40 attendees already registered. Among these are undergraduate students from Cardiff University and other institutions, PhD students, staff members, and local professionals.

As with last year the School is supporting the event with facilities and a contribution to the prize fund. What makes this year even more impressive is the amount of external sponsorship the students have secured. Box UK are very kindly providing the ever-important food and (energy) drinks for the two-day event, and a total £500-worth of prizes are being contributed from Linode, DigiStump, and eysys. On top of that, John Greenaway (Cardiff University Information Services), Richard Gaywood, Stuart Allen (Cardiff University School of CS&I), and Humphrey Sheil (eysys) will be on-hand to judge the final projects.

So: free food, free drink, big prizes, and, importantly, building something cool with friends. What more could you want in an event? Get more information or sign up if you haven't! And well done to Joe, Henry, Geraint, James, and all the organisers!