Tuesday, December 24, 2013

Missed target, found hooliganism on drone mapping mission

A small unmanned aircraft system for mapping, developed for a National Science Foundation grant to improve STEM education.

In north Louisiana, there’s a fantastic little place known for Muscadine grapes, pecans, and on occasion, alligators.

Muscadines always get turned into jelly; an excellent topping for southern-style biscuits (my wife says she’s still working on mastering the family biscuit recipe, but she’s produced the finest biscuits I’ve ever had the pleasure of eating).

Pecans, a grocery bag of which could fetch a gold bar by northern exchange rates, are so plentiful that they must be given away. They’re so much a part of my wife’s heritage that we decided to get married beneath those pecan trees.

And when the water is high in the backyard bayou, a small alligator sometimes will make itself at home. If the alligator is lovingly cared for and becomes fat, a portion of that alligator eventually will make its way to our dinner table in the form of a delightful alligator sauce piquant (wife insists that it be cooked like shrimp, and not burnt to a crisp as it’s done here in the North).

So before we went down to visit my wife’s family farm for Thanksgiving, I decided to pack up one of the “drones” I’ve developed at the National Science Foundation grant where I work, to see if I couldn’t map out this interesting place.



Why map?

Aerial photo of flooding in Colorado by Falcon UAV.

First, a little background on drone mapping. Since early 2012, I’ve been working with high school students in Illinois to build small unmanned aircraft capable of mapping out natural habitats, farms, and interesting bits of geology and geography. We’ve come a long way since our beginning, but some development challenges remain.

Governments have used unmanned aircraft systems, more commonly known as drones, to make measurements and gain important information during the Fukushima nuclear crisis. While it hasn’t been done just yet, drones also could be used by journalists to measure the extent of natural and man-made disasters, and to compare those measurements with government reports (or produce the numbers where governments can’t or won’t).

Mapping drones also could help citizens on the ground understand what is happening to their communities, such as when Falcon UAV deployed a system during the fall 2013 flooding in Colorado.

About the drone.

The autopilot, which was configured to trigger a hacked camera, was tested on the workbench before being installed in the aircraft.


Regardless whether the application is research, STEM education, or journalism, drones used for mapping don’t have to be complex. It just has to hold a camera in the air. Everything else is designed to make sure the camera is in the right place at the right time.

The drone I brought to Louisiana began life as an electric, balsa-frame RC airplane with a 1.7-meter wingspan. What turns it into a mapping machine, however, is a GPS-guided, Arduino-based, sensor-packed autopilot linked to a hacked point-and-shoot camera.

Million-dollar mapping airplanes use much, much better sensors, but don’t let this consumer-grade camera fool you. When you’re flying at 100 meters above the ground, your ground sampling resolution can top the best of them.

An earlier mosaic of the test site produced by the same plane, with a different camera.





This “Turkey flight” wasn’t the system’s maiden voyage. In mid-2012, I flew a very similar setup to create an aerial mosaic of the test site. Due to a faulty GPS and camera lens distortion, that mission didn’t yield much in the way of geospatial data – i.e., a model of the ground in which to accurately measure features or changes.

Even with those setbacks, however, the experience provided valuable “stick time” and critical lessons on planning and executing a drone mapping mission.


The drone's camera was installed using a hook-and-loop strap, and a standard camera bolt.

Besides a new GPS and camera, there were only a few improvements made to this unmanned aircraft. As previously mentioned on this website, I designed and laser-cut a quick-release module to house the autopilot, GPS receiver, and the lithium polymer battery (with the help of the CU Community Fab Lab). All equipment was thoroughly tested on the ground before the flight.

Mission planning: how slow can you go?


Perhaps the most critical variable in planning a drone mapping mission is the stall speed of the aircraft. Below this speed, the lift generated by the airplane’s wings becomes less than the force of the Earth’s gravity, and thus the airplane begins to fall.

V = √( 2 W g / ρ S Clmax )

V = velocity in m/s
W = weight in kg
g = acceleration due to gravity (9.8 m/s^2)

ρ =
air density in kg/m^3
S = wing area in m^2
Clmax = coefficient of lift at stall

Because things like air density, mass, angle of attack, and wing characteristics all affect the stall speed, the calculation can be somewhat involved. Aviation.co.uk has a great manual if you'd like to dig into specifics (pdf). But if you’re willing to make some assumptions, the calculation can be simplified.

V = 3.7√(W/S)
 
V = velocity in mi/h
W = weight in oz

S = wing area in ft^2

Using the super-accurate method of stepping on the bathroom scale with the aircraft, I found that my drone probably weighed less than I had expected – about 6.2 pounds, or 99.2 ounces. Using that number, and the area of the wing, I found that 16.5 miles per hour (or 7.4 meters per second) was probably the slowest I could fly without having the plane drop from the sky.

The stall speed determines most other critical parameters in during mapping mission. GIS experts use as many overlapping 2D images as possible to generate an accurate 3D model of terrain. So the slower you can fly, the more images you can take, and the better model you will have.

When flying at a higher altitude, each image will cover more land. However, that comes at the expense of Ground Sample Resolution, or GSD. In other words, the resulting map will not be as detailed as a map created from images obtained at a lower altitude. Mission planning always is a compromise between speed, area, and precision.

Mission planning: flight planning and configuration.

A screenshot of the mission planner, showing the flight plan for the Turkey mapping missions. The large circles around these waypoints, which have radii of 91 meters, denote the area the drone would need to fly within to consider the waypoint reached. It would prove to be a challenge for the mapping drone.

Speaking of field of view, coming up with a lawnmower-style flight pattern previously required doing some trigonometry (determine the image area for a given altitude, factor in overlap, find time delay between images and width of the swath). Fortunately there’s now a feature in the autopilot configuration app that does those calculations automatically.

After you specify your drone’s camera model and cruising altitude, the app will create a flight plan with the appropriate width between waypoints, and will configure the autopilot to trigger the camera at the necessary distances.

This feature is nice, but there’s still much more to consider in planning a drone mapping mission, and some of it might only come from experience. For example, what altitude are you comfortable at flying? Too high, and the drone will become difficult to see (and thus difficult control in the event of malfunction) or will exceed the 400-foot, FAA-mandated ceiling. Too low, and you could clip a tree.

Additionally, greater distance means mapping a greater area, but again, you risk going beyond visual line of sight. You might also exceed the range for your ground control station (in this case, your radio controller), but fortunately the autopilot has a failsafe feature that will make the drone return and circle above the “home” location when the control link is lost (when configured correctly). And generally speaking, most consumer or hobby-grade radios have a range that goes beyond visual line of sight.

For this mission, I set my cruising altitude at 100 meters, and my farthest waypoint about 170 meters from myself (“home”). Based on the camera and altitude, the mission planner’s automatic mapping application configured the autopilot to take photos every 61.7 meters. Read on, and you’ll find out if those were the best mission parameters.

Executing the mission.

One type of visualization produced from the flight data. When the track color changes, it indicates that the flight mode has been switched from manual to automatic, or automatic to manual.

On Thanksgiving day, after a couple of flights to test all the aircraft’s systems, I made a run above the creek at the back of the property.  To better understand what the airplane was doing, I extracted the .LOG flight data from the autopilot, and converted it into two different geospatial files: a .KMZ and a .GPX file.

I opened both files in Google Earth, and then let GE play the .GPX file in real time on top of the .KMZ data. Below is a visualization of that flight. The grey walls represent the autopilot’s programmed waypoints and flight path.



Once automatic flight was engaged, the plane clipped the first waypoint, and made a wide turn for the second and third, missing the programmed flight path. From the ground, it appeared as if the plane was flying too low and might clip a tree (flight data showed the plane dropped below 30 meters). I aborted automatic flight, swung the plane over the field, and climbed to about 80 meters before trying the run again.

I didn’t know this at the time, but the autopilot will not re-start a whole mission from the start without the power being cycled. In other words, when you flip back into automatic flight, the plane will head to the next waypoint on its list.

So when I turned automatic flight on for the second time, the plane made another wide turn over the field and headed for waypoint 4, instead of restarting the mission and aiming for waypoint 1. This was not obvious from the ground.

After waypoint 4, the plane gently banked to hit waypoint 5. At this point, the plane was out of waypoints, and was programed to “loiter” in a circular path above the location where it first obtained a GPS lock (“home”).

It was very clear from the ground that the plane was done with its mission, because the plane was orbiting above my head. I briefly flipped into manual, and back into automatic flight, in an attempt to get the mission to restart (which as previously mentioned, doesn’t work).

However, from the ground, the plane’s wide orbit made it appear that it was still attempting to map the bayou. When the plane eventually returned to circle above my head, I thought the mission was complete. Finally, I switched into manual control and brought the aircraft in for landing.

Post-flight analysis.

Near-vertical aerial photo taken of the bayou at the property's edge, on Thanksgiving day 2013. A rather large rainstorm swept through the area two days before, so the bayou's water level is slightly higher than average.

Something seemed off when I perused the camera’s SD card. I only had 33 photos, and no two photos seemed to have any overlapping features, which would be necessary to generate a 3D model of the ground.

This aerial photo was taken quite close to the treetops. Too close for comfort, perhaps. However it also is one of the highest-resolution pictures I have of the little bayou.


The autopilot stores the GPS time, latitude, longitude, and airplane orientation for each time it triggers the camera, so it wasn’t hard find out what happened. I opened the flight data in Excel and filtered the camera trigger data, and found that the autopilot attempted to trigger the camera 66 times during the mission.

Due to the CHDK macro on the camera, and the autopilot’s triggering method, the camera was only taking an image on every other trigger.

Near the center-left of this image, covered in the shade of the treeline, are the family-famous Muscadine vines.


I ran the images through mosaicking and ground modeling programs, but none of those attempts yielded any usable mosaics or ground models.

An image over the highway, at the south-east end of the property. Near the center-right of the image, up and to the right of the utility pole, a brown track reveals a spot where water washes over the highway and down into the property.

Salvaging data and uncovering shenanigans.


It was a little disheartening to go through all that work and not end up with any of the “good stuff.” But then I remembered Mapknitter, a website which allows kite and balloon photographers to manually stretch and stitch their aerial images into a mosaic over satellite photos.

Below is a Mapknitter project with some of my drone’s images, stitched over existing Google Earth satellite photos. Not all images could salvaged, especially oblique images taken when the plane was banked. Some images of neighbors’ properties also were omitted.





This method doesn’t produce accurate geospatial data, but it’s a way to salvage some of the mapping drone’s data.  These mosaics still can tell an interesting story about the ground, and in this case, something interesting was uncovered on the southeast end of the property.

This image obtained from the mapping drone is evidence of some tomfoolery happening at the far end of the property.

See the elliptical ruts in the dirt? Those aren’t natural features, but rather evidence of some pickup-truck driving hooligans making donuts on someone else’s land.

My wife’s family was not surprised.

The big takeaways.


It probably was foolish to assume that a universal autopilot, which could be configured fairly easily to automatically fly anything from octocopters to foam wings, wouldn’t need some fine tuning to suit my particular aircraft.

This small unmanned aircraft system worked very well in many respects. The autopilot didn’t wildly oscillate the aircraft into a stall when it came to navigating between waypoints. Its documentation is quite clear that these default settings are purposefully conservative. Having said that, the turning and waypoint radius needs to be reduced, and the shutter interval needs to be halved.

While I overestimated the ability for the autopilot to adapt to this particular plane and mapping mission, I underestimated the ability of the plane to handle the job.

I marked off an area of about 7,030 square meters to map, but the drone easily could have mapped 125,000 square meters (almost 30 acres) or more. Keeping the aircraft confined to such a small space made it difficult to accurately navigate through waypoints.

If I had to produce a list of three big takeaways from this experience, it would look like the following:

  1. Each drone is a unique and special snowflake. The autopilot might be "open source" and "universal," but for each system, it must be tested and tuned well before deployment.
  2. Drones are far more capable than ourselves. For many applications, the limit of the drone's practical ability no longer is battery time, sensors, or optics, but rather regulations and the frailty of the human operator. I could have flown this aircraft well above 400-foot regulatory limits, and the 2-3 mile range of the 72Mhz radio equipment could have carried the plane well beyond my visual line of sight (VLOS). Underestimating the ability of the aircraft means difficulty in executing small-scale mapping missions.
  3. Even botched missions can produce interesting data. Couldn't get the mapping data you were hoping for? Sometimes it's not all about 3D models and measurements. The ground changes all the time, and commercial satellite photos are low-resolution and few and far between. At the very least, the operator will gain valuable experience with the unmanned aircraft.
Since returning from the Turkey Flight, high school students here in the central Illinois Drones for Schools STEM program have examined the data, and used it to inform their own mapping drone development.

It probably will be another six months before I have another opportunity to do any mapping in Louisiana, but there's plenty of interesting places to map here.

This flight was conducted in accordance with regulations prescribed in Federal Aviation Administration advisory circular (AC) 91-57, by an insured, card-holding member of the Academy of Model Aeronautics, on private land with the permission of the property-holder.

2 comments:

  1. Matthew, your article is quite interesting. I'm edging into similar projects myself with quadcopters but don't yet have waypoint flying capability; it's all by the seat of my pants, both simple LOS and some FPV. But, your experience and conclusions give me a lot to think about. Keep up the good work!

    Carl

    ReplyDelete
    Replies
    1. I'm glad you enjoyed the post, Carl. Learning how to fly a quadcopter without an autopilot as an aid is a pretty important thing to do. If the GPS were to ever go out on such a system, only the pilot's wits would likely prevent an accident.

      Delete