There is an ongoing projekt at LiU with the aim to help park guards in Ngulia, Kenya, to protect the rhinoceroses in their reserve. With the help of drones the task of detecting poachers and surveillance of the animals could be simplified. This project was aimed towards producing software that, with a drone, autonomously searches an area, detects any rhinos in the area and also keeps track of them. The result of the project can be seen in the video below.
The general communication is done by ROS (Robot Operating System), which manages simultaneous computations on different modules. The computer and drone communicates via WiFi. The image processing uses the video stream feed from the drone camera to detect targets by color, as well as estimating their coordinates. The computer vision library OpenCV is used for this. These estimations are then used to track the animals. For tracking a Kalman filter is used to track multiple targets. When the imageprocessing has detected a target its estimated coordinates are updated based on a constant position model.
The operator can mark an area in a map for the drone to search for animals. The planning algorithm will implement an initial nearest neighbour tour followed by a Lin-Kernighan heuristic to solve this Traveling Salesman Problem to obtain full scanning coverage and detect possible targets within the area.
When the drone has performed an area search it has hopefully found a few targets. During target tracking it will work to minimise the uncertainty of each targets estimated position by minimizing a cost function. The cost function takes into consideration the uncertainty of all targets together with the distance from the drone to each target.
If a target has been lost, which will occur if the uncertainty of the position reaches a large enough number, the operator can choose to perform target finding. Target finding uses a Archimedian spiral from a starting point that the operator has chosen. The drone will then circle around this point in a spiral and hopefully find the lost target.
The project was first implemented in a simulation environment using Gazebo. In the simulation the targets were simplified as red spheres moving randomly across a flat ground plane. The drone was modelled as a a general drone with a camera pointing downwards, wifi-transmitter and a GPS-reciever.
When all requirements were fulfilled in simulation the code was implemented on a physical drone, an AR Drone 2.0, which has all the features as the drone used in the simulation. The only difference was that instead of using GPS as a positioning system the indoor positioning system at Visionen, LiU was used. This system yields a more accurate position than the GPS.
Below you can find the links to the documents that were written during the course.
|Project leader: Henrik Lillberg, Mail: firstname.lastname@example.org|
|Software: Daniel Arnström, Mail: email@example.com|
|Integration: Adam Bergenhem, Mail: firstname.lastname@example.org|
|Hardware: Joakim Ekström, Mail: email@example.com|
|Design: Tim Fornell, Mail: firstname.lastname@example.org|
|Documentation: Jacob Holmberg, Mail: email@example.com|
|Deliveries: Gustav Magnusson, Mail: firstname.lastname@example.org|
|Tests: Peter Mrad, Mail: email@example.com|
|Information: Johan Svensson, Mail: firstname.lastname@example.org|
The project is a part of the course TSRT10 - CDIO Project course in automatic control which was performed at Linköping university during the autumn term of 2017. Other than the project group the following has given assistance during the course:
|Client: Christian A. Naesseth, Linköpings Universitet|
|Telephone: +46 13 281087, Mail: email@example.com|
|Customer: Gustaf Hendeby, Linköpings Universitet|
|Telephone: +46 13285815, Mail: firstname.lastname@example.org|
|Examinator: Daniel Axehill, Linköping Universitet|
|Telephone: +46 13 284042, Mail: email@example.com|
|Supervisor: Fredrik Ljungberg, Linköpings Universitet|