top of page

Duration: May 2019 - Sept 2019

Purpose: Project conducted at the UAV Lab at the Indian Institute of Science, Bangalore as part of the Wipro-IISc Research and Innovation Network.

Skills: Robotic Operating System, ArUco Markers, Control Systems, Kalman Filtering, Offboard Control

 

This project was my introduction to UAVs.

Similar research has been executed in various parts of the world, with significantly better results, but given some of the limitations that were faced, such as no set up for safe testing, limited funding and our own lack of experience combined with limited technical guidance, this project was a personal success that taught me a lot about the inner workings of autonomous drone systems.


Aproach


There are many approaches to landing on a target, including GPS based waypoint navigation, infra red and laser based, radio based, etc. However, all of these have their limitations. GPS based navigation is prone to an error of about 2 - 5m, which doesn't bode well for precise landing. IR performs poorly in direct sunlight and lasers are very power hungry, with significant latency that makes it inaccurate on a fast moving vehicle such as a drone. 

This is where vision comes in. Cameras offer great potential for a very low cost compared to other options, and the latency is limited only by the processing capabilities of the onboard microprocessors. For most tasks, with the advent of high powered development boards and lightweight GPUs, this latency is negligible, making vision based navigation a very attractive prospect.


Challenges


Specific to the application of landing on a target, there are four key challenges that were identified:

  1. The target must be correctly identified and continuously tracked by the UAV for the duration of the landing procedure.

  2. As the drone comes in for landing and the height is reduced, the field of view is minimized, increasing risk of losing the target.

  3. With outdoor landing, wind plays an influential role, forcing any system to be very robust to variations.

  4. The drone must be equipped with a camera, an onboard microprocessor and a higher capacity battery capable of powering both the processor and the drone. Any additional weight makes motions less precise due to the added factor of inertia.


Hardware Setup


The setup uses a 3DR Iris drone with a Pixhawk 2.0 Cube Black flight controller unit. A downward facing Logitech USB camera is mounted at the front of the drone for monocular vision, connected to an Odroid XU4 onboard to carry out all the processing, and it also carries a wireless router for communication with a ground station for monitoring. The whole system is powered by a single four cell 6000 mAhLithium Polymer Battery that provides about 10 min of flight time. Including battery, the total weight comes up to about 2kg.


Detection


The target is differenctiated and tracked using Apriltag bundles. A series of bundles of descending sizes allow the target to be tracked even as the height is reduced. The drone follows the target using a velocity based PID controller that returns the desired velocity of the drone based on the distance to the target.



Decision Flow


Any autonomous system is defined by a good decision flow that dictates what the system should do in every case. For the landing, a three stage procedure was used that makes sure the target is tracked at all times as the drone comes in, ensuring that it does not miss the target.



Outcomes


The drone is able to track targets of up to 3m/s and has successfully landed on a target moving linearly at a speed of 1.5m/s. However, using PID for control is not the optimal solution. This method performs poorly with system non-linearities such as wind or sudden changes in direction, and is very unstable at higher speeds. A fuzzy logic based controller would improve this accuracy, especially when paired with a Kalman Filter for better state estimation.

Opmerkingen


bottom of page