The U-TRACKR system provides real-time navigation for UAVs placed within a defined area. The key objective is to locate the position of the UAV in an indoor setting, where GPS is unreliable, using only the image sequences of a limited camera system. Ultimately, the system will track and model the trajectory of autonomous UAVs using image-processing and photogrammetry. The team has accomplished time synchronization between the cameras, space resection, and position calculations to obtain the coordinates of the UAV.
The system consists of four inexpensive Raspberry Pi camera modules fixed at the corners of a stable metal frame. The camera angles were chosen so that the entire area of the frame is covered. A personal laptop with a high processing rate produces real-time calculations, manages the synchronization between the four Raspberry Pi camera modules, and runs the OpenCV software for object detection with minimum delay.
About the TeamThe Designer
The Animator
The Brains
The Programmer
For outdoor applications, UAVs are hindered by flying regulations and noise pollution control. However, UAVs have great potential in many indoor applications. The U-TRACKR system improves the efficiency and cost-effectiveness of UAVs by providing positioning technology and indoor navigation. Our system can be used in warehouse applications where UAVs have the advantages of a flexible flight path, automation, and the ability to enter environments that are dangerous to human life. The U-TRACKR system will fulfill the need for employee safety, increased productivity, and reduced company expenses allocated to damaged UAVs. The U-TRACKR can also be used to track autonomous ground vehicles in shipping warehouses. For example, Amazon uses Kiva robots to improve their Prime services and promises the delivery of items within two days. However, these industrial robots come at a cost and warehouses have to be designed around them. For example, floors need to have custom grids so that the robots can move, and workers are at risk of getting hit if they are in the way. The U-TRACKR system provides a solution to these flaws and enables trajectory control in indoor, GPS-denied spaces for these otherwise blind robots.
The scope of application can also be expanded to animals in captivity. Currently, zoos are using techniques such as attaching acoustic tags or GPS tags on animals to carry out their research, capture movements, and study behavioral patterns. These techniques are invasive and may not be available for indoor areas with GPS limitations. The execution of this project minimizes human interaction and disturbances, and provides an non-invasive solution for the study of animal lifestyle and behavior. The U-TRACKR can also be used to compare the popularity of indoor exhibits and stores. Some museums and art galleries are presently using IoT (Indoor Location) tracking with the help of visitor heat-maps to determine and compare the popularity of exhibitions. The U-TRACKR can provide real-time data collection with accurate human behavior trends by using a machine learning library called TensorFlow.
The software deliverables for the Primary Processing System for U-TRACKR consists of three components: Python, OpenCV, and the U-TRACKR Codebase.
This project uses four python libraries: NumPy, SymPy, imutils, picamera[array], and paramiko. Numpy is used to define the HSV array values for object tracking purposes using OpenCV. SymPy is used for position calculation purposes, specifically in the space resection and intersection algorithm. The library imutils is used to resize the video stream from each camera output to an appropriate size. The two libraries picamera[array] and paramiko are used to interface between the primary processing system and each Raspberry Pi Zero.
We would like to give a special thanks to:
Supervisor, Dr. Costas Armenakis
Graduate Student, Erin Du
Industry Advisor, Lui Tai
Professors, Franz Newland, James Smith, and Hossam Sadek