The 28th International Aerial Robot Competition, Kunming, China, August 2019.
The International Aerial Robotics Competition (IARC), organized by the Association for Unmanned Vehicle Systems, is the world's longest-running collegiate aerial robotics challenge. Marking its 33th anniversary in 2024, the IARC advances the state-of-the-art by posing mission challenges deemed "impossible" at their introduction with the latest being the 10th mission.
With a history of pushing technological boundaries, the competition continues to challenge top engineering students globally to develop next-generation aerial robots. Numerous teams have demonstrated systems that garnered significant industry and government interest, leading to sponsorship and commercial opportunities.
The detailed rules for the 2019 IARC Mission 8 can be found in the Official Rules for the International Aerial Robotics Competition Mission 8,and for more information about IARC you can visit the official website AUVSI Foundation International Aerial Robotics Competition
The mission environment is an indoor field, with a size of 28 x 15 meters, in which there are three bunkers with a height of about 2 meters and four drones that can cause damage to human helmets through laser scanning. The goal of the mission is to cooperate with multiple drones to complete the searching of the boxes, the splicing of QRcode,identification of the password, and obtain the target accessories in the box by using the password, as well as the treatment of human, without being killed by the sentry, and finally complete the task in 8 minutes.
Compared with previous missions, the 8th mission has several key elements. For the first time, human is introduced into the mission, emphasizing human-machine collaboration, non-electronic human-machine interaction. Secondly, the introduction of multiple drones to perform multi-target tasks for the first time, highlighting technologies such as multi-UAV collaboration and multi-sensor information fusion. Furthermore, our personnel and drones are exposed for the first time in the task in the hostile environment of sentinel robots with obvious attack and interference tendency, which challenges the cooperation ability of the whole system. In addition, the mission environment is indoor, and the drone lacks global positioning information.
By analyzing the goals, requirements and difficulties of the mission, we must solve the following problems:
1. In the room without global information, the target assignment problem of multi-UAV, that is, how to allocate targets and how to ensure that multi-UAV is safe and does not collide each other when searching and tracking.
2.Under the non-electronic interaction requirements, the problem of the interaction method between human and aircraft, that is, how to design the interaction mode and instructions to enable efficient interaction between human and machine.
3. In the case of serious indoor wireless signal interference, the stability of the communication link, that is, how to ensure the stability of the interaction between the human and the aircraft, so as to ensure that the information flow can be stably transmitted to each other.
4. Under the enemy sentinel interference, and limited task time, the problem of human-machine collaboration efficiency, that is, how to improve the collaboration efficiency of the human-machine team, so as to ensure the task completion time is short enough.
In order to solve the various problems mentioned above, we have built a multi-UAV collaboration system based on man in the loop, including hardware, software, communication link ,human-machine cooperation strategy, etc.The following picture is the overall framework of the system for two aircrafts including the composition of the hardware system and the design of the communication link.
For the signal transmission mode in the system, that is, the construction of the communication link, the dotted arrow refers to the wireless communication mode, and the solid arrow refers to the wired communication mode.
In order to improve the efficiency of team cooperation, while human are avoiding threats and observing the environment, he can still communicate with multiple aircrafts, give instructions, we choose to pass the relevant instructions to the machine through voice signals.
In order to deal with radio frequency interference in the mission environment , we chose 2.4G Wifi communication mode between the processors which are closed to each other, and use 5.8G DJI’s Ocusync as the communication method between the personnel handheld device and the remote aircraft.
For the communication mode between the remote laser system and the personnel handheld device, we use the 433M wireless data transmission to send the switch command to the 433M data transmission in the air laser system through the USB Port of the server to control the laser system.
We have developed Android applications for the client and server based on the Android system, including the UI interface and various function modules.The software architecture is shown below.
The program mainly consists of a Main activity.The activity include UI thread, WIFI listener thread, Connection thread, Voice recognition timer module, and QR code decoding single-thread pool.The UI thread is mainly responsible for the display, interaction and processing of the logic of the interactive interface,and the WIFI listener thread is responsible for monitoring the client connection request and establishing a connection thread.The voice recognition module continuously outputs the voice command recognition result and passes connection thread so as to distribute instructions to the client, and when the connection thread obtains the key picture (QR code segment) from the client, it notifies QR code decoding thread to decode.
The Main activity of the application mainly include four modules, namely the UI thread, Control thread, and Detection thread, Wifi connection thread, of which the UI thread is mainly responsible for displaying the relevant information of the aircraft and the video signal of the first perspective of the aircraft.The Control thread is responsible for sending control instructions to the aircraft, and the Detection thread outputs detection and position results of the targets, including boxes, QR codes and so on, the WiFi connection thread accepts the voice instructions sent from the server and sends the obtained key information such as QR code segments to the server for decoding.
In order to fully utilize and combine the advantages of human and machines and improve the task execution efficiency with human-machine cooperation, we designed a humanmachine combination strategy based on hierarchical finite state machines (HFSM). The voice commands are set as the transition conditions of the upper layer states, and each state in upper layer runs the automatic state machine and closed-loop control algorithm defined in advance in order to take advantage of the machine’s high efficiency of specific task execution.In addition, in order to deal with unexpected situations, some human voices commands also be used as transition condition for the lower layer state, so that human intervention and control can also be performed at the level of the drone movement.
We designed three levels of instructions, namely basic commands, motion control commands, and task-level commands.The human voice command set in the task is shown in the following figure.
Before the aircraft tracks the target, it is necessary to allocate targets for each aircraft.For the pseudo code of the target allocation algorithm, see Algorithm1.In the process of target allocation, the machine first selects the fittest target in the field of view,because most of the actual flight platforms only have the forward and backward obstacle avoidance ability, in the process of aircraft movement, especially many aircrafts moving together in the environment without global position information, we think that the cost and risk of lateral movement is high. Therefore, for each aircraft, the the target closer to the center of the field of vision is easier to track and need to be selected. After that, the operator sends voice command to confirm and trigger the tracking process, or adjust the aircraft’s position and yaw angle to guide the aircraft’s field of vision to aim at more appropriate target. Through this process we solved the problem of multiple aircraft target assignment without global position information based on man in the loop.
Below is the pseudo code of the algorithm
Below is system function demonstration clips from the competition site and debugging processes .
After nearly a year of preparation that included initial team formation, recruitment and training, the main system design and verification, hardware and software development and integration, and the final debugging stage, we designed and implemented a human-drone collaborative multi-UAV system for the IARC Mission 8 from scratch.
In August, we participated in the Asia-Pacific regional competition at Bei Hang University Yunnan Innovation Institute in Kunming, China, where we successfully completed the Mission 8 challenge during the event and won the Third Prize(where rankings were determined by task completion time).
First and foremost, I extend my deepest gratitude to my advisor, Prof. HE Fenghua, for her trust and mentorship and granting me the invaluable opportunity to serve as team captain for the Mission 8 challenge. My sincere thanks also go to senior members HAO Ning and YAO Haodi for their comprehensive support and guidance throughout the competition.
Secondly, I wish to thank my teammates NIU Yinbao, LIN Zhaochen, and LYU Zibo for their seamless collaboration and tremendous dedication during the event. Also as my roommates and fellow graduate students, I am equally grateful for their unwavering support and companionship throughout our postgraduate journey.
Lastly, I express profound appreciation to junior teammates XU Qinzhe, CUI Bohan, DENG Tianchen, and others for their wholehearted committed to the competition.