VIO data from the camera. The detection process is facili-
tated by TensorFlow Lite, a pre-trained deep learning neural
network, specifically a convolutional neural network (CNN).
This type of deep learning AI is designed to recognize patterns
across various media, including video, audio, and text [55, 56].
Upon detecting an AprilTag, the camera places five reference
points: four at the corners and one in the center. These points
are then used to calculate the position of the tag relative to the
drone using trigonometric functions, enabling precise localiza-
tion and navigation.
Utilizing this data for relocalizing the drone involves more
complexity than simply inputting Cartesian coordinates (XYZ).
In autopilot mode, the drone does not navigate using XYZ
coordinates instead, it relies on a different orientation con-
vention known as Euler angles or RPY (roll, pitch, yaw). The
conversion between these two systems is feasible through
the application of Euler angles of angular velocity. However,
a challenge arises because there are 12 possible Euler angle
sequences, and it is not immediately clear which sequence
is being used for the coordinate system conversion. Further
research indicates that the ZYX sequence is commonly
employed for most drones, leading to the following rotation
matrix:
Understanding these conventions is crucial for compre-
hending how the drone calculates its path to relocalize above
the AprilTag. The drone achieves this by rotating its axis and
determining its position over time based on angular velocity.
This underscores the importance of specifying the correct
commands and orientations when working with robot operat-
ing system (ROS) and configuring parameters that enable the
drone to hover above the AprilTag before landing. Additionally,
it is essential for developing algorithms that manage scouting
and landing operations. By leveraging the AprilTag detector
service through the Modal Pipe Architecture, the RPY values
can be extracted and utilized within ROS.
6.2. AprilTag Detection
Data was collected using the drone to assess its AprilTag detec-
tion capabilities in an indoor environment (see Figure 8). The
drone utilized a tag-detection system, which continuously ran
to capture AprilTag location data. Upon detecting the AprilTag’s
size and position, the drone processed its own RPY data to
navigate toward the tag. For this experiment, an AprilTag
1.2
0.6
0
Drone position
at (0,0,0)
Drone position
AprilTag position
AprilTag position
–0.6
–6
6 –0.6 0 0.6 1.2
Y (roll) axis (m)
X (pitch) axis (m) 1.8 2.4 3
–0.2
0
–1.2
VOXL m500
drone
AprilTag location
z
z
x
x y
y
Figure 8. Visual
points from
drone to AprilTag:
(a) graphical
representation
of AprilTag
detection range
(b) experimental
setup and (c) drone
camera view.
A P R I L 2 0 2 5 M AT E R I A L S E V A L U AT I O N 45
Z
(yaw)
axis
(m)
measuring 6 in. 6 in. was used. The minimum and maximum
distances at which the drone could successfully detect the
AprilTag were recorded.
The coordinate system was established relative to the
drone’s position, and the location of the AprilTag was
measured accordingly. The drone’s ability to detect the
AprilTag was evaluated across various positions, encompassing
nearly all possible placements of the tag. This testing consisted
of 32 trials focusing on AprilTag detection accuracy and the
drone’s RPY adjustments relative to the tag. Detection was suc-
cessful in 21 of the 32 tests, highlighting the system’s ability to
detect and navigate to the AprilTag based on coordinate system
alignment. Additionally, 34 tests were conducted to identify the
minimum and maximum detectable distances of the AprilTag.
Detection was successful in 26 of these tests, providing a quan-
tified range of the drone’s AprilTag detection capability. It was
observed that the drone has a specific detection range within
which it can reliably identify the AprilTag. Figure 8 provides
a graphical representation of this detection boundary, illus-
trating the area within which the drone successfully detected
the AprilTag at any given point. The figure shows the AprilTag
detection range of the drone, with the unchanged position of
the drone considered as the origin (0,0,0) of the 3D coordinate
system. The distance is measured in meters.
6.3. MAVLink and ROS Implementation
A notable challenge encountered during the integration
process was the limitation of the drone’s onboard computer,
which restricts the direct installation of packages or applica-
tions without utilizing Docker. In a typical ARM Linux environ-
ment, the ‘sudo apt install’ command would suffice to install
necessary packages. However, for this specific drone, the pro-
cedure requires creating a designated directory and download-
ing a compatible Docker image, such as “voxl-cross.” After con-
figuring the image, it must be built into a tar.gz file. This file is
then transferred to the onboard computing unit using a wired
connection and a debugging tool to push the file into the root
shell directory. It is crucial to ensure that the Docker daemon
is running as a background service before proceeding with
building and deploying the image.
MAVROS, unlike MAVLink and ROS, operates as an active
command-line interface, functioning akin to a frontline appli-
cation that communicates directly with the flight controller
through pipelines connected to the vision processing hub.
The integration of MAVROS is particularly important because
it enables direct communication with the flight controller,
allowing for arming and disarming of the drone without relying
on external signals from a transmitter or QGroundControl.
In previous testing phases, the drone’s autopilot and
AprilTag detection capabilities were evaluated using MAVLink
and vision-processing-hub demonstration packages [57]. With
the implementation of MAVROS, the process begins by ini-
tiating ROScore and the necessary nodes for flight, followed
by executing the required packages to facilitate communica-
tion with the PX4 software on the flight controller. Once the
connection between PX4 and MAVROS is established, the ROS
script is initiated to arm and control the drone, all of which is
executed with a single command from the terminal. Despite
the generally smooth operation, sporadic communication
errors between MAVROS and PX4 were encountered, indicat-
ing unexpected command inputs. However, these errors did
not significantly impact the overall flight performance.
6.4. Drone Relocation Results
The coordinate system was established relative to the drone’s
position, and the location of the AprilTag was measured
accordingly. The drone’s ability to detect the AprilTag was
evaluated across various positions, encompassing nearly all
possible placements of the tag. This testing consisted of 32
trials focusing on AprilTag detection accuracy and the drone’s
RPY adjustments relative to the tag. Detection was successful
in 21 of the 32 tests, highlighting the system’s ability to detect
and navigate to the AprilTag based on coordinate system align-
ment. Additionally, 34 tests were conducted to identify the
minimum and maximum detectable distances of the AprilTag.
Detection was successful in 26 of these tests, providing a quan-
tified range of the drone’s AprilTag detection capability.
To test the drone’s ability to relocate using AprilTag detec-
tion, the drone was taken to an open field, and a QR code
was placed on a flat surface. The objective was for the drone
to detect the QR code mid-flight and autonomously land
on it. Initially, the drone was set to position flight mode and
manually controlled using a transmitter to fly closer to the QR
code. Upon detecting the QR code, the drone was switched to
off-board flight mode (see Figures 9a and 9b). In this mode, the
drone accurately determined the position of the QR code and
hovered steadily at a height of 1.5 m (5 ft) above it. After main-
taining a stable hover, the drone was switched back to position
flight mode and landed on the QR code.
To rigorously evaluate the system’s AprilTag detection and
precision landing capabilities, extensive testing was conducted
in both indoor and outdoor environments. Nine outdoor flight
tests were performed to assess the impact of environmen-
tal factors, including wind, shadow, and varying detection
distances. During some trials, vibrations were intentionally
introduced on the branch structure to simulate wind effects.
Despite these disturbances, the drone reliably detected the
AprilTag and executed precise landings, demonstrating the
system’s robustness under dynamic conditions (see Figure 9c).
7. Future Work
The next steps in this project focus on enhancing the auton-
omous capabilities of the drone system. A key area for devel-
opment is refining the use of MAVROS to create software that
can autonomously plan flight paths and incorporate algo-
rithms to assist the drone in locating its designated landing
site. Once the drone can reliably navigate its flight path inde-
pendently, the integration of a second drone into the system
will be pursued to explore coordinated swarm operations.
Additionally, there will be a focus on integrating an automatic
ME
|
BIOINSPIREDDRONEVERTIPORTS
46
M AT E R I A L S E V A L U AT I O N A P R I L 2 0 2 5
Previous Page Next Page