experimentally collected charging and discharging data for
both the power bank and drone batteries, it was estimated that
the vertiport’s power bank could support up to eight drones
continuously, along with other vertiport components.
The estimation assumes that a swarm of eight drones
actively conducted surveillance tasks over a 5 h period. At
the start of the estimation at 10:00, the power bank level
was assumed to be ~50%. During this time, the solar panels
were actively generating power to charge the power bank.
After 15 min of charging (from 10:00 to 10:15) without drones
connected, the power bank level increased to 69%. At 10:15,
all eight drones returned to the vertiport with battery levels
between 10% and 15%, initiating simultaneous charging.
From 10:15 to 10:45 (30 min), the drones charged sufficiently
to conduct another 30-min surveillance task, reducing the
power bank level to ~9%. After the drones departed for their
next mission, the power bank recharged, reaching ~60% by
11:15 through the solar charging system. At this time, the drones
returned, and the process repeated. During this cycle, the
power bank level dropped from 60% to 10%.
The power drop of the power bank was not consistent
throughout the process due to the solar panels providing power
to both the drone batteries and the power bank. At 1:45 p.m, the
power drop of the power bank was at its lowest (~26%), coincid-
ing with the highest power generation rate of the solar panels
due to the optimal orientation of the sun. The solar panels
operate at maximum capacity when the sun’s orientation is per-
pendicular to them. As the sun’s position changed, the power
generation rate decreased, causing the power bank’s energy
depletion rate to increase gradually. Throughout the process,
the solar panels remained horizontally aligned with the ground
surface, with no adjustments made to their orientation.
6. Drone Integration to Vertiport and Precision Landing
The drone used for this project is equipped with a flight system
that combines a flight controller and an onboard comput-
ing unit. The onboard processor handles all data collected
from the drone’s sensors and cameras and runs the necessary
software to manage communication and configuration for
flight operations. The flight controller operates with open-
source firmware, which can only be accessed via the terminal
on the onboard processor unless a direct-wired connection is
established with the flight controller’s port, typically required
only for firmware updates via image flashing.
The primary reason for selecting this drone was its high
functionality and the robust usability of its autopilot features.
These capabilities are enabled by a modular software architec-
ture running on the onboard computing unit. This architecture
consists of services, pipelines, and tools. The services include
software that handles the processing, calculation, configura-
tion, and maintenance of the drone’s individual components.
For example, the camera server manages optical input, pro-
cesses the data based on the task, and configures the cameras
for identification and filtering. Tools are used to manipulate or
inspect services, such as examining the drone’s IMU (inertia
measurement unit) servers or battery status through the PX4
software. The pipelines, a critical component of this archi-
tecture, function as data carriers between different services
that request specific information. This architecture is particu-
larly valuable to the project as it allows for the integration of
data from these pipelines into custom software applications.
An example of this architecture in action is a visual-inertial
odometry (VIO) system, which requires camera data from the
camera processing module and IMU data from the inertial
measurement unit module to calculate VIO, demonstrating the
interconnected functionality of the system.
6.1. AprilTag Relocation
For landing precisely in any specific location, we use AprilTags
detected by the drones utilizing an in-built deep learning
algorithm. Building on the understanding of the drone’s
software and hardware capabilities, the focus shifts to deep
neural networks and AprilTag detection. The drone is already
equipped with software to detect and localize AprilTags using
ME
|
BIOINSPIREDDRONEVERTIPORTS
10:00 10:15 10:30 10:45 11:00 11:15 11:30 11:45 12:00 12:15
80
70
60
50
40
30
20
10
0
Time of the day (h)
12:30 12:45 13:00 13:15 13:30 13:45 14:00 14:15 14:30 14:45 15:00
Figure 7. Estimated charging pattern for eight drones.
44
M AT E R I A L S E V A L U AT I O N A P R I L 2 0 2 5
Battery
percentage
(%)
VIO data from the camera. The detection process is facili-
tated by TensorFlow Lite, a pre-trained deep learning neural
network, specifically a convolutional neural network (CNN).
This type of deep learning AI is designed to recognize patterns
across various media, including video, audio, and text [55, 56].
Upon detecting an AprilTag, the camera places five reference
points: four at the corners and one in the center. These points
are then used to calculate the position of the tag relative to the
drone using trigonometric functions, enabling precise localiza-
tion and navigation.
Utilizing this data for relocalizing the drone involves more
complexity than simply inputting Cartesian coordinates (XYZ).
In autopilot mode, the drone does not navigate using XYZ
coordinates instead, it relies on a different orientation con-
vention known as Euler angles or RPY (roll, pitch, yaw). The
conversion between these two systems is feasible through
the application of Euler angles of angular velocity. However,
a challenge arises because there are 12 possible Euler angle
sequences, and it is not immediately clear which sequence
is being used for the coordinate system conversion. Further
research indicates that the ZYX sequence is commonly
employed for most drones, leading to the following rotation
matrix:
Understanding these conventions is crucial for compre-
hending how the drone calculates its path to relocalize above
the AprilTag. The drone achieves this by rotating its axis and
determining its position over time based on angular velocity.
This underscores the importance of specifying the correct
commands and orientations when working with robot operat-
ing system (ROS) and configuring parameters that enable the
drone to hover above the AprilTag before landing. Additionally,
it is essential for developing algorithms that manage scouting
and landing operations. By leveraging the AprilTag detector
service through the Modal Pipe Architecture, the RPY values
can be extracted and utilized within ROS.
6.2. AprilTag Detection
Data was collected using the drone to assess its AprilTag detec-
tion capabilities in an indoor environment (see Figure 8). The
drone utilized a tag-detection system, which continuously ran
to capture AprilTag location data. Upon detecting the AprilTag’s
size and position, the drone processed its own RPY data to
navigate toward the tag. For this experiment, an AprilTag
1.2
0.6
0
Drone position
at (0,0,0)
Drone position
AprilTag position
AprilTag position
–0.6
–6
6 –0.6 0 0.6 1.2
Y (roll) axis (m)
X (pitch) axis (m) 1.8 2.4 3
–0.2
0
–1.2
VOXL m500
drone
AprilTag location
z
z
x
x y
y
Figure 8. Visual
points from
drone to AprilTag:
(a) graphical
representation
of AprilTag
detection range
(b) experimental
setup and (c) drone
camera view.
A P R I L 2 0 2 5 M AT E R I A L S E V A L U AT I O N 45
Z
(yaw)
axis
(m)
Previous Page Next Page