he observed orderly arranged pearls
in the cornea of a grey drone fly. These
NCEs inspired the development of arti-
ficial compound eyes (ACEs) based on
planar microlens arrays, curved microlens
arrays, and metasurfaces. However, none
of these ACEs could match the NCEs
in achieving both real-time panoramic
direct imaging and dynamic motion
detection simultaneously. The main
challenge with curved microlens array-
based ACEs is how to transmit light rays
collected by many microlenses on a
curved surface to a flat imaging sensor
(e.g., a CMOS chip) while preserving their
spatial relationships.
In a paper recently published in Light:
Science &Applications, a team of scien-
tists led by Professor Xuming Zhang
from the Department of Applied Physics
at the Photonics Research Institute (PRI),
and the Research Institute for Advanced
Manufacturing (RIAM), both at The Hong
Kong Polytechnic University in China,
have developed a biomimetic ACE as
a panoramic camera, called ACEcam.
This ACEcam offers a 180° field of view
(FOV), compared to the 150°–180° FOV
typical of most arthropods, making it
particularly well-suited for applications in
surveillance. Its real-time, distortion-free
panoramic imaging eliminates the need
for redundant post-processing, making
ACEcam suitable for imaging and
distance measurement among moving
objects in real-world scenarios. The
nearly infinite depth of field enhances
realism in augmented reality experiences.
Additionally, its translational and rota-
tional motion perception, coupled with
ultrafast angular motion detection (up to
5.6 × 106 degrees per second), positions
ACEcam for use in kinestate tracking
and motion control for machines ranging
from cars to high-speed airplanes and
even spacecraft. The combination of
these features also positions ACEcam for
niche applications, such as the integra-
tion into obstacle avoidance systems for
high-speed unmanned aerial vehicles.
This reduces the need for multiple lenses,
eliminating excess weight and size, and
the compact design makes ACEcam suit-
able for endoscopy.
In the proposed ACEcam, lensed plastic
optical fibers serve as artificial ommatidia.
By adding a conical microlens to the distal
end of the fiber, the optical fiber mimics
an ommatidium, collecting and transmit-
ting light to the sensing unit. A bundle of
lensed plastic optical fibers evenly distrib-
uted on a hemispherical surface replicates
NCEs, and the ACEcam demonstrates
exceptional static imaging and dynamic
motion detection capabilities.
The team designed a conical microlens
onto the distal end of the plastic optical
fiber to reduce the light acceptance and
increase the angular resolution. Their simu-
lations showed that a half-apex angle of
35° is optimal, reducing the fiber’s accep-
tance angle from 60° to 45°. The rounded
tip of the conical microlens ensures light
information in the central angular range
isn’t lost.
The conical-microlens optical fibers
were fabricated in batches through 3D
printing, electroplating, and molding.
Each batch produced approximately 200
optical fibers, each with a smooth surface
and rounded tip.
Then the team arranged 271 lensed
optical fibers on a 3D-printed perforated
dome, with the bare ends placed into a
perforated planar buncher. Light from
the bare ends was projected onto a
flat imaging sensor via an imaging lens.
The dome, buncher, imaging lens, and
flat imaging sensor chip were housed
in a hollow tube, with the dome’s black
color absorbing stray light, much like
the pigment cells in NCEs to prevent
crosstalk. The fiber bundle ensures
that light is confined, preventing ghost
images. This setup transmits light from
the curved surface to a flat image sensor,
replicating the ommatidia in an NCE. The
final images are processed digitally after
projection onto the sensor.
“ACEcam holds the promise of
becoming the cornerstone for future
ACEs, owing to its synergy with diverse
disciplines,” the scientists concluded.
“For instance, the imaging optical fiber
bundles can emulate natural ommatidia
to replicate optical and neural super-
position observed in NCEs, potentially
enhancing ACEcam’s imaging resolution
and dynamic perception speed.” They
suggested that the integration of opto-
fluidic lenses with ACEcam presents an
opportunity to harness the advantages
of both arthropods’ compound eyes and
vertebrate monocular eyes.
The paper was coauthored by Heng
Jiang, Chi Chung Tsoi, Weixing Yu,
Mengchao Ma, Mingjie Li, and Zuankai
Wang along with Zhang, and can be
accessed at https://doi.org/10.1038/
s41377-024-01580-5.
Concept and principle of the artificial compound eye for a panoramic camera (ACEcam) that uses
conical-microlens optical fibers to mimic natural ommatidia.
A P R I L 2 0 2 5 M AT E R I A L S E V A L U AT I O N 13
CREDIT:
LIGHT:
SCIENCE
&
APPLICATIONS
EDGE COMPUTING MAKES NDE SMARTER
Traditional nondestructive evaluation
(NDE) has no use for edge computing,
but that is about to change.
The use of artificial intelligence (AI) and
machine learning (ML) requires significant
computational power. As AI/ML adoption
grows, the computational demands on
inspection equipment increase rapidly.
This is especially true for ultrasonics, but
the same trend is seen in other NDE
methods as well.
While cloud computing offers power,
it’s cumbersome for handling large,
sensitive datasets. Standard laptops
and off-the-shelf NDE equipment lack
the necessary computational resources.
The emerging solution is to add a sepa-
rate computing unit to the system: an
edge computer. This has surprising
implications.
Background
Traditional inspections are done by
keying in the settings on the screen
of the NDT machine. The equipment
may employ sophisticated electronics
(like field-programmable gated arrays,
FPGAs), but they must remain versa-
tile—configurable and adjustable for any
potential task the inspectors might face.
As inspections grow more complex, it’s
becoming increasingly problematic to
configure or program these machines.
This leads to overly complex procedures,
where human inspectors need to
manage the minute details of the inspec-
tion settings and oversee the process.
Then there are highly automated
inspection cells, which complete
predefined inspection routines with
the help of complicated mechanics or
robotics, acquiring data in a fully auto-
mated way. However, even in these
cases, the underlying NDE electronics
are often built on a general-purpose
NDE device, with the smart features
focused more on the mechanics than the
NDE itself.
Inspections are getting more
complex, with more channels, more
elements, and more data. While more
data improves inspections, it also places
a significant burden on the evalua-
tion process. Increasingly, advanced
signal processing techniques like total
focusing method (TFM) beamforming
are used, further raising computational
requirements.
All of this leads to overcomplicated
inspection software and procedures for
inspectors. The complexity then trans-
lates to reduced focus on inspection
targets and increased risk of human error.
Progress Up Until Now
The way to deploy AI locally and easily
on top of existing inspections turns out
to be edge computing. These small
boxes contain impressive computa-
tional power and can be connected
to the inspector’s PC or local network,
processing the huge data files produced
by modern techniques in seconds.
Additionally, software installations,
updates, and version control can be
handled directly on the box, avoiding
the typical hassle of installing additional
software on corporate computers. (This
was the immediate issue that brought
edge computers to the industry.) But
there’s more.
As soon as the computational power
is available, it can also be used to
improve signal processing. This enables
the use of many techniques that have
been known for a long time but were
too complex to be used in practice,
such as the synthetic aperture focusing
technique (SAFT) in ultrasonic testing.
Sophisticated signal processing can
be provided in software tailored to the
needs of the procedure.
Edge computers can also be directly
connected to traditional NDE equip-
ment, such as ultrasonic or eddy current
machines, or cameras for visual inspec-
tion. This allows the edge computer to
take over tasks traditionally assigned to
the machine’s electronics, enabling the
freedom to use even the most sophis-
ticated signal processing, filtering, and
reconstruction algorithms.
Taken together, this enables us to
build systems that process and automat-
ically evaluate data as it’s acquired and
generate reports on findings as soon as
the acquisition is complete. It allows us
to upgrade traditional rule-based auto-
mated inspection systems used in manu-
facturing to more robust smart systems
that reduce false calls, improve sensitivity,
and, most importantly, enable better
discrimination between benign indica-
tions and unacceptable defects.
SCANNER
|
NDEOUTLOOK
Edge computing devices are commonly used to implement local AI/ML evaluation on top of
existing inspections.
14
M AT E R I A L S E V A L U AT I O N A P R I L 2 0 2 5
CREDIT:
TRUEFLAW
LTD.
Previous Page Next Page