he observed orderly arranged pearls
in the cornea of a grey drone fly. These
NCEs inspired the development of arti-
ficial compound eyes (ACEs) based on
planar microlens arrays, curved microlens
arrays, and metasurfaces. However, none
of these ACEs could match the NCEs
in achieving both real-time panoramic
direct imaging and dynamic motion
detection simultaneously. The main
challenge with curved microlens array-
based ACEs is how to transmit light rays
collected by many microlenses on a
curved surface to a flat imaging sensor
(e.g., a CMOS chip) while preserving their
spatial relationships.
In a paper recently published in Light:
Science &Applications, a team of scien-
tists led by Professor Xuming Zhang
from the Department of Applied Physics
at the Photonics Research Institute (PRI),
and the Research Institute for Advanced
Manufacturing (RIAM), both at The Hong
Kong Polytechnic University in China,
have developed a biomimetic ACE as
a panoramic camera, called ACEcam.
This ACEcam offers a 180° field of view
(FOV), compared to the 150°–180° FOV
typical of most arthropods, making it
particularly well-suited for applications in
surveillance. Its real-time, distortion-free
panoramic imaging eliminates the need
for redundant post-processing, making
ACEcam suitable for imaging and
distance measurement among moving
objects in real-world scenarios. The
nearly infinite depth of field enhances
realism in augmented reality experiences.
Additionally, its translational and rota-
tional motion perception, coupled with
ultrafast angular motion detection (up to
5.6 × 106 degrees per second), positions
ACEcam for use in kinestate tracking
and motion control for machines ranging
from cars to high-speed airplanes and
even spacecraft. The combination of
these features also positions ACEcam for
niche applications, such as the integra-
tion into obstacle avoidance systems for
high-speed unmanned aerial vehicles.
This reduces the need for multiple lenses,
eliminating excess weight and size, and
the compact design makes ACEcam suit-
able for endoscopy.
In the proposed ACEcam, lensed plastic
optical fibers serve as artificial ommatidia.
By adding a conical microlens to the distal
end of the fiber, the optical fiber mimics
an ommatidium, collecting and transmit-
ting light to the sensing unit. A bundle of
lensed plastic optical fibers evenly distrib-
uted on a hemispherical surface replicates
NCEs, and the ACEcam demonstrates
exceptional static imaging and dynamic
motion detection capabilities.
The team designed a conical microlens
onto the distal end of the plastic optical
fiber to reduce the light acceptance and
increase the angular resolution. Their simu-
lations showed that a half-apex angle of
35° is optimal, reducing the fiber’s accep-
tance angle from 60° to 45°. The rounded
tip of the conical microlens ensures light
information in the central angular range
isn’t lost.
The conical-microlens optical fibers
were fabricated in batches through 3D
printing, electroplating, and molding.
Each batch produced approximately 200
optical fibers, each with a smooth surface
and rounded tip.
Then the team arranged 271 lensed
optical fibers on a 3D-printed perforated
dome, with the bare ends placed into a
perforated planar buncher. Light from
the bare ends was projected onto a
flat imaging sensor via an imaging lens.
The dome, buncher, imaging lens, and
flat imaging sensor chip were housed
in a hollow tube, with the dome’s black
color absorbing stray light, much like
the pigment cells in NCEs to prevent
crosstalk. The fiber bundle ensures
that light is confined, preventing ghost
images. This setup transmits light from
the curved surface to a flat image sensor,
replicating the ommatidia in an NCE. The
final images are processed digitally after
projection onto the sensor.
“ACEcam holds the promise of
becoming the cornerstone for future
ACEs, owing to its synergy with diverse
disciplines,” the scientists concluded.
“For instance, the imaging optical fiber
bundles can emulate natural ommatidia
to replicate optical and neural super-
position observed in NCEs, potentially
enhancing ACEcam’s imaging resolution
and dynamic perception speed.” They
suggested that the integration of opto-
fluidic lenses with ACEcam presents an
opportunity to harness the advantages
of both arthropods’ compound eyes and
vertebrate monocular eyes.
The paper was coauthored by Heng
Jiang, Chi Chung Tsoi, Weixing Yu,
Mengchao Ma, Mingjie Li, and Zuankai
Wang along with Zhang, and can be
accessed at https://doi.org/10.1038/
s41377-024-01580-5.
Concept and principle of the artificial compound eye for a panoramic camera (ACEcam) that uses
conical-microlens optical fibers to mimic natural ommatidia.
A P R I L 2 0 2 5 • M AT E R I A L S E V A L U AT I O N 13
CREDIT:
LIGHT:
SCIENCE
&
APPLICATIONS
in the cornea of a grey drone fly. These
NCEs inspired the development of arti-
ficial compound eyes (ACEs) based on
planar microlens arrays, curved microlens
arrays, and metasurfaces. However, none
of these ACEs could match the NCEs
in achieving both real-time panoramic
direct imaging and dynamic motion
detection simultaneously. The main
challenge with curved microlens array-
based ACEs is how to transmit light rays
collected by many microlenses on a
curved surface to a flat imaging sensor
(e.g., a CMOS chip) while preserving their
spatial relationships.
In a paper recently published in Light:
Science &Applications, a team of scien-
tists led by Professor Xuming Zhang
from the Department of Applied Physics
at the Photonics Research Institute (PRI),
and the Research Institute for Advanced
Manufacturing (RIAM), both at The Hong
Kong Polytechnic University in China,
have developed a biomimetic ACE as
a panoramic camera, called ACEcam.
This ACEcam offers a 180° field of view
(FOV), compared to the 150°–180° FOV
typical of most arthropods, making it
particularly well-suited for applications in
surveillance. Its real-time, distortion-free
panoramic imaging eliminates the need
for redundant post-processing, making
ACEcam suitable for imaging and
distance measurement among moving
objects in real-world scenarios. The
nearly infinite depth of field enhances
realism in augmented reality experiences.
Additionally, its translational and rota-
tional motion perception, coupled with
ultrafast angular motion detection (up to
5.6 × 106 degrees per second), positions
ACEcam for use in kinestate tracking
and motion control for machines ranging
from cars to high-speed airplanes and
even spacecraft. The combination of
these features also positions ACEcam for
niche applications, such as the integra-
tion into obstacle avoidance systems for
high-speed unmanned aerial vehicles.
This reduces the need for multiple lenses,
eliminating excess weight and size, and
the compact design makes ACEcam suit-
able for endoscopy.
In the proposed ACEcam, lensed plastic
optical fibers serve as artificial ommatidia.
By adding a conical microlens to the distal
end of the fiber, the optical fiber mimics
an ommatidium, collecting and transmit-
ting light to the sensing unit. A bundle of
lensed plastic optical fibers evenly distrib-
uted on a hemispherical surface replicates
NCEs, and the ACEcam demonstrates
exceptional static imaging and dynamic
motion detection capabilities.
The team designed a conical microlens
onto the distal end of the plastic optical
fiber to reduce the light acceptance and
increase the angular resolution. Their simu-
lations showed that a half-apex angle of
35° is optimal, reducing the fiber’s accep-
tance angle from 60° to 45°. The rounded
tip of the conical microlens ensures light
information in the central angular range
isn’t lost.
The conical-microlens optical fibers
were fabricated in batches through 3D
printing, electroplating, and molding.
Each batch produced approximately 200
optical fibers, each with a smooth surface
and rounded tip.
Then the team arranged 271 lensed
optical fibers on a 3D-printed perforated
dome, with the bare ends placed into a
perforated planar buncher. Light from
the bare ends was projected onto a
flat imaging sensor via an imaging lens.
The dome, buncher, imaging lens, and
flat imaging sensor chip were housed
in a hollow tube, with the dome’s black
color absorbing stray light, much like
the pigment cells in NCEs to prevent
crosstalk. The fiber bundle ensures
that light is confined, preventing ghost
images. This setup transmits light from
the curved surface to a flat image sensor,
replicating the ommatidia in an NCE. The
final images are processed digitally after
projection onto the sensor.
“ACEcam holds the promise of
becoming the cornerstone for future
ACEs, owing to its synergy with diverse
disciplines,” the scientists concluded.
“For instance, the imaging optical fiber
bundles can emulate natural ommatidia
to replicate optical and neural super-
position observed in NCEs, potentially
enhancing ACEcam’s imaging resolution
and dynamic perception speed.” They
suggested that the integration of opto-
fluidic lenses with ACEcam presents an
opportunity to harness the advantages
of both arthropods’ compound eyes and
vertebrate monocular eyes.
The paper was coauthored by Heng
Jiang, Chi Chung Tsoi, Weixing Yu,
Mengchao Ma, Mingjie Li, and Zuankai
Wang along with Zhang, and can be
accessed at https://doi.org/10.1038/
s41377-024-01580-5.
Concept and principle of the artificial compound eye for a panoramic camera (ACEcam) that uses
conical-microlens optical fibers to mimic natural ommatidia.
A P R I L 2 0 2 5 • M AT E R I A L S E V A L U AT I O N 13
CREDIT:
LIGHT:
SCIENCE
&
APPLICATIONS