over 20 years and thousands of miles of
sewer inspection video data, PipeAId has
built and trained advanced computer
vision algorithms to identify and code
sewer defects and features according
to NASSCO standards. Every defect is
geospatially mapped through a digital
twin of the collection system, providing a
highly accurate, consistent, and detailed
representation of system conditions. This
empowers utilities to make data-driven
decisions, effectively manage assets, and
optimize maintenance and repair budgets.
“This partnership is a testament to our
focus on data quality and commitment to
unlocking the power of data through an
integration-first approach,” said Andrew
Stauffer, CEO at PipeAId.
PipeAId’s AI-powered capabilities are
now available to over 500 municipalities,
utilities, and inspection contractors using
PipeLogix. Once customers purchase the
PipeAId integration, it is activated without
the need for new software training.
PipeAId pricing follows a pay-as-you-go
model based on linear feet and includes
comprehensive quality assurance.
EXTENDE RENEWS
QUALIOPI
CERTIFICATION
EXTENDE has successfully renewed its
Qualiopi certification, which is now valid
until November 2027.
Qualiopi is a French quality standard
for training centers. In addition to the
company’s ISO 9001 certification, the
Qualiopi certification underscores the
high level of EXTENDE’s “CIVA” and
“Reliability in NDE” training programs.
For EXTENDE’s French customers, this
certification means that the company’s
training programs can now be included
in any OPCO’s reference catalog, poten-
tially enabling funding support for course
registration.
ADVANCED LEGGED
ROBOT CREATED FOR
SAFER, FASTER PIPELINE
INSPECTION AND REPAIR
Tmsuk Corporation (Kyoto, Japan)
unveiled the SPD-X, a legged robot
designed for pipeline inspection, in
January. Capable of fitting through
pipes with diameters larger than 7.87 in.
(20 cm), the SPD-X can travel at speeds
of up to 0.19 mph (0.3 kph) using its
16 legs.
Developed specifically for pipeline
inspection and repair, the SPD-X is a
significant upgrade from Tmsuk’s earlier
SPD1 model, offering improved stability
and functionality. Compared to the SPD1,
the SPD-X is smaller, longer, and roughly
heavier. Despite its increased weight,
it maintains the same top speed and is
now fully waterproof. The SPD-X also
features an enhanced camera, offering
360° vision at 12.3 degrees, a signifi-
cant improvement over the SPD1’s 62.2
degrees. This advanced camera allows
the robot to navigate more challenging
conditions, such as pipe joint misalign-
ment or sediment buildup.
Like its predecessor, the SPD-X is
controlled remotely via a gaming pad–
inspired controller. However, its additional
advanced legs allow it to tackle obsta-
cles more efficiently. The robot is also
designed to access narrow spaces that
are otherwise hazardous or impossible
for humans to enter.
The SPD-X is part of a growing global
network of “pipebots”—robots designed
to maintain pipeline integrity. In the US
alone, a water main ruptures approxi-
mately every two minutes, resulting in a
daily loss of six billion gallons of treated
water. Traditional inspection and repair
methods are time-consuming and
labor-intensive, but robots like the SPD-X
could significantly reduce both the time
and cost of such operations.
Tmsuk’s development of the SPD-X
included extensive 3D simulations to
predict its performance in various environ-
ments, including its ability to handle sedi-
ment buildup in pipes. Real-world testing
followed, with improvements based on
feedback from SPD1 users in 2022. These
updates addressed challenges like accom-
modating both larger and smaller pipe
diameters, handling pipe blockages, and
improving water flow.
Tmsuk plans to continue developing
robots that can operate in areas humans
cannot reach, perform dangerous tasks,
and function in environments where
traditional machinery would struggle.
The company is also focused on refining
the SPD-X to perform more advanced
tasks, helping to address labor shortages
and improve safety in hazardous work
conditions.
NEW BIOMIMETIC
PANORAMIC CAMERA
MIMICS INSECT
EYES FOR REAL-
TIME IMAGING AND
MOTION DETECTION
Natural compound eyes (NCEs) were first
studied by Robert Hooke in 1664 after
SCANNER
|
INDUSTRYNEWS
Tmsuk’s SPD-X pipebot, an upgrade from its predecessor SPD1 model, has an enhanced camera
and additional legs that allow it to maneuver better through challenging areas.
12
M AT E R I A L S E V A L U AT I O N A P R I L 2 0 2 5
CREDIT:
TMSUK
CORPORATION
he observed orderly arranged pearls
in the cornea of a grey drone fly. These
NCEs inspired the development of arti-
ficial compound eyes (ACEs) based on
planar microlens arrays, curved microlens
arrays, and metasurfaces. However, none
of these ACEs could match the NCEs
in achieving both real-time panoramic
direct imaging and dynamic motion
detection simultaneously. The main
challenge with curved microlens array-
based ACEs is how to transmit light rays
collected by many microlenses on a
curved surface to a flat imaging sensor
(e.g., a CMOS chip) while preserving their
spatial relationships.
In a paper recently published in Light:
Science &Applications, a team of scien-
tists led by Professor Xuming Zhang
from the Department of Applied Physics
at the Photonics Research Institute (PRI),
and the Research Institute for Advanced
Manufacturing (RIAM), both at The Hong
Kong Polytechnic University in China,
have developed a biomimetic ACE as
a panoramic camera, called ACEcam.
This ACEcam offers a 180° field of view
(FOV), compared to the 150°–180° FOV
typical of most arthropods, making it
particularly well-suited for applications in
surveillance. Its real-time, distortion-free
panoramic imaging eliminates the need
for redundant post-processing, making
ACEcam suitable for imaging and
distance measurement among moving
objects in real-world scenarios. The
nearly infinite depth of field enhances
realism in augmented reality experiences.
Additionally, its translational and rota-
tional motion perception, coupled with
ultrafast angular motion detection (up to
5.6 × 106 degrees per second), positions
ACEcam for use in kinestate tracking
and motion control for machines ranging
from cars to high-speed airplanes and
even spacecraft. The combination of
these features also positions ACEcam for
niche applications, such as the integra-
tion into obstacle avoidance systems for
high-speed unmanned aerial vehicles.
This reduces the need for multiple lenses,
eliminating excess weight and size, and
the compact design makes ACEcam suit-
able for endoscopy.
In the proposed ACEcam, lensed plastic
optical fibers serve as artificial ommatidia.
By adding a conical microlens to the distal
end of the fiber, the optical fiber mimics
an ommatidium, collecting and transmit-
ting light to the sensing unit. A bundle of
lensed plastic optical fibers evenly distrib-
uted on a hemispherical surface replicates
NCEs, and the ACEcam demonstrates
exceptional static imaging and dynamic
motion detection capabilities.
The team designed a conical microlens
onto the distal end of the plastic optical
fiber to reduce the light acceptance and
increase the angular resolution. Their simu-
lations showed that a half-apex angle of
35° is optimal, reducing the fiber’s accep-
tance angle from 60° to 45°. The rounded
tip of the conical microlens ensures light
information in the central angular range
isn’t lost.
The conical-microlens optical fibers
were fabricated in batches through 3D
printing, electroplating, and molding.
Each batch produced approximately 200
optical fibers, each with a smooth surface
and rounded tip.
Then the team arranged 271 lensed
optical fibers on a 3D-printed perforated
dome, with the bare ends placed into a
perforated planar buncher. Light from
the bare ends was projected onto a
flat imaging sensor via an imaging lens.
The dome, buncher, imaging lens, and
flat imaging sensor chip were housed
in a hollow tube, with the dome’s black
color absorbing stray light, much like
the pigment cells in NCEs to prevent
crosstalk. The fiber bundle ensures
that light is confined, preventing ghost
images. This setup transmits light from
the curved surface to a flat image sensor,
replicating the ommatidia in an NCE. The
final images are processed digitally after
projection onto the sensor.
“ACEcam holds the promise of
becoming the cornerstone for future
ACEs, owing to its synergy with diverse
disciplines,” the scientists concluded.
“For instance, the imaging optical fiber
bundles can emulate natural ommatidia
to replicate optical and neural super-
position observed in NCEs, potentially
enhancing ACEcam’s imaging resolution
and dynamic perception speed.” They
suggested that the integration of opto-
fluidic lenses with ACEcam presents an
opportunity to harness the advantages
of both arthropods’ compound eyes and
vertebrate monocular eyes.
The paper was coauthored by Heng
Jiang, Chi Chung Tsoi, Weixing Yu,
Mengchao Ma, Mingjie Li, and Zuankai
Wang along with Zhang, and can be
accessed at https://doi.org/10.1038/
s41377-024-01580-5.
Concept and principle of the artificial compound eye for a panoramic camera (ACEcam) that uses
conical-microlens optical fibers to mimic natural ommatidia.
A P R I L 2 0 2 5 M AT E R I A L S E V A L U AT I O N 13
CREDIT:
LIGHT:
SCIENCE
&
APPLICATIONS
Previous Page Next Page