ABSTRACT Drones are increasingly used during routine inspections of bridges to improve data consistency, work efficiency, inspector safety, and cost effectiveness. Most drones, however, are operated manually within a visual line of sight and thus unable to inspect long-span bridges that are not completely visible to operators. In this paper, aerial nondestructive evaluation (aNDE) will be envisioned for elevated structures such as bridges, buildings, dams, nuclear power plants, and tunnels. To enable aerial nondestructive testing (aNDT), a human-robot system will be created to integrate haptic sensing and dexterous manipulation into a drone or a structural crawler in augmented/virtual reality (AR/VR) for beyond-visual-line-of-sight (BVLOS) inspection of bridges. Some of the technical challenges and potential solutions associated with aNDT&E will be presented. Example applications of the advanced technologies will be demonstrated in simulated bridge decks with stipulated conditions. The developed human-robot system can transform current on-site inspection to future tele-inspection, minimizing impact to traffic passing over the bridges. The automated tele-inspection can save as much as 75% in time and 95% in cost. KEYWORDS: robotic platform, aerial nondestructive testing and evaluation, beyond-visual-line-of-sight inspection, augmented reality Introduction In the United States, currently there are more than 617 000 bridges in the National Bridge Inventory. According to the 2021 American Society of Civil Engineers (ASCE) Infrastructure Report Card, more than 42% of the bridges were at least 50 years old (the design life for most existing highway bridges), and 7.5% of the bridges were considered structurally deficient or in “poor” condition (ASCE 2021). These structurally deficient bridges supported 178 million trips every day, a potential safety concern. Overall, the bridges were rated C, with A being excel- lent and F being a complete failure. Other types of elevated infrastructure, such as dams, levees, transits, and school build- ings, are even worse in their existing conditions. The current practice of visual inspection is required bien- nially. Bridge inspection often requires the use of heavy lifting and access equipment, thus increasing operation time and direct costs. When access to the inspected area must be made from bridge decks, the indirect costs associated with road closure multiply. In such a case, both travelers and inspec- tors are subject to a safety concern on high-volume highways. Moreover, visual inspection is quite subjective and often inconsistent (Moore et al. 2001). It is only capable of detecting damage when it has advanced to become visually apparent. It is thus of economic, psychological, and social importance to develop an alternative platform for faster, safer, cheaper, and more consistent bridge inspection with minimum impact on traffic flow. In November 2012, a robot-assisted bridge inspection tool, referred to as RABIT, was developed as a product of the Federal Highway Administration (FHWA) Long-term Bridge Performance Program (LTBPP) and applied to survey bridge decks (Gucunski et al. 2013 La et al. 2013). The RABIT was equipped with six nondestructive evaluation (NDE) devices and cameras: (a) impact echo for delamination detection (b) ultrasonic surface wave for concrete quality evaluation (c) ground penetrating radar (GPR) for object mapping and deck deterioration assessment (d) electrical resistivity for concrete corrosive environment characterization and (e) two high-resolution, panoramic cameras for deck and surrounding area imaging. To extend autonomous inspection from deck elements to an entire bridge, the INSPIRE University Transportation Center (UTC) led by Missouri University of Science and Technology (Missouri S&T) has been developing advanced technolo- gies to aid in next-generation bridge inspection and mainte- nance. Once integrated, the overall system with the advanced AERIAL NONDESTRUCTIVE TESTING AND EVALUATION ( aNDT&E) GENDA CHEN*†, LIUJUN LI†, HAIBIN ZHANG†, ZHENHUA SHI†, AND BO SHANG† * Department of Civil, Architectural, and Environmental Engineering, Center for Intelligent Infrastructure, Missouri University of Science and Technology 1-573-341-4462 gchen@mst.edu † Department of Civil, Architectural, and Environmental Engineering, Center for Intelligent Infrastructure, Missouri University of Science and Technology Materials Evaluation 81 (1): 67–73 https://doi.org/10.32548/2023.me-04300 ©2023 American Society for Nondestructive Testing J A N U A R Y 2 0 2 3 • M AT E R I A L S E V A L U AT I O N 67 2301 ME Jan New.indd 67 12/20/22 8:15 AM
technologies is referred to as Bridge Inspection Robot Deployment Systems (BIRDS). Specifically, structural crawlers, uncrewed aerial vehicles (UAVs), and multimodal uncrewed vehicles provide mobile platforms for in-depth inspection of bridges. For example, a multimodal uncrewed vehicle, called BridgeBot, combines the traversing capability of crawlers and the flying capability of UAVs into one system for bridge inspec- tion. The BridgeBot can fly to the underside of a bridge deck, attach to a bridge girder, and provide an inspection platform for installed cameras to take high-resolution images from defi- cient areas as conventional visual inspection would do. Thermal and hyperspectral images are being developed to assess concrete delamination and steel corrosion of rein- forced concrete (RC) bridges. Together with other technologies such as ground penetrating radar (GPR), they provide a suite of measurement tools and methods for the NDE of struc- tural damage and deterioration conditions in RC and steel bridges. Innovative sensors such as UAV-based smart rocks for scour monitoring and integrated point and distributed optical fiber systems for strain and corrosion monitoring provide mission-critical data, such as the maximum scour depth, corrosion-induced steel mass loss, and live load-induced strains to normalize the NDE data taken over time at spatially distributed points. This paper intends to provide an overview of a few advanced robotic platforms and their potentially supported NDE technologies. It is organized into five parts. After this introduction, a concept of field operation in augmented reality (AR) is first envisioned. It is then followed with supporting robotic platforms to make aerial NDE a possibility. Next, example NDE technologies suitable for installation on UAVs and robotic platforms are discussed. Finally, a few remarks are made to conclude this study and pose questions that warrant further investigations. BVLOS Bridge Inspection via Augmented/Virtual Reality The INSPIRE UTC has developed a mixed reality (MR) inter- face that can streamline inspection process, analysis, and doc- umentation for seamless data uses from inspection to mainte- nance in bridge asset management via an automating access, visualization, comparison, and assessment, and to apply the MR interface in a beyond-visual-line-of-sight (BVLOS) NDE on flying and/or climbing robotic platforms. Currently, Federal Aviation Administration (FAA) does not have any established regulations on the BVLOS operation of uncrewed aircraft systems (UAS). To meet increasing demand for broadening drone applications in various industries, includ- ing infrastructure construction, survey, surveillance, inspec- tion, and maintenance, FAA formed an Aviation Rulemaking Committee (ARC) in 2021 to develop recommendations on the guidelines for BVLOS flights of UAS (ARC 2022). The ARC was represented by government organizations, different industries, and academia. It is thus expected that BVLOS inspection of infrastructure will likely be allowed in the years to come. As bridges continue to deteriorate, biennial inspection becomes more critical and demanding than ever before. The current practice with visual inspection requires the presence of a crew of two inspectors at any bridge site with one for inspection and paperwork and the other for photographing bridge deterioration and areas of concern. In recent years, inspectors in some states are equipped with mobile tablets (with a flat-screen interface) in a 3D model-based data entry application (Brooks and Ahlborn 2017). The 3D model markup and rendering are often inaccurate and cannot be manipu- lated by the inspectors to record and visualize defects and element-level data (e.g., defect location). This shortcoming can be overcome with the aid of digital technologies in three forms. Virtual reality (VR) immerses users in a digital envi- ronment. Augmented reality (AR) could overlay digital objects onto the physical world by anchoring virtual objects to the real world. However, there is no interaction between the digital and physical elements. On the other hand, MR not only enables superposition of the two worlds but also allows the user to interact with the digital objects (Karaaslan et al. 2019). In bridge applications, MR allows inspectors to recognize their surroundings and digital contents to interact with the real bridge in three dimensions (Maharjan et al. 2021). Some of the recent AR/VR/MR development works are summarized in two review papers (Mascareñas et al. 2021 Xu and Moreu 2021). An MR interface used in an app with a Microsoft AR headset, as illustrated in Figure 1, was recently developed by the INSPIRE UTC. The MR interface includes four main com- ponents: mixed reality, element inspection panel, function menu, and database. It will likely revolutionize the 3D data collection, storage, retrieval, and analysis (or general cloud- based data management) of an entire bridge as well as robot and sensor control through wireless communication. It will Element inspection panel Function menu Local database Azure database Azure SQL The bridge (physical scene) Mixed reality World locking system (multi-point anchoring) Bridge library (digital) Bridge model Specific bridge Rotational alignment Element (Ne) Data synced Component (NC) Legacy data overview Component condition details Photograph Measure Control Display Entry (Ne, Nc) Entry (Ne, Nc) Start Figure 1. Bridge inspection app workflow: a mixed reality interface. ME | AERIALNDTFORBRIDGES 68 M AT E R I A L S E V A L U AT I O N • J A N U A R Y 2 0 2 3 2301 ME Jan New.indd 68 12/20/22 8:15 AM
ASNT grants non-exclusive, non-transferable license of this material to . All rights reserved. © ASNT 2025. To report unauthorized use, contact: customersupport@asnt.org