technologies is referred to as Bridge Inspection Robot Deployment Systems (BIRDS). Specifically, structural crawlers, uncrewed aerial vehicles (UAVs), and multimodal uncrewed vehicles provide mobile platforms for in-depth inspection of bridges. For example, a multimodal uncrewed vehicle, called BridgeBot, combines the traversing capability of crawlers and the flying capability of UAVs into one system for bridge inspec- tion. The BridgeBot can fly to the underside of a bridge deck, attach to a bridge girder, and provide an inspection platform for installed cameras to take high-resolution images from defi- cient areas as conventional visual inspection would do. Thermal and hyperspectral images are being developed to assess concrete delamination and steel corrosion of rein- forced concrete (RC) bridges. Together with other technologies such as ground penetrating radar (GPR), they provide a suite of measurement tools and methods for the NDE of struc- tural damage and deterioration conditions in RC and steel bridges. Innovative sensors such as UAV-based smart rocks for scour monitoring and integrated point and distributed optical fiber systems for strain and corrosion monitoring provide mission-critical data, such as the maximum scour depth, corrosion-induced steel mass loss, and live load-induced strains to normalize the NDE data taken over time at spatially distributed points. This paper intends to provide an overview of a few advanced robotic platforms and their potentially supported NDE technologies. It is organized into five parts. After this introduction, a concept of field operation in augmented reality (AR) is first envisioned. It is then followed with supporting robotic platforms to make aerial NDE a possibility. Next, example NDE technologies suitable for installation on UAVs and robotic platforms are discussed. Finally, a few remarks are made to conclude this study and pose questions that warrant further investigations. BVLOS Bridge Inspection via Augmented/Virtual Reality The INSPIRE UTC has developed a mixed reality (MR) inter- face that can streamline inspection process, analysis, and doc- umentation for seamless data uses from inspection to mainte- nance in bridge asset management via an automating access, visualization, comparison, and assessment, and to apply the MR interface in a beyond-visual-line-of-sight (BVLOS) NDE on flying and/or climbing robotic platforms. Currently, Federal Aviation Administration (FAA) does not have any established regulations on the BVLOS operation of uncrewed aircraft systems (UAS). To meet increasing demand for broadening drone applications in various industries, includ- ing infrastructure construction, survey, surveillance, inspec- tion, and maintenance, FAA formed an Aviation Rulemaking Committee (ARC) in 2021 to develop recommendations on the guidelines for BVLOS flights of UAS (ARC 2022). The ARC was represented by government organizations, different industries, and academia. It is thus expected that BVLOS inspection of infrastructure will likely be allowed in the years to come. As bridges continue to deteriorate, biennial inspection becomes more critical and demanding than ever before. The current practice with visual inspection requires the presence of a crew of two inspectors at any bridge site with one for inspection and paperwork and the other for photographing bridge deterioration and areas of concern. In recent years, inspectors in some states are equipped with mobile tablets (with a flat-screen interface) in a 3D model-based data entry application (Brooks and Ahlborn 2017). The 3D model markup and rendering are often inaccurate and cannot be manipu- lated by the inspectors to record and visualize defects and element-level data (e.g., defect location). This shortcoming can be overcome with the aid of digital technologies in three forms. Virtual reality (VR) immerses users in a digital envi- ronment. Augmented reality (AR) could overlay digital objects onto the physical world by anchoring virtual objects to the real world. However, there is no interaction between the digital and physical elements. On the other hand, MR not only enables superposition of the two worlds but also allows the user to interact with the digital objects (Karaaslan et al. 2019). In bridge applications, MR allows inspectors to recognize their surroundings and digital contents to interact with the real bridge in three dimensions (Maharjan et al. 2021). Some of the recent AR/VR/MR development works are summarized in two review papers (Mascareñas et al. 2021 Xu and Moreu 2021). An MR interface used in an app with a Microsoft AR headset, as illustrated in Figure 1, was recently developed by the INSPIRE UTC. The MR interface includes four main com- ponents: mixed reality, element inspection panel, function menu, and database. It will likely revolutionize the 3D data collection, storage, retrieval, and analysis (or general cloud- based data management) of an entire bridge as well as robot and sensor control through wireless communication. It will Element inspection panel Function menu Local database Azure database Azure SQL The bridge (physical scene) Mixed reality World locking system (multi-point anchoring) Bridge library (digital) Bridge model Specific bridge Rotational alignment Element (Ne) Data synced Component (NC) Legacy data overview Component condition details Photograph Measure Control Display Entry (Ne, Nc) Entry (Ne, Nc) Start Figure 1. Bridge inspection app workflow: a mixed reality interface. ME | AERIALNDTFORBRIDGES 68 M AT E R I A L S E V A L U AT I O N J A N U A R Y 2 0 2 3 2301 ME Jan New.indd 68 12/20/22 8:15 AM
provide an inspector with hands/voice-control robot operation following a flight mission plan, aNDT&E, and intraoperative hands-free access to complex data, real environments, and two-way communication. As an engine of the MR interface, the World Locking System (WLS) toolkit helps lock holograms in place as the user walks around so that “space pins” can be added to specific locations of the model to align perfectly with corresponding features in the physical bridge. The MR bridge environment imports a high-resolution 3D reconstructed and georeferenced bridge model at 5 cm/pixel from a laser scanner and stores and visualizes the metadata (such as past inspection reports and photos of discontinuities including size, shape, and location). The region of interest (ROI) discontinuities can be compared and annotated as needed by retrieving the historical inspection data and appending the current inspection data. A database is established to automate the bridge inspection and reporting process according to the 2019 AASHTO Manual for Bridge Element Inspection. Therefore, the bridge element field inspection efficiency and accuracy can be dramatically improved with the developed MR interface. A point cloud model of the bridge on 10th Street in Rolla, Missouri, was established as shown in Figure 2 in SketchUp and Unity. The model texture and size were maintained. When the bridge was scaled 1:1 in Unity, it was the same size as the actual bridge. This allowed the bridge model to be overlaid over the actual bridge with virtual and physical features roughly aligned. The model was scaled, rotated, and repositioned in X, Y, Z direc- tions either manually or by inputting an accurate desirable value to improve the accuracy of bridge alignment. WLS and its space pin feature was implemented to align local features of the bridge. WLS used an alignment manager called the Frozen Engine to lock the world space. Space pins were added as small objects that could be individually posi- tioned on physical objects at runtime, and then the Frozen Engine would adjust the view of the model to align. In this way, the bridge model was aligned more accurately and anchored perfectly with the real bridge for future revisits. After the 3D bridge model was overlaid with the real bridge asset, the Photograph mode can capture the discontinuity areas and localize them correspondingly. The discontinuity pictures and their locations, preliminary bridge element category (subjected to later review and confirmation) and their service conditions were annotated as illustrated in Figure 2. The dis- continuity metadata were saved to a .csv file together with the bridge inspection legacy data for cloud synchronization with the Azure SQL database. The Measure mode allowed the user to select the start point and then raycast measurement points in sequence. The dimension measurement along specific surfaces was enabled for bridge element inspections. Similarly, the quantity mea- surement for certain discontinuity areas or volumes was done for each structural component or limit state in service. The Control mode (to be developed) will enable the user to guide the navigation of robotic platform according to a predefined mission plan and execute an aNDT&E task. The user will closely coordinate these tasks with the UAV/robotic platform and an on-site safety worker through wireless com- munication. The task will not be directly executed at the UAV or robotic platform but teleoperated by the user through haptic sensing and dexterous manipulation (Kim 2021). The Display mode allowed the discontinuity photo, the measured distance, and/or flight and test scenario to be visualized for further analysis. Besides the local database where information was fed directly from the application at runtime, an Azure cloud database based on SQL was accessi- ble whenever any internet connection became available. This allowed the operator to work in a remote site, then synchronize the data collected locally to the cloud database. This practice is crucial to prevent data loss and allow data access anytime and anywhere. Robotic Platforms to Support aNDT&E In current practices, a moving inspection platform that costs about US$1 million, as shown in Figure 3a, is often installed between girders in the superstructure of river-crossing bridges due to access needs during bridge inspection and main- tenance. Mobile platforms such as structural crawlers on concrete/masonry walls (Yang et al. 2019) and steel members (Nguyen and La 2019) have recently been developed to support bridge inspection with advanced data-driven evaluation tech- nologies based on aNDT. For example, a four-wheel magnet- ically attached structural crawler, as shown in Figure 3b, that Augmented reality (AR) Image of local defects Figure 2. Hand-free bridge inspection enabled by augmented reality. J A N U A R Y 2 0 2 3 M AT E R I A L S E V A L U AT I O N 69 2301 ME Jan New.indd 69 12/20/22 8:15 AM
Previous Page Next Page