To see the other types of publications on this topic, follow the link: Sensors fusion for localisation.

Journal articles on the topic 'Sensors fusion for localisation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Sensors fusion for localisation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ashokaraj, Immanuel, Antonios Tsourdos, Peter Silson, and Brian White. "SENSOR BASED ROBOT LOCALISATION AND NAVIGATION: USING INTERVAL ANALYSIS AND NONLINEAR KALMAN FILTERS." Transactions of the Canadian Society for Mechanical Engineering 29, no. 2 (June 2005): 211–27. http://dx.doi.org/10.1139/tcsme-2005-0014.

Full text
Abstract:
Multiple sensor fusion for robot localisation and navigation has attracted a lot of interest in recent years. This paper describes a sensor based navigation and localisation approach for an autonomous mobile robot using an interval analysis (IA) based adaptive mechanism for the non-linear Kalman filter namely the Extended Kalman filter (EKF). The map used for this study is two-dimensional and assumed to be known a priori. The robot is equipped with inertial sensors (INS), encoders and ultrasonic sensors. A non-linear Kalman filter is used to estimate the robots position using the inertial sensors and encoders. The ultrasonic sensors use an Interval Analysis (IA) algorithm for guaranteed robot localisation. Since the Kalman Filter estimates may be affected by bias, drift etc. we propose an adaptive mechanism using IA to correct these defects in estimates. In the presence of landmarks the complementary interval robot position information from the IA algorithm with uniform distribution using ultrasonic sensors is used to estimate and bound the errors in the non-linear Kalman filter robot position estimate with a Gaussian distribution.
APA, Harvard, Vancouver, ISO, and other styles
2

Meng, Lijun, Zhengang Guo, and Chenglong Ma. "Research on multiple damage localisation based on fusion of the Lamb wave ellipse algorithm and RAPID algorithm." Insight - Non-Destructive Testing and Condition Monitoring 66, no. 1 (January 1, 2024): 34–40. http://dx.doi.org/10.1784/insi.2024.66.1.34.

Full text
Abstract:
Current damage localisation methods often require many sensors and complex signal processing methods. This paper proposes a fusion algorithm based on elliptical localisation and the reconstruction algorithm for probabilistic inspection of damage (RAPID) to locate and image multiple damages. Experimental verification of the damage algorithm was conducted. An ultrasonic probe was used to excite Lamb signals on an aluminium alloy plate, the ultrasonic response signals at different positions within the plate under multiple damages were measured and the constructed algorithm was employed to image the damage location. In the experiment, this method improved localisation efficiency by excluding invalid sensing paths in the sensor network, saving 31.32% of computational time. When some sensors in the sensor network were damaged, this algorithm ensured a positioning accuracy with a positioning error of 5.83 mm. Finally, the algorithm was used to locate multiple damages in the sensor network and the results showed the good robustness of the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
3

Nikitenko, Agris, Aleksis Liekna, Martins Ekmanis, Guntis Kulikovskis, and Ilze Andersone. "Single Robot Localisation Approach for Indoor Robotic Systems through Integration of Odometry and Artificial Landmarks." Applied Computer Systems 14, no. 1 (June 1, 2013): 50–58. http://dx.doi.org/10.2478/acss-2013-0006.

Full text
Abstract:
Abstract we present an integrated approach for robot localization that allows to integrate for the artificial landmark localization data with odometric sensors and signal transfer function data to provide means for different practical application scenarios. The sensor data fusion deals with asynchronous sensor data using inverse Laplace transform. We demonstrate a simulation software system that ensures smooth integration of the odometry-based and signal transfer - based localization into one approach.
APA, Harvard, Vancouver, ISO, and other styles
4

Tibebu, Haileleol, Varuna De-Silva, Corentin Artaud, Rafael Pina, and Xiyu Shi. "Towards Interpretable Camera and LiDAR Data Fusion for Autonomous Ground Vehicles Localisation." Sensors 22, no. 20 (October 20, 2022): 8021. http://dx.doi.org/10.3390/s22208021.

Full text
Abstract:
Recent deep learning frameworks draw strong research interest in application of ego-motion estimation as they demonstrate a superior result compared to geometric approaches. However, due to the lack of multimodal datasets, most of these studies primarily focused on single-sensor-based estimation. To overcome this challenge, we collect a unique multimodal dataset named LboroAV2 using multiple sensors, including camera, light detecting and ranging (LiDAR), ultrasound, e-compass and rotary encoder. We also propose an end-to-end deep learning architecture for fusion of RGB images and LiDAR laser scan data for odometry application. The proposed method contains a convolutional encoder, a compressed representation and a recurrent neural network. Besides feature extraction and outlier rejection, the convolutional encoder produces a compressed representation, which is used to visualise the network’s learning process and to pass useful sequential information. The recurrent neural network uses this compressed sequential data to learn the relationship between consecutive time steps. We use the Loughborough autonomous vehicle (LboroAV2) and the Karlsruhe Institute of Technology and Toyota Institute (KITTI) Visual Odometry (VO) datasets to experiment and evaluate our results. In addition to visualising the network’s learning process, our approach provides superior results compared to other similar methods. The code for the proposed architecture is released in GitHub and accessible publicly.
APA, Harvard, Vancouver, ISO, and other styles
5

Moretti, Michele, Federico Bianchi, and Nicola Senin. "Towards the development of a smart fused filament fabrication system using multi-sensor data fusion for in-process monitoring." Rapid Prototyping Journal 26, no. 7 (June 26, 2020): 1249–61. http://dx.doi.org/10.1108/rpj-06-2019-0167.

Full text
Abstract:
Purpose This paper aims to illustrate the integration of multiple heterogeneous sensors into a fused filament fabrication (FFF) system and the implementation of multi-sensor data fusion technologies to support the development of a “smart” machine capable of monitoring the manufacturing process and part quality as it is being built. Design/methodology/approach Starting from off-the-shelf FFF components, the paper discusses the issues related to how the machine architecture and the FFF process itself must be redesigned to accommodate heterogeneous sensors and how data from such sensors can be integrated. The usefulness of the approach is discussed through illustration of detectable, example defects. Findings Through aggregation of heterogeneous in-process data, a smart FFF system developed upon the architectural choices discussed in this work has the potential to recognise a number of process-related issues leading to defective parts. Research limitations/implications Although the implementation is specific to a type of FFF hardware and type of processed material, the conclusions are of general validity for material extrusion processes of polymers. Practical implications Effective in-process sensing enables timely detection of process or part quality issues, thus allowing for early process termination or application of corrective actions, leading to significant savings for high value-added parts. Originality/value While most current literature on FFF process monitoring has focused on monitoring selected process variables, in this work a wider perspective is gained by aggregation of heterogeneous sensors, with particular focus on achieving co-localisation in space and time of the sensor data acquired within the same fabrication process. This allows for the detection of issues that no sensor alone could reliably detect.
APA, Harvard, Vancouver, ISO, and other styles
6

Donati, Cesare, Martina Mammarella, Lorenzo Comba, Alessandro Biglia, Paolo Gay, and Fabrizio Dabbene. "3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios." Remote Sensing 14, no. 6 (March 11, 2022): 1374. http://dx.doi.org/10.3390/rs14061374.

Full text
Abstract:
In precision agriculture, remote sensing is an essential phase in assessing crop status and variability when considering both the spatial and the temporal dimensions. To this aim, the use of unmanned aerial vehicles (UAVs) is growing in popularity, allowing for the autonomous performance of a variety of in-field tasks which are not limited to scouting or monitoring. To enable autonomous navigation, however, a crucial capability lies in accurately locating the vehicle within the surrounding environment. This task becomes challenging in agricultural scenarios where the crops and/or the adopted trellis systems can negatively affect GPS signal reception and localisation reliability. A viable solution to this problem can be the exploitation of high-accuracy 3D maps, which provide important data regarding crop morphology, as an additional input of the UAVs’ localisation system. However, the management of such big data may be difficult in real-time applications. In this paper, an innovative 3D sensor fusion approach is proposed, which combines the data provided by onboard proprioceptive (i.e., GPS and IMU) and exteroceptive (i.e., ultrasound) sensors with the information provided by a georeferenced 3D low-complexity map. In particular, the parallel-cuts ellipsoid method is used to merge the data from the distance sensors and the 3D map. Then, the improved estimation of the UAV location is fused with the data provided by the GPS and IMU sensors, using a Kalman-based filtering scheme. The simulation results prove the efficacy of the proposed navigation approach when applied to a quadrotor that autonomously navigates between vine rows.
APA, Harvard, Vancouver, ISO, and other styles
7

Kozłowski, Michał, Raúl Santos-Rodríguez, and Robert Piechocki. "Sensor Modalities and Fusion for Robust Indoor Localisation." ICST Transactions on Ambient Systems 6, no. 18 (December 12, 2019): 162670. http://dx.doi.org/10.4108/eai.12-12-2019.162670.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Merfels, Christian, and Cyrill Stachniss. "Sensor Fusion for Self-Localisation of Automated Vehicles." PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 85, no. 2 (March 7, 2017): 113–26. http://dx.doi.org/10.1007/s41064-017-0008-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ciuffreda, Ilaria, Sara Casaccia, and Gian Marco Revel. "A Multi-Sensor Fusion Approach Based on PIR and Ultrasonic Sensors Installed on a Robot to Localise People in Indoor Environments." Sensors 23, no. 15 (August 5, 2023): 6963. http://dx.doi.org/10.3390/s23156963.

Full text
Abstract:
This work illustrates an innovative localisation sensor network that uses multiple PIR and ultrasonic sensors installed on a mobile social robot to localise occupants in indoor environments. The system presented aims to measure movement direction and distance to reconstruct the movement of a person in an indoor environment by using sensor activation strategies and data processing techniques. The data collected are then analysed using both a supervised (Decision Tree) and an unsupervised (K-Means) machine learning algorithm to extract the direction and distance of occupant movement from the measurement system, respectively. Tests in a controlled environment have been conducted to assess the accuracy of the methodology when multiple PIR and ultrasonic sensor systems are used. In addition, a qualitative evaluation of the system’s ability to reconstruct the movement of the occupant has been performed. The system proposed can reconstruct the direction of an occupant with an accuracy of 70.7% and uncertainty in distance measurement of 6.7%.
APA, Harvard, Vancouver, ISO, and other styles
10

Neuland, Renata, Mathias Mantelli, Bernardo Hummes, Luc Jaulin, Renan Maffei, Edson Prestes, and Mariana Kolberg. "Robust Hybrid Interval-Probabilistic Approach for the Kidnapped Robot Problem." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 29, no. 02 (April 2021): 313–31. http://dx.doi.org/10.1142/s0218488521500141.

Full text
Abstract:
For a mobile robot to operate in its environment it is crucial to determine its position with respect to an external reference frame using noisy sensor readings. A scenario in which the robot is moved to another position during its operation without being told, known as the kidnapped robot problem, complicates global localisation. In addition to that, sensor malfunction and external influences of the environment can cause unexpected errors, called outliers, that negatively affect the localisation process. This paper proposes a method based on the fusion of a particle filter with bounded-error localisation, which is able to deal with outliers in the measurement data. The application of our algorithm to solve the kidnapped robot problem using simulated data shows an improvement over conventional probabilistic filtering methods.
APA, Harvard, Vancouver, ISO, and other styles
11

Suhr, J. K., and H. G. Jung. "Sensor fusion‐based precise obstacle localisation for automatic parking systems." Electronics Letters 54, no. 7 (April 2018): 445–47. http://dx.doi.org/10.1049/el.2018.0196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Uney, Murat, Bernard Mulgrew, and Daniel E. Clark. "A Cooperative Approach to Sensor Localisation in Distributed Fusion Networks." IEEE Transactions on Signal Processing 64, no. 5 (March 2016): 1187–99. http://dx.doi.org/10.1109/tsp.2015.2493981.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kalisperakis, I., T. Mandilaras, A. El Saer, P. Stamatopoulou, C. Stentoumis, S. Bourou, and L. Grammatikopoulos. "A MODULAR MOBILE MAPPING PLATFORM FOR COMPLEX INDOOR AND OUTDOOR ENVIRONMENTS." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2020 (August 6, 2020): 243–50. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2020-243-2020.

Full text
Abstract:
Abstract. In this work we present the development of a prototype, mobile mapping platform with modular design and architecture that can be suitably modified to address effectively both outdoors and indoors environments. Our system is built on the Robotics Operation System (ROS) and utilizes multiple sensors to capture images, pointclouds and 3D motion trajectories. These include synchronized cameras with wide angle lenses, a lidar sensor, a GPS/IMU unit and a tracking optical sensor. We report on the individual components of the platform, it’s architecture, the integration and the calibration of its components, the fusion of all recorded data and provide initial 3D reconstruction results. The processing algorithms are based on existing implementations of SLAM (Simultaneous Localisation and Mapping) methods combined with SfM (Structure-from-Motion) for optimal estimations of orientations and 3D pointclouds. The scope of this work, which is part of an ongoing H2020 program, is to digitize the physical world, collect relevant spatial data and make digital copies available to experts and public for covering a wide range of needs; remote access and viewing, process, design, use in VR etc.
APA, Harvard, Vancouver, ISO, and other styles
14

Wu, Xinzhao, Peiqing Li, Qipeng Li, and Zhuoran Li. "Two-dimensional-simultaneous Localisation and Mapping Study Based on Factor Graph Elimination Optimisation." Sustainability 15, no. 2 (January 8, 2023): 1172. http://dx.doi.org/10.3390/su15021172.

Full text
Abstract:
A robust multi-sensor fusion simultaneous localization and mapping (SLAM) algorithm for complex road surfaces is proposed to improve recognition accuracy and reduce system memory occupation, aiming to enhance the computational efficiency of light detection and ranging in complex environments. First, a weighted signed distance function (W-SDF) map-based SLAM method is proposed. It uses a W-SDF map to capture the environment with less accuracy than the raster size but with high localization accuracy. The Levenberg–Marquardt method is used to solve the scan-matching problem in laser SLAM; it effectively alleviates the limitations of the Gaussian–Newton method that may lead to insufficient local accuracy, and reduces localisation errors. Second, ground constraint factors are added to the factor graph, and a multi-sensor fusion localisation algorithm is proposed based on factor graph elimination optimisation. A sliding window is added to the chain factor graph model to retain the historical state information within the window and avoid high-dimensional matrix operations. An elimination algorithm is introduced to transform the factor graph into a Bayesian network to marginalize the historical states and reduce the matrix dimensionality, thereby improving the algorithm localisation accuracy and reducing the memory occupation. Finally, the proposed algorithm is compared and validated with two traditional algorithms based on an unmanned cart. Experiments show that the proposed algorithm reduces memory consumption and improves localisation accuracy compared to the Hector algorithm and Cartographer algorithm, has good performance in terms of accuracy, reliability and computational efficiency in complex pavement environments, and is better utilised in practical environments.
APA, Harvard, Vancouver, ISO, and other styles
15

Sheng, Weihua, and Ravi K. Garimella. "Localisation of multiple mobile subjects using multidimensional scaling and sensor fusion." International Journal of Ad Hoc and Ubiquitous Computing 11, no. 4 (2012): 214. http://dx.doi.org/10.1504/ijahuc.2012.050431.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Lai, Tin. "A Review on Visual-SLAM: Advancements from Geometric Modelling to Learning-Based Semantic Scene Understanding Using Multi-Modal Sensor Fusion." Sensors 22, no. 19 (September 25, 2022): 7265. http://dx.doi.org/10.3390/s22197265.

Full text
Abstract:
Simultaneous Localisation and Mapping (SLAM) is one of the fundamental problems in autonomous mobile robots where a robot needs to reconstruct a previously unseen environment while simultaneously localising itself with respect to the map. In particular, Visual-SLAM uses various sensors from the mobile robot for collecting and sensing a representation of the map. Traditionally, geometric model-based techniques were used to tackle the SLAM problem, which tends to be error-prone under challenging environments. Recent advancements in computer vision, such as deep learning techniques, have provided a data-driven approach to tackle the Visual-SLAM problem. This review summarises recent advancements in the Visual-SLAM domain using various learning-based methods. We begin by providing a concise overview of the geometric model-based approaches, followed by technical reviews on the current paradigms in SLAM. Then, we present the various learning-based approaches to collecting sensory inputs from mobile robots and performing scene understanding. The current paradigms in deep-learning-based semantic understanding are discussed and placed under the context of Visual-SLAM. Finally, we discuss challenges and further opportunities in the direction of learning-based approaches in Visual-SLAM.
APA, Harvard, Vancouver, ISO, and other styles
17

Butz, Tobias, Uwe Wurster, Gert F. Trommer, and Matthias Wankerl. "Simulated GPS space segment and sensor fusion for lane-level accurate localisation." ATZelektronik worldwide 7, no. 3 (June 2012): 10–15. http://dx.doi.org/10.1365/s38314-012-0088-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Yong, Liyi Zhang, Jianfeng Han, Yi Yang, and Xinyuan Ma. "Sensor management based on collaborative information fusion algorithm for gas source localisation." International Journal of Sensor Networks 28, no. 4 (2018): 242. http://dx.doi.org/10.1504/ijsnet.2018.096478.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Ma, Xinyuan, Yi Yang, Jianfeng Han, Liyi Zhang, and Yong Zhang. "Sensor management based on collaborative information fusion algorithm for gas source localisation." International Journal of Sensor Networks 28, no. 4 (2018): 242. http://dx.doi.org/10.1504/ijsnet.2018.10017833.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Liu, Ang, Shiwei Lin, Jianguo Wang, and Xiaoying Kong. "A Novel Loosely Coupling Fusion Approach of Ultra-Wideband and Wheel Odometry for Indoor Localisation." Electronics 12, no. 21 (November 1, 2023): 4499. http://dx.doi.org/10.3390/electronics12214499.

Full text
Abstract:
Ultra-wideband (UWB) systems promise centimetre-level accuracy for indoor positioning, yet they remain susceptible to non-line-of-sight (NLOS) errors due to complex indoor environments. A fusion mechanism that integrates the UWB with an odometer sensor is introduced to address this challenge and achieve a high positioning accuracy. A sliding window method is applied to identify NLOS anchors effectively. The modified UWB-only positioning has an average error under 13 cm with an RMSE of 16 cm. Then, a loosely coupled approach named Dynamic Dimension Fusion (DDF) is designed to mitigate the odometer’s cumulative errors that achieve a remarkable average error and RMSE below 5 cm, notably superior to established unscented Kalman filter (UKF) fusion techniques. DDF utilises UWB data to correct the one-dimensional heading error of the odometer when the robot moves in a straight line and to correct both heading and mileage in two dimensions when the robot is turning. Comprehensive real-world experimental evaluations underscore the efficacy and robustness of this novel approach.
APA, Harvard, Vancouver, ISO, and other styles
21

Ochoa-de-Eribe-Landaberea, Aitor, Leticia Zamora-Cadenas, Oier Peñagaricano-Muñoa, and Igone Velez. "UWB and IMU-Based UAV’s Assistance System for Autonomous Landing on a Platform." Sensors 22, no. 6 (March 18, 2022): 2347. http://dx.doi.org/10.3390/s22062347.

Full text
Abstract:
This work presents a novel landing assistance system (LAS) capable of locating a drone for a safe landing after its inspection mission. The location of the drone is achieved by a fusion of ultra-wideband (UWB), inertial measurement unit (IMU) and magnetometer data. Unlike other typical landing assistance systems, the UWB fixed sensors are placed around a 2 × 2 m landing platform and two tags are attached to the drone. Since this type of set-up is suboptimal for UWB location systems, a new positioning algorithm is proposed for a correct performance. First, an extended Kalman filter (EKF) algorithm is used to calculate the position of each tag, and then both positions are combined for a more accurate and robust localisation. As a result, the obtained positioning errors can be reduced by 50% compared to a typical UWB-based landing assistance system. Moreover, due to the small demand of space, the proposed landing assistance system can be used almost anywhere and is deployed easily.
APA, Harvard, Vancouver, ISO, and other styles
22

Shamsfakhr, Farhad, Andrea Motroni, Luigi Palopoli, Alice Buffi, Paolo Nepa, and Daniele Fontanelli. "Robot Localisation Using UHF-RFID Tags: A Kalman Smoother Approach †." Sensors 21, no. 3 (January 21, 2021): 717. http://dx.doi.org/10.3390/s21030717.

Full text
Abstract:
Autonomous vehicles enable the development of smart warehouses and smart factories with an increased visibility, flexibility and efficiency. Thus, effective and affordable localisation methods for indoor vehicles are attracting interest to implement real-time applications. This paper presents an Extended Kalman Smoother design to both localise a mobile agent and reconstruct its entire trajectory through a sensor-fusion employing the UHF-RFID passive technology. Extensive simulations are carried out by considering the smoother optimal-window length and the effect of missing measurements from reference tags. Monte Carlo simulations are conducted for different vehicle trajectories and for different linear and angular velocities to evaluate the method accuracy. Then, an experimental analysis with a unicycle wheeled robot is performed in real indoor scenario, showing a position and orientation root mean square errors of 15 cm, and 0.2 rad, respectively.
APA, Harvard, Vancouver, ISO, and other styles
23

Yuan, Chuanqian, Ye Wang, and Jun Liu. "Research on multi-sensor fusion-based AGV positioning and navigation technology in storage environment." Journal of Physics: Conference Series 2378, no. 1 (December 1, 2022): 012052. http://dx.doi.org/10.1088/1742-6596/2378/1/012052.

Full text
Abstract:
Abstract Aiming at the problems of low positioning accuracy and large trajectory deviation when AGVs work in dynamic storage environments, an AGV positioning and navigation method based on multi-sensor data fusion is studied. The method is based on traceless Kalman filter (UKF) and adaptive Monte Carlo localisation (AMCL) algorithms to fuse three kinds of sensor data, namely LiDAR, inertial guidance system and ultra-wideband positioning system, to finally output the accurate attitude of the AGV [1]. Remoting data communication between the AGV and the host computer through TCP protocol, and the host computer remotely controls the AGV to achieve autonomous positioning and navigation after the sensor data is fused. Experiments are carried out on a MATLAB simulation experiment platform, and the results show that the positioning and navigation system has high positioning accuracy and trajectory fitting, and has certain practicality in dynamic storage environment.
APA, Harvard, Vancouver, ISO, and other styles
24

Tholen, Christoph, Iain Parnum, Robin Rofallski, Lars Nolle, and Oliver Zielinski. "Investigation of the Spatio-Temporal Behaviour of Submarine Groundwater Discharge Using a Low-Cost Multi-Sensor-Platform." Journal of Marine Science and Engineering 9, no. 8 (July 26, 2021): 802. http://dx.doi.org/10.3390/jmse9080802.

Full text
Abstract:
Submarine groundwater discharge (SGD) is an important pathway of nutrients into coastal areas. During the last decades, interest of researchers in SGDs has grown continuously. However, methods applied for SGD research usually focus on the aquifer or on the mixing processes on larger scales. The distribution of discharged water within the water column is not well investigated. Small remotely operated vehicles (ROV) equipped with environmental sensors can be used to investigate the spatial distribution of environmental parameters in the water column. Herein, a low-cost multi-sensor platform designed to investigate the spatial distribution of water quality properties is presented. The platform is based on an off-the-shelf underwater vehicle carrying various environmental sensors and a short-baseline localisation system. This contribution presents the results of SGD investigations in the area of Woodman Point (Western Australia). Various potential SGD plumes were detected using a skiff equipped with a recreational echo sounder. It was demonstrated that this inexpensive equipment could be used to detect and investigate SGDs in coastal areas. In addition, the low-cost multi-sensor platform was deployed to investigate the spatial distribution of environmental parameters including temperature (T), electric conductivity (EC), dissolved oxygen (DO), oxidation-reduction potential (ORP), pH, and dissolved organic matter fluorescence (FDOM). Three ROV surveys were conducted from different skiff locations. Analyses of the spatial distribution of the environmental parameters allowed the identification of nine potential SGD plumes. At the same locations, plumes were identified during the sonar surveys. In addition, fuzzy logic was used for the fusion of salinity, DO, and FDOM readings in order to enhance SGD detection capability of the designed multi-sensor system. The fuzzy logic approach identified 293 data points as potential within a SGD plume. Average minimum-distance between these points and the identified SGD plumes was 0.5 m and 0.42 m smaller than the minimum-distance average of the remaining data points of survey one and three respectively. It was shown that low-cost ROVs, equipped with environmental sensors, could be an important tool for the investigation of the spatio-temporal behaviour of SGD sites. This method allows continuous mapping of environmental parameters with a high spatial and temporal resolution. However, to obtain deeper insights into the influence of SGDs on the nearshore areas, this method should be combined with other well-established methods for SGD investigation, such as pore water sampling, remote sensing, or groundwater monitoring.
APA, Harvard, Vancouver, ISO, and other styles
25

Bosse, Stefan, Dennis Weiss, and Daniel Schmidt. "Supervised Distributed Multi-Instance and Unsupervised Single-Instance Autoencoder Machine Learning for Damage Diagnostics with High-Dimensional Data—A Hybrid Approach and Comparison Study." Computers 10, no. 3 (March 18, 2021): 34. http://dx.doi.org/10.3390/computers10030034.

Full text
Abstract:
Structural health monitoring (SHM) is a promising technique for in-service inspection of technical structures in a broad field of applications in order to reduce maintenance efforts as well as the overall structural weight. SHM is basically an inverse problem deriving physical properties such as damages or material inhomogeneity (target features) from sensor data. Often models defining the relationship between predictable features and sensors are required but not available. The main objective of this work is the investigation of model-free distributed machine learning (DML) for damage diagnostics under resource and failure constraints by using multi-instance ensemble and model fusion strategies and featuring improved scaling and stability compared with centralised single-instance approaches. The diagnostic system delivers two features: A binary damage classification (damaged or non-damaged) and an estimation of the spatial damage position in case of a damaged structure. The proposed damage diagnostics architecture should be able to be used in low-resource sensor networks with soft real-time capabilities. Two different machine learning methodologies and architectures are evaluated and compared posing low- and high-resolution sensor processing for low- and high-resolution damage diagnostics, i.e., a dedicated supervised trained low-resource and an unsupervised trained high-resource deep learning approach, respectively. In both architectures state-based recurrent artificial neural networks are used that process spatially and time-resolved sensor data from experimental ultrasonic guided wave measurements of a hybrid material (carbon fibre laminate) plate with pseudo defects. Finally, both architectures can be fused to a hybrid architecture with improved damage detection accuracy and reliability. An extensive evaluation of the damage prediction by both systems shows high reliability and accuracy of damage detection and localisation, even by the distributed multi-instance architecture with a resolution in the order of the sensor distance.
APA, Harvard, Vancouver, ISO, and other styles
26

Adámek, Roman, Martin Brablc, Patrik Vávra, Barnabás Dobossy, Martin Formánek, and Filip Radil. "Analytical Models for Pose Estimate Variance of Planar Fiducial Markers for Mobile Robot Localisation." Sensors 23, no. 12 (June 20, 2023): 5746. http://dx.doi.org/10.3390/s23125746.

Full text
Abstract:
Planar fiducial markers are commonly used to estimate a pose of a camera relative to the marker. This information can be combined with other sensor data to provide a global or local position estimate of the system in the environment using a state estimator such as the Kalman filter. To achieve accurate estimates, the observation noise covariance matrix must be properly configured to reflect the sensor output’s characteristics. However, the observation noise of the pose obtained from planar fiducial markers varies across the measurement range and this fact needs to be taken into account during the sensor fusion to provide a reliable estimate. In this work, we present experimental measurements of the fiducial markers in real and simulation scenarios for 2D pose estimation. Based on these measurements, we propose analytical functions that approximate the variances of pose estimates. We demonstrate the effectiveness of our approach in a 2D robot localisation experiment, where we present a method for estimating covariance model parameters based on user measurements and a technique for fusing pose estimates from multiple markers.
APA, Harvard, Vancouver, ISO, and other styles
27

García Daza, Iván, Mónica Rentero, Carlota Salinas Maldonado, Ruben Izquierdo Gonzalo, Noelia Hernández Parra, Augusto Ballardini, and David Fernandez Llorca. "Fail-Aware LIDAR-Based Odometry for Autonomous Vehicles." Sensors 20, no. 15 (July 23, 2020): 4097. http://dx.doi.org/10.3390/s20154097.

Full text
Abstract:
Autonomous driving systems are set to become a reality in transport systems and, so, maximum acceptance is being sought among users. Currently, the most advanced architectures require driver intervention when functional system failures or critical sensor operations take place, presenting problems related to driver state, distractions, fatigue, and other factors that prevent safe control. Therefore, this work presents a redundant, accurate, robust, and scalable LiDAR odometry system with fail-aware system features that can allow other systems to perform a safe stop manoeuvre without driver mediation. All odometry systems have drift error, making it difficult to use them for localisation tasks over extended periods. For this reason, the paper presents an accurate LiDAR odometry system with a fail-aware indicator. This indicator estimates a time window in which the system manages the localisation tasks appropriately. The odometry error is minimised by applying a dynamic 6-DoF model and fusing measures based on the Iterative Closest Points (ICP), environment feature extraction, and Singular Value Decomposition (SVD) methods. The obtained results are promising for two reasons: First, in the KITTI odometry data set, the ranking achieved by the proposed method is twelfth, considering only LiDAR-based methods, where its translation and rotation errors are 1.00 % and 0.0041 deg/m, respectively. Second, the encouraging results of the fail-aware indicator demonstrate the safety of the proposed LiDAR odometry system. The results depict that, in order to achieve an accurate odometry system, complex models and measurement fusion techniques must be used to improve its behaviour. Furthermore, if an odometry system is to be used for redundant localisation features, it must integrate a fail-aware indicator for use in a safe manner.
APA, Harvard, Vancouver, ISO, and other styles
28

Dinh, To Xuan, Nguyen Thi Thu Huong, Nguyen Ngoc Tuan, and Nguyen Thanh Tien. "Motion planning and control of an autonomous mobile robot." Ministry of Science and Technology, Vietnam 65, no. 4 (December 15, 2023): 3–10. http://dx.doi.org/10.31276/vjste.65(4).03-10.

Full text
Abstract:
The application of autonomous mobile robots (AMRs) has gradually become crucial in smart factories due to the advantages of improving production efficiency and reducing labour costs. Motion planning has been a key part of AMR control development. This paper presents motion planning and position tracking control systems of an omnidirectional wheel AMR powered by a hybrid fuel cell and battery power source. First, the kinematical and dynamic models of the AMR are introduced. The navigation system comprises three loops, with the first loop being motor control, the second loop being position tracking control, and a motion planning layer. The position data of the AMR for feedback control is obtained through sensor fusion of data from the inertial measurement unit (IMU) sensor, encoder sensor, and ranging sensor with simultaneous localisation and mapping (SLAM) algorithm. The motion planning is then applied to obtain an optimal path with the shortest distance and collision-free movement. In addition, the tracking algorithm is designed to drive the AMR to follow the optimal path and achieve high accuracy. The experimental results show a 30% improvement in tracking accuracy compared to traditional approaches and 8 hours of continuous working, which is promising for industrial applications, and the results are satisfactory in terms of both accuracy and efficiency requirements.
APA, Harvard, Vancouver, ISO, and other styles
29

Bassiri, Amin, Mohammadreza Asghari Oskoei, and Anahid Basiri. "Particle Filter and Finite Impulse Response Filter Fusion and Hector SLAM to Improve the Performance of Robot Positioning." Journal of Robotics 2018 (November 11, 2018): 1–9. http://dx.doi.org/10.1155/2018/7806854.

Full text
Abstract:
Indoor position estimation is essential for navigation; however, it is a challenging task mainly due to the indoor environments’ (a) high noise to signal ratio and (b) low sampling rate and (c) sudden changes to the environments. This paper uses a hybrid filter algorithm for the indoor positioning system for robot navigation integrating Particle Filter (PF) algorithm and Finite Impulse Response (FIR) filter algorithm to assure the continuity of the positioning solution. Additionally, the Hector Simultaneous Localisation and Mapping (Hector SLAM) algorithm is used to map the environment and improve the accuracy of the navigation. The paper implements the hybrid algorithm that uses the integrated PF, FIR, and Hector SLAM, using an embedded laser scanner sensor. The hybrid algorithm coupled with Hector SLAM is tested in several scenarios to evaluate the performance of the system, in terms of continuity and accuracy of the position estimation, and compares it with similar systems. The scenarios where the system is tested include reducing the laser sensor readings (low sampling rate), dynamic environments (change in the location of the obstacles), and the kidnapped robot situation. The results show that the system provides a significantly better accuracy and continuity of the position estimation in all scenarios, even in comparison with similar hybrid systems, except where there is a high and constant noise, where the performance of the hybrid filter and the simple PF seems almost the same.
APA, Harvard, Vancouver, ISO, and other styles
30

George, Anand, Niko Koivumäki, Teemu Hakala, Juha Suomalainen, and Eija Honkavaara. "Visual-Inertial Odometry Using High Flying Altitude Drone Datasets." Drones 7, no. 1 (January 4, 2023): 36. http://dx.doi.org/10.3390/drones7010036.

Full text
Abstract:
Positioning of unoccupied aerial systems (UAS, drones) is predominantly based on Global Navigation Satellite Systems (GNSS). Due to potential signal disruptions, redundant positioning systems are needed for reliable operation. The objective of this study was to implement and assess a redundant positioning system for high flying altitude drone operation based on visual-inertial odometry (VIO). A new sensor suite with stereo cameras and an inertial measurement unit (IMU) was developed, and a state-of-the-art VIO algorithm, VINS-Fusion, was used for localisation. Empirical testing of the system was carried out at flying altitudes of 40–100 m, which cover the common flight altitude range of outdoor drone operations. The performance of various implementations was studied, including stereo-visual-odometry (stereo-VO), monocular-visual-inertial-odometry (mono-VIO) and stereo-visual-inertial-odometry (stereo-VIO). The stereo-VIO provided the best results; the flight altitude of 40–60 m was the most optimal for the stereo baseline of 30 cm. The best positioning accuracy was 2.186 m for a 800 m-long trajectory. The performance of the stereo-VO degraded with the increasing flight altitude due to the degrading base-to-height ratio. The mono-VIO provided acceptable results, although it did not reach the performance level of the stereo-VIO. This work presented new hardware and research results on localisation algorithms for high flying altitude drones that are of great importance since the use of autonomous drones and beyond visual line-of-sight flying are increasing and will require redundant positioning solutions that compensate for potential disruptions in GNSS positioning. The data collected in this study are published for analysis and further studies.
APA, Harvard, Vancouver, ISO, and other styles
31

Schwarz, D., R. H. Rasshofer, and E. M. Biebl. "Optimized tracking for cooperative sensor systems in multipath environments." Advances in Radio Science 6 (May 26, 2008): 71–75. http://dx.doi.org/10.5194/ars-6-71-2008.

Full text
Abstract:
Abstract. In a cooperative sensor system for pedestrian protection, a pedestrian and other road users exchange data by means of radio frequency communication. In the proposed system, the pedestrian carries a transponder which is interrogated by a vehicle and sends an anonymous identification (ID) sequence. By decoding the ID, the interrogation unit in the vehicle detects the presence of the transponder. Evaluating the incident wave of the transponder's answer, a localisation is possible. In the proposed localization system, the measurement results can be distorted by multipath propagation. Multipath errors result if signals of the same transponder arrive simultaneously at the receiver unit from different directions. In this case, erroneous distances and angles are measured. Because the signals arriving from different directions contain the same transponder ID, they can be assigned to their origin. One of the challenges in post-processing for signal improvement is enhancing the selection of the correct position information by making assumptions about the pedestrian's movement and by knowing the vehicle's current driving parameters. Additionally, information contained in multipath signals is used to form a better estimate for the true position of the transponder. To overcome the problems related to multipath propagation effects, a method is proposed that estimates the origin of a multipath signal and maps the distorted position information back to the true position. A fusion of tracked direct positions and mapped multipath signals leads to an improvement in positioning accuracy.
APA, Harvard, Vancouver, ISO, and other styles
32

Zhao, Zhongyi, Yongzhi Cui, and Shenyu Wang. "High-Precision Positioning Simulation and Experimental Research of Special Operation Vehicles Based on Network RTK." Computational Intelligence and Neuroscience 2023 (February 2, 2023): 1–15. http://dx.doi.org/10.1155/2023/2911846.

Full text
Abstract:
In terms of driverless systems, high-precision positioning technology is one among the critical aspects of driverless cars to achieve driverlessness. This study analyzed the working principles of GNSS (global navigation satellite system) and SINS (strapdown inertial navigation system) and elaborated the principles of the least square method and LAMBDA algorithm in the integer ambiguity resolution. Based on the network RTK positioning technology and the abovementioned theory, the unmanned automatic work vehicle was used as the research object, and the fusion positioning algorithm of the BDS/GPS system and inertial sensor was used to propose a high-precision positioning technology for the unmanned automatic work vehicle. The combined navigation system model was studied and constructed. Relevant verification was carried out through simulation and experiment. The results were as follows: the pitch angle error was less than 0.1°, the roll angle error was less than 0.05°, the speed error was less than 0.2 m/s, and the position error was less than 2.1 m. The outcomes indicate that an integrated navigation and positioning algorithm for driverless vehicles can significantly enhance the localisation accuracy and reliability of navigation. The research results are of engineering value and practical application for the development of unmanned automatic special vehicle positioning systems.
APA, Harvard, Vancouver, ISO, and other styles
33

Wijayathunga, Liyana, Alexander Rassau, and Douglas Chai. "Challengesand Solutions for Autonomous Ground Robot Scene Understanding and Navigation in Unstructured Outdoor Environments: A Review." Applied Sciences 13, no. 17 (August 31, 2023): 9877. http://dx.doi.org/10.3390/app13179877.

Full text
Abstract:
The capabilities of autonomous mobile robotic systems have been steadily improving due to recent advancements in computer science, engineering, and related disciplines such as cognitive science. In controlled environments, robots have achieved relatively high levels of autonomy. In more unstructured environments, however, the development of fully autonomous mobile robots remains challenging due to the complexity of understanding these environments. Many autonomous mobile robots use classical, learning-based or hybrid approaches for navigation. More recent learning-based methods may replace the complete navigation pipeline or selected stages of the classical approach. For effective deployment, autonomous robots must understand their external environments at a sophisticated level according to their intended applications. Therefore, in addition to robot perception, scene analysis and higher-level scene understanding (e.g., traversable/non-traversable, rough or smooth terrain, etc.) are required for autonomous robot navigation in unstructured outdoor environments. This paper provides a comprehensive review and critical analysis of these methods in the context of their applications to the problems of robot perception and scene understanding in unstructured environments and the related problems of localisation, environment mapping and path planning. State-of-the-art sensor fusion methods and multimodal scene understanding approaches are also discussed and evaluated within this context. The paper concludes with an in-depth discussion regarding the current state of the autonomous ground robot navigation challenge in unstructured outdoor environments and the most promising future research directions to overcome these challenges.
APA, Harvard, Vancouver, ISO, and other styles
34

Hoppenot, Philippe, Etienne Colle, and Christian Barat. "Off-line localisation of a mobile robot using ultrasonic measurements." Robotica 18, no. 3 (May 2000): 315–23. http://dx.doi.org/10.1017/s0263574799002180.

Full text
Abstract:
Regarding assistance to disabled people for object manipulation and carrying, the paper focuses on the localisation for mobile robot autonomy. In order to respect strong low-cost constraints, the perception system of the mobile robot uses sensors of low metrological quality, ultrasonic ring and odometry. That poses new problems for localisation, in particular. Among different localisation techniques, we present only off-line localisation. With poor perception means, it is necessary to introduce a priori knowledge on sensors and environment models. To solve the localisation problem, the ultrasonic image is segmented applying the Hough transform, well-adapted to ultrasonic sensor characteristics. The segments are then matched with the room, modelled and assumed to be rectangular. Several positions are found. A first sort, based on a cost function, reduces the possibilities. The remaining ambiguities are removed by a neural network which plays the part of a classifier detecting the door in the environment. Improvements of the method are proposed to take into account obstacles and non-rectangular room. Experimental results show that the localisation operates even with one obstacle.
APA, Harvard, Vancouver, ISO, and other styles
35

Jurdak, Raja, Branislav Kusy, and Alban Cotillon. "Group-based Motion Detection for Energy-Efficient Localisation." Journal of Sensor and Actuator Networks 1, no. 3 (October 19, 2012): 183–216. http://dx.doi.org/10.3390/jsan1030183.

Full text
Abstract:
Long-term outdoor localization remains challenging due to the high energy profiles of GPS modules. Duty cycling the GPS module combined with inertial sensors can improve energy consumption. However, inertial sensors that are kept active all the time can also drain mobile node batteries. This paper proposes duty cycling strategies for inertial sensors to maintain a target position accuracy and node lifetime. We present a method for duty cycling motion sensors according to features of movement events, and evaluate its energy and accuracy profile for an empirical data trace of cattle movement. We further introduce the concept of group-based duty cycling, where nodes that cluster together can share the burden of motion detection to reduce their duty cycles. Our evaluation shows that both variants of motion sensor duty cycling yield up to 78% improvement in overall node power consumption, and that the group-based method yields an additional 20% power reduction during periods of low mobility.
APA, Harvard, Vancouver, ISO, and other styles
36

Pan, Lei, Xi Zheng, Philip Kolar, and Shaun Bangay. "Object localisation through clustering unreliable ultrasonic range sensors." International Journal of Sensor Networks 27, no. 4 (2018): 268. http://dx.doi.org/10.1504/ijsnet.2018.093965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Zheng, Xi, Philip Kolar, Shaun Bangay, and Lei Pan. "Object localisation through clustering unreliable ultrasonic range sensors." International Journal of Sensor Networks 27, no. 4 (2018): 268. http://dx.doi.org/10.1504/ijsnet.2018.10014898.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

White, P. R., T. G. Leighton, D. C. Finfer, C. Powles, and O. N. Baumann. "Localisation of sperm whales using bottom-mounted sensors." Applied Acoustics 67, no. 11-12 (November 2006): 1074–90. http://dx.doi.org/10.1016/j.apacoust.2006.05.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Narita, Masaki, Keisuke Kamada, Kanayo Ogura, Bhed Bahadur Bista, and Toyoo Takata. "Countermeasures against darknet localisation attacks with packet sampling." Indonesian Journal of Electrical Engineering and Computer Science 19, no. 2 (August 1, 2020): 1036. http://dx.doi.org/10.11591/ijeecs.v19.i2.pp1036-1047.

Full text
Abstract:
<p>The darknet monitoring system consists of network sensors widely deployed on the Internet to capture incoming unsolicited packets. A goal of this system is to analyse captured malicious packets and provide effective information to protect regular nonmalicious Internet users from malicious activities. To provide effective and reliable information, the location of sensors must be concealed. However, attackers launch localisation attacks to detect sensors in order to evade them. If the actual location of sensors is revealed, it is almost impossible to identify the latest tactics used by attackers. Thus, in a previous study, we proposed a packet sampling method, which samples incoming packets based on an attribute of the packet sender, to increase tolerance to a localisation attack and maintain the quality of information publicised by the system. We were successful in countering localisation attacks, which generate spikes on the publicised graph to detect a sensor. However, in some cases, with the previously proposed sampling method, spikes were clearly evident on the graph. Therefore, in this paper, we propose advanced sampling methods such that incoming packets are sampled based on multiple attributes of the packet sender. We present our improved methods and show promising evaluation results obtained via a simulation.</p>
APA, Harvard, Vancouver, ISO, and other styles
40

Kudela, Pawel, Wiesław M. Ostachowicz, and Arkadiusz Zak. "Sensor Triangulation for Damage Localisation in Composite Plates." Key Engineering Materials 413-414 (June 2009): 55–62. http://dx.doi.org/10.4028/www.scientific.net/kem.413-414.55.

Full text
Abstract:
The aim of this paper is development of an algorithm for damage localisation in composite plates based on wave propagation signals registered by sensors. It is proposed to distribute the sensors uniformly over the area of a plate-like structure performing triangulation. Next the registered signals are processed and a damage influence map is created in each triangle separately in order to avoid problems connected with reflections from boundaries of the structure. The proposed procedure has been verified on numerical signals as well as experimental signals.
APA, Harvard, Vancouver, ISO, and other styles
41

Jose, Diniya, and Shoney Sebastian. "Taylor series method in TDOA approach for indoor positioning system." International Journal of Electrical and Computer Engineering (IJECE) 9, no. 5 (October 1, 2019): 3927. http://dx.doi.org/10.11591/ijece.v9i5.pp3927-3933.

Full text
Abstract:
Localisation technologies have always remained in the limelight of positioning-science as researchers have ever shown keen interest to know the exact positions of things. Ultrasonic sensors are mainly used for localisation of mobile robots since they provide high accuracy. This paper presents Taylor Series Method in Time Difference of Arrival approach using ultrasonic sensors.Signals are send from the sensors periodically.The time difference of arrival of signals from the ultrasonic sensors is used by the receiver unit to estimate the location of the mobile unit. The equations formed by using Time Difference of Approach are solved using Taylor Series Method. Taylor Series Method provides a more accurate result since they give less error compared to other methods and they ignore the measurement errors.
APA, Harvard, Vancouver, ISO, and other styles
42

Levy dit Vehel, Victor, Ange Haddjeri, and Osvanny Ramos. "Acoustic localisation in a two-dimensional granular medium." EPJ Web of Conferences 249 (2021): 15005. http://dx.doi.org/10.1051/epjconf/202124915005.

Full text
Abstract:
We focus on localizing the source of acoustic emissions within a compressed two-dimensional granular ensemble of photoelastic disks, having as main information the arrival times of the acoustic signal to 6 sensors located in the boundaries of the system. By estimating, thanks to the photoelasticity of the grains, the wave speed at every point of the structure, we are able to compute the arrival times from every point of the system to the sensors. A comparison between the arrival time differences between every set of computed values to those from the actual measurements allows finding the source of the acoustic emissions.
APA, Harvard, Vancouver, ISO, and other styles
43

Lawal, Isah A., and Sophia Bano. "Deep Human Activity Recognition With Localisation of Wearable Sensors." IEEE Access 8 (2020): 155060–70. http://dx.doi.org/10.1109/access.2020.3017681.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Trusheim, P., Y. Chen, F. Rottensteiner, and C. Heipke. "COOPERATIVE LOCALISATION USING IMAGE SENSORS IN A DYNAMIC TRAFFIC SCENARIO." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2021 (June 28, 2021): 117–24. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2021-117-2021.

Full text
Abstract:
Abstract. Localisation is one of the key elements in navigation. Especially due to the development in automated driving, precise and reliable localisation becomes essential. In this paper, we report on different cooperation approaches in visual localisation with two vehicles driving in a convoy formation. Each vehicle is equipped with a multi-sensor platform consisting of front-facing stereo cameras and a global navigation satellite system (GNSS) receiver. In the first approach, the GNSS signals are used as excentric observations for the projection centres of the cameras in a bundle adjustment, whereas the second approach uses markers on the front vehicle as dynamic ground control points (GCPs). As the platforms are moving and data acquisition is not synchronised, we use time dependent platform poses. These time dependent poses are represented by trajectories consisting of multiple 6 Degree of Freedom (DoF) anchor points between which linear interpolation takes place. In order to investigate the developed approach experimentally, in particular the potential of dynamic GCPs, we captured data using two platforms driving on a public road at normal speed. As a baseline, we determine the localisation parameters of one platform using only data of that platform. We then compute a solution based on image and GNSS data from both platforms. In a third scenario, the front platform is used as a dynamic GCP which can be related to the trailing platform by markers observed in the images acquired by the latter. We show that both cooperative approaches lead to significant improvements in the precision of the poses of the anchor points after bundle adjustment compared to the baseline. The improvement achieved due to the inclusion of dynamic GCPs is somewhat smaller than the one due to relating the platforms by tie points. Finally, we show that for an individual vehicle, the use of dynamic GCPs can compensate for the lack of GNSS data.
APA, Harvard, Vancouver, ISO, and other styles
45

Zhang, Yuchen, Bo Chen, and Li Yu. "Fusion estimation under binary sensors." Automatica 115 (May 2020): 108861. http://dx.doi.org/10.1016/j.automatica.2020.108861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Sabra, Adham, and Wai-Keung Fung. "A Fuzzy Cooperative Localisation Framework for Underwater Robotic Swarms." Sensors 20, no. 19 (September 25, 2020): 5496. http://dx.doi.org/10.3390/s20195496.

Full text
Abstract:
This article proposes a holistic localisation framework for underwater robotic swarms to dynamically fuse multiple position estimates of an autonomous underwater vehicle while using fuzzy decision support system. A number of underwater localisation methods have been proposed in the literature for wireless sensor networks. The proposed navigation framework harnesses the established localisation methods in order to provide navigation aids in the absence of acoustic exteroceptive sensors navigation aid (i.e., ultra-short base line) and it can be extended to accommodate newly developed localisation methods by expanding the fuzzy rule base. Simplicity, flexibility, and scalability are the main three advantages that are inherent in the proposed localisation framework when compared to other traditional and commonly adopted underwater localisation methods, such as the Extended Kalman Filter. A physics-based simulation platform that considers environment’s hydrodynamics, industrial grade inertial measurement unit, and underwater acoustic communications characteristics is implemented in order to validate the proposed localisation framework on a swarm size of 150 autonomous underwater vehicles. The proposed fuzzy-based localisation algorithm improves the entire swarm mean localisation error and standard deviation by 16.53% and 35.17%, respectively, when compared to the Extended Kalman Filter based localisation with round-robin scheduling.
APA, Harvard, Vancouver, ISO, and other styles
47

Majewska, Katarzyna, Magdalena Mieloszyk, and Wieslaw Ostachowicz. "Application of FBGs Grids for Damage Detection and Localisation." Applied Mechanics and Materials 70 (August 2011): 375–80. http://dx.doi.org/10.4028/www.scientific.net/amm.70.375.

Full text
Abstract:
Structural Health Monitoring (SHM) technology allows significant improvement of safety of structures and ecological awareness. SHM systems allow users to reduce maintenance costs which are directly connected with non-destructive techniques used for monitoring of important structures. In this paper Authors present an applications of Fibre Bragg Gratings (FBGs) for damage detection and damage localisation in isotropic beam. FBGs sensors are excellent tools for evaluating conditions of a structure due to their sensitivity, small size and weight as well the immunity of the sensors to electromagnetic field interferences.
APA, Harvard, Vancouver, ISO, and other styles
48

Qi, Ji Yuan, and Xiao Jun Hu. "Track Handing Fusion in Data Fusion and Database." Advanced Materials Research 468-471 (February 2012): 959–62. http://dx.doi.org/10.4028/www.scientific.net/amr.468-471.959.

Full text
Abstract:
Track message is obtained with a lot of adopted sensors. For one sensor, the reliability may not the same in different condition. A reliability judgment matrix for each target is constructed based on the relative reliability of information offered by sensors. Then, the reliabilities of all the sensors for each target and their general reliability can be gotten. A numerical example is presented. The well reliability sensor’s information will be chose to merge with Kalman filter algorithm in tracking handling based on data fusion technique, and gain the best estimate track. According to the demand of data fusion for the data, establish a real-time database by using the objected technology to fulfill the real-time demand of fusion algorithm. And establish a history database by Relation Database. The data in data fusion system can be managed through the two databases. In this way the data fusion system can work efficiently.
APA, Harvard, Vancouver, ISO, and other styles
49

Xu, Shi Jun, Li Hong, and Yong Hong Hu. "A Distributed Bayesian Fusion Algorithm Research." Advanced Materials Research 181-182 (January 2011): 1006–12. http://dx.doi.org/10.4028/www.scientific.net/amr.181-182.1006.

Full text
Abstract:
In this paper, the signal detection problem when distributed sensors are used a global decision is desired is considered. Local decisions from the sensors are fed to the data fusion center which then yields a global decision based on a fusion rule. Based on The data fusion theories of Bayesian criterion used for a distributed parallel structure, fusion rules at the fusion center、 the decision rules of sensors and the results of the computer simulation for two identical sensors, two different sensors and three identical sensors are presented. The results of the computer simulation show that the performance of the fusion system, compared with the sensor, has been improved. For the case there are three identical sensors in the fusion system, Bayesian risk is reduced by 26.5%, compared with the sensor.
APA, Harvard, Vancouver, ISO, and other styles
50

Hall, A., D. St Leger, A. Singh, and R. K. Lingam. "The utility of computed tomography and diffusion-weighted magnetic resonance imaging fusion in cholesteatoma: illustration with a UK case series." Journal of Laryngology & Otology 134, no. 2 (January 8, 2020): 178–83. http://dx.doi.org/10.1017/s0022215119002640.

Full text
Abstract:
AbstractObjectivePost-processing imaging techniques allow high-resolution computed tomography and diffusion-weighted magnetic resonance imaging of the temporal bone to be superimposed and viewed simultaneously (fusion imaging). This study aimed to highlight the practical utility of fusion imaging for disease localisation and evaluation in a UK case series of primary and post-operative cholesteatoma.MethodFusion of computed tomography and diffusion-weighted magnetic resonance b1000 images was performed using specific software. Axial computed tomography images and coronal b1000 images were selected for fusion.ResultsA case series of primary and post-operative cholesteatoma in which computed tomography and magnetic resonance imaging fusion assisted the management of both the patient pathway and surgical approach is reviewed.ConclusionComputed tomography and magnetic resonance imaging fusion can assist in pre-operative surgical planning and patient counselling through assessment of disease in both primary and revision scenarios. Computed tomography and magnetic resonance imaging fusion can assist the operative surgeon through accurate localisation that can influence both the operative technique and optimise operation theatre utilisation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography