Academic literature on the topic 'Range-Only-SLAM (Simultaneous Localization and Mapping)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Range-Only-SLAM (Simultaneous Localization and Mapping).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Range-Only-SLAM (Simultaneous Localization and Mapping)"

1

Tsubouchi, Takashi. "Introduction to Simultaneous Localization and Mapping." Journal of Robotics and Mechatronics 31, no. 3 (June 20, 2019): 367–74. http://dx.doi.org/10.20965/jrm.2019.p0367.

Full text
Abstract:
Simultaneous localization and mapping (SLAM) forms the core of the technology that supports mobile robots. With SLAM, when a robot is moving in an actual environment, real world information is imported to a computer on the robot via a sensor, and robot’s physical location and a map of its surrounding environment of the robot are created. SLAM is a major topic in mobile robot research. Although the information, supported by a mathematical description, is derived from a space in reality, it is formulated based on a probability theory when being handled. Therefore, this concept contributes not only to the research and development concerning mobile robots, but also to the training of mathematics and computer implementation, aimed mainly at position estimation and map creation for the mobile robots. This article focuses on the SLAM technology, including a brief overview of its history, insights from the author, and, finally, introduction of a specific example that the author was involved.
APA, Harvard, Vancouver, ISO, and other styles
2

Torres-González, A., J. R. Martinez-de Dios, A. Jiménez-Cano, and A. Ollero. "An Efficient Fast-Mapping SLAM Method for UAS Applications Using Only Range Measurements." Unmanned Systems 04, no. 02 (April 2016): 155–65. http://dx.doi.org/10.1142/s2301385016500035.

Full text
Abstract:
This paper deals with 3D Simultaneous Localization and Mapping (SLAM), where the UAS uses only range measurements to build a local map of an unknown environment and to self-localize in that map. In the recent years Range Only (RO) SLAM has attracted significant interest, it is suitable for non line-of-sight conditions and bad lighting, being superior to visual SLAM in some problems. However, some issues constrain its applicability in practical cases, such as delays in map building and low map and UAS estimation accuracies. This paper proposes a 3D RO-SLAM scheme for UAS that specifically focuses on improving map building delays and accuracy levels without compromising efficiency in the consumption of resources. The scheme integrates sonar measurements together with range measurements between the robot and beacons deployed in the scenario. The proposed scheme presents two main advantages: (1) it integrates direct range measurements between the robot and the beacons and also range measurements between beacons — called inter-beacon measurements — which significantly reduce map building times and improve map and UAS localization accuracies; and (2) the SLAM scheme is endowed with a supervisory module that self-adapts the measurements that are integrated in SLAM reducing computational, bandwidth and energy consumption. Experimental validation in field experiments with an octorotor UAS showed that the proposed scheme improved map building times in 72%, map accuracy in 40% and UAS localization accuracy in 12%.
APA, Harvard, Vancouver, ISO, and other styles
3

Kim, Jung-Hee, and Doik Kim. "Computationally Efficient Cooperative Dynamic Range-Only SLAM Based on Sum of Gaussian Filter." Sensors 20, no. 11 (June 10, 2020): 3306. http://dx.doi.org/10.3390/s20113306.

Full text
Abstract:
A cooperative dynamic range-only simultaneous localization and mapping (CDRO-SLAM) algorithm based on the sum of Gaussian (SoG) filter was recently introduced. The main characteristics of the CDRO-SLAM are (i) the integration of inter-node ranges as well as usual direct robot-node ranges to improve the convergence rate and localization accuracy and (ii) the tracking of any moving nodes under dynamic environments by resetting and updating the SoG variables. In this paper, an efficient implementation of the CDRO-SLAM (eCDRO-SLAM) is proposed to mitigate the high computational burden of the CDRO-SLAM due to the inter-node measurements. Furthermore, a thorough computational analysis is presented, which reveals that the computational efficiency of the eCDRO-SLAM is significantly improved over the CDRO-SLAM. The performance of the proposed eCDRO-SLAM is compared with those of several conventional RO-SLAM algorithms and the results show that the proposed efficient algorithm has a faster convergence rate and a similar map estimation error regardless of the map size. Accordingly, the proposed eCDRO-SLAM can be utilized in various RO-SLAM applications.
APA, Harvard, Vancouver, ISO, and other styles
4

Xu, S., Z. Ji, D. T. Pham, and F. Yu. "Simultaneous localization and mapping: swarm robot mutual localization and sonar arc bidirectional carving mapping." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 225, no. 3 (September 10, 2010): 733–44. http://dx.doi.org/10.1243/09544062jmes2239.

Full text
Abstract:
This work primarily aims to study robot swarm global mapping in a static indoor environment. Due to the prerequisite estimation of the robots' own poses, it is upgraded to a simultaneous localization and mapping (SLAM) problem. Five techniques are proposed to solve the SLAM problem, including the extended Kalman filter (EKF)-based mutual localization, sonar arc bidirectional carving mapping, grid-oriented correlation, working robot group substitution, and termination rule. The EKF mutual localization algorithm updates the pose estimates of not only the current robot, but also the landmark-functioned robots. The arc-carving mapping algorithm is to increase the azimuth resolution of sonar readings by using their freespace regions to shrink the possible regions. It is further improved in both accuracy and efficiency by the creative ideas of bidirectional carving, grid-orientedly correlated-arc carving, working robot group substitution, and termination rule. Software simulation and hardware experiment have verified the feasibility of the proposed SLAM philosophy when implemented in a typical medium-cluttered office by a team of three robots. Besides the combined effect, individual algorithm components have also been investigated.
APA, Harvard, Vancouver, ISO, and other styles
5

RASHIDI, ALI JABAR, and SAEED MOHAMMADLOO. "SIMULTANEOUS COOPERATIVE LOCALIZATION FOR AUVs USING RANGE-ONLY SENSORS." International Journal of Information Acquisition 08, no. 02 (June 2011): 117–32. http://dx.doi.org/10.1142/s0219878911002380.

Full text
Abstract:
The absence of GPS underwater makes navigation for autonomous underwater vehicles (AUVs) a challenge. Moreover, the use of static beacons in the form of a long baseline (LBL) array limits the operation area to a few square kilometers and requires substantial deployment effort before operations. In this paper, an algorithm for cooperative localization of AUVs is proposed. We describe a form of cooperative Simultaneous Localization and Mapping (SLAM). Each of the robots in the group is equipped with an Inertial Measurement Unit (IMU) and some of them have a range-only sonar sensor that can determine the relative distance to the others. Two estimators, in the form of a Kalman filter, process the available position information from all the members of the team and produce a pose estimate for every one of them. Simulation results are presented for a typical localization example of three AUVs formation in a large environment and indirect trajectory. The results show that our proposed method offers good localization accuracy, although a small number of low-cost sensors are needed for each vehicle, which validates that it is an economical and practical localization approach.
APA, Harvard, Vancouver, ISO, and other styles
6

Hsu, Chen-Chien, Wei-Yen Wang, Tung-Yuan Lin, Yin-Tien Wang, and Teng-Wei Huang. "Enhanced Simultaneous Localization and Mapping (ESLAM) for Mobile Robots." International Journal of Humanoid Robotics 14, no. 02 (April 16, 2017): 1750007. http://dx.doi.org/10.1142/s0219843617500074.

Full text
Abstract:
FastSLAM, such as FastSLAM 1.0 and FastSLAM 2.0, is a popular algorithm to solve the simultaneous localization and mapping (SLAM) problem for mobile robots. In real environments, however, the execution speed by FastSLAM would be too slow to achieve the objective of real-time design with a satisfactory accuracy because of excessive comparisons of the measurement with all the existing landmarks in particles, particularly when the number of landmarks is drastically increased. In this paper, an enhanced SLAM (ESLAM) is proposed, which uses not only odometer information but also sensor measurements to estimate the robot’s pose in the prediction step. Landmark information that has the maximum likelihood is then used to update the robot’s pose before updating the landmarks’ location. Compared to existing FastSLAM algorithms, the proposed ESLAM algorithm has a better performance in terms of computation efficiency as well as localization and mapping accuracy as demonstrated in the illustrated examples.
APA, Harvard, Vancouver, ISO, and other styles
7

Herranz, F., A. Llamazares, E. Molinos, M. Ocaña, and M. A. Sotelo. "WiFi SLAM algorithms: an experimental comparison." Robotica 34, no. 4 (July 18, 2014): 837–58. http://dx.doi.org/10.1017/s0263574714001908.

Full text
Abstract:
SUMMARYLocalization and mapping in indoor environments, such as airports and hospitals, are key tasks for almost every robotic platform. Some researchers suggest the use of Range-Only (RO) sensors based on WiFi (Wireless Fidelity) technology with SLAM (Simultaneous Localization And Mapping) techniques to solve both problems. The current state of the art in RO SLAM is mainly focused on the filtering approach, while the study of smoothing approaches with RO sensors is quite incomplete. This paper presents a comparison between filtering algorithms, such as EKF and FastSLAM, and a smoothing algorithm, the SAM (Smoothing And Mapping). Experimental results are obtained in indoor environments using WiFi sensors. The results demonstrate the feasibility of the smoothing approach using WiFi sensors in an indoor environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Hongling, Chengjin Zhang, Yong Song, and Bao Pang. "Information-Fusion Methods Based Simultaneous Localization and Mapping for Robot Adapting to Search and Rescue Postdisaster Environments." Journal of Robotics 2018 (2018): 1–13. http://dx.doi.org/10.1155/2018/4218324.

Full text
Abstract:
The first application of utilizing unique information-fusion SLAM (IF-SLAM) methods is developed for mobile robots performing simultaneous localization and mapping (SLAM) adapting to search and rescue (SAR) environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF) and optimal improved EKF (IEKF) algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM); multiple robots mount on-board laser range finder (LRF) sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM) is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.
APA, Harvard, Vancouver, ISO, and other styles
9

McGarey, Patrick, Kirk MacTavish, François Pomerleau, and Timothy D. Barfoot. "TSLAM: Tethered simultaneous localization and mapping for mobile robots." International Journal of Robotics Research 36, no. 12 (October 2017): 1363–86. http://dx.doi.org/10.1177/0278364917732639.

Full text
Abstract:
Tethered mobile robots are useful for exploration in steep, rugged, and dangerous terrain. A tether can provide a robot with robust communications, power, and mechanical support, but also constrains motion. In cluttered environments, the tether will wrap around a number of intermediate ‘anchor points’, complicating navigation. We show that by measuring the length of tether deployed and the bearing to the most recent anchor point, we can formulate a tethered simultaneous localization and mapping (TSLAM) problem that allows us to estimate the pose of the robot and the positions of the anchor points, using only low-cost, nonvisual sensors. This information is used by the robot to safely return along an outgoing trajectory while avoiding tether entanglement. We are motivated by TSLAM as a building block to aid conventional, camera, and laser-based approaches to simultaneous localization and mapping (SLAM), which tend to fail in dark and or dusty environments. Unlike conventional range-bearing SLAM, the TSLAM problem must account for the fact that the tether-length measurements are a function of the robot’s pose and all the intermediate anchor-point positions. While this fact has implications on the sparsity that can be exploited in our method, we show that a solution to the TSLAM problem can still be found and formulate two approaches: (i) an online particle filter based on FastSLAM and (ii) an efficient, offline batch solution. We demonstrate that either method outperforms odometry alone, both in simulation and in experiments using our TReX (Tethered Robotic eXplorer) mobile robot operating in flat-indoor and steep-outdoor environments. For the indoor experiment, we compare each method using the same dataset with ground truth, showing that batch TSLAM outperforms particle-filter TSLAM in localization and mapping accuracy, owing to superior anchor-point detection, data association, and outlier rejection.
APA, Harvard, Vancouver, ISO, and other styles
10

Wu, Ming, Lin Lin Li, Cheng Jian Li, Hong Qiao Wang, and Zhen Hua Wei. "Simultaneous Localization, Mapping and Detection of Objects for Mobile Robot Based on Information Fusion in Dynamic Environment." Advanced Materials Research 1014 (July 2014): 319–22. http://dx.doi.org/10.4028/www.scientific.net/amr.1014.319.

Full text
Abstract:
This paper presents a novel approach for simultaneous localization, mapping (SLAM) and detection of moving object based on information fusion. We use different information sources, such as laser range scanner, and monocular camera, to improve the ability of distinction between object and background environment. The approach improves the accuracy of SLAM in complex environment, reduces the interference caused by objects, and enhances the practical utility of traditional methods of SLAM. Moreover, the approach expands fields of both research and application of SLAM in combination with target tracking method. Results in real robot experiments show the effectiveness of our method.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Range-Only-SLAM (Simultaneous Localization and Mapping)"

1

Dahlin, Alfred. "Simultaneous Localization and Mapping for an Unmanned Aerial Vehicle Using Radar and Radio Transmitters." Thesis, Linköpings universitet, Reglerteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-110645.

Full text
Abstract:
The Global Positioning System (GPS) is a cornerstone in Unmanned Aerial Vehicle (UAV) navigation and is by far the most common way to obtain the position of a UAV. However, since there are many scenarios in which GPS measurements might not be available, the possibility of estimating the UAV position without using the GPS would greatly improve the overall robustness of the navigation. This thesis studies the possibility of instead using Simultaneous Localisation and Mapping (SLAM) in order to estimate the position of a UAV using an Inertial Measurement Unit (IMU) and the direction towards ground based radio transmitters without prior knowledge of their position. Simulations using appropriately generated data provides a feasibility analysis which shows promising results for position errors for outdoor trajectories over large areas, however with some issues regarding overall offset. The method seems to have potential but further studies are required using the measurements from a live flight, in order to determine the true performance.
APA, Harvard, Vancouver, ISO, and other styles
2

HERRERA, LUIS ERNESTO YNOQUIO. "MOBILE ROBOT SIMULTANEOUS LOCALIZATION AND MAPPING USING DP-SLAM WITH A SINGLE LASER RANGE FINDER." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2011. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=34617@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
SLAM (Mapeamento e Localização Simultânea) é uma das áreas mais pesquisadas na Robótica móvel. Trata-se do problema, num robô móvel, de construir um mapa sem conhecimento prévio do ambiente e ao mesmo tempo manter a sua localização nele. Embora a tecnologia ofereça sensores cada vez mais precisos, pequenos erros na medição são acumulados comprometendo a precisão na localização, sendo estes evidentes quando o robô retorna a uma posição inicial depois de percorrer um longo caminho. Assim, para melhoria do desempenho do SLAM é necessário representar a sua formulação usando teoria das probabilidades. O SLAM com Filtro Extendido de Kalman (EKF-SLAM) é uma solução básica, e apesar de suas limitações é a técnica mais popular. O Fast SLAM, por outro lado, resolve algumas limitações do EKF-SLAM usando uma instância do filtro de partículas conhecida como Rao-Blackwellized. Outra solução bem sucedida é o DP-SLAM, o qual usa uma representação do mapa em forma de grade de ocupação, com um algoritmo hierárquico que constrói mapas 2D bastante precisos. Todos estes algoritmos usam informação de dois tipos de sensores: odômetros e sensores de distância. O Laser Range Finder (LRF) é um medidor laser de distância por varredura, e pela sua precisão é bastante usado na correção do erro em odômetros. Este trabalho apresenta uma detalhada implementação destas três soluções para o SLAM, focalizado em ambientes fechados e estruturados. Apresenta-se a construção de mapas 2D e 3D em terrenos planos tais como em aplicações típicas de ambientes fechados. A representação dos mapas 2D é feita na forma de grade de ocupação. Por outro lado, a representação dos mapas 3D é feita na forma de nuvem de pontos ao invés de grade, para reduzir o custo computacional. É considerado um robô móvel equipado com apenas um LRF, sem nenhuma informação de odometria. O alinhamento entre varreduras laser é otimizado fazendo o uso de Algoritmos Genéticos. Assim, podem-se construir mapas e ao mesmo tempo localizar o robô sem necessidade de odômetros ou outros sensores. Um simulador em Matlab é implementado para a geração de varreduras virtuais de um LRF em um ambiente 3D (virtual). A metodologia proposta é validada com os dados simulados, assim como com dados experimentais obtidos da literatura, demonstrando a possibilidade de construção de mapas 3D com apenas um sensor LRF.
Simultaneous Localization and Mapping (SLAM) is one of the most widely researched areas of Robotics. It addresses the mobile robot problem of generating a map without prior knowledge of the environment, while keeping track of its position. Although technology offers increasingly accurate position sensors, even small measurement errors can accumulate and compromise the localization accuracy. This becomes evident when programming a robot to return to its original position after traveling a long distance, based only on its sensor readings. Thus, to improve SLAM s performance it is necessary to represent its formulation using probability theory. The Extended Kalman Filter SLAM (EKF-SLAM) is a basic solution and, despite its shortcomings, it is by far the most popular technique. Fast SLAM, on the other hand, solves some limitations of the EKFSLAM using an instance of the Rao-Blackwellized particle filter. Another successful solution is to use the DP-SLAM approach, which uses a grid representation and a hierarchical algorithm to build accurate 2D maps. All SLAM solutions require two types of sensor information: odometry and range measurement. Laser Range Finders (LRF) are popular range measurement sensors and, because of their accuracy, are well suited for odometry error correction. Furthermore, the odometer may even be eliminated from the system if multiple consecutive LRF scans are matched. This works presents a detailed implementation of these three SLAM solutions, focused on structured indoor environments. The implementation is able to map 2D environments, as well as 3D environments with planar terrain, such as in a typical indoor application. The 2D application is able to automatically generate a stochastic grid map. On the other hand, the 3D problem uses a point cloud representation of the map, instead of a 3D grid, to reduce the SLAM computational effort. The considered mobile robot only uses a single LRF, without any odometry information. A Genetic Algorithm is presented to optimize the matching of LRF scans taken at different instants. Such matching is able not only to map the environment but also localize the robot, without the need for odometers or other sensors. A simulation program is implemented in Matlab to generate virtual LRF readings of a mobile robot in a 3D environment. Both simulated readings and experimental data from the literature are independently used to validate the proposed methodology, automatically generating 3D maps using just a single LRF.
APA, Harvard, Vancouver, ISO, and other styles
3

Huang, Henry. "Bearing-only SLAM : a vision-based navigation system for autonomous robots." Queensland University of Technology, 2008. http://eprints.qut.edu.au/28599/.

Full text
Abstract:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
APA, Harvard, Vancouver, ISO, and other styles
4

Weber, Richard. "Automatisierte Integration von funkbasierten Sensornetzen auf Basis simultaner Lokalisierung und Kartenerstellung." 2018. https://tud.qucosa.de/id/qucosa%3A75245.

Full text
Abstract:
Ziel der vorliegenden Arbeit ist die Entwicklung eines Verfahrens zur automatisierten Integration funkbasierter drahtloser Sensornetze (engl. Wireless Sensor Network, kurz WSN) in die jeweilige Anwendungsumgebung. Die Sensornetze realisieren dort neben Kommunikationsaufgaben vor allem die Bestimmung von Ortsinformationen. Das Betriebshofmanagement im ÖPNV stellt dabei eine typische Anwendung dar. So wird auf der Grundlage permanent verfügbarer Positionskoordinaten von Bussen und Bahnen als mobile Objekte im Verkehrsumfeld eine effizientere Betriebsführung ermöglicht. Die Datenbasis in dieser Arbeit bilden zum einen geometrische Beziehungen im Sensornetz, die aus Gründen der Verfügbarkeit lediglich durch paarweise Distanzen zwischen den mobilen Objekten und den im Umfeld fest installierten Ankern beschrieben sind. Zum anderen kann auf vorhandenes digitales Kartenmaterial in Form von Vektor- und Rasterkarten bspw. von GIS-Diensten zurückgegriffen werden. Die Argumente für eine Automatisierung sind naheliegend. Einerseits soll der Aufwand der Positionskalibrierung nicht mit der Anzahl verbauter Anker skalieren, was sich ausschließlich automatisiert realisieren lässt. Dadurch werden gleichzeitig symptomatische Fehlerquellen, die aus einer manuellen Systemintegration resultieren, eliminiert. Andererseits soll die Automatisierung ein echtzeitfähiges Betreiben (z.B. Rekalibrierung und Fernwartung) gewährleisten, sodass kostenintensive Wartungs- und Servicedienstleistungen entfallen. Das entwickelte Verfahren generiert zunächst aus den Sensordaten mittels distanzbasierter simultaner Lokalisierung und Kartenerstellung (engl. Range-Only Simultaneous Localization and Mapping, kurz RO-SLAM) relative Positionsinformationen für Anker und mobile Objekte. Anschließend führt das Verfahren diese Informationen im Rahmen einer kooperativen Kartenerstellung zusammen. Aus den relativen, kooperativen Ergebnissen und dem zugrundeliegenden Kartenmaterial wird schließlich ein anwendungsspezifischer absoluter Raumbezug hergestellt. Die Ergebnisse der durchgeführten Verfahrensevaluation belegen anhand generierter semi-realer Sensordaten sowie definierter Testszenarien die Funktions- und Leistungsfähigkeit des entwickelten Verfahrens. Sie beinhalten qualifizierende Aussagen und zeigen darüber hinaus statistisch belastbare Genauigkeitsgrenzen auf.:Abbildungsverzeichnis...............................................X Tabellenverzeichnis...............................................XII Abkürzungsverzeichnis............................................XIII Symbolverzeichnis................................................XVII 1 Einleitung........................................................1 1.1 Stand der Technik...............................................3 1.2 Entwickeltes Verfahren im Überblick.............................4 1.3 Wissenschaftlicher Beitrag......................................7 1.4 Gliederung der Arbeit...........................................8 2 Grundlagen zur Verfahrensumsetzung...............................10 2.1 Überblick zu funkbasierten Sensornetzen........................10 2.1.1 Aufbau und Netzwerk..........................................11 2.1.2 System- und Technologiemerkmale..............................12 2.1.3 Selbstorganisation...........................................13 2.1.4 Räumliche Beziehungen........................................14 2.2 Umgebungsrepräsentation........................................18 2.2.1 Koordinatenbeschreibung......................................19 2.2.2 Kartentypen..................................................20 2.3 Lokalisierung..................................................22 2.3.1 Positionierung...............................................23 2.3.2 Tracking.....................................................28 2.3.3 Koordinatentransformation....................................29 3 Zustandsschätzung dynamischer Systeme............................37 3.1 Probabilistischer Ansatz.......................................38 3.1.1 Satz von Bayes...............................................39 3.1.2 Markov-Kette.................................................40 3.1.3 Hidden Markov Model..........................................42 3.1.4 Dynamische Bayes‘sche Netze..................................43 3.2 Bayes-Filter...................................................45 3.2.1 Extended Kalman-Filter.......................................48 3.2.2 Histogramm-Filter............................................51 3.2.3 Partikel-Filter..............................................52 3.3 Markov Lokalisierung...........................................58 4 Simultane Lokalisierung und Kartenerstellung.....................61 4.1 Überblick......................................................62 4.1.1 Objektbeschreibung...........................................63 4.1.2 Umgebungskarte...............................................65 4.1.3 Schließen von Schleifen......................................70 4.2 Numerische Darstellung.........................................72 4.2.1 Formulierung als Bayes-Filter................................72 4.2.2 Diskretisierung des Zustandsraums............................74 4.2.3 Verwendung von Hypothesen....................................74 4.3 Initialisierung des Range-Only SLAM............................75 4.3.1 Verzögerte und unverzögerte Initialisierung..................75 4.3.2 Initialisierungsansätze......................................76 4.4 SLAM-Verfahren.................................................80 4.4.1 Extended Kalman-Filter-SLAM..................................81 4.4.2 Incremental Maximum Likelihood-SLAM..........................90 4.4.3 FastSLAM.....................................................99 5 Kooperative Kartenerstellung....................................107 5.1 Aufbereitung der Ankerkartierungsergebnisse...................108 5.2 Ankerkarten-Merging-Verfahren.................................110 5.2.1 Auflösen von Mehrdeutigkeiten...............................110 5.2.2 Erstellung einer gemeinsamen Ankerkarte.....................115 6 Herstellung eines absoluten Raumbezugs..........................117 6.1 Aufbereitung der Lokalisierungsergebnisse.....................117 6.1.1 Generierung von Geraden.....................................119 6.1.2 Generierung eines Graphen...................................122 6.2 Daten-Matching-Verfahren......................................123 6.2.1 Vektorbasierte Karteninformationen..........................125 6.2.2 Rasterbasierte Karteninformationen..........................129 7 Verfahrensevaluation............................................133 7.1 Methodischer Ansatz...........................................133 7.2 Datenbasis....................................................135 7.2.1 Sensordaten.................................................137 7.2.2 Digitales Kartenmaterial....................................143 7.3 Definition von Testszenarien..................................145 7.4 Bewertung.....................................................147 7.4.1 SLAM-Verfahren..............................................148 7.4.2 Ankerkarten-Merging-Verfahren...............................151 7.4.3 Daten-Matching-Verfahren....................................152 8 Zusammenfassung und Ausblick....................................163 8.1 Ergebnisse der Arbeit.........................................164 8.2 Ausblick......................................................165 Literaturverzeichnis..............................................166 A Ergänzungen zum entwickelten Verfahren..........................A-1 A.1 Generierung von Bewegungsinformationen........................A-1 A.2 Erweiterung des FastSLAM-Verfahrens...........................A-2 A.3 Ablauf des konzipierten Greedy-Algorithmus....................A-4 A.4 Lagewinkel der Kanten in einer Rastergrafik...................A-5 B Ergänzungen zur Verfahrensevaluation............................A-9 B.1 Geschwindigkeitsprofile der simulierten Objekttrajektorien....A-9 B.2 Gesamtes SLAM-Ergebnis eines Testszenarios....................A-9 B.3 Statistische Repräsentativität...............................A-10 B.4 Gesamtes Ankerkarten-Merging-Ergebnis eines Testszenarios....A-11 B.5 Gesamtes Daten-Matching-Ergebnis eines Testszenarios.........A-18 B.6 Qualitative Ergebnisbewertung................................A-18 B.7 Divergenz des Gesamtverfahrens...............................A-18
The aim of this work is the development of a method for the automated integration of Wireless Sensor Networks (WSN) into the respective application environment. The sensor networks realize there beside communication tasks above all the determination of location information. Therefore, the depot management in public transport is a typical application. Based on permanently available position coordinates of buses and trams as mobile objects in the traffic environment, a more efficient operational management is made possible. The database in this work is formed on the one hand by geometric relationships in the sensor network, which for reasons of availability are only described by pairwise distances between the mobile objects and the anchors permanently installed in the environment. On the other hand, existing digital map material in the form of vector and raster maps, e.g. obtained by GIS services, is used. The arguments for automation are obvious. First, the effort of position calibration should not scale with the number of anchors installed, which can only be automated. This at once eliminates symptomatic sources of error resulting from manual system integration. Secondly, automation should ensure real-time operation (e.g. recalibration and remote maintenance), eliminating costly maintenance and service. Initially, the developed method estimates relative position information for anchors and mobile objects from the sensor data by means of Range-Only Simultaneous Localization and Mapping (RO-SLAM). The method then merges this information within a cooperative map creation. From the relative, cooperative results and the available map material finally an application-specific absolute spatial outcome is generated. Based on semi-real sensor data and defined test scenarios, the results of the realized method evaluation demonstrate the functionality and performance of the developed method. They contain qualifying statements and also show statistically reliable limits of accuracy.:Abbildungsverzeichnis...............................................X Tabellenverzeichnis...............................................XII Abkürzungsverzeichnis............................................XIII Symbolverzeichnis................................................XVII 1 Einleitung........................................................1 1.1 Stand der Technik...............................................3 1.2 Entwickeltes Verfahren im Überblick.............................4 1.3 Wissenschaftlicher Beitrag......................................7 1.4 Gliederung der Arbeit...........................................8 2 Grundlagen zur Verfahrensumsetzung...............................10 2.1 Überblick zu funkbasierten Sensornetzen........................10 2.1.1 Aufbau und Netzwerk..........................................11 2.1.2 System- und Technologiemerkmale..............................12 2.1.3 Selbstorganisation...........................................13 2.1.4 Räumliche Beziehungen........................................14 2.2 Umgebungsrepräsentation........................................18 2.2.1 Koordinatenbeschreibung......................................19 2.2.2 Kartentypen..................................................20 2.3 Lokalisierung..................................................22 2.3.1 Positionierung...............................................23 2.3.2 Tracking.....................................................28 2.3.3 Koordinatentransformation....................................29 3 Zustandsschätzung dynamischer Systeme............................37 3.1 Probabilistischer Ansatz.......................................38 3.1.1 Satz von Bayes...............................................39 3.1.2 Markov-Kette.................................................40 3.1.3 Hidden Markov Model..........................................42 3.1.4 Dynamische Bayes‘sche Netze..................................43 3.2 Bayes-Filter...................................................45 3.2.1 Extended Kalman-Filter.......................................48 3.2.2 Histogramm-Filter............................................51 3.2.3 Partikel-Filter..............................................52 3.3 Markov Lokalisierung...........................................58 4 Simultane Lokalisierung und Kartenerstellung.....................61 4.1 Überblick......................................................62 4.1.1 Objektbeschreibung...........................................63 4.1.2 Umgebungskarte...............................................65 4.1.3 Schließen von Schleifen......................................70 4.2 Numerische Darstellung.........................................72 4.2.1 Formulierung als Bayes-Filter................................72 4.2.2 Diskretisierung des Zustandsraums............................74 4.2.3 Verwendung von Hypothesen....................................74 4.3 Initialisierung des Range-Only SLAM............................75 4.3.1 Verzögerte und unverzögerte Initialisierung..................75 4.3.2 Initialisierungsansätze......................................76 4.4 SLAM-Verfahren.................................................80 4.4.1 Extended Kalman-Filter-SLAM..................................81 4.4.2 Incremental Maximum Likelihood-SLAM..........................90 4.4.3 FastSLAM.....................................................99 5 Kooperative Kartenerstellung....................................107 5.1 Aufbereitung der Ankerkartierungsergebnisse...................108 5.2 Ankerkarten-Merging-Verfahren.................................110 5.2.1 Auflösen von Mehrdeutigkeiten...............................110 5.2.2 Erstellung einer gemeinsamen Ankerkarte.....................115 6 Herstellung eines absoluten Raumbezugs..........................117 6.1 Aufbereitung der Lokalisierungsergebnisse.....................117 6.1.1 Generierung von Geraden.....................................119 6.1.2 Generierung eines Graphen...................................122 6.2 Daten-Matching-Verfahren......................................123 6.2.1 Vektorbasierte Karteninformationen..........................125 6.2.2 Rasterbasierte Karteninformationen..........................129 7 Verfahrensevaluation............................................133 7.1 Methodischer Ansatz...........................................133 7.2 Datenbasis....................................................135 7.2.1 Sensordaten.................................................137 7.2.2 Digitales Kartenmaterial....................................143 7.3 Definition von Testszenarien..................................145 7.4 Bewertung.....................................................147 7.4.1 SLAM-Verfahren..............................................148 7.4.2 Ankerkarten-Merging-Verfahren...............................151 7.4.3 Daten-Matching-Verfahren....................................152 8 Zusammenfassung und Ausblick....................................163 8.1 Ergebnisse der Arbeit.........................................164 8.2 Ausblick......................................................165 Literaturverzeichnis..............................................166 A Ergänzungen zum entwickelten Verfahren..........................A-1 A.1 Generierung von Bewegungsinformationen........................A-1 A.2 Erweiterung des FastSLAM-Verfahrens...........................A-2 A.3 Ablauf des konzipierten Greedy-Algorithmus....................A-4 A.4 Lagewinkel der Kanten in einer Rastergrafik...................A-5 B Ergänzungen zur Verfahrensevaluation............................A-9 B.1 Geschwindigkeitsprofile der simulierten Objekttrajektorien....A-9 B.2 Gesamtes SLAM-Ergebnis eines Testszenarios....................A-9 B.3 Statistische Repräsentativität...............................A-10 B.4 Gesamtes Ankerkarten-Merging-Ergebnis eines Testszenarios....A-11 B.5 Gesamtes Daten-Matching-Ergebnis eines Testszenarios.........A-18 B.6 Qualitative Ergebnisbewertung................................A-18 B.7 Divergenz des Gesamtverfahrens...............................A-18
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Range-Only-SLAM (Simultaneous Localization and Mapping)"

1

Torres-González, Arturo, J. Ramiro Martínez-de Dios, and Aníbal Ollero. "Range-Only Simultaneous Localization and Mapping for Aerial Robots." In Encyclopedia of Robotics, 1–9. Berlin, Heidelberg: Springer Berlin Heidelberg, 2020. http://dx.doi.org/10.1007/978-3-642-41610-1_73-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gaber, Heba, Mohamed Marey, Safaa Amin, and Mohamed F. Tolba. "Localization and Mapping for Indoor Navigation." In Handbook of Research on Machine Learning Innovations and Trends, 136–60. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-2229-4.ch007.

Full text
Abstract:
Mapping and exploration for the purpose of navigation in unknown or partially unknown environments is a challenging problem, especially in indoor environments where GPS signals can't give the required accuracy. This chapter discusses the main aspects for designing a Simultaneous Localization and Mapping (SLAM) system architecture with the ability to function in situations where map information or current positions are initially unknown or partially unknown and where environment modifications are possible. Achieving this capability makes these systems significantly more autonomous and ideal for a large range of applications, especially indoor navigation for humans and for robotic missions. This chapter surveys the existing algorithms and technologies used for localization and mapping and highlights on using SLAM algorithms for indoor navigation. Also the proposed approach for the current research is presented.
APA, Harvard, Vancouver, ISO, and other styles
3

Gaber, Heba, Mohamed Marey, Safaa Amin, and Mohamed F. Tolba. "Localization and Mapping for Indoor Navigation." In Robotic Systems, 930–54. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1754-3.ch046.

Full text
Abstract:
Mapping and exploration for the purpose of navigation in unknown or partially unknown environments is a challenging problem, especially in indoor environments where GPS signals can't give the required accuracy. This chapter discusses the main aspects for designing a Simultaneous Localization and Mapping (SLAM) system architecture with the ability to function in situations where map information or current positions are initially unknown or partially unknown and where environment modifications are possible. Achieving this capability makes these systems significantly more autonomous and ideal for a large range of applications, especially indoor navigation for humans and for robotic missions. This chapter surveys the existing algorithms and technologies used for localization and mapping and highlights on using SLAM algorithms for indoor navigation. Also the proposed approach for the current research is presented.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Range-Only-SLAM (Simultaneous Localization and Mapping)"

1

Lourenco, Pedro, Pedro Batista, Paulo Oliveira, Carlos Silvestre, and C. L. Philip Chen. "Sensor-based globally asymptotically stable range-only simultaneous localization and mapping." In 2013 IEEE 52nd Annual Conference on Decision and Control (CDC). IEEE, 2013. http://dx.doi.org/10.1109/cdc.2013.6760786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fabresse, F. R., F. Caballero, L. Merino, and A. Ollero. "Active perception for 3D range-only simultaneous localization and mapping with UAVs." In 2016 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, 2016. http://dx.doi.org/10.1109/icuas.2016.7502639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fabresse, F. R., F. Caballero, and A. Ollero. "Decentralized simultaneous localization and mapping for multiple aerial vehicles using range-only sensors." In 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015. http://dx.doi.org/10.1109/icra.2015.7140099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Putra, Irham Arfakhsadz, and Prawito Prajitno. "Parameter Tuning of G-mapping SLAM (Simultaneous Localization and Mapping) on Mobile Robot with Laser-Range Finder 360° Sensor." In 2019 International Seminar on Research of Information Technology and Intelligent Systems (ISRITI). IEEE, 2019. http://dx.doi.org/10.1109/isriti48646.2019.9034573.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Atanasov, Nikolay, Sean L. Bowman, Kostas Daniilidis, and George J. Pappas. "A Unifying View of Geometry, Semantics, and Data Association in SLAM." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/722.

Full text
Abstract:
Traditional approaches for simultaneous localization and mapping (SLAM) rely on geometric features such as points, lines, and planes to infer the environment structure. They make hard decisions about the (data) association between observed features and mapped landmarks to update the environment model. This paper makes two contributions to the state of the art in SLAM. First, it generalizes the purely geometric model by introducing semantically meaningful objects, represented as structured models of mid-level part features. Second, instead of making hard, potentially wrong associations between semantic features and objects, it shows that SLAM inference can be performed efficiently with probabilistic data association. The approach not only allows building meaningful maps (containing doors, chairs, cars, etc.) but also offers significant advantages in ambiguous environments.
APA, Harvard, Vancouver, ISO, and other styles
6

Andersson, Lars A. A., and Marcus Berglund. "Camera Based Concept for Enhancement of IRB Accuracy Using Fixed Features and SLAM." In ASME 8th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2006. http://dx.doi.org/10.1115/esda2006-95312.

Full text
Abstract:
A standard IRB (Industrial Robot) of today is not capable of positioning itself to within better than 2–5mm relative to a given structure or object. In order to be able to perform high precision manufacturing, the requirement is less than 0.5mm. Existing solutions for 6DOF measurements to enhance the accuracy are not only expensive but also lack flexibility. An affordable, accurate concept for IRB closed loop position feedback is presented here. This article proposes a method for using a camera together with fixed features to enhance the accuracy of an IRB in industrial manufacturing processes. The method is based on SLAM (Simultaneous Localization and Mapping), used to structure collected information about the features (mapping), and for the IRB to locate itself among the features (localization). The concept covers a scenario in which a set of features are placed within the working area. A camera is mounted on the IRB to observe (measure) the relative pose of the feature set. When the system is running observation of only one feature increases accuracy. However, in normal operation more features are observed which will increase accuracy further. The system setup is a standard IRB, ABB4400, on which a single camera with IR flash is mounted and used as sensor. The features are made out of reflectors that are placed in the production cell. The camera is also equipped with an IR filter to improve the signal to noise ratio, simplifying image processing and accelerating execution. The execution time is critical since the goal is to sample at such a rate that as to cover the dynamics of the robot.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography