Academic literature on the topic 'Sensors fusion for localisation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sensors fusion for localisation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Sensors fusion for localisation"
Ashokaraj, Immanuel, Antonios Tsourdos, Peter Silson, and Brian White. "SENSOR BASED ROBOT LOCALISATION AND NAVIGATION: USING INTERVAL ANALYSIS AND NONLINEAR KALMAN FILTERS." Transactions of the Canadian Society for Mechanical Engineering 29, no. 2 (June 2005): 211–27. http://dx.doi.org/10.1139/tcsme-2005-0014.
Full textMeng, Lijun, Zhengang Guo, and Chenglong Ma. "Research on multiple damage localisation based on fusion of the Lamb wave ellipse algorithm and RAPID algorithm." Insight - Non-Destructive Testing and Condition Monitoring 66, no. 1 (January 1, 2024): 34–40. http://dx.doi.org/10.1784/insi.2024.66.1.34.
Full textNikitenko, Agris, Aleksis Liekna, Martins Ekmanis, Guntis Kulikovskis, and Ilze Andersone. "Single Robot Localisation Approach for Indoor Robotic Systems through Integration of Odometry and Artificial Landmarks." Applied Computer Systems 14, no. 1 (June 1, 2013): 50–58. http://dx.doi.org/10.2478/acss-2013-0006.
Full textTibebu, Haileleol, Varuna De-Silva, Corentin Artaud, Rafael Pina, and Xiyu Shi. "Towards Interpretable Camera and LiDAR Data Fusion for Autonomous Ground Vehicles Localisation." Sensors 22, no. 20 (October 20, 2022): 8021. http://dx.doi.org/10.3390/s22208021.
Full textMoretti, Michele, Federico Bianchi, and Nicola Senin. "Towards the development of a smart fused filament fabrication system using multi-sensor data fusion for in-process monitoring." Rapid Prototyping Journal 26, no. 7 (June 26, 2020): 1249–61. http://dx.doi.org/10.1108/rpj-06-2019-0167.
Full textDonati, Cesare, Martina Mammarella, Lorenzo Comba, Alessandro Biglia, Paolo Gay, and Fabrizio Dabbene. "3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios." Remote Sensing 14, no. 6 (March 11, 2022): 1374. http://dx.doi.org/10.3390/rs14061374.
Full textKozłowski, Michał, Raúl Santos-Rodríguez, and Robert Piechocki. "Sensor Modalities and Fusion for Robust Indoor Localisation." ICST Transactions on Ambient Systems 6, no. 18 (December 12, 2019): 162670. http://dx.doi.org/10.4108/eai.12-12-2019.162670.
Full textMerfels, Christian, and Cyrill Stachniss. "Sensor Fusion for Self-Localisation of Automated Vehicles." PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 85, no. 2 (March 7, 2017): 113–26. http://dx.doi.org/10.1007/s41064-017-0008-1.
Full textCiuffreda, Ilaria, Sara Casaccia, and Gian Marco Revel. "A Multi-Sensor Fusion Approach Based on PIR and Ultrasonic Sensors Installed on a Robot to Localise People in Indoor Environments." Sensors 23, no. 15 (August 5, 2023): 6963. http://dx.doi.org/10.3390/s23156963.
Full textNeuland, Renata, Mathias Mantelli, Bernardo Hummes, Luc Jaulin, Renan Maffei, Edson Prestes, and Mariana Kolberg. "Robust Hybrid Interval-Probabilistic Approach for the Kidnapped Robot Problem." International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 29, no. 02 (April 2021): 313–31. http://dx.doi.org/10.1142/s0218488521500141.
Full textDissertations / Theses on the topic "Sensors fusion for localisation"
Millikin, R. L. "Sensor fusion for the localisation of birds in flight." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2002. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ65871.pdf.
Full textWelte, Anthony. "Spatio-temporal data fusion for intelligent vehicle localization." Thesis, Compiègne, 2020. http://bibliotheque.utc.fr/EXPLOITATION/doc/IFD/2020COMP2572.
Full textLocalization is an essential basic capability for vehicles to be able to navigate autonomously on the road. This can be achieved through already available sensors and new technologies (Iidars, smart cameras). These sensors combined with highly accurate maps result in greater accuracy. In this work, the benefits of storing and reusing information in memory (in data buffers) are explored. Localization systems need to perform a high-frequency estimation, map matching, calibration and error detection. A framework composed of several processing layers is proposed and studied. A main filtering layer estimates the vehicle pose while other layers address the more complex problems. High-frequency state estimation relies on proprioceptive measurements combined with GNSS observations. Calibration is essential to obtain an accurate pose. By keeping state estimates and observations in a buffer, the observation models of these sensors can be calibrated. This is achieved using smoothed estimates in place of a ground truth. Lidars and smart cameras provide measurements that can be used for localization but raise matching issues with map features. In this work, the matching problem is addressed on a spatio-temporal window, resulting in a more detailed pictur of the environment. The state buffer is adjusted using the observations and all possible matches. Although using mapped features for localization enables to reach greater accuracy, this is only true if the map can be trusted. An approach using the post smoothing residuals has been developed to detect changes and either mitigate or reject the affected features
Lilja, Robin. "A Localisation and Navigation System for an Autonomous Wheel Loader." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-12157.
Full textMatsumoto, Takeshi, and takeshi matsumoto@flinders edu au. "Real-Time Multi-Sensor Localisation and Mapping Algorithms for Mobile Robots." Flinders University. Computer Science, Engineering and Mathematics, 2010. http://catalogue.flinders.edu.au./local/adt/public/adt-SFU20100302.131127.
Full textKhairallah, Mahmoud. "Flow-Based Visual-Inertial Odometry for Neuromorphic Vision Sensors." Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPAST117.
Full textRather than generating images constantly and synchronously, neuromorphic vision sensors -also known as event-based cameras- permit each pixel to provide information independently and asynchronously whenever brightness change is detected. Consequently, neuromorphic vision sensors do not encounter the problems of conventional frame-based cameras like image artifacts and motion blur. Furthermore, they can provide lossless data compression, higher temporal resolution and higher dynamic range. Hence, event-based cameras conveniently replace frame-based cameras in robotic applications requiring high maneuverability and varying environmental conditions. In this thesis, we address the problem of visual-inertial odometry using event-based cameras and an inertial measurement unit. Exploiting the consistency of event-based cameras with the brightness constancy conditions, we discuss the availability of building a visual odometry system based on optical flow estimation. We develop our approach based on the assumption that event-based cameras provide edge-like information about the objects in the scene and apply a line detection algorithm for data reduction. Line tracking allows us to gain more time for computations and provides a better representation of the environment than feature points. In this thesis, we do not only show an approach for event-based visual-inertial odometry but also event-based algorithms that can be used as stand-alone algorithms or integrated into other approaches if needed
Salehi, Achkan. "Localisation précise d'un véhicule par couplage vision/capteurs embarqués/systèmes d'informations géographiques." Thesis, Université Clermont Auvergne (2017-2020), 2018. http://www.theses.fr/2018CLFAC064/document.
Full textThe fusion between sensors and databases whose errors are independant is the most re-liable and therefore most widespread solution to the localization problem. Current autonomousand semi-autonomous vehicles, as well as augmented reality applications targeting industrialcontexts exploit large sensor and database graphs that are difficult and expensive to synchro-nize and calibrate. Thus, the democratization of these technologies requires the exploration ofthe possiblity of exploiting low-cost and easily accessible sensors and databases. These infor-mation sources are naturally tainted by higher uncertainty levels, and many obstacles to theireffective and efficient practical usage persist. Moreover, the recent but dazzling successes ofdeep neural networks in various tasks seem to indicate that they could be a viable and low-costalternative to some components of current SLAM systems.In this thesis, we focused on large-scale localization of a vehicle in a georeferenced co-ordinate frame from a low-cost system, which is based on the fusion between a monocularvideo stream, 3d non-textured but georeferenced building models, terrain elevation models anddata either from a low-cost GPS or from vehicle odometry. Our work targets the resolutionof two problems. The first one is related to the fusion via barrier term optimization of VS-LAM and positioning measurements provided by a low-cost GPS. This method is, to the bestof our knowledge, the most robust against GPS uncertainties, but it is more demanding in termsof computational resources. We propose an algorithmic optimization of that approach basedon the definition of a novel barrier term. The second problem is the data association problembetween the primitives that represent the geometry of the scene (e.g. 3d points) and the 3d buil-ding models. Previous works in that area use simple geometric criteria and are therefore verysensitive to occlusions in urban environments. We exploit deep convolutional neural networksin order to identify and associate elements from the map that correspond to 3d building mo-del façades. Although our contributions are for the most part independant from the underlyingSLAM system, we based our experiments on constrained key-frame based bundle adjustment.The solutions that we propose are evaluated on synthetic sequences as well as on real urbandatasets. These experiments show important performance gains for VSLAM/GPS fusion, andconsiderable improvements in the robustness of building constraints to occlusions
Héry, Elwan. "Localisation coopérative de véhicules autonomes communicants." Thesis, Compiègne, 2019. http://www.theses.fr/2019COMP2516.
Full textTo be able to navigate autonomously, a vehicle must be accurately localized relatively to all obstacles, such as roadside for lane keeping and vehicles and pedestrians to avoid causing accidents. This PhD thesis deals with the interest of cooperation to improve the localization of cooperative vehicles that exchange information. Autonomous navigation on the road is often based on coordinates provided in a Cartesian frame. In order to better represent the pose of a vehicle with respect to the lane in which it travels, we study curvilinear coordinates with respect to a path stored in a map. These coordinates generalize the curvilinear abscissa by adding a signed lateral deviation from the center of the lane and an orientation relative to the center of the lane taking into account the direction of travel. These coordinates are studied with different track models and using different projections to make the map-matching. A first cooperative localization approach is based on these coordinates. The lateral deviation and the orientation relative to the lane can be known precisely from a perception of the lane borders, but for autonomous driving with other vehicles, it is important to maintain a good longitudinal accuracy. A one-dimensional data fusion method makes it possible to show the interest of the cooperative localization in this simplified case where the lateral deviation, the curvilinear orientation and the relative positioning between two vehicles are accurately known. This case study shows that, in some cases, lateral accuracy can be propagated to other vehicles to improve their longitudinal accuracy. The correlation issues of the errors are taken into account with a covariance intersection filter. An ICP (Iterative Closest Point) minimization algorithm is then used to determine the relative pose between the vehicles from LiDAR points and a 2D polygonal model representing the shape of the vehicle. Several correspondences of the LiDAR points with the model and different minimization approaches are compared. The propagation of absolute vehicle pose using relative poses with their uncertainties is done through non-linear equations that can have a strong impact on consistency. The different dynamic elements surrounding the ego-vehicle are estimated in a Local Dynamic Map (LDM) to enhance the static high definition map describing the center of the lane and its border. In our case, the agents are only communicating vehicles. The LDM is composed of the state of each vehicle. The states are merged using an asynchronous algorithm, fusing available data at variable times. The algorithm is decentralized, each vehicle computing its own LDM and sharing it. As the position errors of the GNSS receivers are biased, a marking detection is introduced to obtain the lateral deviation from the center of the lane in order to estimate these biases. LiDAR observations with the ICP method allow to enrich the fusion with the constraints between the vehicles. Experimental results of this fusion show that the vehicles are more accurately localized with respect to each other while maintaining consistent poses
Jacobson, Adam. "Bio-inspired multi-sensor fusion and calibration for robot place learning and recognition." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/116179/1/Adam%20Jacobson%20Thesis.pdf.
Full textEricsson, John-Eric, and Daniel Eriksson. "Indoor Positioning and Localisation System with Sensor Fusion : AN IMPLEMENTATION ON AN INDOOR AUTONOMOUS ROBOT AT ÅF." Thesis, KTH, Maskinkonstruktion (Inst.), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-168841.
Full textExamensjobbet presenterar riktlinjer för hur sensorer och algoritmer för inomhuspositionering och lokaliseringssystem med sensorfusion bör väljas. Riktlinjerna är baserade på en omfattande teori och state of the art undersökning. Olika scenarion presenteras för att ge exempel på metoder för att välja sensorer och algoritmer för applikationer. Självklart finns det inga kombinationer som är rätt eller fel, men vissa faktorer är bra att komma ihåg när ett system designas. För att ge exempel på de föreslagna riktlinjerna har ett “Simultaneous Localisation and Mapping” (SLAM) system samt ett Inomhus Positioneringssystem (IPS) designats och implementerats på en inbyggd robotplattform. Det implementerade SLAM systemet baserades på en FastSLAM2algoritm med ultraljudssensorer och det implementerade IPS baserades på en Wifi RSS profileringsmetod som använder en Weibullfördelning. Metoderna, sensorerna och infrastrukturenhar valts utifrån krav som framställts från önskningar av intressenten samt utifrån kunskap från teori och state of the art undersökningen. En kombination av SLAM och IPS har föreslagits och valts att kallas WiFi SLAM för att reducera osäkerheter från de båda metoderna. Tyvärr har ingen kombination implementerats och testats på grund av oväntade problem med plattformen. Systemen simulerades individuellt före implementationen på den inbyggda plattformen. Resultat från dessa simuleringar tydde på att kraven skulle kunna uppfyllas samt gav en indikation av den minsta “set-upen” som behövdes för implementering. Båda de implementerade systemen visade sig ha de förväntade noggrannheterna under testning och med mer tid kunde bättre kalibrering ha skett, vilket förmodligen skulle resulterat i bättre resultat. Från resultaten kunde slutsatsen dras att en kombinerad WiFi SLAM lösning skulle förbättrat resultatet i en större testyta än den som användes. IPS skulle ha ökat sin precision medan SLAM skulle ha ökat sin robusthet. Examensjobbet har visat att det inte finns något exakt sätt att hitta en perfekt sensor och metodlösning. Viktigast är dock viktningen mellan tid, kostnad och kvalitet. Andra viktigafaktorer är att bestämma miljön systemet skall operera i och om systemet är säkerhetskritiskt. Det visade sig även att fusionerad sensordata kommer överträffa resultatet från endast en sensor och att det inte finns någon maxgräns för antalet fusionerade sensorer. Det kräver dock att sensorfusionsalgoritmen är väl kalibrerad, annars kan det motsatta inträffa.
Ladhari, Maroua. "Architecture générique de fusion par approche Top-Down : application à la localisation d’un robot mobile." Thesis, Université Clermont Auvergne (2017-2020), 2020. http://www.theses.fr/2020CLFAC052.
Full textThe issue that will be addressed in this thesis is the localization of a mobile robot. Equipped with low- cost sensors, the robot aims to exploit the maximum possible amount of information to meet an objective set beforehand. A data fusion problem will be treated in a way that at each situation, the robot will select which information to use to locate itself in a continuous way. The data we will process will be of different types.In our work, two properties of localization are desired: accuracy and confidence. In order to be controlled, the robot must know its position in a precise and reliable way. Indeed, accuracy refers to the degree of uncertainty related to the estimated position. It is returned by a fusion filter. If, in addition, the degree of certainty of being in this uncertainty zone is important, we will have a good confidence contribution and the estimate will be considered as reliable. These two properties are generally related. This is why they are often represented together to characterize the returned estimate of the robot position. In this work, our objective is to simultaneously optimize these two properties.To take advantage of the different existing techniques for an optimal estimation of the robot position, we propose a top-down approach based on the exploitation of environmental map environmental map defined in an absolute reference frame. This approach uses an a priori selection of the best informative measurements among all possible measurement sources. The selection is made according to a given objective (of accuracy and confidence), the current robot state and the data informational contribution.As the data is noisy, imprecise and may also be ambiguous and unreliable, the consideration of these limitations is necessary in order to provide the most accurate and reliable robot position estimation. For this, spatial focusing and a Bayesian network are used to reduce the risk of misdetection. However, in case of ambiguities, these misdetections may occur. A backwards process has been developed in order to react efficiently to these situations and thus achieve the set objectives.The main contributions of this work are on one side the development of a high-level generic and modular multi sensory localization architecture with a top-down process. We used a concept of perceptual triplet which is the set of landmark, sensor and detector to designate each perceptual module. At each time, a prediction and an update steps are performed. For the update step, the system selects the most relevant triplet (in terms of accuracy and confidence) according to an informational criterion. In order to ensure an accurate and relaible localization, our algorithm has been written in such a way that ambiguity aspects can be managed.On the other side, the developed algorithm allows to locate a robot in an environment map. For this purpose, the possibility of bad detections due to ambiguity phenomena has been taken into account in the backward process. Indeed, this process allows on the one hand to correct a bad detection and on the other hand to improve the returned position estimation to meet a desired objective
Books on the topic "Sensors fusion for localisation"
Hucks, John A. Fusion of ground-based sensors for optimal tracking of military targets. Monterey, Calif: Naval Postgraduate School, 1989.
Find full textG, Buser Rudolph, Warren Frank B, Society of Photo-optical Instrumentation Engineers., and University of Alabama in Huntsville. Center for Applied Optics., eds. Infrared sensors and sensor fusion: 19-21 May, 1987, Orlando, Florida. Bellingham, Wash., USA: SPIE--the International Society for Optical Engineering, 1987.
Find full textMulti-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.
Find full textOtmar, Loffeld, Centre national de la recherche scientifique (France), and Society of Photo-optical Instrumentation Engineers., eds. Vision systems--sensors, sensor systems, and components: 10-12 June 1996, Besançon, France. Bellingham, Wash: SPIE--the International Society for Optical Engineering, 1996.
Find full textNetworked multisensor decision and estimation fusion: Based on advanced mathematical methods. Boca Raton, FL: Taylor & Francis, 2012.
Find full textOtmar, Loffeld, Society of Photo-optical Instrumentation Engineers., European Optical Society, and Commission of the European Communities. Directorate-General for Science, Research, and Development., eds. Sensors, sensor systems, and sensor data processing: June 16-17 1997, Munich, FRG. Bellingham, Wash., USA: SPIE, 1997.
Find full textIntelligent Sensors, Sensor Networks & Information Processing Conference (2nd 2005 Melbourne, Vic.). Proceedings of the 2005 Intelligent Sensors, Sensor Networks & Information Processing Conference: 5-8 December, 2005, Melbourne, Australia. [Piscataway, N.J.]: IEEE, 2006.
Find full textIntelligent Sensors, Sensor Networks & Information Processing Conference (2nd 2005 Melbourne, Vic.). Proceedings of the 2005 Intelligent Sensors, Sensor Networks & Information Processing Conference: 5-8 December, 2005, Melbourne, Australia. [Piscataway, N.J.]: IEEE, 2006.
Find full textIEEE/AESS Dayton Chapter Symposium (15th 1998 Fairborn, OH). Sensing the world: Analog sensors & systems across the spectrum : the 15th Annual AESS/IEEE Dayton Section Symposium, Fairborn, OH, 14-15 May 1998. Piscataway, N.J: Institute of Electrical and Electronics Engineers, 1998.
Find full textGreen, Milford B. Mergers and acquisitions: Geographical and spatial perspectives. London: Routledge, 1990.
Find full textBook chapters on the topic "Sensors fusion for localisation"
Espinosa, Jose, Mihalis Tsiakkas, Dehao Wu, Simon Watson, Joaquin Carrasco, Peter R. Green, and Barry Lennox. "A Hybrid Underwater Acoustic and RF Localisation System for Enclosed Environments Using Sensor Fusion." In Towards Autonomous Robotic Systems, 369–80. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-96728-8_31.
Full textMitchell, H. B. "Image Sensors." In Image Fusion, 9–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-11216-4_2.
Full textKiaer, Jieun. "Fusion, localisation, and hybridity." In Delicious Words, 54–70. New York: Routledge, 2020. |: Routledge, 2020. http://dx.doi.org/10.4324/9780429321801-4.
Full textMajumder, Bansari Deb, and Joyanta Kumar Roy. "Multifunction Data Fusion." In Multifunctional Sensors, 49–54. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003350484-4.
Full textMitchell, H. B. "Sensors." In Data Fusion: Concepts and Ideas, 15–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27222-6_2.
Full textKoch, Wolfgang. "Characterizing Objects and Sensors." In Tracking and Sensor Data Fusion, 31–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_2.
Full textSubramanian, Rajesh. "Additional Sensors and Sensor Fusion." In Build Autonomous Mobile Robot from Scratch using ROS, 457–96. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9645-5_9.
Full textSuciu, George, Andrei Scheianu, Cristina Mihaela Bălăceanu, Ioana Petre, Mihaela Dragu, Marius Vochin, and Alexandru Vulpe. "Sensors Fusion Approach Using UAVs and Body Sensors." In Advances in Intelligent Systems and Computing, 146–53. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77700-9_15.
Full textVěchet, S., and J. Krejsa. "Sensors Data Fusion via Bayesian Network." In Recent Advances in Mechatronics, 221–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-05022-0_38.
Full textWagner, Jakub, Paweł Mazurek, and Roman Z. Morawski. "Fusion of Data from Impulse-Radar Sensors and Depth Sensors." In Health Information Science, 205–24. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96009-4_7.
Full textConference papers on the topic "Sensors fusion for localisation"
Jarvis, R. A. "Autonomous Robot Localisation By Sensor Fusion." In IEEE International Workshop on Emerging Technologies and Factory Automation,. IEEE, 1992. http://dx.doi.org/10.1109/etfa.1992.683295.
Full textIzri, Sonia, and Eric Brassart. "Uncertainties quantification criteria for multi-sensors fusion: Application to vehicles localisation." In Automation (MED 2008). IEEE, 2008. http://dx.doi.org/10.1109/med.2008.4602171.
Full textRedzic, Milan, Conor Brennan, and Noel E. O'Connor. "Dual-sensor fusion for indoor user localisation." In the 19th ACM international conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2072298.2071948.
Full textFranken, Dietrich. "An approximate maximum-likelihood estimator for localisation using bistatic measurements." In 2018 Sensor Data Fusion: Trends, Solutions, Applications (SDF). IEEE, 2018. http://dx.doi.org/10.1109/sdf.2018.8547074.
Full textAlvarado, Biel Piero, Fernando Matia, and Ramon Galan. "Improving indoor robots localisation by fusing different sensors." In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018. http://dx.doi.org/10.1109/iros.2018.8593667.
Full textRistic, Branko, Mark Morelande, Alfonso Farina, and S. Dulman. "On Proximity-Based Range-Free Node Localisation in Wireless Sensor Networks." In 2006 9th International Conference on Information Fusion. IEEE, 2006. http://dx.doi.org/10.1109/icif.2006.301734.
Full textCorral-Plaza, David, Olaf Reich, Erik Hübner, Matthias Wagner, and Inmaculada Medina-Bulo. "A SENSOR FUSION SYSTEM IDENTIFYING COMPLEX EVENTS FOR LOCALISATION ESTIMATION." In International Conference on Applied Computing 2019. IADIS Press, 2019. http://dx.doi.org/10.33965/ac2019_201912c033.
Full textKhoder, Makkawi, Ait-Tmazirte Nourdine, El Badaoui El Najjar Maan, and Moubayed Nazih. "Fault Tolerant multi-sensor Data Fusion for vehicle localisation using Maximum Correntropy Unscented Information Filter and α-Rényi Divergence." In 2020 IEEE 23rd International Conference on Information Fusion (FUSION). IEEE, 2020. http://dx.doi.org/10.23919/fusion45008.2020.9190407.
Full textPagnottelli, S., S. Taraglio, P. Valigi, and A. Zanela. "Visual and laser sensory data fusion for outdoor robot localisation and navigation." In 2005 12th International Conference on Advanced Robotics. IEEE, 2005. http://dx.doi.org/10.1109/icar.2005.1507409.
Full textUney, Murat, Bernard Mulgrew, and Daniel Clark. "Cooperative sensor localisation in distributed fusion networks by exploiting non-cooperative targets." In 2014 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2014. http://dx.doi.org/10.1109/ssp.2014.6884689.
Full textReports on the topic "Sensors fusion for localisation"
Cadwallader, L. C. Reliability estimates for selected sensors in fusion applications. Office of Scientific and Technical Information (OSTI), September 1996. http://dx.doi.org/10.2172/425367.
Full textLane, Brandon, Lars Jacquemetton, Martin Piltch, and Darren Beckett. Thermal calibration of commercial melt pool monitoring sensors on a laser powder bed fusion system. Gaithersburg, MD: National Institute of Standards and Technology, July 2020. http://dx.doi.org/10.6028/nist.ams.100-35.
Full textBeiker, Sven. Next-generation Sensors for Automated Road Vehicles. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, February 2023. http://dx.doi.org/10.4271/epr2023003.
Full textMobley, Curtis D. Determining the Scattering Properties of Vertically-Structured Nepheloid Layers from the Fusion of Active and Passive Optical Sensors. Fort Belvoir, VA: Defense Technical Information Center, September 2006. http://dx.doi.org/10.21236/ada630921.
Full textKulhandjian, Hovannes. Detecting Driver Drowsiness with Multi-Sensor Data Fusion Combined with Machine Learning. Mineta Transportation Institute, September 2021. http://dx.doi.org/10.31979/mti.2021.2015.
Full textKulhandjian, Hovannes. AI-based Pedestrian Detection and Avoidance at Night using an IR Camera, Radar, and a Video Camera. Mineta Transportation Institute, November 2022. http://dx.doi.org/10.31979/mti.2022.2127.
Full text