Literatura académica sobre el tema "Sensors fusion for localisation"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Sensors fusion for localisation".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Sensors fusion for localisation"
Ashokaraj, Immanuel, Antonios Tsourdos, Peter Silson y Brian White. "SENSOR BASED ROBOT LOCALISATION AND NAVIGATION: USING INTERVAL ANALYSIS AND NONLINEAR KALMAN FILTERS." Transactions of the Canadian Society for Mechanical Engineering 29, n.º 2 (junio de 2005): 211–27. http://dx.doi.org/10.1139/tcsme-2005-0014.
Texto completoMeng, Lijun, Zhengang Guo y Chenglong Ma. "Research on multiple damage localisation based on fusion of the Lamb wave ellipse algorithm and RAPID algorithm". Insight - Non-Destructive Testing and Condition Monitoring 66, n.º 1 (1 de enero de 2024): 34–40. http://dx.doi.org/10.1784/insi.2024.66.1.34.
Texto completoNikitenko, Agris, Aleksis Liekna, Martins Ekmanis, Guntis Kulikovskis y Ilze Andersone. "Single Robot Localisation Approach for Indoor Robotic Systems through Integration of Odometry and Artificial Landmarks". Applied Computer Systems 14, n.º 1 (1 de junio de 2013): 50–58. http://dx.doi.org/10.2478/acss-2013-0006.
Texto completoTibebu, Haileleol, Varuna De-Silva, Corentin Artaud, Rafael Pina y Xiyu Shi. "Towards Interpretable Camera and LiDAR Data Fusion for Autonomous Ground Vehicles Localisation". Sensors 22, n.º 20 (20 de octubre de 2022): 8021. http://dx.doi.org/10.3390/s22208021.
Texto completoMoretti, Michele, Federico Bianchi y Nicola Senin. "Towards the development of a smart fused filament fabrication system using multi-sensor data fusion for in-process monitoring". Rapid Prototyping Journal 26, n.º 7 (26 de junio de 2020): 1249–61. http://dx.doi.org/10.1108/rpj-06-2019-0167.
Texto completoDonati, Cesare, Martina Mammarella, Lorenzo Comba, Alessandro Biglia, Paolo Gay y Fabrizio Dabbene. "3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios". Remote Sensing 14, n.º 6 (11 de marzo de 2022): 1374. http://dx.doi.org/10.3390/rs14061374.
Texto completoKozłowski, Michał, Raúl Santos-Rodríguez y Robert Piechocki. "Sensor Modalities and Fusion for Robust Indoor Localisation". ICST Transactions on Ambient Systems 6, n.º 18 (12 de diciembre de 2019): 162670. http://dx.doi.org/10.4108/eai.12-12-2019.162670.
Texto completoMerfels, Christian y Cyrill Stachniss. "Sensor Fusion for Self-Localisation of Automated Vehicles". PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science 85, n.º 2 (7 de marzo de 2017): 113–26. http://dx.doi.org/10.1007/s41064-017-0008-1.
Texto completoCiuffreda, Ilaria, Sara Casaccia y Gian Marco Revel. "A Multi-Sensor Fusion Approach Based on PIR and Ultrasonic Sensors Installed on a Robot to Localise People in Indoor Environments". Sensors 23, n.º 15 (5 de agosto de 2023): 6963. http://dx.doi.org/10.3390/s23156963.
Texto completoNeuland, Renata, Mathias Mantelli, Bernardo Hummes, Luc Jaulin, Renan Maffei, Edson Prestes y Mariana Kolberg. "Robust Hybrid Interval-Probabilistic Approach for the Kidnapped Robot Problem". International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 29, n.º 02 (abril de 2021): 313–31. http://dx.doi.org/10.1142/s0218488521500141.
Texto completoTesis sobre el tema "Sensors fusion for localisation"
Millikin, R. L. "Sensor fusion for the localisation of birds in flight". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2002. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/NQ65871.pdf.
Texto completoWelte, Anthony. "Spatio-temporal data fusion for intelligent vehicle localization". Thesis, Compiègne, 2020. http://bibliotheque.utc.fr/EXPLOITATION/doc/IFD/2020COMP2572.
Texto completoLocalization is an essential basic capability for vehicles to be able to navigate autonomously on the road. This can be achieved through already available sensors and new technologies (Iidars, smart cameras). These sensors combined with highly accurate maps result in greater accuracy. In this work, the benefits of storing and reusing information in memory (in data buffers) are explored. Localization systems need to perform a high-frequency estimation, map matching, calibration and error detection. A framework composed of several processing layers is proposed and studied. A main filtering layer estimates the vehicle pose while other layers address the more complex problems. High-frequency state estimation relies on proprioceptive measurements combined with GNSS observations. Calibration is essential to obtain an accurate pose. By keeping state estimates and observations in a buffer, the observation models of these sensors can be calibrated. This is achieved using smoothed estimates in place of a ground truth. Lidars and smart cameras provide measurements that can be used for localization but raise matching issues with map features. In this work, the matching problem is addressed on a spatio-temporal window, resulting in a more detailed pictur of the environment. The state buffer is adjusted using the observations and all possible matches. Although using mapped features for localization enables to reach greater accuracy, this is only true if the map can be trusted. An approach using the post smoothing residuals has been developed to detect changes and either mitigate or reject the affected features
Lilja, Robin. "A Localisation and Navigation System for an Autonomous Wheel Loader". Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-12157.
Texto completoMatsumoto, Takeshi y takeshi matsumoto@flinders edu au. "Real-Time Multi-Sensor Localisation and Mapping Algorithms for Mobile Robots". Flinders University. Computer Science, Engineering and Mathematics, 2010. http://catalogue.flinders.edu.au./local/adt/public/adt-SFU20100302.131127.
Texto completoKhairallah, Mahmoud. "Flow-Based Visual-Inertial Odometry for Neuromorphic Vision Sensors". Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPAST117.
Texto completoRather than generating images constantly and synchronously, neuromorphic vision sensors -also known as event-based cameras- permit each pixel to provide information independently and asynchronously whenever brightness change is detected. Consequently, neuromorphic vision sensors do not encounter the problems of conventional frame-based cameras like image artifacts and motion blur. Furthermore, they can provide lossless data compression, higher temporal resolution and higher dynamic range. Hence, event-based cameras conveniently replace frame-based cameras in robotic applications requiring high maneuverability and varying environmental conditions. In this thesis, we address the problem of visual-inertial odometry using event-based cameras and an inertial measurement unit. Exploiting the consistency of event-based cameras with the brightness constancy conditions, we discuss the availability of building a visual odometry system based on optical flow estimation. We develop our approach based on the assumption that event-based cameras provide edge-like information about the objects in the scene and apply a line detection algorithm for data reduction. Line tracking allows us to gain more time for computations and provides a better representation of the environment than feature points. In this thesis, we do not only show an approach for event-based visual-inertial odometry but also event-based algorithms that can be used as stand-alone algorithms or integrated into other approaches if needed
Salehi, Achkan. "Localisation précise d'un véhicule par couplage vision/capteurs embarqués/systèmes d'informations géographiques". Thesis, Université Clermont Auvergne (2017-2020), 2018. http://www.theses.fr/2018CLFAC064/document.
Texto completoThe fusion between sensors and databases whose errors are independant is the most re-liable and therefore most widespread solution to the localization problem. Current autonomousand semi-autonomous vehicles, as well as augmented reality applications targeting industrialcontexts exploit large sensor and database graphs that are difficult and expensive to synchro-nize and calibrate. Thus, the democratization of these technologies requires the exploration ofthe possiblity of exploiting low-cost and easily accessible sensors and databases. These infor-mation sources are naturally tainted by higher uncertainty levels, and many obstacles to theireffective and efficient practical usage persist. Moreover, the recent but dazzling successes ofdeep neural networks in various tasks seem to indicate that they could be a viable and low-costalternative to some components of current SLAM systems.In this thesis, we focused on large-scale localization of a vehicle in a georeferenced co-ordinate frame from a low-cost system, which is based on the fusion between a monocularvideo stream, 3d non-textured but georeferenced building models, terrain elevation models anddata either from a low-cost GPS or from vehicle odometry. Our work targets the resolutionof two problems. The first one is related to the fusion via barrier term optimization of VS-LAM and positioning measurements provided by a low-cost GPS. This method is, to the bestof our knowledge, the most robust against GPS uncertainties, but it is more demanding in termsof computational resources. We propose an algorithmic optimization of that approach basedon the definition of a novel barrier term. The second problem is the data association problembetween the primitives that represent the geometry of the scene (e.g. 3d points) and the 3d buil-ding models. Previous works in that area use simple geometric criteria and are therefore verysensitive to occlusions in urban environments. We exploit deep convolutional neural networksin order to identify and associate elements from the map that correspond to 3d building mo-del façades. Although our contributions are for the most part independant from the underlyingSLAM system, we based our experiments on constrained key-frame based bundle adjustment.The solutions that we propose are evaluated on synthetic sequences as well as on real urbandatasets. These experiments show important performance gains for VSLAM/GPS fusion, andconsiderable improvements in the robustness of building constraints to occlusions
Héry, Elwan. "Localisation coopérative de véhicules autonomes communicants". Thesis, Compiègne, 2019. http://www.theses.fr/2019COMP2516.
Texto completoTo be able to navigate autonomously, a vehicle must be accurately localized relatively to all obstacles, such as roadside for lane keeping and vehicles and pedestrians to avoid causing accidents. This PhD thesis deals with the interest of cooperation to improve the localization of cooperative vehicles that exchange information. Autonomous navigation on the road is often based on coordinates provided in a Cartesian frame. In order to better represent the pose of a vehicle with respect to the lane in which it travels, we study curvilinear coordinates with respect to a path stored in a map. These coordinates generalize the curvilinear abscissa by adding a signed lateral deviation from the center of the lane and an orientation relative to the center of the lane taking into account the direction of travel. These coordinates are studied with different track models and using different projections to make the map-matching. A first cooperative localization approach is based on these coordinates. The lateral deviation and the orientation relative to the lane can be known precisely from a perception of the lane borders, but for autonomous driving with other vehicles, it is important to maintain a good longitudinal accuracy. A one-dimensional data fusion method makes it possible to show the interest of the cooperative localization in this simplified case where the lateral deviation, the curvilinear orientation and the relative positioning between two vehicles are accurately known. This case study shows that, in some cases, lateral accuracy can be propagated to other vehicles to improve their longitudinal accuracy. The correlation issues of the errors are taken into account with a covariance intersection filter. An ICP (Iterative Closest Point) minimization algorithm is then used to determine the relative pose between the vehicles from LiDAR points and a 2D polygonal model representing the shape of the vehicle. Several correspondences of the LiDAR points with the model and different minimization approaches are compared. The propagation of absolute vehicle pose using relative poses with their uncertainties is done through non-linear equations that can have a strong impact on consistency. The different dynamic elements surrounding the ego-vehicle are estimated in a Local Dynamic Map (LDM) to enhance the static high definition map describing the center of the lane and its border. In our case, the agents are only communicating vehicles. The LDM is composed of the state of each vehicle. The states are merged using an asynchronous algorithm, fusing available data at variable times. The algorithm is decentralized, each vehicle computing its own LDM and sharing it. As the position errors of the GNSS receivers are biased, a marking detection is introduced to obtain the lateral deviation from the center of the lane in order to estimate these biases. LiDAR observations with the ICP method allow to enrich the fusion with the constraints between the vehicles. Experimental results of this fusion show that the vehicles are more accurately localized with respect to each other while maintaining consistent poses
Jacobson, Adam. "Bio-inspired multi-sensor fusion and calibration for robot place learning and recognition". Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/116179/1/Adam%20Jacobson%20Thesis.pdf.
Texto completoEricsson, John-Eric y Daniel Eriksson. "Indoor Positioning and Localisation System with Sensor Fusion : AN IMPLEMENTATION ON AN INDOOR AUTONOMOUS ROBOT AT ÅF". Thesis, KTH, Maskinkonstruktion (Inst.), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-168841.
Texto completoExamensjobbet presenterar riktlinjer för hur sensorer och algoritmer för inomhuspositionering och lokaliseringssystem med sensorfusion bör väljas. Riktlinjerna är baserade på en omfattande teori och state of the art undersökning. Olika scenarion presenteras för att ge exempel på metoder för att välja sensorer och algoritmer för applikationer. Självklart finns det inga kombinationer som är rätt eller fel, men vissa faktorer är bra att komma ihåg när ett system designas. För att ge exempel på de föreslagna riktlinjerna har ett “Simultaneous Localisation and Mapping” (SLAM) system samt ett Inomhus Positioneringssystem (IPS) designats och implementerats på en inbyggd robotplattform. Det implementerade SLAM systemet baserades på en FastSLAM2algoritm med ultraljudssensorer och det implementerade IPS baserades på en Wifi RSS profileringsmetod som använder en Weibullfördelning. Metoderna, sensorerna och infrastrukturenhar valts utifrån krav som framställts från önskningar av intressenten samt utifrån kunskap från teori och state of the art undersökningen. En kombination av SLAM och IPS har föreslagits och valts att kallas WiFi SLAM för att reducera osäkerheter från de båda metoderna. Tyvärr har ingen kombination implementerats och testats på grund av oväntade problem med plattformen. Systemen simulerades individuellt före implementationen på den inbyggda plattformen. Resultat från dessa simuleringar tydde på att kraven skulle kunna uppfyllas samt gav en indikation av den minsta “set-upen” som behövdes för implementering. Båda de implementerade systemen visade sig ha de förväntade noggrannheterna under testning och med mer tid kunde bättre kalibrering ha skett, vilket förmodligen skulle resulterat i bättre resultat. Från resultaten kunde slutsatsen dras att en kombinerad WiFi SLAM lösning skulle förbättrat resultatet i en större testyta än den som användes. IPS skulle ha ökat sin precision medan SLAM skulle ha ökat sin robusthet. Examensjobbet har visat att det inte finns något exakt sätt att hitta en perfekt sensor och metodlösning. Viktigast är dock viktningen mellan tid, kostnad och kvalitet. Andra viktigafaktorer är att bestämma miljön systemet skall operera i och om systemet är säkerhetskritiskt. Det visade sig även att fusionerad sensordata kommer överträffa resultatet från endast en sensor och att det inte finns någon maxgräns för antalet fusionerade sensorer. Det kräver dock att sensorfusionsalgoritmen är väl kalibrerad, annars kan det motsatta inträffa.
Ladhari, Maroua. "Architecture générique de fusion par approche Top-Down : application à la localisation d’un robot mobile". Thesis, Université Clermont Auvergne (2017-2020), 2020. http://www.theses.fr/2020CLFAC052.
Texto completoThe issue that will be addressed in this thesis is the localization of a mobile robot. Equipped with low- cost sensors, the robot aims to exploit the maximum possible amount of information to meet an objective set beforehand. A data fusion problem will be treated in a way that at each situation, the robot will select which information to use to locate itself in a continuous way. The data we will process will be of different types.In our work, two properties of localization are desired: accuracy and confidence. In order to be controlled, the robot must know its position in a precise and reliable way. Indeed, accuracy refers to the degree of uncertainty related to the estimated position. It is returned by a fusion filter. If, in addition, the degree of certainty of being in this uncertainty zone is important, we will have a good confidence contribution and the estimate will be considered as reliable. These two properties are generally related. This is why they are often represented together to characterize the returned estimate of the robot position. In this work, our objective is to simultaneously optimize these two properties.To take advantage of the different existing techniques for an optimal estimation of the robot position, we propose a top-down approach based on the exploitation of environmental map environmental map defined in an absolute reference frame. This approach uses an a priori selection of the best informative measurements among all possible measurement sources. The selection is made according to a given objective (of accuracy and confidence), the current robot state and the data informational contribution.As the data is noisy, imprecise and may also be ambiguous and unreliable, the consideration of these limitations is necessary in order to provide the most accurate and reliable robot position estimation. For this, spatial focusing and a Bayesian network are used to reduce the risk of misdetection. However, in case of ambiguities, these misdetections may occur. A backwards process has been developed in order to react efficiently to these situations and thus achieve the set objectives.The main contributions of this work are on one side the development of a high-level generic and modular multi sensory localization architecture with a top-down process. We used a concept of perceptual triplet which is the set of landmark, sensor and detector to designate each perceptual module. At each time, a prediction and an update steps are performed. For the update step, the system selects the most relevant triplet (in terms of accuracy and confidence) according to an informational criterion. In order to ensure an accurate and relaible localization, our algorithm has been written in such a way that ambiguity aspects can be managed.On the other side, the developed algorithm allows to locate a robot in an environment map. For this purpose, the possibility of bad detections due to ambiguity phenomena has been taken into account in the backward process. Indeed, this process allows on the one hand to correct a bad detection and on the other hand to improve the returned position estimation to meet a desired objective
Libros sobre el tema "Sensors fusion for localisation"
Hucks, John A. Fusion of ground-based sensors for optimal tracking of military targets. Monterey, Calif: Naval Postgraduate School, 1989.
Buscar texto completoG, Buser Rudolph, Warren Frank B, Society of Photo-optical Instrumentation Engineers. y University of Alabama in Huntsville. Center for Applied Optics., eds. Infrared sensors and sensor fusion: 19-21 May, 1987, Orlando, Florida. Bellingham, Wash., USA: SPIE--the International Society for Optical Engineering, 1987.
Buscar texto completoMulti-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.
Buscar texto completoOtmar, Loffeld, Centre national de la recherche scientifique (France) y Society of Photo-optical Instrumentation Engineers., eds. Vision systems--sensors, sensor systems, and components: 10-12 June 1996, Besançon, France. Bellingham, Wash: SPIE--the International Society for Optical Engineering, 1996.
Buscar texto completoNetworked multisensor decision and estimation fusion: Based on advanced mathematical methods. Boca Raton, FL: Taylor & Francis, 2012.
Buscar texto completoOtmar, Loffeld, Society of Photo-optical Instrumentation Engineers., European Optical Society y Commission of the European Communities. Directorate-General for Science, Research, and Development., eds. Sensors, sensor systems, and sensor data processing: June 16-17 1997, Munich, FRG. Bellingham, Wash., USA: SPIE, 1997.
Buscar texto completoIntelligent Sensors, Sensor Networks & Information Processing Conference (2nd 2005 Melbourne, Vic.). Proceedings of the 2005 Intelligent Sensors, Sensor Networks & Information Processing Conference: 5-8 December, 2005, Melbourne, Australia. [Piscataway, N.J.]: IEEE, 2006.
Buscar texto completoIntelligent Sensors, Sensor Networks & Information Processing Conference (2nd 2005 Melbourne, Vic.). Proceedings of the 2005 Intelligent Sensors, Sensor Networks & Information Processing Conference: 5-8 December, 2005, Melbourne, Australia. [Piscataway, N.J.]: IEEE, 2006.
Buscar texto completoIEEE/AESS Dayton Chapter Symposium (15th 1998 Fairborn, OH). Sensing the world: Analog sensors & systems across the spectrum : the 15th Annual AESS/IEEE Dayton Section Symposium, Fairborn, OH, 14-15 May 1998. Piscataway, N.J: Institute of Electrical and Electronics Engineers, 1998.
Buscar texto completoGreen, Milford B. Mergers and acquisitions: Geographical and spatial perspectives. London: Routledge, 1990.
Buscar texto completoCapítulos de libros sobre el tema "Sensors fusion for localisation"
Espinosa, Jose, Mihalis Tsiakkas, Dehao Wu, Simon Watson, Joaquin Carrasco, Peter R. Green y Barry Lennox. "A Hybrid Underwater Acoustic and RF Localisation System for Enclosed Environments Using Sensor Fusion". En Towards Autonomous Robotic Systems, 369–80. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-96728-8_31.
Texto completoMitchell, H. B. "Image Sensors". En Image Fusion, 9–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-11216-4_2.
Texto completoKiaer, Jieun. "Fusion, localisation, and hybridity". En Delicious Words, 54–70. New York: Routledge, 2020. |: Routledge, 2020. http://dx.doi.org/10.4324/9780429321801-4.
Texto completoMajumder, Bansari Deb y Joyanta Kumar Roy. "Multifunction Data Fusion". En Multifunctional Sensors, 49–54. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003350484-4.
Texto completoMitchell, H. B. "Sensors". En Data Fusion: Concepts and Ideas, 15–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-27222-6_2.
Texto completoKoch, Wolfgang. "Characterizing Objects and Sensors". En Tracking and Sensor Data Fusion, 31–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_2.
Texto completoSubramanian, Rajesh. "Additional Sensors and Sensor Fusion". En Build Autonomous Mobile Robot from Scratch using ROS, 457–96. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9645-5_9.
Texto completoSuciu, George, Andrei Scheianu, Cristina Mihaela Bălăceanu, Ioana Petre, Mihaela Dragu, Marius Vochin y Alexandru Vulpe. "Sensors Fusion Approach Using UAVs and Body Sensors". En Advances in Intelligent Systems and Computing, 146–53. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77700-9_15.
Texto completoVěchet, S. y J. Krejsa. "Sensors Data Fusion via Bayesian Network". En Recent Advances in Mechatronics, 221–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-05022-0_38.
Texto completoWagner, Jakub, Paweł Mazurek y Roman Z. Morawski. "Fusion of Data from Impulse-Radar Sensors and Depth Sensors". En Health Information Science, 205–24. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96009-4_7.
Texto completoActas de conferencias sobre el tema "Sensors fusion for localisation"
Jarvis, R. A. "Autonomous Robot Localisation By Sensor Fusion". En IEEE International Workshop on Emerging Technologies and Factory Automation,. IEEE, 1992. http://dx.doi.org/10.1109/etfa.1992.683295.
Texto completoIzri, Sonia y Eric Brassart. "Uncertainties quantification criteria for multi-sensors fusion: Application to vehicles localisation". En Automation (MED 2008). IEEE, 2008. http://dx.doi.org/10.1109/med.2008.4602171.
Texto completoRedzic, Milan, Conor Brennan y Noel E. O'Connor. "Dual-sensor fusion for indoor user localisation". En the 19th ACM international conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2072298.2071948.
Texto completoFranken, Dietrich. "An approximate maximum-likelihood estimator for localisation using bistatic measurements". En 2018 Sensor Data Fusion: Trends, Solutions, Applications (SDF). IEEE, 2018. http://dx.doi.org/10.1109/sdf.2018.8547074.
Texto completoAlvarado, Biel Piero, Fernando Matia y Ramon Galan. "Improving indoor robots localisation by fusing different sensors". En 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2018. http://dx.doi.org/10.1109/iros.2018.8593667.
Texto completoRistic, Branko, Mark Morelande, Alfonso Farina y S. Dulman. "On Proximity-Based Range-Free Node Localisation in Wireless Sensor Networks". En 2006 9th International Conference on Information Fusion. IEEE, 2006. http://dx.doi.org/10.1109/icif.2006.301734.
Texto completoCorral-Plaza, David, Olaf Reich, Erik Hübner, Matthias Wagner y Inmaculada Medina-Bulo. "A SENSOR FUSION SYSTEM IDENTIFYING COMPLEX EVENTS FOR LOCALISATION ESTIMATION". En International Conference on Applied Computing 2019. IADIS Press, 2019. http://dx.doi.org/10.33965/ac2019_201912c033.
Texto completoKhoder, Makkawi, Ait-Tmazirte Nourdine, El Badaoui El Najjar Maan y Moubayed Nazih. "Fault Tolerant multi-sensor Data Fusion for vehicle localisation using Maximum Correntropy Unscented Information Filter and α-Rényi Divergence". En 2020 IEEE 23rd International Conference on Information Fusion (FUSION). IEEE, 2020. http://dx.doi.org/10.23919/fusion45008.2020.9190407.
Texto completoPagnottelli, S., S. Taraglio, P. Valigi y A. Zanela. "Visual and laser sensory data fusion for outdoor robot localisation and navigation". En 2005 12th International Conference on Advanced Robotics. IEEE, 2005. http://dx.doi.org/10.1109/icar.2005.1507409.
Texto completoUney, Murat, Bernard Mulgrew y Daniel Clark. "Cooperative sensor localisation in distributed fusion networks by exploiting non-cooperative targets". En 2014 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2014. http://dx.doi.org/10.1109/ssp.2014.6884689.
Texto completoInformes sobre el tema "Sensors fusion for localisation"
Cadwallader, L. C. Reliability estimates for selected sensors in fusion applications. Office of Scientific and Technical Information (OSTI), septiembre de 1996. http://dx.doi.org/10.2172/425367.
Texto completoLane, Brandon, Lars Jacquemetton, Martin Piltch y Darren Beckett. Thermal calibration of commercial melt pool monitoring sensors on a laser powder bed fusion system. Gaithersburg, MD: National Institute of Standards and Technology, julio de 2020. http://dx.doi.org/10.6028/nist.ams.100-35.
Texto completoBeiker, Sven. Next-generation Sensors for Automated Road Vehicles. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, febrero de 2023. http://dx.doi.org/10.4271/epr2023003.
Texto completoMobley, Curtis D. Determining the Scattering Properties of Vertically-Structured Nepheloid Layers from the Fusion of Active and Passive Optical Sensors. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 2006. http://dx.doi.org/10.21236/ada630921.
Texto completoKulhandjian, Hovannes. Detecting Driver Drowsiness with Multi-Sensor Data Fusion Combined with Machine Learning. Mineta Transportation Institute, septiembre de 2021. http://dx.doi.org/10.31979/mti.2021.2015.
Texto completoKulhandjian, Hovannes. AI-based Pedestrian Detection and Avoidance at Night using an IR Camera, Radar, and a Video Camera. Mineta Transportation Institute, noviembre de 2022. http://dx.doi.org/10.31979/mti.2022.2127.
Texto completo