Dissertationen zum Thema „Sensor Fusion and Tracking“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Sensor Fusion and Tracking" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Mathew, Vineet. „Radar and Vision Sensor Fusion for Vehicle Tracking“. The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1574441839857988.
Der volle Inhalt der QuelleSikdar, Ankita. „Depth based Sensor Fusion in Object Detection and Tracking“. The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1515075130647622.
Der volle Inhalt der QuelleMoemeni, Armaghan. „Hybrid marker-less camera pose tracking with integrated sensor fusion“. Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/11093.
Der volle Inhalt der QuelleLundquist, Christian. „Sensor Fusion for Automotive Applications“. Doctoral thesis, Linköpings universitet, Reglerteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-71594.
Der volle Inhalt der QuelleSEFS -- IVSS
VR - ETT
Romine, Jay Brent. „Fusion of radar and imaging sensor data for target tracking“. Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/13324.
Der volle Inhalt der QuelleMoody, Leigh. „Sensors, measurement fusion and missile trajectory optimisation“. Thesis, Cranfield University; College of Defence Technology; Department of Aerospace, Power and Sensors, 2003. http://hdl.handle.net/1826/778.
Der volle Inhalt der QuelleAndersson, Naesseth Christian. „Vision and Radar Sensor Fusion for Advanced Driver Assistance Systems“. Thesis, Linköpings universitet, Reglerteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94222.
Der volle Inhalt der QuelleAttalla, Daniela, und Alexandra Tang. „Drones in Arctic Environments: Snow Change Tracking Aid using Sensor Fusion“. Thesis, KTH, Mekatronik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235928.
Der volle Inhalt der QuelleArktis är ett område som är utsatt för stora klimatförändringar, vilka kan vara svåra att spåra. Målet med arbetet är att föreslå, utveckla och utvärdera ett koncept där forskare i arktiska områden gagnas av att använda drönar- och sensorteknik i deras arbete gällande snöablation. Arbetet presenterar ett alternativ till att mäta utplacerade referensstavar med hjälp av ett integrerat sensorsystem monterat på en drönare. Dessa referensstavar borras ned, under snö- och isytan, över ett rutnät på glaciärerna i Arktis under vintern för att sedan mätas under sommaren med avsikt att studera mängden snö som smälter under året. Varje mätning görs således genom att fysiskt gå till varje enskild referensstav. Det framtagna konceptet uppskattar höjden på referensstavarna med hjälp av en framåtriktad LiDAR monterad på en servomotor och en nedåriktad ultraljudssensor. Höjden är uttytt som det högsta ultraljudsavståndet då det framåtriktade sensorsystemet detekterar ett föremål inom 3 m avstånd. Resultaten tyder på att det föreslagna konceptets höjduppskattning av referensstavar är en potentiell lösning inom problemområdet om systemets roll- och pitchvinklar kompenseras för.
Andersson, Anton. „Offline Sensor Fusion for Multitarget Tracking using Radar and Camera Detection“. Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208344.
Der volle Inhalt der QuelleMycket resurser läggs på utveckling av självkörande bilsystem. Dessa kan komma att förändra samhället under det kommande decenniet. En viktig del av dessa system är behandling och tolkning av sensordata och skapande av banor för objekt i omgivningen. I detta examensarbete studeras en energiminimeringsmetod tillsammans med radar- och kameramätningar. En energi beräknas för banorna. Denna tar mätningarna, objektets dynamik och fler faktorer i beaktande. Banorna väljs för att minimera denna energi med hjälp av gradientmetoden. Ju lägre energi, desto bättre förväntas banorna att matcha verkligheten. Bearbetning sker offline i motsats till i realtid; offline-bearbetning kan användas då prestandan för sensorer och realtidsbehandlingen utvärderas. Detta möjliggör användning av mer datorkraft och ger möjlighet att använda data som samlats in efter den aktuella tidpunkten. En studie av de ingående parametrarna i den använda energiminimeringsmetoden presenteras, tillsammans med justeringar av den ursprungliga metoden. Metoden ger ett förbättrat resultat jämfört med de enskilda sensormätningarna, och även jämfört med den realtidsmetod som används i bilarna för närvarande. I parameterstudien visas vilka komponenter i energifunktionen som förbättrar metodens prestanda.
Manyika, James. „An information-theoretic approach to data fusion and sensor management“. Thesis, University of Oxford, 1993. http://ora.ox.ac.uk/objects/uuid:6e6dd2a8-1ec0-4d39-8f8b-083289756a70.
Der volle Inhalt der QuelleLi, Lingjie Luo Zhi-Quan. „Data fusion and filtering for target tracking and identification /“. *McMaster only, 2003.
Den vollen Inhalt der Quelle findenGallagher, Jonathan G. „Likelihood as a Method of Multi Sensor Data Fusion for Target Tracking“. The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1244041862.
Der volle Inhalt der QuelleFallah, Haghmohammadi Hamidreza. „Fever Detection for Dynamic Human Environment Using Sensor Fusion“. Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37332.
Der volle Inhalt der QuelleJohansson, Ronnie. „Information Acquisition in Data Fusion Systems“. Licentiate thesis, KTH, Numerical Analysis and Computer Science, NADA, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-1673.
Der volle Inhalt der QuelleBy purposefully utilising sensors, for instance by a datafusion system, the state of some system-relevant environmentmight be adequately assessed to support decision-making. Theever increasing access to sensors o.ers great opportunities,but alsoincurs grave challenges. As a result of managingmultiple sensors one can, e.g., expect to achieve a morecomprehensive, resolved, certain and more frequently updatedassessment of the environment than would be possible otherwise.Challenges include data association, treatment of con.ictinginformation and strategies for sensor coordination.
We use the term information acquisition to denote the skillof a data fusion system to actively acquire information. Theaim of this thesis is to instructively situate that skill in ageneral context, explore and classify related research, andhighlight key issues and possible future work. It is our hopethat this thesis will facilitate communication, understandingand future e.orts for information acquisition.
The previously mentioned trend towards utilisation of largesets of sensors makes us especially interested in large-scaleinformation acquisition, i.e., acquisition using many andpossibly spatially distributed and heterogeneous sensors.
Information acquisition is a general concept that emerges inmany di.erent .elds of research. In this thesis, we surveyliterature from, e.g., agent theory, robotics and sensormanagement. We, furthermore, suggest a taxonomy of theliterature that highlights relevant aspects of informationacquisition.
We describe a function, perception management (akin tosensor management), which realizes information acquisition inthe data fusion process and pertinent properties of itsexternal stimuli, sensing resources, and systemenvironment.
An example of perception management is also presented. Thetask is that of managing a set of mobile sensors that jointlytrack some mobile targets. The game theoretic algorithmsuggested for distributing the targets among the sensors proveto be more robust to sensor failure than a measurement accuracyoptimal reference algorithm.
Keywords:information acquisition, sensor management,resource management, information fusion, data fusion,perception management, game theory, target tracking
Palaniappan, Ravishankar. „A SELF-ORGANIZING HYBRID SENSOR SYSTEM WITH DISTRIBUTED DATA FUSION FOR INTRUDER TRACKING AND SURVEILLANCE“. Doctoral diss., University of Central Florida, 2010. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2407.
Der volle Inhalt der QuellePh.D.
School of Electrical Engineering and Computer Science
Engineering and Computer Science
Modeling and Simulation PhD
Fredriksson, Alfred, und Joakim Wallin. „Mapping an Auditory Scene Using Eye Tracking Glasses“. Thesis, Linköpings universitet, Reglerteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170849.
Der volle Inhalt der QuelleBeale, Gregory Thomas. „Radar and LiDAR Fusion for Scaled Vehicle Sensing“. Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/102932.
Der volle Inhalt der QuelleMaster of Science
Research and development platforms, often supported by robust prototypes, are essential for the development, testing, and validation of automated driving functions. Thousands of hours of safety and performance benchmarks must be met before any advanced driver assistance system (ADAS) is considered production-ready. However, full-scale testbeds are expensive to build, labor-intensive to design, and present inherent safety risks while testing. Scaled prototypes, developed to model system design and vehicle behavior in targeted driving scenarios, can minimize these risks and expenses. Scaled testbeds, more specifically, can improve the ease of safety testing future ADAS systems and help visualize test results and system limitations, better than software simulations, to audiences with varying technical backgrounds. However, these testbeds are not without limitation. Although small-scale vehicles may accommodate similar on-board systems to its full-scale counterparts, as the vehicle scales down the resolution from perception sensors decreases, especially from on board radars. With many automated driving functions relying on radar object detection, the scaled vehicle must host radar sensors that function appropriately at scale to support accurate vehicle and system behavior. However, traditional radar technology is known to have limitations when operating in small-scale environments. Sensor fusion, which is the process of merging data from multiple sensors, may offer a potential solution to this issue. Consequently, a sensor fusion approach is presented that augments the angular resolution of radar data in a scaled environment with a commercially available Light Detection and Ranging (LiDAR) system. With this approach, object tracking software designed to operate in full-scaled vehicles with radars can operate more accurately when used in a scaled environment. Using this improvement, small-scale system tests could confidently and quickly be used to identify safety concerns in ADAS functions, leading to a faster and safer product development cycle.
Larsson, Olof. „Visual-inertial tracking using Optical Flow measurements“. Thesis, Linköping University, Automatic Control, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-59970.
Der volle Inhalt der Quelle
Visual-inertial tracking is a well known technique to track a combination of a camera and an inertial measurement unit (IMU). An issue with the straight-forward approach is the need of known 3D points. To by-pass this, 2D information can be used without recovering depth to estimate the position and orientation (pose) of the camera. This Master's thesis investigates the feasibility of using Optical Flow (OF) measurements and indicates the benifits using this approach.
The 2D information is added using OF measurements. OF describes the visual flow of interest points in the image plane. Without the necessity to estimate depth of these points, the computational complexity is reduced. With the increased 2D information, the 3D information required for the pose estimate decreases.
The usage of 2D points for the pose estimation has been verified with experimental data gathered by a real camera/IMU-system. Several data sequences containing different trajectories are used to estimate the pose. It is shown that OF measurements can be used to improve visual-inertial tracking with reduced need of 3D-point registrations.
Chavez, Garcia Ricardo Omar. „Multiple sensor fusion for detection, classification and tracking of moving objects in driving environments“. Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM034/document.
Der volle Inhalt der QuelleAdvanced driver assistance systems (ADAS) help drivers to perform complex driving tasks and to avoid or mitigate dangerous situations. The vehicle senses the external world using sensors and then builds and updates an internal model of the environment configuration. Vehicle perception consists of establishing the spatial and temporal relationships between the vehicle and the static and moving obstacles in the environment. Vehicle perception is composed of two main tasks: simultaneous localization and mapping (SLAM) deals with modelling static parts; and detection and tracking moving objects (DATMO) is responsible for modelling moving parts in the environment. In order to perform a good reasoning and control, the system has to correctly model the surrounding environment. The accurate detection and classification of moving objects is a critical aspect of a moving object tracking system. Therefore, many sensors are part of a common intelligent vehicle system. Classification of moving objects is needed to determine the possible behaviour of the objects surrounding the vehicle, and it is usually performed at tracking level. Knowledge about the class of moving objects at detection level can help improve their tracking. Most of the current perception solutions consider classification information only as aggregate information for the final perception output. Also, management of incomplete information is an important requirement for perception systems. Incomplete information can be originated from sensor-related reasons, such as calibration issues and hardware malfunctions; or from scene perturbations, like occlusions, weather issues and object shifting. It is important to manage these situations by taking them into account in the perception process. The main contributions in this dissertation focus on the DATMO stage of the perception problem. Precisely, we believe that including the object's class as a key element of the object's representation and managing the uncertainty from multiple sensors detections, we can improve the results of the perception task, i.e., a more reliable list of moving objects of interest represented by their dynamic state and appearance information. Therefore, we address the problems of sensor data association, and sensor fusion for object detection, classification, and tracking at different levels within the DATMO stage. Although we focus on a set of three main sensors: radar, lidar, and camera, we propose a modifiable architecture to include other type or number of sensors. First, we define a composite object representation to include class information as a part of the object state from early stages to the final output of the perception task. Second, we propose, implement, and compare two different perception architectures to solve the DATMO problem according to the level where object association, fusion, and classification information is included and performed. Our data fusion approaches are based on the evidential framework, which is used to manage and include the uncertainty from sensor detections and object classifications. Third, we propose an evidential data association approach to establish a relationship between two sources of evidence from object detections. We observe how the class information improves the final result of the DATMO component. Fourth, we integrate the proposed fusion approaches as a part of a real-time vehicle application. This integration has been performed in a real vehicle demonstrator from the interactIVe European project. Finally, we analysed and experimentally evaluated the performance of the proposed methods. We compared our evidential fusion approaches against each other and against a state-of-the-art method using real data from different driving scenarios. These comparisons focused on the detection, classification and tracking of different moving objects: pedestrian, bike, car and truck
Högger, Andreas. „Dempster Shafer Sensor Fusion for Autonomously Driving Vehicles : Association Free Tracking of Dynamic Objects“. Thesis, KTH, Skolan för elektro- och systemteknik (EES), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-187814.
Der volle Inhalt der QuelleSjälvkörande bilar har lett till flera intressanta forskningsområden som kombinerar manga olika discipliner. En utmaning är att ge fordonet en sorts ¨ögon. Genom att använda ytterligare sensorer och kombinera data frän samtliga så kan man detektera hinder i fordonets väg. Detta kan naturligtvis användas för att förbättra fordonets planerade rutt och därmed också minska klimatpåverkan. Här används två sammankopplade Velodyne laserstrålsensorer för att undersöka detta Närmare, men det går också att utöka antalet sensorer ytterligare. Sammanlänkningen av sensorer är mycket känslig och kräver därför exakta koordinater, vilket inte alltid kan garanteras. Därför utreds istället om en sensor baserad på Dempster Shaferteorin kan användas för att hantera fel och osäkerheter. Denna används dock inte i testfordonet. Baserat på en sammanvägd kartbild över upptagna och fria områden (occupancy grid mapping) kan objekt och hinder i rörelse följas för att uppskatta fordonets hastighet utan att metoder för objekt- eller banidentifiering behöver användas. Experiment har utförts på verklig data. Dessutom används simulerade mätningar där en sann grundreferens används. Algoritmen som används för occupancy-kartan använder sig av central- och grafik-processorenheter, vilket ger oss möjlighet att jämföra två metoder och finna den bäst fungerande metoden för olika applikationer.
Wilson, Dean A. „Analysis of tracking and identification characteristics of diverse systems and data sources for sensor fusion“. Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA392099.
Der volle Inhalt der QuelleThesis advisors(s): Duren, Russ ; Hutchins, Gary. "June 2001". Includes bibliographical references (p. 115-117). Also available online.
Rutkowski, Adam J. „A BIOLOGICALLY-INSPIRED SENSOR FUSION APPROACH TO TRACKING A WIND-BORNE ODOR IN THREE DIMENSIONS“. Case Western Reserve University School of Graduate Studies / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=case1196447143.
Der volle Inhalt der QuelleMarron, Monteserin Juan Jose. „Multi Sensor System for Pedestrian Tracking and Activity Recognition in Indoor Environments“. Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5068.
Der volle Inhalt der QuelleHomelius, Marcus. „Tracking of Ground Vehicles : Evaluation of Tracking Performance Using Different Sensors and Filtering Techniques“. Thesis, Linköpings universitet, Reglerteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148432.
Der volle Inhalt der QuelleGale, Nicholas C. „FUSION OF VIDEO AND MULTI-WAVEFORM FMCW RADAR FOR TRAFFIC SURVEILLANCE“. Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1315857639.
Der volle Inhalt der QuelleVincent, David E. „PORTABLE INDOOR MULTI-USER POSITION TRACKING SYSTEM FOR IMMERSIVE VIRTUAL ENVIRONMENTS USING SENSOR FUSION WITH MULTIDIMENSIONAL SCALING“. Miami University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=miami1335634621.
Der volle Inhalt der QuelleChen, Yangsheng. „Ground Target Tracking with Multi-Lane Constraint“. ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/925.
Der volle Inhalt der QuelleHucks, John A. „Fusion of ground-based sensors for optimal tracking of military targets“. Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27067.
Der volle Inhalt der QuelleMangette, Clayton John. „Perception and Planning of Connected and Automated Vehicles“. Thesis, Virginia Tech, 2020. http://hdl.handle.net/10919/98812.
Der volle Inhalt der QuelleMaster of Science
Connected and Automated Vehicles are an emerging area of research that involve integrating computational components to enable autonomous driving. This work considers two of the major challenges in this area of research. The first half of this thesis considers how to design a perception system in the vehicle that can correctly track other vehicles and assess their relative importance in the environment. A sensor fusion system is designed which incorporates information from different sensor types to form a list of relevant target objects. The rest of this work considers the high-level problem of coordination between autonomous vehicles. A planning algorithm which plans the paths of multiple autonomous vehicles that is guaranteed to prevent collisions and is empirically faster than existing planning methods is demonstrated.
Denman, Simon Paul. „Improved detection and tracking of objects in surveillance video“. Queensland University of Technology, 2009. http://eprints.qut.edu.au/29328/.
Der volle Inhalt der QuelleMalik, Zohaib Mansoor. „Design and implementation of temporal filtering and other data fusion algorithms to enhance the accuracy of a real time radio location tracking system“. Thesis, Högskolan i Gävle, Avdelningen för elektronik, matematik och naturvetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-13261.
Der volle Inhalt der QuelleJohansson, Ronnie. „Large-Scale Information Acquisition for Data and Information Fusion“. Doctoral thesis, Stockholm, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3890.
Der volle Inhalt der QuelleLee, Yeongseon. „Bayesian 3D multiple people tracking using multiple indoor cameras and microphones“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/29668.
Der volle Inhalt der QuelleCommittee Chair: Rusell M. Mersereau; Committee Member: Biing Hwang (Fred) Juang; Committee Member: Christopher E. Heil; Committee Member: Georgia Vachtsevanos; Committee Member: James H. McClellan. Part of the SMARTech Electronic Thesis and Dissertation Collection.
Buaes, Alexandre Greff. „A low cost one-camera optical tracking system for indoor wide-area augmented and virtual reality environments“. reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2006. http://hdl.handle.net/10183/7138.
Der volle Inhalt der QuelleIn the last years the number of industrial applications for Augmented Reality (AR) and Virtual Reality (VR) environments has significantly increased. Optical tracking systems are an important component of AR/VR environments. In this work, a low cost optical tracking system with adequate attributes for professional use is proposed. The system works in infrared spectral region to reduce optical noise. A highspeed camera, equipped with daylight blocking filter and infrared flash strobes, transfers uncompressed grayscale images to a regular PC, where image pre-processing software and the PTrack tracking algorithm recognize a set of retro-reflective markers and extract its 3D position and orientation. Included in this work is a comprehensive research on image pre-processing and tracking algorithms. A testbed was built to perform accuracy and precision tests. Results show that the system reaches accuracy and precision levels slightly worse than but still comparable to professional systems. Due to its modularity, the system can be expanded by using several one-camera tracking modules linked by a sensor fusion algorithm, in order to obtain a larger working range. A setup with two modules was built and tested, resulting in performance similar to the stand-alone configuration.
Wijk, Olle. „Triangulation Based Fusion of Sonar Data with Application in Mobile Robot Mapping and Localization“. Doctoral thesis, Stockholm : Tekniska högsk, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3124.
Der volle Inhalt der QuelleO-larnnithipong, Nonnarit. „Hand Motion Tracking System using Inertial Measurement Units and Infrared Cameras“. FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3905.
Der volle Inhalt der QuelleShaban, Heba Ahmed. „A Novel Highly Accurate Wireless Wearable Human Locomotion Tracking and Gait Analysis System via UWB Radios“. Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/27562.
Der volle Inhalt der QuellePh. D.
Nicolini, Andrea. „Multipath tracking techniques for millimeter wave communications“. Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019. http://amslaurea.unibo.it/17690/.
Der volle Inhalt der QuelleStein, Sebastian. „Multi-modal recognition of manipulation activities through visual accelerometer tracking, relational histograms, and user-adaptation“. Thesis, University of Dundee, 2014. https://discovery.dundee.ac.uk/en/studentTheses/61c22b7e-5f02-4f21-a948-bf9e7b497120.
Der volle Inhalt der QuelleLamard, Laetitia. „Approche modulaire pour le suivi temps réel de cibles multi-capteurs pour les applications routières“. Thesis, Clermont-Ferrand 2, 2014. http://www.theses.fr/2014CLF22477/document.
Der volle Inhalt der QuelleThis PhD work, carried out in collaboration with Institut Pascal and Renault, is in the field of the Advanced Driving Assisted Systems, most of these systems aiming to improve passenger security. Sensors fusion makes the system decision more reliable. The goal of this PhD work was to develop a fusion system between a radar and a smart camera, improving obstacles detection in front of the vehicle. Our approach proposes a real-time flexible fusion architecture system using asynchronous data from the sensors without any prior knowledge about the application. Our fusion system is based on a multi targets tracking method. Probabilistic multi target tracking was considered, and one based on random finite sets (modelling targets) was selected and tested in real-time computation. The filter, named CPHD (Cardinalized Probability Hypothesis Density), succeed in taking into account and correcting all sensor defaults (non detections, false alarms and imprecision on position and speed estimated by sensors) and uncertainty about the environment (unknown number of targets). This system was improved by introducing the management of the type of the target: pedestrian, car, truck and bicycle. A new system was proposed, solving explicitly camera occlusions issues by a probabilistic method taking into account this sensor imprecision. Smart sensors use induces data correlation (due to pre-processed data). This issue was solved by correcting the estimation of sensor detection performance. A new tool was set up to complete fusion system: it allows the estimation of all sensors parameters used by fusion filter. Our system was tested in real situations with several experimentations. Every contribution was qualitatively and quantitatively validated
Mekonnen, Alhayat Ali. „Cooperative people detection and tracking strategies with a mobile robot and wall mounted cameras“. Phd thesis, Université Paul Sabatier - Toulouse III, 2014. http://tel.archives-ouvertes.fr/tel-01068355.
Der volle Inhalt der QuelleKarlsson, Johannes. „Wireless video sensor network and its applications in digital zoo“. Doctoral thesis, Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-38032.
Der volle Inhalt der QuelleWiklund, Åsa. „Multiple Platform Bias Error Estimation“. Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2126.
Der volle Inhalt der QuelleSensor fusion has long been recognized as a mean to improve target tracking. Sensor fusion deals with the merging of several signals into one to get a better and more reliable result. To get an improved and more reliable result you have to trust the incoming data to be correct and not contain unknown systematic errors. This thesis tries to find and estimate the size of the systematic errors that appear when we have a multi platform environment and data is shared among the units. To be more precise, the error estimated within the scope of this thesis appears when platforms cannot determine their positions correctly and share target tracking data with their own corrupted position as a basis for determining the target's position. The algorithms developed in this thesis use the Kalman filter theory, including the extended Kalman filter and the information filter, to estimate the platform location bias error. Three algorithms are developed with satisfying result. Depending on time constraints and computational demands either one of the algorithms could be preferred.
Bebek, Ozkan. „ROBOTIC-ASSISTED BEATING HEART SURGERY“. Case Western Reserve University School of Graduate Studies / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=case1201289393.
Der volle Inhalt der QuelleHachour, Samir. „Suivi et classification d'objets multiples : contributions avec la théorie des fonctions de croyance“. Thesis, Artois, 2015. http://www.theses.fr/2015ARTO0206/document.
Der volle Inhalt der QuelleThis thesis deals with multi-objet tracking and classification problem. It was shown that belieffunctions allow the results of classical Bayesian methods to be improved. In particular, a recentapproach dedicated to a single object classification which is extended to multi-object framework. Itwas shown that detected observations to known objects assignment is a fundamental issue in multiobjecttracking and classification solutions. New assignment solutions based on belief functionsare proposed in this thesis, they are shown to be more robust than the other credal solutions fromrecent literature. Finally, the issue of multi-sensor classification that requires a second phase ofassignment is addressed. In the latter case, two different multi-sensor architectures are proposed, aso-called centralized one and another said distributed. Many comparisons illustrate the importanceof this work, in both situations of constant and changing objects classes
Magnusson, Daniel. „A network based algorithm for aided navigation“. Thesis, Linköpings universitet, Institutionen för systemteknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-75235.
Der volle Inhalt der QuelleDet här examensarbetet syftar till utveckling av en algoritm för navigering, primärt för stridsflygplanet SAAB JAS 39 Gripen, i svärmar av andra enheter. Algoritmen använder information från konventionella navigeringssystem och ytterligare information från en radiodatalänk som ger understödjande information, relativa avståndsmätningar. Då den förlitade GPS:en kan störas ut, kan denna gruppspårande lösning öka navigeringsprestandan i dessa förhållanden. För enkelhetens skull, används förenklade karaktäristiker i simuleringarna där enkla genererade trajektorier och mätningar används. Denna mätinformation kan sedan ihopviktas genom att använda filterteori från statistisk sensorfusion. Genom att använda radiodatalänkar och den tillförda informationen från externa informationskällor, således andra flygplan och olika typer av landmärken som väldigt ofta har god prestanda, är navigeringen understödd när GPS inte är användbar, t.ex. i GPS-fientliga miljöer. Ett antal scenarion med operativ verklighetsanknytning simulerades för att verifiera och studera dessa förhållanden, för att ge resultat med slutsatser.
© Daniel Magnusson.
Patel, Shwetak Naran. „Infrastructure mediated sensing“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/24829.
Der volle Inhalt der QuelleCommittee Chair: Abowd, Gregory; Committee Member: Edwards, Keith; Committee Member: Grinter, Rebecca; Committee Member: LaMarca, Anthony; Committee Member: Starner, Thad.
Neumann, Markus. „Automatic multimodal real-time tracking for image plane alignment in interventional Magnetic Resonance Imaging“. Phd thesis, Université de Strasbourg, 2014. http://tel.archives-ouvertes.fr/tel-01038023.
Der volle Inhalt der QuelleCook, Brandon M. „An Intelligent System for Small Unmanned Aerial Vehicle Traffic Management“. University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1617106257481515.
Der volle Inhalt der QuelleKarimi, Majid. „Master ’s Programme in Information Technology: Using multiple Leap Motion sensors in Assembly workplace in Smart Factory“. Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-32392.
Der volle Inhalt der Quelle