Auswahl der wissenschaftlichen Literatur zum Thema „Sensor Fusion and Tracking“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Sensor Fusion and Tracking" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Sensor Fusion and Tracking"

1

Et. al., M. Hyndhavi,. „DEVELOPMENT OF VEHICLE TRACKING USING SENSOR FUSION“. INFORMATION TECHNOLOGY IN INDUSTRY 9, Nr. 2 (01.04.2021): 731–39. http://dx.doi.org/10.17762/itii.v9i2.406.

Der volle Inhalt der Quelle
Annotation:
The development of vehicle tracking using sensor fusion is presented in this paper. Advanced driver assistance systems (ADAS) are becoming more popular in recent years. These systems use sensor information for real-time control. To improve the standard and robustness, especially in the presence of environmental noises like varying lighting, weather conditions, and fusion of sensors has been the center of attention in recent studies. Faced with complex traffic conditions, the single sensor has been unable to meet the security requirements of ADAS and autonomous driving. The common environment perception sensors consist of radar, camera, and lidar which have both pros and cons. The sensor fusion is a necessary technology for autonomous driving which provides a better vision and understanding of vehicles surrounding. We mainly focus on highway scenarios that enable an autonomous car to comfortably follow other cars at various speeds while keeping a secure distance and mix the advantages of both sensors with a sensor fusion approach. The radar and vision sensor information are fused to produce robust and accurate measurements. And the experimental results indicate that the comparison of using only radar sensors and sensor fusion of both camera and radar sensors is presented in this paper. The algorithm is described along with simulation results by using MATLAB.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Liu, Yan Ju, Chun Xiang Xie und Jian Hui Song. „Research on Fusion Tracking Technology in Heterogeneous Multi-Sensor“. Advanced Materials Research 1056 (Oktober 2014): 158–61. http://dx.doi.org/10.4028/www.scientific.net/amr.1056.158.

Der volle Inhalt der Quelle
Annotation:
Heterogeneous multi-sensor’s fusion tracking can detect precise distance and angle to the target. For heterogeneous multi-sensor issues, radar, infrared sensor and laser sensor’s data fusion, and target tracking are studied, weighted fusion algorithm based on Lagrange and unscented kalman filter are adopted, which make date fusion and tracking filtering for target. Simulation results show that the radar / infrared / laser sensors can realize data fusion and tracking to the target, and the accuracy is significantly higher than radar and infrared/laser, and then tracking effect is better.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Qin, Y., Xue Hui Wang, Ming Jun Feng, Zhen Zhou und L. J. Wang. „Research of Asynchronous Multi-Type Sensors Data Fusion“. Advanced Materials Research 142 (Oktober 2010): 16–20. http://dx.doi.org/10.4028/www.scientific.net/amr.142.16.

Der volle Inhalt der Quelle
Annotation:
A data fusion algorithm was established for estimating the state of target tracking system with multi-type sensor. Through Kalman filter regarding the multi-sensors to computer goal estimated value, it can obtain estimation value of goal at moment. And mean square deviation of fusion estimation value was smaller than single sensor's mean square deviation. The simulation results indicated that synchronisms data fusion method was effective to the multi-target tracking problem. Asynchronous multi-sensor fusion process can obtain good control effect in the practice control process.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Li, Xin Yu, und Dong Yi Chen. „Sensor Fusion Based on Strong Tracking Filter for Augmented Reality Registration“. Key Engineering Materials 467-469 (Februar 2011): 108–13. http://dx.doi.org/10.4028/www.scientific.net/kem.467-469.108.

Der volle Inhalt der Quelle
Annotation:
Accurate tracking for Augmented Reality applications is a challenging task. Multi-sensors hybrid tracking generally provide more stable than the effect of the single visual tracking. This paper presents a new tightly-coupled hybrid tracking approach combining vision-based systems with inertial sensor. Based on multi-frequency sampling theory in the measurement data synchronization, a strong tracking filter (STF) is used to smooth sensor data and estimate position and orientation. Through adding time-varying fading factor to adaptively adjust the prediction error covariance of filter, this method improves the performance of tracking for fast moving targets. Experimental results show the efficiency and robustness of this proposed approach.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Yi, Chunlei, Kunfan Zhang und Nengling Peng. „A multi-sensor fusion and object tracking algorithm for self-driving vehicles“. Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering 233, Nr. 9 (August 2019): 2293–300. http://dx.doi.org/10.1177/0954407019867492.

Der volle Inhalt der Quelle
Annotation:
Vehicles need to detect threats on the road, anticipate emerging dangerous driving situations and take proactive actions for collision avoidance. Therefore, the study on methods of target detection and recognition are of practical value to a self-driving system. However, single sensor has its weakness, such as poor weather adaptability with lidar and camera. In this article, we propose a novel spatial calibration method based on multi-sensor systems, and the approach utilizes rotation and translation of the coordinate system. The validity of the proposed spatial calibration method is tested through comparisons with the data calibrated. In addition, a multi-sensor fusion and object tracking algorithm based on target level to detect and recognize targets is tested. Sensors contain lidar, radar and camera. The multi-sensor fusion and object tracking algorithm takes advantages of various sensors such as target location from lidar, target velocity from radar and target type from camera. Besides, multi-sensor fusion and object tracking algorithm can achieve information redundancy and increase environmental adaptability. Compared with the results of single sensor, this new approach is verified to have the accuracy of location, velocity and recognition by real data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Chen, Bin, Xiaofei Pei und Zhenfu Chen. „Research on Target Detection Based on Distributed Track Fusion for Intelligent Vehicles“. Sensors 20, Nr. 1 (20.12.2019): 56. http://dx.doi.org/10.3390/s20010056.

Der volle Inhalt der Quelle
Annotation:
Accurate target detection is the basis of normal driving for intelligent vehicles. However, the sensors currently used for target detection have types of defects at the perception level, which can be compensated by sensor fusion technology. In this paper, the application of sensor fusion technology in intelligent vehicle target detection is studied with a millimeter-wave (MMW) radar and a camera. The target level fusion hierarchy is adopted, and the fusion algorithm is divided into two tracking processing modules and one fusion center module based on the distributed structure. The measurement information output by two sensors enters the tracking processing module, and after processing by a multi-target tracking algorithm, the local tracks are generated and transmitted to the fusion center module. In the fusion center module, a two-level association structure is designed based on regional collision association and weighted track association. The association between two sensors’ local tracks is completed, and a non-reset federated filter is used to estimate the state of the fusion tracks. The experimental results indicate that the proposed algorithm can complete a tracks association between the MMW radar and camera, and the fusion track state estimation method has an excellent performance.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Shi, Yifang, Jee Woong Choi, Lei Xu, Hyung June Kim, Ihsan Ullah und Uzair Khan. „Distributed Target Tracking in Challenging Environments Using Multiple Asynchronous Bearing-Only Sensors“. Sensors 20, Nr. 9 (07.05.2020): 2671. http://dx.doi.org/10.3390/s20092671.

Der volle Inhalt der Quelle
Annotation:
In the multiple asynchronous bearing-only (BO) sensors tracking system, there usually exist two main challenges: (1) the presence of clutter measurements and the target misdetection due to imperfect sensing; (2) the out-of-sequence (OOS) arrival of locally transmitted information due to diverse sensor sampling interval or internal processing time or uncertain communication delay. This paper simultaneously addresses the two problems by proposing a novel distributed tracking architecture consisting of the local tracking and central fusion. To get rid of the kinematic state unobservability problem in local tracking for a single BO sensor scenario, we propose a novel local integrated probabilistic data association (LIPDA) method for target measurement state tracking. The proposed approach enables eliminating most of the clutter measurement disturbance with increased target measurement accuracy. In the central tracking, the fusion center uses the proposed distributed IPDA-forward prediction fusion and decorrelation (DIPDA-FPFD) approach to sequentially fuse the OOS information transmitted by each BO sensor. The track management is carried out at local sensor level and also at the fusion center by using the recursively calculated probability of target existence as a track quality measure. The efficiency of the proposed methodology was validated by intensive numerical experiments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Deo, Ankur, Vasile Palade und Md Nazmul Huda. „Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist“. Sensors 21, Nr. 16 (11.08.2021): 5422. http://dx.doi.org/10.3390/s21165422.

Der volle Inhalt der Quelle
Annotation:
Many advanced driver assistance systems (ADAS) are currently trying to utilise multi-sensor architectures, where the driver assistance algorithm receives data from a multitude of sensors. As mono-sensor systems cannot provide reliable and consistent readings under all circumstances because of errors and other limitations, fusing data from multiple sensors ensures that the environmental parameters are perceived correctly and reliably for most scenarios, thereby substantially improving the reliability of the multi-sensor-based automotive systems. This paper first highlights the significance of efficiently fusing data from multiple sensors in ADAS features. An emergency brake assist (EBA) system is showcased using multiple sensors, namely, a light detection and ranging (LiDAR) sensor and camera. The architectures of the proposed ‘centralised’ and ‘decentralised’ sensor fusion approaches for EBA are discussed along with their constituents, i.e., the detection algorithms, the fusion algorithm, and the tracking algorithm. The centralised and decentralised architectures are built and analytically compared, and the performance of these two fusion architectures for EBA are evaluated in terms of speed of execution, accuracy, and computational cost. While both fusion methods are seen to drive the EBA application at an acceptable frame rate (~20 fps or higher) on an Intel i5-based Ubuntu system, it was concluded through the experiments and analytical comparisons that the decentralised fusion-driven EBA leads to higher accuracy; however, it has the downside of a higher computational cost. The centralised fusion-driven EBA yields comparatively less accurate results, but with the benefits of a higher frame rate and lesser computational cost.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Wöhle, Lukas, und Marion Gebhard. „SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data“. Sensors 20, Nr. 10 (12.05.2020): 2759. http://dx.doi.org/10.3390/s20102759.

Der volle Inhalt der Quelle
Annotation:
This paper presents the use of eye tracking data in Magnetic AngularRate Gravity (MARG)-sensor based head orientation estimation. The approach presented here can be deployed in any motion measurement that includes MARG and eye tracking sensors (e.g., rehabilitation robotics or medical diagnostics). The challenge in these mostly indoor applications is the presence of magnetic field disturbances at the location of the MARG-sensor. In this work, eye tracking data (visual fixations) are used to enable zero orientation change updates in the MARG-sensor data fusion chain. The approach is based on a MARG-sensor data fusion filter, an online visual fixation detection algorithm as well as a dynamic angular rate threshold estimation for low latency and adaptive head motion noise parameterization. In this work we use an adaptation of Madgwicks gradient descent filter for MARG-sensor data fusion, but the approach could be used with any other data fusion process. The presented approach does not rely on additional stationary or local environmental references and is therefore self-contained. The proposed system is benchmarked against a Qualisys motion capture system, a gold standard in human motion analysis, showing improved heading accuracy for the MARG-sensor data fusion up to a factor of 0.5 while magnetic disturbance is present.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Guo, Xiaoxiao, Yuansheng Liu, Qixue Zhong und Mengna Chai. „Research on Moving Target Tracking Algorithm Based on Lidar and Visual Fusion“. Journal of Advanced Computational Intelligence and Intelligent Informatics 22, Nr. 5 (20.09.2018): 593–601. http://dx.doi.org/10.20965/jaciii.2018.p0593.

Der volle Inhalt der Quelle
Annotation:
Multi-sensor fusion and target tracking are two key technologies for the environmental awareness system of autonomous vehicles. In this paper, a moving target tracking method based on the fusion of Lidar and binocular camera is proposed. Firstly, the position information obtained by the two types of sensors is fused at decision level by using adaptive weighting algorithm, and then the Joint Probability Data Association (JPDA) algorithm is correlated with the result of fusion to achieve multi-target tracking. Tested at a curve in the campus and compared with the Extended Kalman Filter (EKF) algorithm, the experimental results show that this algorithm can effectively overcome the limitation of a single sensor and track more accurately.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Sensor Fusion and Tracking"

1

Mathew, Vineet. „Radar and Vision Sensor Fusion for Vehicle Tracking“. The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1574441839857988.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Sikdar, Ankita. „Depth based Sensor Fusion in Object Detection and Tracking“. The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1515075130647622.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Moemeni, Armaghan. „Hybrid marker-less camera pose tracking with integrated sensor fusion“. Thesis, De Montfort University, 2014. http://hdl.handle.net/2086/11093.

Der volle Inhalt der Quelle
Annotation:
This thesis presents a framework for a hybrid model-free marker-less inertial-visual camera pose tracking with an integrated sensor fusion mechanism. The proposed solution addresses the fundamental problem of pose recovery in computer vision and robotics and provides an improved solution for wide-area pose tracking that can be used on mobile platforms and in real-time applications. In order to arrive at a suitable pose tracking algorithm, an in-depth investigation was conducted into current methods and sensors used for pose tracking. Preliminary experiments were then carried out on hybrid GPS-Visual as well as wireless micro-location tracking in order to evaluate their suitability for camera tracking in wide-area or GPS-denied environments. As a result of this investigation a combination of an inertial measurement unit and a camera was chosen as the primary sensory inputs for a hybrid camera tracking system. After following a thorough modelling and mathematical formulation process, a novel and improved hybrid tracking framework was designed, developed and evaluated. The resulting system incorporates an inertial system, a vision-based system and a recursive particle filtering-based stochastic data fusion and state estimation algorithm. The core of the algorithm is a state-space model for motion kinematics which, combined with the principles of multi-view camera geometry and the properties of optical flow and focus of expansion, form the main components of the proposed framework. The proposed solution incorporates a monitoring system, which decides on the best method of tracking at any given time based on the reliability of the fresh vision data provided by the vision-based system, and automatically switches between visual and inertial tracking as and when necessary. The system also includes a novel and effective self-adjusting mechanism, which detects when the newly captured sensory data can be reliably used to correct the past pose estimates. The corrected state is then propagated through to the current time in order to prevent sudden pose estimation errors manifesting as a permanent drift in the tracking output. Following the design stage, the complete system was fully developed and then evaluated using both synthetic and real data. The outcome shows an improved performance compared to existing techniques, such as PTAM and SLAM. The low computational cost of the algorithm enables its application on mobile devices, while the integrated self-monitoring, self-adjusting mechanisms allow for its potential use in wide-area tracking applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Lundquist, Christian. „Sensor Fusion for Automotive Applications“. Doctoral thesis, Linköpings universitet, Reglerteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-71594.

Der volle Inhalt der Quelle
Annotation:
Mapping stationary objects and tracking moving targets are essential for many autonomous functions in vehicles. In order to compute the map and track estimates, sensor measurements from radar, laser and camera are used together with the standard proprioceptive sensors present in a car. By fusing information from different types of sensors, the accuracy and robustness of the estimates can be increased. Different types of maps are discussed and compared in the thesis. In particular, road maps make use of the fact that roads are highly structured, which allows relatively simple and powerful models to be employed. It is shown how the information of the lane markings, obtained by a front looking camera, can be fused with inertial measurement of the vehicle motion and radar measurements of vehicles ahead to compute a more accurate and robust road geometry estimate. Further, it is shown how radar measurements of stationary targets can be used to estimate the road edges, modeled as polynomials and tracked as extended targets. Recent advances in the field of multiple target tracking lead to the use of finite set statistics (FISST) in a set theoretic approach, where the targets and the measurements are treated as random finite sets (RFS). The first order moment of a RFS is called probability hypothesis density (PHD), and it is propagated in time with a PHD filter. In this thesis, the PHD filter is applied to radar data for constructing a parsimonious representation of the map of the stationary objects around the vehicle. Two original contributions, which exploit the inherent structure in the map, are proposed. A data clustering algorithm is suggested to structure the description of the prior and considerably improving the update in the PHD filter. Improvements in the merging step further simplify the map representation. When it comes to tracking moving targets, the focus of this thesis is on extended targets, i.e., targets which potentially may give rise to more than one measurement per time step. An implementation of the PHD filter, which was proposed to handle data obtained from extended targets, is presented. An approximation is proposed in order to limit the number of hypotheses. Further, a framework to track the size and shape of a target is introduced. The method is based on measurement generating points on the surface of the target, which are modeled by an RFS. Finally, an efficient and novel Bayesian method is proposed for approximating the tire radii of a vehicle based on particle filters and the marginalization concept. This is done under the assumption that a change in the tire radius is caused by a change in tire pressure, thus obtaining an indirect tire pressure monitoring system. The approaches presented in this thesis have all been evaluated on real data from both freeways and rural roads in Sweden.
SEFS -- IVSS
VR - ETT
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Romine, Jay Brent. „Fusion of radar and imaging sensor data for target tracking“. Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/13324.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Moody, Leigh. „Sensors, measurement fusion and missile trajectory optimisation“. Thesis, Cranfield University; College of Defence Technology; Department of Aerospace, Power and Sensors, 2003. http://hdl.handle.net/1826/778.

Der volle Inhalt der Quelle
Annotation:
When considering advances in “smart” weapons it is clear that air-launched systems have adopted an integrated approach to meet rigorous requirements, whereas air-defence systems have not. The demands on sensors, state observation, missile guidance, and simulation for air-defence is the subject of this research. Historical reviews for each topic, justification of favoured techniques and algorithms are provided, using a nomenclature developed to unify these disciplines. Sensors selected for their enduring impact on future systems are described and simulation models provided. Complex internal systems are reduced to simpler models capable of replicating dominant features, particularly those that adversely effect state observers. Of the state observer architectures considered, a distributed system comprising ground based target and own-missile tracking, data up-link, and on-board missile measurement and track fusion is the natural choice for air-defence. An IMM is used to process radar measurements, combining the estimates from filters with different target dynamics. The remote missile state observer combines up-linked target tracks and missile plots with IMU and seeker data to provide optimal guidance information. The performance of traditional PN and CLOS missile guidance is the basis against which on-line trajectory optimisation is judged. Enhanced guidance laws are presented that demand more from the state observers, stressing the importance of time-to-go and transport delays in strap-down systems employing staring array technology. Algorithms for solving the guidance twopoint boundary value problems created from the missile state observer output using gradient projection in function space are presented. A simulation integrating these aspects was developed whose infrastructure, capable of supporting any dynamical model, is described in the air-defence context. MBDA have extended this work creating the Aircraft and Missile Integration Simulation (AMIS) for integrating different launchers and missiles. The maturity of the AMIS makes it a tool for developing pre-launch algorithms for modern air-launched missiles from modern military aircraft.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Andersson, Naesseth Christian. „Vision and Radar Sensor Fusion for Advanced Driver Assistance Systems“. Thesis, Linköpings universitet, Reglerteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94222.

Der volle Inhalt der Quelle
Annotation:
The World Health Organization predicts that by the year 2030, road traffic injuries will be one of the top five leading causes of death. Many of these deaths and injuries can be prevented by driving cars properly equipped with state-of-the-art safety and driver assistance systems. Some examples are auto-brake and auto-collision avoidance which are becoming more and more popular on the market today. A recent study by a Swedish insurance company has shown that on roadswith speeds up to 50 km/h an auto-brake system can reduce personal injuries by up to 64 percent. In fact in an estimated 40 percent of crashes, the auto-brake reduced the effects to the degree that no personal injury was sustained. It is imperative that these so called Advanced Driver Assistance Systems, to be really effective, have good situational awareness. It is important that they have adequate information of the vehicle’s immediate surroundings. Where are other cars, pedestrians or motorcycles relative to our own vehicle? How fast are they driving and in which lane? How is our own vehicle driving? Are there objects in the way of our own vehicle’s intended path? These and many more questions can be answered by a properly designed system for situational awareness. In this thesis we design and evaluate, both quantitatively and qualitatively, sensor fusion algorithms for multi-target tracking. We use a combination of camera and radar information to perform fusion and find relevant objects in a cluttered environment. The combination of these two sensors is very interesting because of their complementary attributes. The radar system has high range resolution but poor bearing resolution. The camera system on the other hand has a very high bearing resolution. This is very promising, with the potential to substantially increase the accuracy of the tracking system compared to just using one of the two. We have also designed algorithms for path prediction and a first threat awareness logic which are both qualitively evaluated.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Attalla, Daniela, und Alexandra Tang. „Drones in Arctic Environments: Snow Change Tracking Aid using Sensor Fusion“. Thesis, KTH, Mekatronik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235928.

Der volle Inhalt der Quelle
Annotation:
The Arctic is subject to rapid climate changes that canbe difficult to track. This thesis aims to provide a user case in which researchers in the Arctic benefit from the incorporation of drones in their snow ablation research. This thesis presents a way to measure ablation stakes with the help of a sensor fusion system mounted on a drone. Ablation stakes are stakes placed on a grid over glaciers in the Arctic, below the snow and ice surface, during the winter and then measured during the summer to keep track of the amount of snow that has melted throughout the mass balance year. Each measurement is given by physically going to these stakes. The proposed solution is based on estimating the heights of the ablation stakes using a forward-faced LiDAR on a servo motor and a downward-faced ultrasonic sensor. The stake height is interpreted as the highest ultrasonic distance while the forward-faced sensor system is detecting an object within 3 m distance. The results indicate that stake height estimation using the proposed concept is a potential solution for the researchers if the roll and pitch angles of the sensor system are compensated for.
Arktis är ett område som är utsatt för stora klimatförändringar, vilka kan vara svåra att spåra. Målet med arbetet är att föreslå, utveckla och utvärdera ett koncept där forskare i arktiska områden gagnas av att använda drönar- och sensorteknik i deras arbete gällande snöablation. Arbetet presenterar ett alternativ till att mäta utplacerade referensstavar med hjälp av ett integrerat sensorsystem monterat på en drönare. Dessa referensstavar borras ned, under snö- och isytan, över ett rutnät på glaciärerna i Arktis under vintern för att sedan mätas under sommaren med avsikt att studera mängden snö som smälter under året. Varje mätning görs således genom att fysiskt gå till varje enskild referensstav. Det framtagna konceptet uppskattar höjden på referensstavarna med hjälp av en framåtriktad LiDAR monterad på en servomotor och en nedåriktad ultraljudssensor. Höjden är uttytt som det högsta ultraljudsavståndet då det framåtriktade sensorsystemet detekterar ett föremål inom 3 m avstånd. Resultaten tyder på att det föreslagna konceptets höjduppskattning av referensstavar är en potentiell lösning inom problemområdet om systemets roll- och pitchvinklar kompenseras för.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Andersson, Anton. „Offline Sensor Fusion for Multitarget Tracking using Radar and Camera Detection“. Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208344.

Der volle Inhalt der Quelle
Annotation:
Autonomous driving systems are rapidly improving and may have the ability to change society in the coming decade. One important part of these systems is the interpretation of sensor information into trajectories of objects. In this master’s thesis, we study an energy minimisation method with radar and camera measurements as inputs. An energy is associated with the trajectories; this takes the measurements, the objects’ dynamics and more factors into consideration. The trajectories are chosen to minimise this energy, using a gradient descent method. The lower the energy, the better the trajectories are expected to match the real world. The processing is performed offline, as opposed to in real time. Offline tracking can be used in the evaluation of the sensors’ and the real time tracker’s performance. Offline processing allows for the use of more computer power. It also gives the possibility to use data that was collected after the considered point in time. A study of the parameters of the used energy minimisation method is presented, along with variations of the initial method. The results of the method is an improvement over the individual inputs, as well as over the real time processing used in the cars currently. In the parameter study it is shown which components of the energy function are improving the results.
Mycket resurser läggs på utveckling av självkörande bilsystem. Dessa kan komma att förändra samhället under det kommande decenniet. En viktig del av dessa system är behandling och tolkning av sensordata och skapande av banor för objekt i omgivningen. I detta examensarbete studeras en energiminimeringsmetod tillsammans med radar- och kameramätningar. En energi beräknas för banorna. Denna tar mätningarna, objektets dynamik och fler faktorer i beaktande. Banorna väljs för att minimera denna energi med hjälp av gradientmetoden. Ju lägre energi, desto bättre förväntas banorna att matcha verkligheten. Bearbetning sker offline i motsats till i realtid; offline-bearbetning kan användas då prestandan för sensorer och realtidsbehandlingen utvärderas. Detta möjliggör användning av mer datorkraft och ger möjlighet att använda data som samlats in efter den aktuella tidpunkten. En studie av de ingående parametrarna i den använda energiminimeringsmetoden presenteras, tillsammans med justeringar av den ursprungliga metoden. Metoden ger ett förbättrat resultat jämfört med de enskilda sensormätningarna, och även jämfört med den realtidsmetod som används i bilarna för närvarande. I parameterstudien visas vilka komponenter i energifunktionen som förbättrar metodens prestanda.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Manyika, James. „An information-theoretic approach to data fusion and sensor management“. Thesis, University of Oxford, 1993. http://ora.ox.ac.uk/objects/uuid:6e6dd2a8-1ec0-4d39-8f8b-083289756a70.

Der volle Inhalt der Quelle
Annotation:
The use of multi-sensor systems entails a Data Fusion and Sensor Management requirement in order to optimize the use of resources and allow the synergistic operation of sensors. To date, data fusion and sensor management have largely been dealt with separately and primarily for centralized and hierarchical systems. Although work has recently been done in distributed and decentralized data fusion, very little of it has addressed sensor management. In decentralized systems, a consistent and coherent approach is essential and the ad hoc methods used in other systems become unsatisfactory. This thesis concerns the development of a unified approach to data fusion and sensor management in multi-sensor systems in general and decentralized systems in particular, within a single consistent information-theoretic framework. Our approach is based on considering information and its gain as the main goal of multi-sensor systems. We develop a probabilistic information update paradigm from which we derive directly architectures and algorithms for decentralized data fusion and, most importantly, address sensor management. Presented with several alternatives, the question of how to make decisions leading to the best sensing configuration or actions, defines the management problem. We discuss the issues in decentralized decision making and present a normative method for decentralized sensor management based on information as expected utility. We discuss several ways of realizing the solution culminating in an iterative method akin to bargaining for a general decentralized system. Underlying this is the need for a good sensor model detailing a sensor's physical operation and the phenomenological nature of measurements vis-a-vis the probabilistic information the sensor provides. Also, implicit in a sensor management problem is the existence of several sensing alternatives such as those provided by agile or multi-mode sensors. With our application in mind, we detail such a sensor model for a novel Tracking Sonar with precisely these capabilities making it ideal for managed data fusion. As an application, we consider vehicle navigation, specifically localization and map-building. Implementation is on the OxNav vehicle (JTR) which we are currently developing. The results show, firstly, how with managed data fusion, localization is greatly speeded up compared to previous published work and secondly, how synergistic operation such as sensor-feature assignments, hand-off and cueing can be realised decentrally. This implementation provides new ways of addressing vehicle navigation, while the theoretical results are applicable to a variety of multi-sensing problems.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Sensor Fusion and Tracking"

1

Koch, Wolfgang. Tracking and Sensor Data Fusion. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-39271-9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Kim, Kyungsu. A comparison of nonlinear filters and multi-sensor fusion for tracking boost-phase ballistic missiles. Monterey, California: Naval Postgraduate School, 2009.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

International Conference on Information Fusion (7th 2005 Philadelphia, Pa.). 2005 7th International Conference on Information Fusion (FUSION): Philadelphia, PA, 25-28 July, 2005. Piscataway, NJ: IEEE, 2005.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kadar, Ivan. Signal processing, sensor fusion, and target recognition XIX: 5-7 April 2010, Orlando, Florida, United States. Bellingham, Wash: SPIE, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Hucks, John A. Fusion of ground-based sensors for optimal tracking of military targets. Monterey, Calif: Naval Postgraduate School, 1989.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Kadar, Ivan. Signal processing, sensor fusion, and target recognition XX: 25-27 April 2011, Orlando, Florida, United States. Herausgegeben von SPIE (Society). Bellingham, Wash: SPIE, 2011.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Italy) International Conference on Information Fusion (9th 2006 Florence. 2006 9th International Conference on Information Fusion: Florence, Italy, 10-13 July 2006. Piscataway, NJ: IEEE Service Center, 2006.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Feraille, Olivier. Optimal sensor fusion for changedetection. Manchester: UMIST, 1994.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Raol, J. R. Multi-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Raol, J. R. Multi-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Sensor Fusion and Tracking"

1

Koch, Wolfgang. „Integration of Advanced Sensor Properties“. In Tracking and Sensor Data Fusion, 127–56. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Nimier, V. „Soft Sensor Management for Multisensor Tracking Algorithm“. In Multisensor Fusion, 365–79. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_15.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Koch, Wolfgang. „Feed-Back to Acquisition: Sensor Management“. In Tracking and Sensor Data Fusion, 211–35. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_10.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Koch, Wolfgang. „Characterizing Objects and Sensors“. In Tracking and Sensor Data Fusion, 31–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Koch, Wolfgang. „Bayesian Knowledge Propagation“. In Tracking and Sensor Data Fusion, 53–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Koch, Wolfgang. „Sequential Track Extraction“. In Tracking and Sensor Data Fusion, 83–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Koch, Wolfgang. „On Recursive Batch Processing“. In Tracking and Sensor Data Fusion, 89–105. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Koch, Wolfgang. „Aspects of Track-to-Track Fusion“. In Tracking and Sensor Data Fusion, 107–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_6.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Koch, Wolfgang. „Integration of Advanced Object Properties“. In Tracking and Sensor Data Fusion, 157–85. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_8.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Koch, Wolfgang. „Integration of Topographical Information“. In Tracking and Sensor Data Fusion, 187–210. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39271-9_9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Sensor Fusion and Tracking"

1

Wang, Xuezhi, Branko Ristic, Braham Himed und Bill Moran. „Joint passive sensor scheduling for target tracking“. In 2017 20th International Conference on Information Fusion (Fusion). IEEE, 2017. http://dx.doi.org/10.23919/icif.2017.8009854.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Gao, Lin, Giorgio Battistelli, Luigi Chisci und Ping Wei. „Consensus-based joint target tracking and sensor localization“. In 2017 20th International Conference on Information Fusion (Fusion). IEEE, 2017. http://dx.doi.org/10.23919/icif.2017.8009847.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Noonan, C. A. „Entropy measures of multi-sensor fusion performance“. In IEE Colloquium on Target Tracking and Data Fusion. IEE, 1996. http://dx.doi.org/10.1049/ic:19961362.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Nygards, Jonas, Viktor Deleskog und Gustaf Hendeby. „Decentralized Tracking in Sensor Networks with Varying Coverage“. In 2018 21st International Conference on Information Fusion (FUSION 2018). IEEE, 2018. http://dx.doi.org/10.23919/icif.2018.8455669.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Welford, J. „Multi-sensor debris tracking“. In IET Seminar on Target Tracking and Data Fusion: Algorithms and Applications. IEE, 2008. http://dx.doi.org/10.1049/ic:20080052.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Williams, Elmer F. „IR sensor data fusion for target detection, identification, and tracking“. In Acquisition, Tracking, and POinting IV. SPIE, 1990. http://dx.doi.org/10.1117/12.2322205.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Coraluppi, Stefano, Craig Carthel und Andy Coon. „An MHT Approach to Multi-Sensor Passive Sonar Tracking“. In 2018 21st International Conference on Information Fusion (FUSION 2018). IEEE, 2018. http://dx.doi.org/10.23919/icif.2018.8455402.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Lai, Hoe Chee, Rong Yang, Gee Wah Ng, Felix Govaers, Martin Ulmke und Wolfgang Koch. „Bearings-only tracking and Doppler-bearing tracking with inequality constraint“. In 2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF). IEEE, 2017. http://dx.doi.org/10.1109/sdf.2017.8126387.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Harris, C. J. „Multi sensor data fusion for real time aircraft collision“. In IEE Colloquium on Target Tracking and Data Fusion. IEE, 1996. http://dx.doi.org/10.1049/ic:19961359.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Judge, I. „RADIX - a solution to multiple sensor data fusion“. In IEE International Seminar Target Tracking: Algorithms and Applications. IEE, 2001. http://dx.doi.org/10.1049/ic:20010231.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Sensor Fusion and Tracking"

1

Norcross, Richard J. HiCASS target tracking sensor study. Gaithersburg, MD: National Institute of Standards and Technology, 2005. http://dx.doi.org/10.6028/nist.ir.7220.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Garg, Devendra P., und Manish Kumar. Sensor Modeling and Multi-Sensor Data Fusion. Fort Belvoir, VA: Defense Technical Information Center, August 2005. http://dx.doi.org/10.21236/ada440553.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Akita, Richard, Robert Pap und Joel Davis. Biologically Inspired Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, Mai 1999. http://dx.doi.org/10.21236/ada389747.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Baim, Paul. Dynamic Database for Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, Mai 1999. http://dx.doi.org/10.21236/ada363915.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Hero, III, Raich Alfred O. und Raviv. Performance-driven Multimodality Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, Januar 2012. http://dx.doi.org/10.21236/ada565491.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

ROCKWELL INTERNATIONAL ANAHEIM CA. Multi-Sensor Feature Level Fusion. Fort Belvoir, VA: Defense Technical Information Center, Mai 1991. http://dx.doi.org/10.21236/ada237106.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Meyer, David, und Jeffrey Remmel. Distributed Algorithms for Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, Oktober 2002. http://dx.doi.org/10.21236/ada415039.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Carlson, J. J., A. M. Bouchard, G. C. Osbourn, R. F. Martinez, J. W. Bartholomew, J. B. Jordan, G. M. Flachs, Z. Bao und L. Zhu. Sensor-fusion-based biometric identity verification. Office of Scientific and Technical Information (OSTI), Februar 1998. http://dx.doi.org/10.2172/573302.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Connors, John J., Kevin Hill, David Hanekamp, William F. Haley, Robert J. Gallagher, Craig Gowin, Arthur R. Farrar et al. Sensor fusion for intelligent process control. Office of Scientific and Technical Information (OSTI), August 2004. http://dx.doi.org/10.2172/919114.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Hunn, Bruce P. The Human Factors of Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, Mai 2008. http://dx.doi.org/10.21236/ada481551.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie