Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Sensor Fusion and Tracking.

Zeitschriftenartikel zum Thema „Sensor Fusion and Tracking“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Zeitschriftenartikel für die Forschung zum Thema "Sensor Fusion and Tracking" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Zeitschriftenartikel für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Et. al., M. Hyndhavi,. „DEVELOPMENT OF VEHICLE TRACKING USING SENSOR FUSION“. INFORMATION TECHNOLOGY IN INDUSTRY 9, Nr. 2 (01.04.2021): 731–39. http://dx.doi.org/10.17762/itii.v9i2.406.

Der volle Inhalt der Quelle
Annotation:
The development of vehicle tracking using sensor fusion is presented in this paper. Advanced driver assistance systems (ADAS) are becoming more popular in recent years. These systems use sensor information for real-time control. To improve the standard and robustness, especially in the presence of environmental noises like varying lighting, weather conditions, and fusion of sensors has been the center of attention in recent studies. Faced with complex traffic conditions, the single sensor has been unable to meet the security requirements of ADAS and autonomous driving. The common environment perception sensors consist of radar, camera, and lidar which have both pros and cons. The sensor fusion is a necessary technology for autonomous driving which provides a better vision and understanding of vehicles surrounding. We mainly focus on highway scenarios that enable an autonomous car to comfortably follow other cars at various speeds while keeping a secure distance and mix the advantages of both sensors with a sensor fusion approach. The radar and vision sensor information are fused to produce robust and accurate measurements. And the experimental results indicate that the comparison of using only radar sensors and sensor fusion of both camera and radar sensors is presented in this paper. The algorithm is described along with simulation results by using MATLAB.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Liu, Yan Ju, Chun Xiang Xie und Jian Hui Song. „Research on Fusion Tracking Technology in Heterogeneous Multi-Sensor“. Advanced Materials Research 1056 (Oktober 2014): 158–61. http://dx.doi.org/10.4028/www.scientific.net/amr.1056.158.

Der volle Inhalt der Quelle
Annotation:
Heterogeneous multi-sensor’s fusion tracking can detect precise distance and angle to the target. For heterogeneous multi-sensor issues, radar, infrared sensor and laser sensor’s data fusion, and target tracking are studied, weighted fusion algorithm based on Lagrange and unscented kalman filter are adopted, which make date fusion and tracking filtering for target. Simulation results show that the radar / infrared / laser sensors can realize data fusion and tracking to the target, and the accuracy is significantly higher than radar and infrared/laser, and then tracking effect is better.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Qin, Y., Xue Hui Wang, Ming Jun Feng, Zhen Zhou und L. J. Wang. „Research of Asynchronous Multi-Type Sensors Data Fusion“. Advanced Materials Research 142 (Oktober 2010): 16–20. http://dx.doi.org/10.4028/www.scientific.net/amr.142.16.

Der volle Inhalt der Quelle
Annotation:
A data fusion algorithm was established for estimating the state of target tracking system with multi-type sensor. Through Kalman filter regarding the multi-sensors to computer goal estimated value, it can obtain estimation value of goal at moment. And mean square deviation of fusion estimation value was smaller than single sensor's mean square deviation. The simulation results indicated that synchronisms data fusion method was effective to the multi-target tracking problem. Asynchronous multi-sensor fusion process can obtain good control effect in the practice control process.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Li, Xin Yu, und Dong Yi Chen. „Sensor Fusion Based on Strong Tracking Filter for Augmented Reality Registration“. Key Engineering Materials 467-469 (Februar 2011): 108–13. http://dx.doi.org/10.4028/www.scientific.net/kem.467-469.108.

Der volle Inhalt der Quelle
Annotation:
Accurate tracking for Augmented Reality applications is a challenging task. Multi-sensors hybrid tracking generally provide more stable than the effect of the single visual tracking. This paper presents a new tightly-coupled hybrid tracking approach combining vision-based systems with inertial sensor. Based on multi-frequency sampling theory in the measurement data synchronization, a strong tracking filter (STF) is used to smooth sensor data and estimate position and orientation. Through adding time-varying fading factor to adaptively adjust the prediction error covariance of filter, this method improves the performance of tracking for fast moving targets. Experimental results show the efficiency and robustness of this proposed approach.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Yi, Chunlei, Kunfan Zhang und Nengling Peng. „A multi-sensor fusion and object tracking algorithm for self-driving vehicles“. Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering 233, Nr. 9 (August 2019): 2293–300. http://dx.doi.org/10.1177/0954407019867492.

Der volle Inhalt der Quelle
Annotation:
Vehicles need to detect threats on the road, anticipate emerging dangerous driving situations and take proactive actions for collision avoidance. Therefore, the study on methods of target detection and recognition are of practical value to a self-driving system. However, single sensor has its weakness, such as poor weather adaptability with lidar and camera. In this article, we propose a novel spatial calibration method based on multi-sensor systems, and the approach utilizes rotation and translation of the coordinate system. The validity of the proposed spatial calibration method is tested through comparisons with the data calibrated. In addition, a multi-sensor fusion and object tracking algorithm based on target level to detect and recognize targets is tested. Sensors contain lidar, radar and camera. The multi-sensor fusion and object tracking algorithm takes advantages of various sensors such as target location from lidar, target velocity from radar and target type from camera. Besides, multi-sensor fusion and object tracking algorithm can achieve information redundancy and increase environmental adaptability. Compared with the results of single sensor, this new approach is verified to have the accuracy of location, velocity and recognition by real data.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Chen, Bin, Xiaofei Pei und Zhenfu Chen. „Research on Target Detection Based on Distributed Track Fusion for Intelligent Vehicles“. Sensors 20, Nr. 1 (20.12.2019): 56. http://dx.doi.org/10.3390/s20010056.

Der volle Inhalt der Quelle
Annotation:
Accurate target detection is the basis of normal driving for intelligent vehicles. However, the sensors currently used for target detection have types of defects at the perception level, which can be compensated by sensor fusion technology. In this paper, the application of sensor fusion technology in intelligent vehicle target detection is studied with a millimeter-wave (MMW) radar and a camera. The target level fusion hierarchy is adopted, and the fusion algorithm is divided into two tracking processing modules and one fusion center module based on the distributed structure. The measurement information output by two sensors enters the tracking processing module, and after processing by a multi-target tracking algorithm, the local tracks are generated and transmitted to the fusion center module. In the fusion center module, a two-level association structure is designed based on regional collision association and weighted track association. The association between two sensors’ local tracks is completed, and a non-reset federated filter is used to estimate the state of the fusion tracks. The experimental results indicate that the proposed algorithm can complete a tracks association between the MMW radar and camera, and the fusion track state estimation method has an excellent performance.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Shi, Yifang, Jee Woong Choi, Lei Xu, Hyung June Kim, Ihsan Ullah und Uzair Khan. „Distributed Target Tracking in Challenging Environments Using Multiple Asynchronous Bearing-Only Sensors“. Sensors 20, Nr. 9 (07.05.2020): 2671. http://dx.doi.org/10.3390/s20092671.

Der volle Inhalt der Quelle
Annotation:
In the multiple asynchronous bearing-only (BO) sensors tracking system, there usually exist two main challenges: (1) the presence of clutter measurements and the target misdetection due to imperfect sensing; (2) the out-of-sequence (OOS) arrival of locally transmitted information due to diverse sensor sampling interval or internal processing time or uncertain communication delay. This paper simultaneously addresses the two problems by proposing a novel distributed tracking architecture consisting of the local tracking and central fusion. To get rid of the kinematic state unobservability problem in local tracking for a single BO sensor scenario, we propose a novel local integrated probabilistic data association (LIPDA) method for target measurement state tracking. The proposed approach enables eliminating most of the clutter measurement disturbance with increased target measurement accuracy. In the central tracking, the fusion center uses the proposed distributed IPDA-forward prediction fusion and decorrelation (DIPDA-FPFD) approach to sequentially fuse the OOS information transmitted by each BO sensor. The track management is carried out at local sensor level and also at the fusion center by using the recursively calculated probability of target existence as a track quality measure. The efficiency of the proposed methodology was validated by intensive numerical experiments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Deo, Ankur, Vasile Palade und Md Nazmul Huda. „Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist“. Sensors 21, Nr. 16 (11.08.2021): 5422. http://dx.doi.org/10.3390/s21165422.

Der volle Inhalt der Quelle
Annotation:
Many advanced driver assistance systems (ADAS) are currently trying to utilise multi-sensor architectures, where the driver assistance algorithm receives data from a multitude of sensors. As mono-sensor systems cannot provide reliable and consistent readings under all circumstances because of errors and other limitations, fusing data from multiple sensors ensures that the environmental parameters are perceived correctly and reliably for most scenarios, thereby substantially improving the reliability of the multi-sensor-based automotive systems. This paper first highlights the significance of efficiently fusing data from multiple sensors in ADAS features. An emergency brake assist (EBA) system is showcased using multiple sensors, namely, a light detection and ranging (LiDAR) sensor and camera. The architectures of the proposed ‘centralised’ and ‘decentralised’ sensor fusion approaches for EBA are discussed along with their constituents, i.e., the detection algorithms, the fusion algorithm, and the tracking algorithm. The centralised and decentralised architectures are built and analytically compared, and the performance of these two fusion architectures for EBA are evaluated in terms of speed of execution, accuracy, and computational cost. While both fusion methods are seen to drive the EBA application at an acceptable frame rate (~20 fps or higher) on an Intel i5-based Ubuntu system, it was concluded through the experiments and analytical comparisons that the decentralised fusion-driven EBA leads to higher accuracy; however, it has the downside of a higher computational cost. The centralised fusion-driven EBA yields comparatively less accurate results, but with the benefits of a higher frame rate and lesser computational cost.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Wöhle, Lukas, und Marion Gebhard. „SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data“. Sensors 20, Nr. 10 (12.05.2020): 2759. http://dx.doi.org/10.3390/s20102759.

Der volle Inhalt der Quelle
Annotation:
This paper presents the use of eye tracking data in Magnetic AngularRate Gravity (MARG)-sensor based head orientation estimation. The approach presented here can be deployed in any motion measurement that includes MARG and eye tracking sensors (e.g., rehabilitation robotics or medical diagnostics). The challenge in these mostly indoor applications is the presence of magnetic field disturbances at the location of the MARG-sensor. In this work, eye tracking data (visual fixations) are used to enable zero orientation change updates in the MARG-sensor data fusion chain. The approach is based on a MARG-sensor data fusion filter, an online visual fixation detection algorithm as well as a dynamic angular rate threshold estimation for low latency and adaptive head motion noise parameterization. In this work we use an adaptation of Madgwicks gradient descent filter for MARG-sensor data fusion, but the approach could be used with any other data fusion process. The presented approach does not rely on additional stationary or local environmental references and is therefore self-contained. The proposed system is benchmarked against a Qualisys motion capture system, a gold standard in human motion analysis, showing improved heading accuracy for the MARG-sensor data fusion up to a factor of 0.5 while magnetic disturbance is present.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Guo, Xiaoxiao, Yuansheng Liu, Qixue Zhong und Mengna Chai. „Research on Moving Target Tracking Algorithm Based on Lidar and Visual Fusion“. Journal of Advanced Computational Intelligence and Intelligent Informatics 22, Nr. 5 (20.09.2018): 593–601. http://dx.doi.org/10.20965/jaciii.2018.p0593.

Der volle Inhalt der Quelle
Annotation:
Multi-sensor fusion and target tracking are two key technologies for the environmental awareness system of autonomous vehicles. In this paper, a moving target tracking method based on the fusion of Lidar and binocular camera is proposed. Firstly, the position information obtained by the two types of sensors is fused at decision level by using adaptive weighting algorithm, and then the Joint Probability Data Association (JPDA) algorithm is correlated with the result of fusion to achieve multi-target tracking. Tested at a curve in the campus and compared with the Extended Kalman Filter (EKF) algorithm, the experimental results show that this algorithm can effectively overcome the limitation of a single sensor and track more accurately.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Girija, G., J. R. Raol, R. Appavu raj und Sudesh Kashyap. „Tracking filter and multi-sensor data fusion“. Sadhana 25, Nr. 2 (April 2000): 159–67. http://dx.doi.org/10.1007/bf02703756.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Takyu, Osamu, Keiichiro Shirai, Mai Ohta und Takeo Fujii. „ID Insertion and Data Tracking with Frequency Offset for Physical Wireless Parameter Conversion Sensor Networks“. Sensors 19, Nr. 4 (13.02.2019): 767. http://dx.doi.org/10.3390/s19040767.

Der volle Inhalt der Quelle
Annotation:
As the applications of the internet of things are becoming widely diversified, wireless sensor networks require real-time data reception, accommodation of access from several sensors, and low power consumption. In physical wireless parameter conversion sensor networks (PhyC-SN), all the sensors use frequency shift keying as the modulation scheme and then access the channel to the fusion center, simultaneously. As a result, the fusion center can recognize the statistical tendency of all the sensing results at a time from the frequency spectrum of the received signal. However, the information source, i.e., the sensor, cannot be specified from the received signal because no ID-indicating sensor is inserted to the signal. The data-tracking technique for tracing the time continuity of the sensing results is available for decomposing the sequence of the sensing results per sensor but the error tracking, which is a wrong recognition between the sensing results and the sensor, occurs owing to the similarity of the sensing results. This paper proposes the sensing result separation technique using a fractional carrier frequency offset (CFO) for PhyC-SN. In the proposed scheme, the particular fractional CFO is assigned to each user and it is useful for the ID specifying sensor. The fractional CFO causes inter-carrier interference (ICI). The ICI cancellation of the narrowband wireless communications is proposed. The two types of data-tracking techniques are proposed and are selectively used by the fusion center. Since the proposed data-tracking technique is multi-dimensional, high accuracy of data tracking is achieved even under the similar tendency of the sensing results. Based on computer simulation, we elucidate the advantage of the proposed sensing results separation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Ge, Bing, und Yi Yu. „Application of Multi-Sensors Data Fusion Technology in the Theodolite Tracking System“. Applied Mechanics and Materials 392 (September 2013): 783–86. http://dx.doi.org/10.4028/www.scientific.net/amm.392.783.

Der volle Inhalt der Quelle
Annotation:
The task of multi-sensors data fusion technology is to obtain more precise estimate of object state and light path than single sensor through dealing with the information from different sensors. The paper puts forward the ideal of applying the data fusion theory to O-E theodolite system, based on the data fusion theory and Kalman filter and estimate theory. At the condition of losing and covering object, the theodolite tracks object normally. The theory of multi-sensors data fusion is validated improving acquiring and tracking ability of the theodolite effectively in practice.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Luo, Junhai, Zhiyan Wang, Yanping Chen, Man Wu und Yang Yang. „An Improved Unscented Particle Filter Approach for Multi-Sensor Fusion Target Tracking“. Sensors 20, Nr. 23 (30.11.2020): 6842. http://dx.doi.org/10.3390/s20236842.

Der volle Inhalt der Quelle
Annotation:
In this paper, a new approach of multi-sensor fusion algorithm based on the improved unscented particle filter (IUPF) and a new multi-sensor distributed fusion model are proposed. Additionally, we employ a novel multi-target tracking algorithm that combines the joint probabilistic data association (JPDA) algorithm and the IUPF algorithm. To improve the real-time performance of the UPF algorithm for the maneuvering target, minimum skew simplex unscented transform combined with a scaled unscented transform is utilized, which significantly reduces the calculation of UPF sample selection. Moreover, a self-adaptive gain modification coefficient is defined to solve the low accuracy problem caused by the sigma point reduction, and the problem of particle degradation is solved by modifying the weights calculation method. In addition, a new multi-sensor fusion model is proposed, which better integrates radar and infrared sensors. Simulation results show that IUPF effectively improves real-time performance while ensuring the tracking accuracy compared with other algorithms. Besides, compared with the traditional distributed fusion architecture, the proposed new architecture makes better use of the advantages of radar and an infrared sensor and improves the tracking accuracy.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Abbas, Ash Mohammad. „Target Tracking in Wireless Sensor Networks“. Journal of Computer Science and Technology 21, Nr. 1 (17.04.2021): e8. http://dx.doi.org/10.24215/16666038.21.e8.

Der volle Inhalt der Quelle
Annotation:
A Wireless Sensor Network (WSN) consists of a group of tiny devices called sensors that communicate throughwireless links. Sensors are used to collect data about some parameters and send the collected data for furtherprocessing to a designated station. The designated station is often called command and control center (CCC),fusion center (FC), or sink. Sensors forward the collected data to their leaders or cluster heads, which in turn sendit to the centralized station. There are many applications of a WSN such as environmental monitoring, raisingalarms for fires in forests and multi-storied buildings, monitoring habitats of wild animals, monitoring children ina kindergarten, support system in play grounds, monitoring indoor patients in a hospital, precision agriculture,detection of infiltration along international boundaries, tracking an object or a target, etc.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Kim, Taeklim, und Tae-Hyoung Park. „Extended Kalman Filter (EKF) Design for Vehicle Position Tracking Using Reliability Function of Radar and Lidar“. Sensors 20, Nr. 15 (24.07.2020): 4126. http://dx.doi.org/10.3390/s20154126.

Der volle Inhalt der Quelle
Annotation:
Detection and distance measurement using sensors is not always accurate. Sensor fusion makes up for this shortcoming by reducing inaccuracies. This study, therefore, proposes an extended Kalman filter (EKF) that reflects the distance characteristics of lidar and radar sensors. The sensor characteristics of the lidar and radar over distance were analyzed, and a reliability function was designed to extend the Kalman filter to reflect distance characteristics. The accuracy of position estimation was improved by identifying the sensor errors according to distance. Experiments were conducted using real vehicles, and a comparative experiment was done combining sensor fusion using a fuzzy, adaptive measure noise and Kalman filter. Experimental results showed that the study’s method produced accurate distance estimations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Yao, Ya Chuan, Yi Yao, Ren Yi Zhang und Qiang Han. „Design of the Multi-Sensor Target Tracking System Based on Data Fusion“. Advanced Materials Research 219-220 (März 2011): 1407–10. http://dx.doi.org/10.4028/www.scientific.net/amr.219-220.1407.

Der volle Inhalt der Quelle
Annotation:
The paper is about designing fusion tracking system based on multi-sensor information processing which combined with data fusion. It solves the multiple sensor nodes collaborative work problems of the target tracking system, which makes wireless sensor networks can process a large number of instantaneous data in time. Its practicability becomes strong after practicing and simulating.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Jung, Kyung-Hoon, Jung-Min Kim, Jung-Je Park, Sung-Shin Kim und Sun-Il Bae. „Line Tracking Method of AGV using Sensor Fusion“. Journal of Korean Institute of Intelligent Systems 20, Nr. 1 (25.02.2010): 54–59. http://dx.doi.org/10.5391/jkiis.2010.20.1.054.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Faulkner, Karen, James Mark William Brownjohn, Ying Wang und Farhad Huseynov. „Tracking bridge tilt behaviour using sensor fusion techniques“. Journal of Civil Structural Health Monitoring 10, Nr. 4 (29.04.2020): 543–55. http://dx.doi.org/10.1007/s13349-020-00400-9.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Klein, G. S. W., und T. W. Drummond. „Tightly integrated sensor fusion for robust visual tracking“. Image and Vision Computing 22, Nr. 10 (September 2004): 769–76. http://dx.doi.org/10.1016/j.imavis.2004.02.007.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Ma, Liang, Kai Xue und Ping Wang. „Multitarget Tracking with Spatial Nonmaximum Suppressed Sensor Selection“. Mathematical Problems in Engineering 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/148081.

Der volle Inhalt der Quelle
Annotation:
Multitarget tracking is one of the most important applications of sensor networks, yet it is an extremely challenging problem since multisensor multitarget tracking itself is nontrivial and the difficulty is further compounded by sensor management. Recently, random finite set based Bayesian framework has opened doors for multitarget tracking with sensor management, which is modelled in the framework of partially observed Markov decision process (POMDP). However, sensor management posed as a POMDP is in essence a combinatorial optimization problem which is NP-hard and computationally unacceptable. In this paper, we propose a novel sensor selection method for multitarget tracking. We first present the sequential multi-Bernoulli filter as a centralized multisensor fusion scheme for multitarget tracking. In order to perform sensor selection, we define the hypothesis information gain (HIG) of a sensor to measure its information quantity when the sensor is selected alone. Then, we propose spatial nonmaximum suppression approach to select sensors with respect to their locations and HIGs. Two distinguished implementations have been provided using the greedy spatial nonmaximum suppression. Simulation results verify the effectiveness of proposed sensor selection approach for multitarget tracking.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Qi, Ji Yuan, und Xiao Jun Hu. „Track Handing Fusion in Data Fusion and Database“. Advanced Materials Research 468-471 (Februar 2012): 959–62. http://dx.doi.org/10.4028/www.scientific.net/amr.468-471.959.

Der volle Inhalt der Quelle
Annotation:
Track message is obtained with a lot of adopted sensors. For one sensor, the reliability may not the same in different condition. A reliability judgment matrix for each target is constructed based on the relative reliability of information offered by sensors. Then, the reliabilities of all the sensors for each target and their general reliability can be gotten. A numerical example is presented. The well reliability sensor’s information will be chose to merge with Kalman filter algorithm in tracking handling based on data fusion technique, and gain the best estimate track. According to the demand of data fusion for the data, establish a real-time database by using the objected technology to fulfill the real-time demand of fusion algorithm. And establish a history database by Relation Database. The data in data fusion system can be managed through the two databases. In this way the data fusion system can work efficiently.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Mao, Yao, Wei Ren, Yong Luo und Zhijun Li. „Optimal Design Based on Closed-Loop Fusion for Velocity Bandwidth Expansion of Optical Target Tracking System“. Sensors 19, Nr. 1 (02.01.2019): 133. http://dx.doi.org/10.3390/s19010133.

Der volle Inhalt der Quelle
Annotation:
Micro-electro-mechanical system (MEMS) gyro is one of the extensively used inertia sensors in the field of optical target tracking (OTT). However, velocity closed-loop bandwidth of the OTT system is limited due to the resonance and measurement range issues of MEMS gyro. In this paper, the generalized sensor fusion framework, named the closed-loop fusion (CLF), is analyzed, and the optimal design principle of filter is proposed in detail in order to improve measurement of the bandwidth of MEMS gyro by integrating information of MEMS accelerometers. The fusion error optimization problem, which is the core issue of fusion design, can be solved better through the feedback compensation law of CLF framework and fusion filter optimal design. Differently from conventional methods, the fusion filter of CLF can be simply and accurately designed, and the determination of superposition of fusion information can also be effectively avoided. To show the validity of the proposed method, both sensor fusion simulations and closed-loop experiments of optical target tracking system have yielded excellent results.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Lee, Deok Jin, Kil To Chong und Dong Pyo Hong. „Target Tracking in Sensor Networks Using Additive Divided Difference Information Filtering Method“. Applied Mechanics and Materials 433-435 (Oktober 2013): 503–9. http://dx.doi.org/10.4028/www.scientific.net/amm.433-435.503.

Der volle Inhalt der Quelle
Annotation:
This paper represents a new multiple sensor information fusion algorithm in distributed sensor networks using an additive divided difference information filter for nonlinear estimation and tracking applications. The newly proposed multi-sensor fusion algorithm is derived by utilizing an efficient new additive divided difference filtering algorithm with embedding statistical error propagation method into an information filtering architecture. The new additive divided difference information filter achieves not only the accurate nonlinear estimation solution, but also the flexibility of multiple information fusion in distributed sensor networks. Performance comparison of the proposed filter with the nonlinear information filters is demonstrated through a target-tracking application.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Zhang, Xi Bin. „Two-Sensor Information Fusion with Out-of-Sequence Measurement“. Advanced Materials Research 468-471 (Februar 2012): 1158–62. http://dx.doi.org/10.4028/www.scientific.net/amr.468-471.1158.

Der volle Inhalt der Quelle
Annotation:
In sensor tracking system, Out-Of-Sequence Measurement(OOSM) always occur due to communication delays. The system needs to update the OOSM in order to improve the precision. An optimal information fusion estimation weighted by scalars is presented for two-sensor information fusing with OOSM, which considers correlation between local errors, and avoid to compute the weighted matrix. It is easy to apply in real-time.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Koksal, N., M. Jalalmaab und B. Fidan. „Adaptive Linear Quadratic Attitude Tracking Control of a Quadrotor UAV Based on IMU Sensor Data Fusion“. Sensors 19, Nr. 1 (22.12.2018): 46. http://dx.doi.org/10.3390/s19010046.

Der volle Inhalt der Quelle
Annotation:
In this paper, an infinite-horizon adaptive linear quadratic tracking (ALQT) control scheme is designed for optimal attitude tracking of a quadrotor unmanned aerial vehicle (UAV). The proposed control scheme is experimentally validated in the presence of real-world uncertainties in quadrotor system parameters and sensor measurement. The designed control scheme guarantees asymptotic stability of the close-loop system with the help of complete controllability of the attitude dynamics in applying optimal control signals. To achieve robustness against parametric uncertainties, the optimal tracking solution is combined with an online least squares based parameter identification scheme to estimate the instantaneous inertia of the quadrotor. Sensor measurement noises are also taken into account for the on-board Inertia Measurement Unit (IMU) sensors. To improve controller performance in the presence of sensor measurement noises, two sensor fusion techniques are employed, one based on Kalman filtering and the other based on complementary filtering. The ALQT controller performance is compared for the use of these two sensor fusion techniques, and it is concluded that the Kalman filter based approach provides less mean-square estimation error, better attitude estimation, and better attitude control performance.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Langstrand, Jens-Patrick, Hoa T. Nguyen und Michael Hildebrandt. „Synopticon: Sensor Fusion for Real-Time Gaze Detection and Analysis“. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62, Nr. 1 (September 2018): 311–15. http://dx.doi.org/10.1177/1541931218621072.

Der volle Inhalt der Quelle
Annotation:
Synopticon is a software platform that fuses data from position tracking, eye tracking, and physiological sensors. Synopticon was developed to produce real-time digital representations of users. These “digital twins” can be visualized, or used by other algorithms to detect the behavioural, cognitive or emotional state of the user. Synopticon provides 3D modelling tools based on position tracking data to define areas of interest (AOI) in the environment. By projecting the combined eye-and position-data into the 3D model, Synopticon can automatically detect when a user is looking at an AOI, generates real-time heat maps, and compiles statistical information. The demonstration will show how to set up and calibrate a combined position tracking and eye tracking system, and explain how Synopticon addresses some of the limitations of current eye tracking technology.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Shi, Zhenlian, Yanfeng Sun, Linxin Xiong, Yongli Hu und Baocai Yin. „A Multisource Heterogeneous Data Fusion Method for Pedestrian Tracking“. Mathematical Problems in Engineering 2015 (2015): 1–10. http://dx.doi.org/10.1155/2015/150541.

Der volle Inhalt der Quelle
Annotation:
Traditional visual pedestrian tracking methods perform poorly when faced with problems such as occlusion, illumination changes, and complex backgrounds. In principle, collecting more sensing information should resolve these issues. However, it is extremely challenging to properly fuse different sensing information to achieve accurate tracking results. In this study, we develop a pedestrian tracking method for fusing multisource heterogeneous sensing information, including video, RGB-D sequences, and inertial sensor data. In our method, a RGB-D sequence is used to position the target locally by fusing the texture and depth features. The local position is then used to eliminate the cumulative error resulting from the inertial sensor positioning. A camera calibration process is used to map the inertial sensor position onto the video image plane, where the visual tracking position and the mapped position are fused using a similarity feature to obtain accurate tracking results. Experiments using real scenarios show that the developed method outperforms the existing tracking method, which uses only a single sensing dataset, and is robust to target occlusion, illumination changes, and interference from similar textures or complex backgrounds.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Tian, Qinglin, Kevin I.-Kai Wang und Zoran Salcic. „An INS and UWB Fusion-Based Gyroscope Drift Correction Approach for Indoor Pedestrian Tracking“. Sensors 20, Nr. 16 (10.08.2020): 4476. http://dx.doi.org/10.3390/s20164476.

Der volle Inhalt der Quelle
Annotation:
Information fusion combining inertial navigation and radio frequency (RF) technologies, is commonly applied in indoor positioning systems (IPSs) to obtain more accurate tracking results. The performance of the inertial navigation system (INS) subsystem is affected by sensor drift over time and the RF-based subsystem aims to correct the position estimate using a fusion filter. However, the inherent sensor drift is usually not corrected during fusion, which leads to increasingly erroneous estimates over a short period of time. Among the inertial sensor drifts, gyroscope drift has the most significant impact in determining the correct orientation and accurate tracking. A gyroscope drift correction approach is proposed in this study and is incorporated in an INS and ultra-wideband (UWB) fusion IPS where only distance measurements from UWB subsystem are used. The drift correction approach is based on turn detection to account for the fact that gyroscope drift is accumulated during a turn. Practical pedestrian tracking experiments are conducted to demonstrate the accuracy of the drift correction approach. With the gyroscope drift corrected, the fusion IPS is able to provide more accurate tracking performance and achieve up to 64.52% mean position error reduction when compared to the INS only tracking result.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Jin, Xue Bo, Jiang Feng Wang, Hui Yan Zhang und Li Hong Cao. „Variable Number of Multi-Sensor Fusion for Indoor RFID Tracking System“. Applied Mechanics and Materials 385-386 (August 2013): 654–57. http://dx.doi.org/10.4028/www.scientific.net/amm.385-386.654.

Der volle Inhalt der Quelle
Annotation:
We present a new method to accurately tracking persons indoors by active RFID technology. To deal with nonlinear measurement model, the EKF(extended Kalman filter) is used to estimate the target trajectory. This paper developed the fusion estimation algorithm for the common indoor tracking problem with the reader at any location and fusion estimation with variable number of multi-sensor system. Simulations show the algorithm developed here can adaptively adjust the model parameter while tracking and obtain good estimation performance for indoor RFID tracking.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Tørdal, Sondre Sanden, und Geir Hovland. „Relative Vessel Motion Tracking using Sensor Fusion, Aruco Markers, and MRU Sensors“. Modeling, Identification and Control: A Norwegian Research Bulletin 38, Nr. 2 (2017): 79–93. http://dx.doi.org/10.4173/mic.2017.2.3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Chen, Joy Iong-Zong. „An Algorithm of Mobile Sensors Data Fusion Tracking for Wireless Sensor Networks“. Wireless Personal Communications 58, Nr. 2 (03.12.2009): 197–214. http://dx.doi.org/10.1007/s11277-009-9888-8.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Martínez-Barberá, Humberto, Pablo Bernal-Polo und David Herrero-Pérez. „Sensor Modeling for Underwater Localization Using a Particle Filter“. Sensors 21, Nr. 4 (23.02.2021): 1549. http://dx.doi.org/10.3390/s21041549.

Der volle Inhalt der Quelle
Annotation:
This paper presents a framework for processing, modeling, and fusing underwater sensor signals to provide a reliable perception for underwater localization in structured environments. Submerged sensory information is often affected by diverse sources of uncertainty that can deteriorate the positioning and tracking. By adopting uncertain modeling and multi-sensor fusion techniques, the framework can maintain a coherent representation of the environment, filtering outliers, inconsistencies in sequential observations, and useless information for positioning purposes. We evaluate the framework using cameras and range sensors for modeling uncertain features that represent the environment around the vehicle. We locate the underwater vehicle using a Sequential Monte Carlo (SMC) method initialized from the GPS location obtained on the surface. The experimental results show that the framework provides a reliable environment representation during the underwater navigation to the localization system in real-world scenarios. Besides, they evaluate the improvement of localization compared to the position estimation using reliable dead-reckoning systems.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Li, Juan, Hui Juan Hao und Mao Li Wang. „The Particle Filter Algorithm Research for Target Tracking Based on Information Fusion“. Advanced Materials Research 628 (Dezember 2012): 440–44. http://dx.doi.org/10.4028/www.scientific.net/amr.628.440.

Der volle Inhalt der Quelle
Annotation:
This paper researches the particle filters Algorithms for target tracking based on Information Fusion, it combines the traditional Kalman filter with the particle filter. For multi-sensor and multi-target tracking system with complex application background, which is nonlinear and non-gaussian system, the paper proposes an effective particle filtering algorithm based on information fusion for distributed sensor, this algorithm contributes to the solution of particle degradation problems and the phenomenon of particle lack, and achieve high precision for target tracking.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Amamra, Abdenour, und Nabil Aouf. „Real-time multiview data fusion for object tracking with RGBD sensors“. Robotica 34, Nr. 8 (01.12.2014): 1855–79. http://dx.doi.org/10.1017/s026357471400263x.

Der volle Inhalt der Quelle
Annotation:
SUMMARYThis paper presents a new approach to accurately track a moving vehicle with a multiview setup of red–green–blue depth (RGBD) cameras. We first propose a correction method to eliminate a shift, which occurs in depth sensors when they become worn. This issue could not be otherwise corrected with the ordinary calibration procedure. Next, we present a sensor-wise filtering system to correct for an unknown vehicle motion. A data fusion algorithm is then used to optimally merge the sensor-wise estimated trajectories. We implement most parts of our solution in the graphic processor. Hence, the whole system is able to operate at up to 25 frames per second with a configuration of five cameras. Test results show the accuracy we achieved and the robustness of our solution to overcome uncertainties in the measurements and the modelling.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Xue, Guangyue, Xuemei Ren, Kexin Xing und Qiang Chen. „Discrete-Time Sliding Mode Control Coupled with Asynchronous Sensor Fusion for Rigid-Link Flexible-Joint Manipulators“. Complexity 2021 (26.07.2021): 1–12. http://dx.doi.org/10.1155/2021/9927850.

Der volle Inhalt der Quelle
Annotation:
This paper proposes a novel discrete-time terminal sliding mode controller (DTSMC) coupled with an asynchronous multirate sensor fusion estimator for rigid-link flexible-joint (RLFJ) manipulator tracking control. A camera is employed as external sensors to observe the RLFJ manipulator’s state which cannot be directly obtained from the encoders since gear mechanisms or flexible joints exist. The extended Kalman filter- (EKF-) based asynchronous multirate sensor fusion method deals with the slow sampling rate and the latency of camera by using motor encoders to cover the missing information between two visual samples. In the proposed control scheme, a novel sliding mode surface is presented by taking advantage of both the estimation error and tracking error. It is proved that the proposed controller achieves convergence results for tracking control in the theoretical derivation. Simulation and experimental studies are included to validate the effectiveness of the proposed approach.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Gitahi, J., M. Hahn, M. Storz, C. Bernhard, M. Feldges und R. Nordentoft. „MULTI-SENSOR TRAFFIC DATA FUSION FOR CONGESTION DETECTION AND TRACKING“. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2020 (06.08.2020): 173–80. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2020-173-2020.

Der volle Inhalt der Quelle
Annotation:
Abstract. Traffic management applications including congestion detection and tracking rely on traffic from multiple sources to model the traffic conditions. The sources are either stationary sensors which include inductive loop detectors (ILD), radar stations and Bluetooth/WiFi/BLE sensors or Floating Car Data (FCD) from moving vehicles which transmit their locations and speeds. The different sources have their inherent strengths and weaknesses but when used together, they have the potential to provide traffic information with increased robustness. Multi-sensor data fusion has the potential to enhance the estimation of traffic state in real-time by reducing the uncertainty of individual sources, extending the temporal and spatial coverage and increasing the confidence of data inputs. In this study, we fuse data from different FCD providers to improve travel time and average segment speeds estimation. We use data from INRIX, HERE and TomTom FCD commercial services and fuse the speeds based on their confidence values and granularity on virtual sub-segments of 250 m. Speeds differences between each pair of datasets are evaluated by calculating the absolute mean and standard deviation of differences. The evaluation of systematic differences is also performed for peak periods depending on the day of the week. INRIX FCD speeds are compared with ground truth spot speeds where both datasets are measured at a 1-minute interval which show good agreement with an error rate of between 8–20%. Some issues that affect FCD accuracy which include data availability and reliability problems are identified and discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Dimitrievski, Martin, David Van Hamme, Peter Veelaert und Wilfried Philips. „Cooperative Multi-Sensor Tracking of Vulnerable Road Users in the Presence of Missing Detections“. Sensors 20, Nr. 17 (26.08.2020): 4817. http://dx.doi.org/10.3390/s20174817.

Der volle Inhalt der Quelle
Annotation:
This paper presents a vulnerable road user (VRU) tracking algorithm capable of handling noisy and missing detections from heterogeneous sensors. We propose a cooperative fusion algorithm for matching and reinforcing of radar and camera detections using their proximity and positional uncertainty. The belief in the existence and position of objects is then maximized by temporal integration of fused detections by a multi-object tracker. By switching between observation models, the tracker adapts to the detection noise characteristics making it robust to individual sensor failures. The main novelty of this paper is an improved imputation sampling function for updating the state when detections are missing. The proposed function uses a likelihood without association that is conditioned on the sensor information instead of the sensor model. The benefits of the proposed solution are two-fold: firstly, particle updates become computationally tractable and secondly, the problem of imputing samples from a state which is predicted without an associated detection is bypassed. Experimental evaluation shows a significant improvement in both detection and tracking performance over multiple control algorithms. In low light situations, the cooperative fusion outperforms intermediate fusion by as much as 30%, while increases in tracking performance are most significant in complex traffic scenes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Fong, Li Wei. „Decoupled Adaptive Tracking Algorithm for Multi-Sensor Measurement Fusion“. Applied Mechanics and Materials 229-231 (November 2012): 1235–38. http://dx.doi.org/10.4028/www.scientific.net/amm.229-231.1235.

Der volle Inhalt der Quelle
Annotation:
A decoupled adaptive tracking filter is developed for centralized measurement fusion to track the same maneuvering target to improve the tracking accuracy. The proposed approach consists of a dual-band Kalman filter and a two-category Bayesian classifier. Based upon data compression and decoupling techniques, two parallel decoupled filters are obtained for lessening computation. The Bayesian classification scheme is employed which involves switching between high-level-band filter and low-level-band filter to continuously resist different target maneuver turns. The simulation results are presented which demonstrate the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Galajda, Pavol, Alena Galajdova, Stanislav Slovak, Martin Pecovsky, Milos Drutarovsky, Marek Sukop und Ihab BA Samaneh. „Robot vision ultra-wideband wireless sensor in non-cooperative industrial environments“. International Journal of Advanced Robotic Systems 15, Nr. 4 (01.07.2018): 172988141879576. http://dx.doi.org/10.1177/1729881418795767.

Der volle Inhalt der Quelle
Annotation:
In this article, the ultra-wideband technology for localization and tracking of the robot gripper (behind the obstacles) in industrial environments is presented. We explore the possibilities of ultra-wideband radar sensor network employing the centralized data fusion method that can significantly improve tracking capabilities in a complex environment. In this article, we present ultra-wideband radar sensor network hardware demonstrator that uses a new wireless ultra-wideband sensor with an embedded controller to detect and track online or off-line movement of the robot gripper. This sensor uses M-sequence ultra-wideband radars front-end and low-cost powerful processors on a system on chip with the advanced RISC machines (ARM) architecture as a main signal processing block. The ARM-based single board computer ODROID-XU4 platform used in our ultra-wideband sensor can provide processing power for the preprocessing of received raw radar signals, algorithms for detection and estimation of target’s coordinates, and finally, compression of data sent to the data fusion center. Data streams of compressed target coordinates are sent from each sensor node to the data fusion center in the central node using standard the wireless local area network (WLAN) interface that is the feature of the ODROID-XU4 platform. The article contains experimental results from measurements where sensors and antennas are located behind the wall or opaque material. Experimental testing confirmed capability of real-time performance of developed ultra-wideband radar sensor network hardware and acceptable precision of software. The introduced modular architecture of ultra-wideband radar sensor network can be used for fast development and testing of new real-time localization and tracking applications in industrial environments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Sun, Hui Qin. „Design of Indoor Large-Scale Multi-Target Precise Positioning and Tracking System“. Advanced Materials Research 1049-1050 (Oktober 2014): 1233–36. http://dx.doi.org/10.4028/www.scientific.net/amr.1049-1050.1233.

Der volle Inhalt der Quelle
Annotation:
This paper is to build a interior Large-scale multi-target precise positioning and tracking system.In-depth resear of a visual-based optical tracking technology, inertial tracking technology and multi-sensor data fusion technology.Breaking the graphics, images, data fusion, tracking and other related key technologies.Develop high versatility, real-time, robustness of a wide range of high-precision optical tracking system for interior Large-scale multi-target precise positioning and tracking system.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Demars, Casey, Michael Roggemann, Adam Webb und Timothy Havens. „Target Localization and Tracking by Fusing Doppler Differentials from Cellular Emanations with a Multi-Spectral Video Tracker“. Sensors 18, Nr. 11 (30.10.2018): 3687. http://dx.doi.org/10.3390/s18113687.

Der volle Inhalt der Quelle
Annotation:
We present an algorithm for fusing data from a constellation of RF sensors detecting cellular emanations with the output of a multi-spectral video tracker to localize and track a target with a specific cell phone. The RF sensors measure the Doppler shift caused by the moving cellular emanation and then Doppler differentials between all sensor pairs are calculated. The multi-spectral video tracker uses a Gaussian mixture model to detect foreground targets and SIFT features to track targets through the video sequence. The data is fused by associating the Doppler differential from the RF sensors with the theoretical Doppler differential computed from the multi-spectral tracker output. The absolute difference and the root-mean-square difference are computed to associate the Doppler differentials from the two sensor systems. Performance of the algorithm was evaluated using synthetically generated datasets of an urban scene with multiple moving vehicles. The presented fusion algorithm correctly associates the cellular emanation with the corresponding video target for low measurement uncertainty and in the presence of favorable motion patterns. For nearly all objects the fusion algorithm has high confidence in associating the emanation with the correct multi-spectral target from the most probable background target.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Umetani, Tomohiro, Naomichi Kuga, Misahiro Tanaka, Masahiro Wada und Minoru Ito. „Sensor Fusion of Vision System and Thermal Imaging Sensor for Target Tracking“. Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2012 (05.05.2012): 165–68. http://dx.doi.org/10.5687/sss.2012.165.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Koch, Wolfgang. „On exploiting ‘negative’ sensor evidence for target tracking and sensor data fusion“. Information Fusion 8, Nr. 1 (Januar 2007): 28–39. http://dx.doi.org/10.1016/j.inffus.2005.09.002.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Sheng, Xueli, Yang Chen, Longxiang Guo, Jingwei Yin und Xiao Han. „Multitarget Tracking Algorithm Using Multiple GMPHD Filter Data Fusion for Sonar Networks“. Sensors 18, Nr. 10 (21.09.2018): 3193. http://dx.doi.org/10.3390/s18103193.

Der volle Inhalt der Quelle
Annotation:
Multitarget tracking algorithms based on sonar usually run into detection uncertainty, complex channel and more clutters, which cause lower detection probability, single sonar sensors failing to measure when the target is in an acoustic shadow zone, and computational bottlenecks. This paper proposes a novel tracking algorithm based on multisensor data fusion to solve the above problems. Firstly, under more clutters and lower detection probability condition, a Gaussian Mixture Probability Hypothesis Density (GMPHD) filter with computational advantages was used to get local estimations. Secondly, this paper provided a maximum-detection capability multitarget track fusion algorithm to deal with the problems caused by low detection probability and the target being in acoustic shadow zones. Lastly, a novel feedback algorithm was proposed to improve the GMPHD filter tracking performance, which fed the global estimations as a random finite set (RFS). In the end, the statistical characteristics of OSPA were used as evaluation criteria in Monte Carlo simulations, which showed this algorithm’s performance against those sonar tracking problems. When the detection probability is 0.7, compared with the GMPHD filter, the OSPA mean of two sensor and three sensor fusion was decrease almost by 40% and 55%, respectively. Moreover, this algorithm successfully tracks targets in acoustic shadow zones.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Suhr, Jae Kyu, und Ho Gi Jung. „Sensor Fusion-Based Vacant Parking Slot Detection and Tracking“. IEEE Transactions on Intelligent Transportation Systems 15, Nr. 1 (Februar 2014): 21–36. http://dx.doi.org/10.1109/tits.2013.2272100.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Dallil, Ahmed, Mourad Oussalah und Abdelaziz Ouldali. „Sensor Fusion and Target Tracking Using Evidential Data Association“. IEEE Sensors Journal 13, Nr. 1 (Januar 2013): 285–93. http://dx.doi.org/10.1109/jsen.2012.2213892.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Nashashibi, Fawzi. „Vehicle tracking using a generic multi-sensor fusion approach“. International Journal of Vehicle Information and Communication Systems 2, Nr. 1/2 (2009): 99. http://dx.doi.org/10.1504/ijvics.2009.027748.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Chen, Y., und Y. Rui. „Real-Time Speaker Tracking Using Particle Filter Sensor Fusion“. Proceedings of the IEEE 92, Nr. 3 (März 2004): 485–94. http://dx.doi.org/10.1109/jproc.2003.823146.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Jia, Zhen, Arjuna Balasuriya und Subhash Challa. „Sensor fusion-based visual target tracking for autonomous vehicles“. Artificial Life and Robotics 12, Nr. 1-2 (März 2008): 317–28. http://dx.doi.org/10.1007/s10015-007-0499-8.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie