Academic literature on the topic 'Sensor fusion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sensor fusion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Sensor fusion"

1

Kim, Gon Woo, Ji Min Kim, Nosan Kwak, and Beom Hee Lee. "Hierarchical sensor fusion for building a probabilistic local map using active sensor modules." Robotica 26, no. 3 (May 2008): 307–22. http://dx.doi.org/10.1017/s026357470700392x.

Full text
Abstract:
SUMMARYAn algorithm for three-level hierarchical sensor fusions has beenproposed and applied to environment map building with enhanced accuracy and efficiency. The algorithm was realized through the two new types of sensor modules, which are composed of a halogen lamp-based active vision sensor and a semicircular ultrasonic (US) and infrared (IR) sensor system. In the first-level fusion, the US and IR sensor information is utilized in terms of the geometric characteristics of the sensor location. In the second-level fusion, the outputs from the US and IR sensors are combined with the sheet of halogen light through a proposed rule base. In the third-level fusion, local maps from the first- and second-level fusion are updated in a probabilistic way for a very accurate environment local map. A practical implementation has been carried out to demonstrate the efficiency and accuracy of the proposed hierarchical sensor fusion algorithm in environment map building.
APA, Harvard, Vancouver, ISO, and other styles
2

Jalil Piran, Mohammad, Amjad Ali, and Doug Young Suh. "Fuzzy-Based Sensor Fusion for Cognitive Radio-Based Vehicular Ad Hoc and Sensor Networks." Mathematical Problems in Engineering 2015 (2015): 1–9. http://dx.doi.org/10.1155/2015/439272.

Full text
Abstract:
In wireless sensor networks, sensor fusion is employed to integrate the acquired data from diverse sensors to provide a unified interpretation. The best and most salient advantage of sensor fusion is to obtain high-level information in both statistical and definitive aspects, which cannot be attained by a single sensor. In this paper, we propose a novel sensor fusion technique based on fuzzy theory for our earlier proposed Cognitive Radio-based Vehicular Ad Hoc and Sensor Networks (CR-VASNET). In the proposed technique, we considered four input sensor readings (antecedents) and one output (consequent). The employed mobile nodes in CR-VASNET are supposed to be equipped with diverse sensors, which cater to our antecedent variables, for example, The Jerk, Collision Intensity, and Temperature and Inclination Degree. Crash_Severity is considered as the consequent variable. The processing and fusion of the diverse sensory signals are carried out by fuzzy logic scenario. Accuracy and reliability of the proposed protocol, demonstrated by the simulation results, introduce it as an applicable system to be employed to reduce the causalities rate of the vehicles’ crashes.
APA, Harvard, Vancouver, ISO, and other styles
3

Abbas, Jabbar, Amin Al-Habaibeh, and Dai Zhong Su. "Sensor Fusion for Condition Monitoring System of End Milling Operations." Key Engineering Materials 450 (November 2010): 267–70. http://dx.doi.org/10.4028/www.scientific.net/kem.450.267.

Full text
Abstract:
This paper describes the utilisation of multi sensor fusion model using force, vibration, acoustic emission, strain and sound sensors for monitoring tool wear in end milling operations. The paper applies the ASPS approach (Automated Sensor and Signal Processing Selection) method for signal processing and sensor selection [1]. The sensory signals were processed using different signal processing methods to create a wide range of Sensory Characteristic Features (SCFs). The sensitivity of these SCFs to tool wear is investigated. The results indicate that the sensor fusion system is capable of detecting machining faults in comparison to a single sensor using the suggested approach.
APA, Harvard, Vancouver, ISO, and other styles
4

Duan, Bo. "Sensor and sensor fusion technology in autonomous vehicles." Applied and Computational Engineering 52, no. 1 (March 27, 2024): 132–37. http://dx.doi.org/10.54254/2755-2721/52/20241470.

Full text
Abstract:
The perception and navigation of autonomous vehicles heavily rely on the utilization of sensor technology and the integration of sensor fusion techniques, which play an essential role in ensuring a secure and proficient understanding of the vehicle's environment.This paper highlights the significance of sensors in autonomous vehicles and how sensor fusion techniques enhance their capabilities. Firstly, the paper introduces the different types of sensors commonly used in autonomous vehicles and explains their principles of operation, strengths, and limitations in capturing essential information about the vehicles environment. Next, the paper discusses various sensor fusion algorithms, such as Kalman filters and particle filters. Furthermore, the paper explores the challenges associated with sensor fusion and addresses the issue of handling sensor failures or uncertainties. The benefits of sensor fusion technology in autonomous vehicles are also presented. These include improved perception of the environment, enhanced object recognition and tracking, better trajectory planning, and enhanced safety through redundancy and fault tolerance. Lastly, the paper discusses the advancements and highlights the integration of artificial intelligence and machine learning techniques to optimize sensor fusion algorithms and improve the overall autonomy of the vehicle. Following thorough analysis, the deduction can be made that sensor and sensor fusion technology assume a critical function in facilitating efficient and secure autonomous vehicle navigation within intricate surroundings.
APA, Harvard, Vancouver, ISO, and other styles
5

Yeong, De Jong, Gustavo Velasco-Hernandez, John Barry, and Joseph Walsh. "Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review." Sensors 21, no. 6 (March 18, 2021): 2140. http://dx.doi.org/10.3390/s21062140.

Full text
Abstract:
With the significant advancement of sensor and communication technology and the reliable application of obstacle detection techniques and algorithms, automated driving is becoming a pivotal technology that can revolutionize the future of transportation and mobility. Sensors are fundamental to the perception of vehicle surroundings in an automated driving system, and the use and performance of multiple integrated sensors can directly determine the safety and feasibility of automated driving vehicles. Sensor calibration is the foundation block of any autonomous system and its constituent sensors and must be performed correctly before sensor fusion and obstacle detection processes may be implemented. This paper evaluates the capabilities and the technical performance of sensors which are commonly employed in autonomous vehicles, primarily focusing on a large selection of vision cameras, LiDAR sensors, and radar sensors and the various conditions in which such sensors may operate in practice. We present an overview of the three primary categories of sensor calibration and review existing open-source calibration packages for multi-sensor calibration and their compatibility with numerous commercial sensors. We also summarize the three main approaches to sensor fusion and review current state-of-the-art multi-sensor fusion techniques and algorithms for object detection in autonomous driving applications. The current paper, therefore, provides an end-to-end review of the hardware and software methods required for sensor fusion object detection. We conclude by highlighting some of the challenges in the sensor fusion field and propose possible future research directions for automated driving systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Ishikawa, Masatoshi, and Hiro Yamasaki. "Special issue "Sensor Fusion". Sensor Fusion Project." Journal of the Robotics Society of Japan 12, no. 5 (1994): 650–55. http://dx.doi.org/10.7210/jrsj.12.650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ye, Chunxuan, Zinan Lin, Alpaslan Demir, and Yan Li. "Performance Analysis of Different Types of Sensor Networks for Cognitive Radios." Journal of Electrical and Computer Engineering 2012 (2012): 1–15. http://dx.doi.org/10.1155/2012/632762.

Full text
Abstract:
We consider the problem of using multiple sensors to detect whether a certain spectrum is occupied or not. Each sensor sends its spectrum sensing result to a data fusion center, which combines all the results for an overall decision. With the existence of wireless fading on the channels from sensors to data fusion center, we examine three different mechanisms on the transmissions from sensors to data fusion center: (1) direct transmissions; (2) transmissions with the assistance of relays and (3) transmissions with the assistance of an intermediate fusion helper which fuses the sensing results from the sensors and sends the fused result to the data fusion center. For each mechanism, we analyze the correct probability of the overall decision made by the data fusion center. Our evaluation establishes that a sensor network with an intermediate fusion helper performs almost as good as a sensor network with relays, but providing energy and spectral advantages. Both networks significantly outperform the sensor network without relay or intermediate fusion helper. Such analysis facilitates the design of sensor networks. Furthermore, we generalize this evaluation to sensor networks with an arbitrary number of sensors and to sensor networks applying various information combining rules. Our simulations validate the analytical results.
APA, Harvard, Vancouver, ISO, and other styles
8

Senel, Numan, Gordon Elger, Klaus Kefferpütz, and Kristina Doycheva. "Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking." Processes 11, no. 2 (February 7, 2023): 501. http://dx.doi.org/10.3390/pr11020501.

Full text
Abstract:
Sensor data fusion is essential for environmental perception within smart traffic applications. By using multiple sensors cooperatively, the accuracy and probability of the perception are increased, which is crucial for critical traffic scenarios or under bad weather conditions. In this paper, a modular real-time capable multi-sensor fusion framework is presented and tested to fuse data on the object list level from distributed automotive sensors (cameras, radar, and LiDAR). The modular multi-sensor fusion architecture receives an object list (untracked objects) from each sensor. The fusion framework combines classical data fusion algorithms, as it contains a coordinate transformation module, an object association module (Hungarian algorithm), an object tracking module (unscented Kalman filter), and a movement compensation module. Due to the modular design, the fusion framework is adaptable and does not rely on the number of sensors or their types. Moreover, the method continues to operate because of this adaptable design in case of an individual sensor failure. This is an essential feature for safety-critical applications. The architecture targets environmental perception in challenging time-critical applications. The developed fusion framework is tested using simulation and public domain experimental data. Using the developed framework, sensor fusion is obtained well below 10 milliseconds of computing time using an AMD Ryzen 7 5800H mobile processor and the Python programming language. Furthermore, the object-level multi-sensor approach enables the detection of changes in the extrinsic calibration of the sensors and potential sensor failures. A concept was developed to use the multi-sensor framework to identify sensor malfunctions. This feature will become extremely important in ensuring the functional safety of the sensors for autonomous driving.
APA, Harvard, Vancouver, ISO, and other styles
9

Vervečka, Martynas. "SENSOR NETWORK DATA FUSION METHODS." Mokslas - Lietuvos ateitis 2, no. 1 (February 28, 2010): 50–53. http://dx.doi.org/10.3846/mla.2010.011.

Full text
Abstract:
Sensor network data fusion is widely used in warfare, in areas such as automatic target recognition, battlefield surveillance, automatic vehicle control, multiple target surveillance, etc. Non-military use example are: medical equipment status monitoring, intelligent home. The paper describes sensor networks topologies, sensor network advantages against the isolated sensors, most common network topologies, their advantages and disadvantages.
APA, Harvard, Vancouver, ISO, and other styles
10

Granshaw, Stuart I. "Sensor fusion." Photogrammetric Record 35, no. 169 (March 2020): 6–9. http://dx.doi.org/10.1111/phor.12311.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Sensor fusion"

1

Kangerud, Jim. "Sensor Fusion : Applying sensor fusion in a district heating substation." Thesis, Blekinge Tekniska Högskola, Avdelningen för för interaktion och systemdesign, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4884.

Full text
Abstract:
Many machines in these days have sensors to collect information from the world they inhabit. The correctness of this information is crucial for the correct operation. However, at times sensors are not so reliable since they are sometimes affected of some type of noise and thus give incorrect information. Another drawback might be lack of information due to shortage of existing sensors. Sensor fusion is trying to overcome these drawbacks by integrating or combining information from multiple sensors. The heating of a building is a slow and time consuming process, i.e. either the flow or energy consumption are object to drastically changes. On the other hand, the tap water system, i.e. the heating of tap water can be the source to severe changes in both flow and energy consumption. This because of that the flow is stochastic in the tap water system, at any given time a tap may be opened or closed and therefore drastically change the flow. The purpose of this thesis is to investigate if is it possible to use sensor fusion to get accurate continuous flow values from a district heating substation. This is done by integrating different sensor fusion algorithms in a district heating substation simulator.
APA, Harvard, Vancouver, ISO, and other styles
2

Barro, Alessandro. "Indirect TPMS improvement: sensor fusion with ultrasound parking sensors." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23765/.

Full text
Abstract:
Pre-feasibility analysis on the optimization of the performance of the indirect tyre pressure monitoring system through a sensor fusion with a new generation of ultrasound parking sensors: from the idea to the development of macro project specifications and macro business case, with definition of the possible new scenario in terms of performance, costs and perceived quality
APA, Harvard, Vancouver, ISO, and other styles
3

Lundquist, Christian. "Sensor Fusion for Automotive Applications." Doctoral thesis, Linköpings universitet, Reglerteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-71594.

Full text
Abstract:
Mapping stationary objects and tracking moving targets are essential for many autonomous functions in vehicles. In order to compute the map and track estimates, sensor measurements from radar, laser and camera are used together with the standard proprioceptive sensors present in a car. By fusing information from different types of sensors, the accuracy and robustness of the estimates can be increased. Different types of maps are discussed and compared in the thesis. In particular, road maps make use of the fact that roads are highly structured, which allows relatively simple and powerful models to be employed. It is shown how the information of the lane markings, obtained by a front looking camera, can be fused with inertial measurement of the vehicle motion and radar measurements of vehicles ahead to compute a more accurate and robust road geometry estimate. Further, it is shown how radar measurements of stationary targets can be used to estimate the road edges, modeled as polynomials and tracked as extended targets. Recent advances in the field of multiple target tracking lead to the use of finite set statistics (FISST) in a set theoretic approach, where the targets and the measurements are treated as random finite sets (RFS). The first order moment of a RFS is called probability hypothesis density (PHD), and it is propagated in time with a PHD filter. In this thesis, the PHD filter is applied to radar data for constructing a parsimonious representation of the map of the stationary objects around the vehicle. Two original contributions, which exploit the inherent structure in the map, are proposed. A data clustering algorithm is suggested to structure the description of the prior and considerably improving the update in the PHD filter. Improvements in the merging step further simplify the map representation. When it comes to tracking moving targets, the focus of this thesis is on extended targets, i.e., targets which potentially may give rise to more than one measurement per time step. An implementation of the PHD filter, which was proposed to handle data obtained from extended targets, is presented. An approximation is proposed in order to limit the number of hypotheses. Further, a framework to track the size and shape of a target is introduced. The method is based on measurement generating points on the surface of the target, which are modeled by an RFS. Finally, an efficient and novel Bayesian method is proposed for approximating the tire radii of a vehicle based on particle filters and the marginalization concept. This is done under the assumption that a change in the tire radius is caused by a change in tire pressure, thus obtaining an indirect tire pressure monitoring system. The approaches presented in this thesis have all been evaluated on real data from both freeways and rural roads in Sweden.
SEFS -- IVSS
VR - ETT
APA, Harvard, Vancouver, ISO, and other styles
4

Feng, Shimin. "Sensor fusion with Gaussian processes." Thesis, University of Glasgow, 2014. http://theses.gla.ac.uk/5626/.

Full text
Abstract:
This thesis presents a new approach to multi-rate sensor fusion for (1) user matching and (2) position stabilisation and lag reduction. The Microsoft Kinect sensor and the inertial sensors in a mobile device are fused with a Gaussian Process (GP) prior method. We present a Gaussian Process prior model-based framework for multisensor data fusion and explore the use of this model for fusing mobile inertial sensors and an external position sensing device. The Gaussian Process prior model provides a principled mechanism for incorporating the low-sampling-rate position measurements and the high-sampling-rate derivatives in multi-rate sensor fusion, which takes account of the uncertainty of each sensor type. We explore the complementary properties of the Kinect sensor and the built-in inertial sensors in a mobile device and apply the GP framework for sensor fusion in the mobile human-computer interaction area. The Gaussian Process prior model-based sensor fusion is presented as a principled probabilistic approach to dealing with position uncertainty and the lag of the system, which are critical for indoor augmented reality (AR) and other location-aware sensing applications. The sensor fusion helps increase the stability of the position and reduce the lag. This is of great benefit for improving the usability of a human-computer interaction system. We develop two applications using the novel and improved GP prior model. (1) User matching and identification. We apply the GP model to identify individual users, by matching the observed Kinect skeletons with the sensed inertial data from their mobile devices. (2) Position stabilisation and lag reduction in a spatially aware display application for user performance improvement. We conduct a user study. Experimental results show the improved accuracy of target selection, and reduced delay from the sensor fusion system, allowing the users to acquire the target more rapidly, and with fewer errors in comparison with the Kinect filtered system. They also reported improved performance in subjective questions. The two applications can be combined seamlessly in a proxemic interaction system as identification of people and their positions in a room-sized environment plays a key role in proxemic interactions.
APA, Harvard, Vancouver, ISO, and other styles
5

Howard, Shaun Michael. "Deep Learning for Sensor Fusion." Case Western Reserve University School of Graduate Studies / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=case1495751146601099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sobrinho, Carlos Eduardo dos Reis Rodrigues. "Sensor fusion in humanoid robots." Master's thesis, Universidade de Aveiro, 2012. http://hdl.handle.net/10773/11052.

Full text
Abstract:
Mestrado em Engenharia Electrónica e Telecomunicações
A fus~ao sensorial combina pe cas de informa c~ao proveniente de diferentes fontes/sensores de modo a obter informa c~ao global mais precisa quando comparada com sistemas que apenas dependem de fontes/sensores. Diferentes m etodos de fus~ao sensorial t^em sido desenvolvidos de forma a optimizar a resposta geral dos sistemas. Resultados nais, como a unidade inercial que funde duas fam lias diferentes de sensores para dar uma estimativa mais precisa/melhor dos dados sensoriais ou a auto-localiza c~ao do robot que deve ser capaz de avaliar a sua pr opria posi c~ao e consequentemente a posi c~ao dos membros da sua equipa s~ao exemplos da fus~ao sensorial. Esta tese ir a descrever detalhadamente, desde a fase de algoritmo at e a implementa c~ao juntamente com algumas bases matem aticas necess arias para a compreens~ao dos conceitos introduzidos, todo o trabalho desenvolvido para a equipa portuguesa que serviu para tornar o objectivo proposto em realidade: participar pela primeira vez na categoria Standard Platform League no RoboCup 2012.
The technology of sensor fusion combines pieces of information coming from di erent sources/sensors, resulting in an enhanced overall information accuracy when compared with systems that rely only on sources/sensors. Di erent sensor fusion methods have been developed in order to optimize the overall system output. End results like the inertial unit that fuses two di erent sensor families to give a more accurate/better estimate of the sensory data or the self-localization of the robot that should be able to evaluate its position and consequently its team members position. A walk-through, from the algorithm phase to the implementation, will be given in this thesis along with some mathematical background necessary to comprehend the concepts introduced and description of the auxiliary tools that were built for the Portuguese Team to help accomplish the objective: First presence in the Standard Platform League in the RoboCup 2012.
APA, Harvard, Vancouver, ISO, and other styles
7

Brandimarti, Alberto. "Sensor Data Fusion e applicazioni." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/6620/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Holmberg, Per. "Sensor Fusion with Coordinated Mobile Robots." Thesis, Linköping University, Department of Electrical Engineering, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-1717.

Full text
Abstract:

Robust localization is a prerequisite for mobile robot autonomy. In many situations the GPS signal is not available and thus an additional localization system is required. A simple approach is to apply localization based on dead reckoning by use of wheel encoders but it results in large estimation errors. With exteroceptive sensors such as a laser range finder natural landmarks in the environment of the robot can be extracted from raw range data. Landmarks are extracted with the Hough transform and a recursive line segment algorithm. By applying data association and Kalman filtering along with process models the landmarks can be used in combination with wheel encoders for estimating the global position of the robot. If several robots can cooperate better position estimates are to be expected because robots can be seen as mobile landmarks and one robot can supervise the movement of another. The centralized Kalman filter presented in this master thesis systematically treats robots and extracted landmarks such that benefits from several robots are utilized. Experiments in different indoor environments with two different robots show that long distances can be traveled while the positional uncertainty is kept low. The benefit from cooperating robots in the sense of reduced positional uncertainty is also shown in an experiment.

Except for localization algorithms a typical autonomous robot task in the form of change detection is solved. The change detection method, which requires robust localization, is aimed to be used for surveillance. The implemented algorithm accounts for measurement- and positional uncertainty when determining whether something in the environment has changed. Consecutive true changes as well as sporadic false changes are detected in an illustrative experiment.

APA, Harvard, Vancouver, ISO, and other styles
9

Lundquist, Christian. "Automotive Sensor Fusion for Situation Awareness." Licentiate thesis, Linköping University, Linköping University, Automatic Control, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-51226.

Full text
Abstract:

The use of radar and camera for situation awareness is gaining popularity in automotivesafety applications. In this thesis situation awareness consists of accurate estimates of theego vehicle’s motion, the position of the other vehicles and the road geometry. By fusinginformation from different types of sensors, such as radar, camera and inertial sensor, theaccuracy and robustness of those estimates can be increased.

Sensor fusion is the process of using information from several different sensors tocompute an estimate of the state of a dynamic system, that in some sense is better thanit would be if the sensors were used individually. Furthermore, the resulting estimate isin some cases only obtainable through the use of data from different types of sensors. Asystematic approach to handle sensor fusion problems is provided by model based stateestimation theory. The systems discussed in this thesis are primarily dynamic and they aremodeled using state space models. A measurement model is used to describe the relationbetween the state variables and the measurements from the different sensors. Within thestate estimation framework a process model is used to describe how the state variablespropagate in time. These two models are of major importance for the resulting stateestimate and are therefore given much attention in this thesis. One example of a processmodel is the single track vehicle model, which is used to model the ego vehicle’s motion.In this thesis it is shown how the estimate of the road geometry obtained directly from thecamera information can be improved by fusing it with the estimates of the other vehicles’positions on the road and the estimate of the radius of the ego vehicle’s currently drivenpath.

The positions of stationary objects, such as guardrails, lampposts and delineators aremeasured by the radar. These measurements can be used to estimate the border of theroad. Three conceptually different methods to represent and derive the road borders arepresented in this thesis. Occupancy grid mapping discretizes the map surrounding theego vehicle and the probability of occupancy is estimated for each grid cell. The secondmethod applies a constrained quadratic program in order to estimate the road borders,which are represented by two polynomials. The third method associates the radar measurementsto extended stationary objects and tracks them as extended targets.

The approaches presented in this thesis have all been evaluated on real data from bothfreeways and rural roads in Sweden.


IVSS - SEFS
APA, Harvard, Vancouver, ISO, and other styles
10

Ehsanibenafati, Aida. "Visualization Tool for Sensor Data Fusion." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5677.

Full text
Abstract:
In recent years researchers has focused on the development of techniques for multi-sensor data fusion systems. Data fusion systems process data from multiple sensors to develop improved estimate of the position, velocity, attributes and identity of entities such as the targets or entities of interest. Visualizing sensor data from fused data to raw data from each sensor help analysts to interpret the data and assess sensor data fusion platform, an evolving situation or threats. Immersive visualization has emerged as an ideal solution for exploration of sensor data and provides opportunities for improvement in multi sensor data fusion. The thesis aims to investigate possibilities of applying information visualization to sensor data fusion platform in Volvo. A visualization prototype is also developed to enables multiple users to interactively visualize Sensor Data Fusion platform in real-time, mainly in order to demonstrates, evaluate and analyze the platform functionality. In this industrial study two research methodologies were used; a case study and an experiment for evaluating the results. First a case study was conducted in order to find the best visualization technique for visualizing sensor data fusion platform. Second an experiment was conducted to evaluate the usability of the prototype that has been developed and make sure the user requirement were met. The visualization tool enabled us to study the effectiveness and efficiency of the visualization techniques used. The results confirm that the visualization method used is effective, efficient for visualizing sensor data fusion platform.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Sensor fusion"

1

Koch, Wolfgang. Tracking and Sensor Data Fusion. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-39271-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Feraille, Olivier. Optimal sensor fusion for changedetection. Manchester: UMIST, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hager, Gregory D. Task-Directed Sensor Fusion and Planning. Boston, MA: Springer US, 1990. http://dx.doi.org/10.1007/978-1-4613-1545-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Raol, J. R. Multi-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Raol, J. R. Multi-sensor data fusion with MATLAB. Boca Raton: Taylor & Francis, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Raol, J. R. Multi-sensor data fusion with MATLAB. Boca Raton: CRC Press, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Xinyu, Jun Li, Zhiwei Li, Huaping Liu, Mo Zhou, Li Wang, and Zhenhong Zou. Multi-sensor Fusion for Autonomous Driving. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3280-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

(Firm), Knovel, ed. Multi-sensor data fusion: An introduction. Berlin: Springer Verlag, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Otmar, Loffeld, Society of Photo-optical Instrumentation Engineers., European Optical Society, and Commission of the European Communities. Directorate-General for Science, Research, and Development., eds. Sensors, sensor systems, and sensor data processing: June 16-17 1997, Munich, FRG. Bellingham, Wash., USA: SPIE, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Klein, Lawrence A. Sensor and data fusion concepts and applications. Bellingham, Wash., USA: SPIE Optical Engineering Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Sensor fusion"

1

Varshney, Pramod K. "Sensor Fusion." In Computer Vision, 1–3. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-03243-2_301-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Varshney, Pramod K. "Sensor Fusion." In Computer Vision, 719–21. Boston, MA: Springer US, 2014. http://dx.doi.org/10.1007/978-0-387-31439-6_301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Muri, Harald I., Markus Wahl, Jacob J. Lamb, Rolf K. Snilsberg, and Dag R. Hjelme. "Sensor Fusion." In Micro-Optics and Energy, 53–57. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-43676-6_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Stanley, Michael, and Jongmin Lee. "Sensor Fusion." In Sensor Analysis for the Internet of Things, 29–63. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-031-01526-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Varshney, Pramod K. "Sensor Fusion." In Computer Vision, 1134–36. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-63416-2_301.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fan, Shuxiang, and Changying Li. "Sensor Fusion." In Encyclopedia of Smart Agriculture Technologies, 1–15. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-030-89123-7_142-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fan, Shuxiang, and Changying Li. "Sensor Fusion." In Encyclopedia of Digital Agricultural Technologies, 1224–38. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-24861-0_142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fan, Shuxiang, and Changying Li. "Sensor Fusion." In Encyclopedia of Smart Agriculture Technologies, 1–15. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-030-89123-7_142-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Subramanian, Rajesh. "Additional Sensors and Sensor Fusion." In Build Autonomous Mobile Robot from Scratch using ROS, 457–96. Berkeley, CA: Apress, 2023. http://dx.doi.org/10.1007/978-1-4842-9645-5_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Guang-Zhong, Javier Andreu-Perez, Xiaopeng Hu, and Surapa Thiemjarus. "Multi-sensor Fusion." In Body Sensor Networks, 301–54. London: Springer London, 2014. http://dx.doi.org/10.1007/978-1-4471-6374-9_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Sensor fusion"

1

Yuan, Jane Xiaojing, and Fernando Figueroa. "Intuitive Intelligent Sensor Fusion With Highly Autonomous Sensors." In ASME 2001 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/imece2001/dsc-24502.

Full text
Abstract:
Abstract The objective of sensor fusion is to synergistically combine different sources of sensory information into one representational format to provide more complete and precise interpretation of the system. A generic sensor fusion framework based on a highly autonomous sensor (HAS) model is presented. The framework provides freedom to choose different data fusion methods and combine them together to achieve better performance. In the context of HAS’s, this paper describes a hierarchical decentralized sensor-fusion approach based on a qualitative theory to interpret measurements, and on qualitative procedures to reason and make decisions based on the measurement interpretations. In this manner, heuristic fusion methods are applied at a high-qualitative level as well as at a numerical level when necessary. This approach implements intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors.
APA, Harvard, Vancouver, ISO, and other styles
2

Wen, Yao-Jung, Alice M. Agogino, and Kai Goebel. "Fuzzy Validation and Fusion for Wireless Sensor Networks." In ASME 2004 International Mechanical Engineering Congress and Exposition. ASMEDC, 2004. http://dx.doi.org/10.1115/imece2004-60964.

Full text
Abstract:
Miniaturized, distributed, networked sensors — called motes — promise to be smaller, less expensive and more versatile than other sensing alternatives. While these motes may have less individual reliability, high accuracy for the overall system is still desirable. Sensor validation and fusion algorithms provide a mechanism to extract pertinent information from massively sensed data and identify incipient sensor failures. Fuzzy approaches have proven to be effective and robust in challenging sensor validation and fusion applications. The algorithm developed in this paper — called mote-FVF (fuzzy validation and fusion) — uses a fuzzy approach to define the correlation among sensor readings, assign a confidence value to each of them, and perform a fused weighted average. A sensor network implementing mote-FVF for monitoring the illuminance in a dimmable fluorescent lighting environment empirically demonstrates the timely response of the algorithm to sudden changes in normal operating conditions while correctly isolating faulty sensor readings.
APA, Harvard, Vancouver, ISO, and other styles
3

Yoder, Valerie J., Steven W. Havens, Arthur J. Na, and Rachel E. Weingrad. "Sensor Fusion for Industrial Applications Using Transducer Markup Language." In ASME 2006 International Manufacturing Science and Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/msec2006-21116.

Full text
Abstract:
Manufacturing processes would greatly benefit from fusing data from many disparate sensors, but systems today do not fully exploit available sensor data. Disparate sensors could include Coordinate Measurement Machines (CMM), laser surface scanners, micro sensors, cameras, acoustic devices, thermocouples, or other various devices which provide measurement or visual data. Often, sensor data requires separate customized software for each type of sensor system, as opposed to having common tools for use across a wide array of sensor systems. This process of stove-piping requires proprietary software for analysis and display of each sensor type, and inhibits interoperability. There are several challenges to sensor fusion which need to be addressed. First, many sensors providing data are heterogeneous in phenomena detection and operation, providing measurements of different target attributes. This makes the measurements very difficult to fuse directly. Second, these disparate sensors are asynchronous in time. The collection, integration, buffering and transmitting time can each affect the way time is calculated and stored by the sensor. Transducer Markup Language (TML), developed by IRIS Corporation, addresses these challenges. This paper describes TML and addresses examples of industrial applications of TML-enabled transducer networks.
APA, Harvard, Vancouver, ISO, and other styles
4

Ma, Wen, Hongyan Zhu, and Yan Lin. "Multi-Sensor Passive Localization Based on Sensor Selection." In 2019 22th International Conference on Information Fusion (FUSION). IEEE, 2019. http://dx.doi.org/10.23919/fusion43075.2019.9011312.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cormack, David, and James R. Hopgood. "Sensor Registration and Tracking from Heterogeneous Sensors with Belief Propagation." In 2019 22th International Conference on Information Fusion (FUSION). IEEE, 2019. http://dx.doi.org/10.23919/fusion43075.2019.9011389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Shiraishi, Masatake, Makoto Kikuchi, and Hideyasu Sumiya. "Workpiece Quality Estimation in Turning by Quasi-Sensor Fusion." In ASME 2000 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2000. http://dx.doi.org/10.1115/imece2000-1808.

Full text
Abstract:
Abstract Two types of sensor fusion approaches are reported for quality assurance based on the decision making process via the sensor fusion technique: quasi-sensor fusion with several data-processed signals from a single sensor and general sensor fusion using multiple sensors. The former method may be expected to provide new information about an object to be measured from a single sensor, i.e., a strain gauge in this study. The latter method with a tool dynamometer and a strain gauge is, on the other hand, important on a practical production floor for real in-process measurement. Experimental results show the effectiveness of both techniques.
APA, Harvard, Vancouver, ISO, and other styles
7

Roussel, Stephane, Hemanth Porumamilla, Charles Birdsong, Peter Schuster, and Christopher Clark. "Enhanced Vehicle Identification Utilizing Sensor Fusion and Statistical Algorithms." In ASME 2009 International Mechanical Engineering Congress and Exposition. ASMEDC, 2009. http://dx.doi.org/10.1115/imece2009-12012.

Full text
Abstract:
Several studies in the area of vehicle detection and identification involve the use of probabilistic analysis and sensor fusion. While several sensors utilized for identifying vehicle presence and proximity have been researched, their effectiveness in identifying vehicle types has remained inadequate. This study presents the utilization of an ultrasonic sensor coupled with a magnetic sensor and the development of statistical algorithms to overcome this limitation. Mathematical models of both the ultrasonic and magnetic sensors were constructed to first understand the intrinsic characteristics of the individual sensors and also to provide a means of simulating the performance of the combined sensor system and to facilitate algorithm development. Preliminary algorithms that utilized this sensor fusion were developed to make inferences relating to vehicle proximity as well as type. It was noticed that while it helped alleviate the limitations of the individual sensors, the algorithm was affected by high occurrences of false positives. Also, since sensors carry only partial information about the surrounding environment and their measured quantities are partially corrupted with noise, probabilistic techniques were employed to extend the preliminary algorithms to include these sensor characteristics. These statistical techniques were utilized to reconstruct partial state information provided by the sensors and to also filter noisy measurement data. This probabilistic approach helped to effectively utilize the advantages of sensor fusion to further enhance the reliability of inferences made on vehicle identification. In summary, the study investigated the enhancement of vehicle identification through the use of sensor fusion and statistical techniques. The algorithms developed showed encouraging results in alleviating the occurrences of false positive inferences. One of the several applications of this study is in the use of ultrasonic-magnetic sensor combination for advanced traffic monitoring such as smart toll booths.
APA, Harvard, Vancouver, ISO, and other styles
8

Neumayer, Markus, Thomas Bretterklieber, and Thomas Suppan. "Sensor Fusion Concept for Improved Rotational Speed Measurement in Small Engines." In Small Engine Technology Conference & Exposition. 10-2 Gobancho, Chiyoda-ku, Tokyo, Japan: Society of Automotive Engineers of Japan, 2020. http://dx.doi.org/10.4271/2019-32-0519.

Full text
Abstract:
<div class="section abstract"><div class="htmlview paragraph">Future developments for small engines, e.g. engines for handheld working tools, like chain saws require the integration of ECU-systems for engine control. For small engines often only a rotational speed senor is available. The application of additional engine sensors is in many cases unwanted, e.g. due to cost aspects and additional wiring. The lack of sensor data requires tailored control strategies and signal processing techniques to infer information about the engine from the sensor data. E.g. for rotational speed sensors the Δ<i>ω</i> method has been proposed, where the load is estimated from the temporal variation of the rotational speed. This approach requires a rotational speed sensor with sufficient angular resolution. In this paper we present a simulation study for a sensor fusion concept to improve the temporal resolution of engine speed measurements for low cost engines by means of an additional vibration sensor. The rotational sensor of the engine is assumed to have insufficient resolution to determine variations of the rotational speed over an engine revolution. However, variations of the rotational speed of the engine also cause vibrations of the engine chassis. A vibration sensor can be used to pick up the vibration signal with high temporal resolution. As the transfer function between the variation of the rotational speed and the sensor readings is only approximately known, a sensor fusion concept for the rotational speed sensor and the acceleration sensor has to be applied, which combines the different measurements, while simultaneously estimating the unknown transfer function. We will use an extended Kalman filter for the data fusion and an autoregressive-model for the unknown transfer function.</div></div>
APA, Harvard, Vancouver, ISO, and other styles
9

Helgesen, Oystein Kaarstad, Edmund Forland Brekke, Hakon Hagen Helgesen, and Oystein Engelhardtsen. "Sensor Combinations in Heterogeneous Multi-sensor Fusion for Maritime Target Tracking." In 2019 22th International Conference on Information Fusion (FUSION). IEEE, 2019. http://dx.doi.org/10.23919/fusion43075.2019.9011297.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Shen, Kai, Zhongliang Jing, Peng Dong, Yinshuai Sun, and Jiyuan Cai. "Consensus and EM Based Sensor Registration in Distributed Sensor Networks." In 2018 21st International Conference on Information Fusion (FUSION 2018). IEEE, 2018. http://dx.doi.org/10.23919/icif.2018.8455802.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Sensor fusion"

1

Garg, Devendra P., and Manish Kumar. Sensor Modeling and Multi-Sensor Data Fusion. Fort Belvoir, VA: Defense Technical Information Center, August 2005. http://dx.doi.org/10.21236/ada440553.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Akita, Richard, Robert Pap, and Joel Davis. Biologically Inspired Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, May 1999. http://dx.doi.org/10.21236/ada389747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Baim, Paul. Dynamic Database for Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, May 1999. http://dx.doi.org/10.21236/ada363915.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Meyer, David, and Jeffrey Remmel. Distributed Algorithms for Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, October 2002. http://dx.doi.org/10.21236/ada415039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hero, III, Raich Alfred O., and Raviv. Performance-driven Multimodality Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, January 2012. http://dx.doi.org/10.21236/ada565491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

ROCKWELL INTERNATIONAL ANAHEIM CA. Multi-Sensor Feature Level Fusion. Fort Belvoir, VA: Defense Technical Information Center, May 1991. http://dx.doi.org/10.21236/ada237106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Connors, John J., Kevin Hill, David Hanekamp, William F. Haley, Robert J. Gallagher, Craig Gowin, Arthur R. Farrar, et al. Sensor fusion for intelligent process control. Office of Scientific and Technical Information (OSTI), August 2004. http://dx.doi.org/10.2172/919114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Carlson, J. J., A. M. Bouchard, G. C. Osbourn, R. F. Martinez, J. W. Bartholomew, J. B. Jordan, G. M. Flachs, Z. Bao, and L. Zhu. Sensor-fusion-based biometric identity verification. Office of Scientific and Technical Information (OSTI), February 1998. http://dx.doi.org/10.2172/573302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hunn, Bruce P. The Human Factors of Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, May 2008. http://dx.doi.org/10.21236/ada481551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bharadwaj, Arjun, and Jerry M. Mendel. Fuzzy Logic for Unattended Ground Sensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, January 2006. http://dx.doi.org/10.21236/ada444339.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography