Academic literature on the topic 'Detector based on Bayes Theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Detector based on Bayes Theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Detector based on Bayes Theory"

1

Doering, Dionísio, and Adalberto Schuck Junior. "A Novel Method for Generating Scale Space Kernels Based on Wavelet Theory." Revista de Informática Teórica e Aplicada 15, no. 2 (December 12, 2008): 121–38. http://dx.doi.org/10.22456/2175-2745.7024.

Full text
Abstract:
The linear scale-space kernel is a Gaussian or Poisson function. These functions were chosen based on several axioms. This representation creates a good base for visualization when there is no information (in advanced) about which scales are more important. These kernels have some deficiencies, as an example, its support region goes from minus to plus infinite. In order to solve these issues several others scale-space kernels have been proposed. In this paper we present a novel method to create scale-space kernels from one-dimensional wavelet functions. In order to do so, we show the scale-space and wavelet fundamental equations and then the relationship between them. We also describe three different methods to generate two-dimensional functions from one-dimensional functions. Then we show results got from scale-space blob detector using the original and two new scale-space bases (Haar and Bi-ortogonal 4.4), and a comparison between the edges detected using the Gaussian kernel and Haar kernel for a noisy image. Finally we show a comparison between the scale space Haar edge detector and the Canny edge detector for an image with one known square in it, for that case we show the Mean Square Error (MSE) of the edges detected with both algorithms.
APA, Harvard, Vancouver, ISO, and other styles
2

Gotoh, Masayuki, and Shigeichi Hirasawa. "Statistical model selection based on Bayes decision theory and its application to change detection problem." International Journal of Production Economics 60-61 (April 1999): 629–38. http://dx.doi.org/10.1016/s0925-5273(98)00186-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sun, Shuang, Li Liang, Ming Li, and Xin Li. "Multidamage Detection of Bridges Using Rough Set Theory and Naive-Bayes Classifier." Mathematical Problems in Engineering 2018 (May 27, 2018): 1–13. http://dx.doi.org/10.1155/2018/6752456.

Full text
Abstract:
This paper is intended to introduce a two-stage detection method to solve the multidamage problem in bridges. Vibration analysis is conducted to acquire the dynamic fingerprints which are regarded as information sources. Bayesian fusion is used to integrate these sources and preliminarily locate the damage. Then, the RSNB method which combines rough set theory and Naive-Bayes classifier is proposed to simplify the sample dimensions and fuse the remaining attributes for damage extent detection. A numerical simulation of a real structure, the Sishui Bridge in Shenyang, China, is conducted to validate the effectiveness of the proposed detection method. Data fusion based method is compared with single-valued index method at the damage localization stage. The proposed RSNB method is compared with the Back Propagation Neural Network (BPNN) method at the damage qualification stage. The results show that the proposed two-stage damage detection method has better performances in regard to transparency, accuracy, efficiency, noise robustness, and stability. Furthermore, an ambient excitation modal test was carried out on the bridge to obtain the vibration responses and assess the damage condition with the proposed method. This novel approach is applicable for early damage detection and provides a basis for bridge management and maintenance.
APA, Harvard, Vancouver, ISO, and other styles
4

Rastiveis, H. "DECISION LEVEL FUSION OF LIDAR DATA AND AERIAL COLOR IMAGERY BASED ON BAYESIAN THEORY FOR URBAN AREA CLASSIFICATION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-1-W5 (December 11, 2015): 589–94. http://dx.doi.org/10.5194/isprsarchives-xl-1-w5-589-2015.

Full text
Abstract:
Airborne Light Detection and Ranging (LiDAR) generates high-density 3D point clouds to provide a comprehensive information from object surfaces. Combining this data with aerial/satellite imagery is quite promising for improving land cover classification. In this study, fusion of LiDAR data and aerial imagery based on Bayesian theory in a three-level fusion algorithm is presented. In the first level, pixel-level fusion, the proper descriptors for both LiDAR and image data are extracted. In the next level of fusion, feature-level, using extracted features the area are classified into six classes of “Buildings”, “Trees”, “Asphalt Roads”, “Concrete roads”, “Grass” and “Cars” using Naïve Bayes classification algorithm. This classification is performed in three different strategies: (1) using merely LiDAR data, (2) using merely image data, and (3) using all extracted features from LiDAR and image. The results of three classifiers are integrated in the last phase, decision level fusion, based on Naïve Bayes algorithm. To evaluate the proposed algorithm, a high resolution color orthophoto and LiDAR data over the urban areas of Zeebruges, Belgium were applied. Obtained results from the decision level fusion phase revealed an improvement in overall accuracy and kappa coefficient.
APA, Harvard, Vancouver, ISO, and other styles
5

Chetouani, Yahya. "Model selection and fault detection approach based on Bayes decision theory: Application to changes detection problem in a distillation column." Process Safety and Environmental Protection 92, no. 3 (May 2014): 215–23. http://dx.doi.org/10.1016/j.psep.2013.02.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Podlech, Steffen. "Autofocus by Bayes Spectral Entropy Applied to Optical Microscopy." Microscopy and Microanalysis 22, no. 1 (January 13, 2016): 199–207. http://dx.doi.org/10.1017/s1431927615015652.

Full text
Abstract:
AbstractThis study introduces a passive autofocus method based on image analysis calculating the Bayes spectral entropy (BSE). The method is applied to optical microscopy and together with the specific construction of the opto-mechanical unit, it allows the analysis of large samples with complicated surfaces without subsampling. This paper will provide a short overview of the relevant theory of calculating the normalized discrete cosine transform when analyzing obtained images, in order to find the BSE measure. Furthermore, it will be shown that the BSE measure is a strong indicator, helping to determine the focal position of the optical microscope. To demonstrate the strength and robustness of the microscope system, tests have been performed using a 1951 USAF test pattern resolution chart determining the in focus position of the microscope. Finally, this method and the optical microscope system is applied to analyze an optical grating (100 lines/mm) demonstrating the detection of the focal position. The paper concludes with an outlook of potential applications of the presented system within quality control and surface analysis.
APA, Harvard, Vancouver, ISO, and other styles
7

Du, Na, Qiaoning Zhang, and X. Jessie Yang. "Evaluating effects of automation reliability and reliability information on trust, dependence and dual-task performance." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62, no. 1 (September 2018): 174. http://dx.doi.org/10.1177/1541931218621041.

Full text
Abstract:
The use of automated decision aids could reduce human exposure to dangers and enable human workers to perform more challenging tasks. However, automation is problematic when people fail to trust and depend on it appropriately. Existing studies have shown that system design that provides users with likelihood information including automation certainty, reliability, and confidence could facilitate trust- reliability calibration, the correspondence between a person’s trust in the automation and the automation’s capabilities (Lee & Moray, 1994), and improve human–automation task performance (Beller et al., 2013; Wang, Jamieson, & Hollands, 2009; McGuirl & Sarter, 2006). While revealing reliability information has been proposed as a design solution, the concrete effects of such information disclosure still vary (Wang et al., 2009; Fletcher et al., 2017; Walliser et al., 2016). Clear guidelines that would allow display designers to choose the most effective reliability information to facilitate human decision performance and trust calibration do not appear to exist. The present study, therefore, aimed to reconcile existing literature by investigating if and how different methods of calculating reliability information affect their effectiveness at different automation reliability. A human subject experiment was conducted with 60 participants. Each participant performed a compensatory tracking task and a threat detection task simultaneously with the help of an imperfect automated threat detector. The experiment adopted a 2×4 mixed design with two independent variables: automation reliability (68% vs. 90%) as a within- subject factor and reliability information as a between-subjects factor. Reliability information of the automated threat detector was calculated using different methods based on the signal detection theory and conditional probability formula of Bayes’ Theorem (H: hits; CR: correct rejections, FA: false alarms; M: misses): Overall reliability = P (H + CR | H + FA + M + CR). Positive predictive value = P (H | H + FA); negative predictive value = P (CR | CR + M). Hit rate = P (H | H + M), correct rejection rate = P (CR | CR + FA). There was also a control condition where participants were not informed of any reliability information but only told the alerts from the automated threat detector may or may not be correct. The dependent variables of interest were participants’ subjective trust in automation and objective measures of their display-switching behaviors. The results of this study showed that as the automated threat detector became more reliable, participants’ trust in and dependence on the threat detector increased significantly, and their detection performance improved. More importantly, there were significant differences in participants’ trust, dependence and dual-task performance when reliability information was calculated by different methods. Specifically, when overall reliability of the automated threat detector was 90%, revealing positive and negative predictive values of the automation significantly helped participants to calibrate their trust in and dependence on the detector, and led to the shortest reaction time for detection task. However, when overall reliability of the automated threat detector was 68%, positive and negative predictive values didn’t lead to significant difference in participants’ compliance on the detector. In addition, our result demonstrated that the disclosure of hit rate and correct rejection rate or overall reliability didn’t seem to aid human-automation team performance and trust-reliability calibration. An implication of the study is that users should be made aware of system reliability, especially of positive/negative predictive values, to engender appropriate trust in and dependence on the automation. This can be applied to the interface design of automated decision aids. Future studies should examine whether the positive and negative predictive values are still the most effective pieces of information for trust calibration when the criterion of the automated threat detector becomes liberal.
APA, Harvard, Vancouver, ISO, and other styles
8

Dr. Pullagura Priyadarsini, Ravi Kanth Motupalli, Dr Joel Sunny Deol Gosu,. "A Hybrid Approach for the Analysis of Feature Selection using Information Gain and BAT Techniques on The Anomaly Detection." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 5 (April 11, 2021): 656–66. http://dx.doi.org/10.17762/turcomat.v12i5.1063.

Full text
Abstract:
Every day, millions of people in many institutions communicate with each other on the Internet. The past two decades have witnessed unprecedented levels of Internet use by people around the world. Almost alongside these rapid developments in the internet space, an ever increasing incidence of attacks carried out on the internet has been consistently reported every minute. In such a difficult environment, Anomaly Detection Systems (ADS) play an important role in monitoring and analyzing daily internet activities for security breaches and threats. However, the analytical data routinely generated from computer networks are usually of enormous size and of little use. This creates a major challenge for ADSs, who must examine all the functionality of a certain dataset to identify intrusive patterns. The selection of features is an important factor in modeling anomaly-based intrusion detection systems. An irrelevant characteristic can lead to overfitting which in turn negatively affects the modeling power of classification algorithms. The objective of this study is to analyze and select the most discriminating input characteristics for the construction of efficient and computationally efficient schemes for an ADS. In the first step, a heuristic algorithm called IG-BA is proposed for dimensionality reduction by selecting the optimal subset based on the concept of entropy. Then, the relevant and meaningful features are selected, before implementing Number of Classifiers which includes: (1) An irrelevant feature can lead to overfitting which in turn negatively affects the modeling power of the classification algorithms. Experiment was done on CICIDS-2017 dataset by applying (1) Random Forest (RF), (2) Bayes Network (BN), (3) Naive Bayes (NB), (4) J48 and (5) Random Tree (RT) with results showing better detection precision and faster execution time. The proposed heuristic algorithm outperforms the existing ones as it is more accurate in detection as well as faster. However, Random Forest algorithm emerges as the best classifier for feature selection technique and scores over others by virtue of its accuracy in optimal selection of features.
APA, Harvard, Vancouver, ISO, and other styles
9

Kabanda, Gabriel. "Bayesian Network Model for a Zimbabwean Cybersecurity System." Oriental journal of computer science and technology 12, no. 4 (January 3, 2020): 147–67. http://dx.doi.org/10.13005/ojcst12.04.02.

Full text
Abstract:
The purpose of this research was to develop a structure for a network intrusion detection and prevention system based on the Bayesian Network for use in Cybersecurity. The phenomenal growth in the use of internet-based technologies has resulted in complexities in cybersecurity subjecting organizations to cyberattacks. What is required is a network intrusion detection and prevention system based on the Bayesian Network structure for use in Cybersecurity. Bayesian Networks (BNs) are defined as graphical probabilistic models for multivariate analysis and are directed acyclic graphs that have an associated probability distribution function. The research determined the cybersecurity framework appropriate for a developing nation; evaluated network detection and prevention systems that use Artificial Intelligence paradigms such as finite automata, neural networks, genetic algorithms, fuzzy logic, support-vector machines or diverse data-mining-based approaches; analysed Bayesian Networks that can be represented as graphical models and are directional to represent cause-effect relationships; and developed a Bayesian Network model that can handle complexity in cybersecurity. The theoretical framework on Bayesian Networks was largely informed by the NIST Cybersecurity Framework, General deterrence theory, Game theory, Complexity theory and data mining techniques. The Pragmatism paradigm used in this research, as a philosophy is intricately related to the Mixed Method Research (MMR). A mixed method approach was used in this research, which is largely quantitative with the research design being a survey and an experiment, but supported by qualitative approaches where Focus Group discussions were held. The performance of Support Vector Machines, Artificial Neural Network, K-Nearest Neighbour, Naive-Bayes and Decision Tree Algorithms was discussed. Alternative improved solutions discussed include the use of machine learning algorithms specifically Artificial Neural Networks (ANN), Decision Tree C4.5, Random Forests and Support Vector Machines (SVM).
APA, Harvard, Vancouver, ISO, and other styles
10

Et.al, Kandala Srujana Kumari. "Performance Analysis of Diabetes Mellitus Using Machine Learning Techniques." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 6 (April 10, 2021): 225–30. http://dx.doi.org/10.17762/turcomat.v12i6.1297.

Full text
Abstract:
Diabetes is a common disease in the human body caused by a set of metabolic disorders in which blood sugar levels are very long. It affects various organs in the human body and destroys many-body systems, especially the kidneys and kidneys. Early detection can save lives. To achieve this goal, this study focuses specifically on the use of machine learning techniques for many risk factors associated with this disease. Technical training methods achieve effective results by creating predictive models based on medical diagnostic data collected on Indian sugar. Learning from such data can help in predicting diabetics. In this study, we used four popular machine learning algorithms, namely Support Vector Machine (SVM), Naive Bayes (NB), Near Neighbor K (KNN), and Decision Tree C4.5 (DT), based on statistical data. people. adults in sugar. , preview. The results of our experiments show that the C4.5 solution tree has greater accuracy compared to other machine learning methods.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Detector based on Bayes Theory"

1

Lopez, Paola Johana Saboya. "Uma contribuição ao problema de detecção de ruídos impulsivos para power line communication." Universidade Federal de Juiz de Fora (UFJF), 2013. https://repositorio.ufjf.br/jspui/handle/ufjf/4155.

Full text
Abstract:
Submitted by Renata Lopes (renatasil82@gmail.com) on 2017-04-24T15:28:35Z No. of bitstreams: 1 paolajohanasaboyalopez.pdf: 1042873 bytes, checksum: a46dd95de00e062cba39ef4b9b642462 (MD5)
Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-04-24T17:09:24Z (GMT) No. of bitstreams: 1 paolajohanasaboyalopez.pdf: 1042873 bytes, checksum: a46dd95de00e062cba39ef4b9b642462 (MD5)
Made available in DSpace on 2017-04-24T17:09:24Z (GMT). No. of bitstreams: 1 paolajohanasaboyalopez.pdf: 1042873 bytes, checksum: a46dd95de00e062cba39ef4b9b642462 (MD5) Previous issue date: 2013-06-03
A presente dissertação tem por objetivo propor e avaliar cinco técnicas de detecção de ruídos impulsivos para a melhoria da transmissão digital de dados via redes de energia elétrica (do inglês, Power Line Communications) (PLC). As técnicas propostas contemplam a detecção de ruídos impulsivos no domínio do tempo discreto, no domínio da transformada wavelet discreta (do inglês, Discrete Wavelet Transform) (DWT) e no domínio da transformada discreta de Fourier (do inglês, Discrete Fourier Transform) (DFT). Tais técnicas fazem uso de métodos de extração e seleção de características, assim como métodos de detecção de sinais baseados na teoria de Bayes e redes neurais. Análises comparativas explicitam as vantagens e desvantagens de cada uma das técnicas propostas para o problema em questão, e ainda indicam que estas são bastante adequadas para a solução do mesmo.
This dissertation aims to propose and evaluate five techniques for impulsive noise detection in order to improve digital communications through power line channels. The imput signals for the proposed detection techniques are impulsive noise signals on discrete-time domain, on the Discrete Wavelet Transform domain and on the Discrete Fourier Transform domain and it makes use of feature extraction and selection techniques, as well as detection techniques supported on Bayes Theory and Multi-layer Perceptron Neural Networks. Comparative analysis show some advantages and disadvantages of each proposed technique and the relevance of them to solve the impulsive noise detection problem.
APA, Harvard, Vancouver, ISO, and other styles
2

Park, Changyi. "Generalization error rates for margin-based classifiers." Connect to resource, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1124282485.

Full text
Abstract:
Thesis (Ph. D.)--Ohio State University, 2005.
Title from first page of PDF file. Document formatted into pages; contains ix, 63 p.; also includes graphics (some col.). Includes bibliographical references (p. 60-63). Available online via OhioLINK's ETD Center
APA, Harvard, Vancouver, ISO, and other styles
3

Arroyo, Negrete Elkin Rafael. "Continuous reservoir model updating using an ensemble Kalman filter with a streamline-based covariance localization." Texas A&M University, 2006. http://hdl.handle.net/1969.1/4859.

Full text
Abstract:
This work presents a new approach that combines the comprehensive capabilities of the ensemble Kalman filter (EnKF) and the flow path information from streamlines to eliminate and/or reduce some of the problems and limitations of the use of the EnKF for history matching reservoir models. The recent use of the EnKF for data assimilation and assessment of uncertainties in future forecasts in reservoir engineering seems to be promising. EnKF provides ways of incorporating any type of production data or time lapse seismic information in an efficient way. However, the use of the EnKF in history matching comes with its shares of challenges and concerns. The overshooting of parameters leading to loss of geologic realism, possible increase in the material balance errors of the updated phase(s), and limitations associated with non-Gaussian permeability distribution are some of the most critical problems of the EnKF. The use of larger ensemble size may mitigate some of these problems but are prohibitively expensive in practice. We present a streamline-based conditioning technique that can be implemented with the EnKF to eliminate or reduce the magnitude of these problems, allowing for the use of a reduced ensemble size, thereby leading to significant savings in time during field scale implementation. Our approach involves no extra computational cost and is easy to implement. Additionally, the final history matched model tends to preserve most of the geological features of the initial geologic model. A quick look at the procedure is provided that enables the implementation of this approach into the current EnKF implementations. Our procedure uses the streamline path information to condition the covariance matrix in the Kalman Update. We demonstrate the power and utility of our approach with synthetic examples and a field case. Our result shows that using the conditioned technique presented in this thesis, the overshooting/undershooting problems disappears and the limitation to work with non- Gaussian distribution is reduced. Finally, an analysis of the scalability in a parallel implementation of our computer code is given.
APA, Harvard, Vancouver, ISO, and other styles
4

Higgins, Paul Anthony. "Reducing uncertainty in new product development." Queensland University of Technology, 2008. http://eprints.qut.edu.au/20273/.

Full text
Abstract:
Research and Development engineering is at the corner stone of humanity’s evolution. It is perceived to be a systematic creative process which ultimately improves the living standard of a society through the creation of new applications and products. The commercial paradigm that governs project selection, resource allocation and market penetration prevails when the focus shifts from pure research to applied research. Furthermore, the road to success through commercialisation is difficult for most inventors, especially in a vast and isolated country such as Australia which is located a long way from wealthy and developed economies. While market leading products are considered unique, the actual process to achieve these products is essentially the same; progressing from an idea, through development to an outcome (if successful). Unfortunately, statistics indicate that only 3% of ‘ideas’ are significantly successful, 4% are moderately successful, and the remainder ‘evaporate’ in that form (Michael Quinn, Chairman, Innovation Capital Associates Pty Ltd). This study demonstrates and analyses two techniques developed by the author which reduce uncertainty in the engineering design and development phase of new product development and therefore increase the probability of a successful outcome. This study expands the existing knowledge of the engineering design and development stage in the new product development process and is couched in the identification of practical methods, which have been successfully used to develop new products by Australian Small Medium Enterprise (SME) Excel Technology Group Pty Ltd (ETG). Process theory is the term most commonly used to describe scientific study that identifies occurrences that result from a specified input state to an output state, thus detailing the process used to achieve an outcome. The thesis identifies relevant material and analyses recognised and established engineering processes utilised in developing new products. The literature identified that case studies are a particularly useful method for supporting problem-solving processes in settings where there are no clear answers or where problems are unstructured, as in New Product Development (NPD). This study describes, defines, and demonstrates the process of new product development within the context of historical product development and a ‘live’ case study associated with an Australian Government START grant awarded to Excel Technology Group in 2004 to assist in the development of an image-based vehicle detection product. This study proposes two techniques which reduce uncertainty and thereby improve the probability of a successful outcome. The first technique provides a predicted project development path or forward engineering plan which transforms the initial ‘fuzzy idea’ into a potential and achievable outcome. This process qualifies the ‘fuzzy idea’ as a potential, rationale or tangible outcome which is within the capability of the organisation. Additionally, this process proposes that a tangible or rationale idea can be deconstructed in reverse engineering process in order to create a forward engineering development plan. A detailed structured forward engineering plan reduces the uncertainty associated with new product development unknowns and therefore contributes to a successful outcome. This is described as the RETRO technique. The study recognises however that this claim requires qualification and proposes a second technique. The second technique proposes that a two dimensional spatial representation which has productivity and consumed resources as its axes, provides an effective means to qualify progress and expediently identify variation from the predicted plan. This spatial representation technique allows a quick response which in itself has a prediction attribute associated with directing the project back onto its predicted path. This process involves a coterminous comparison between the predicted development path and the evolving actual project development path. A consequence of this process is verification of progress or the application of informed, timely and quantified corrective action. This process also identifies the degree of success achieved in the engineering design and development phase of new product development where success is defined as achieving a predicted outcome. This spatial representation technique is referred to as NPD Mapping. The study demonstrates that these are useful techniques which aid SMEs in achieving successful new product outcomes because the technique are easily administered, measure and represent relevant development process related elements and functions, and enable expedient quantified responsive action when the evolving path varies from the predicted path. These techniques go beyond time line representations as represented in GANTT charts and PERT analysis, and represent the base variables of consumed resource and productivity/technical achievement in a manner that facilitates higher level interpretation of time, effort, degree of difficulty, and product complexity in order to facilitate informed decision making. This study presents, describes, analyses and demonstrates an SME focused engineering development technique, developed by the author, that produces a successful new product outcome which begins with a ‘fuzzy idea’ in the mind of the inventor and concludes with a successful new product outcome that is delivered on time and within budget. Further research on a wider range of SME organisations undertaking new product development is recommended.
APA, Harvard, Vancouver, ISO, and other styles
5

Salehian, Bahram. "Bimodal adaptive hypermedia and interactive multimedia a web-based learning environment based on Kolb's theory of learning style." Thèse, 2003. http://hdl.handle.net/1866/14557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lin, Yu-Shan, and 林于珊. "Using Hierarchical Bayes Choice-Based Conjoint Analysis for Taiwanese Mobile Payment Market Analysis: The Theory of Two-Sided Market." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/j9tf5g.

Full text
Abstract:
碩士
國立中興大學
科技管理研究所
106
In recent years, mobile payment services have been widely used in our daily life. It not only changes payment methods to consumers, but also influences merchants’ business model. With the arrival of Apple Pay, Samsung Pay, and Google Pay in Taiwan have increased the competition in mobile payment industry. How to catch up with this trend under sheer competition from global mobile payment players becomes an important issue for Taiwanese mobile payment service providers. This study uses Hierarchical Bayes Choice-Based conjoint (HB-CBC) analysis to determine the preferences of consumers and merchants on mobile payments. This method allows respondents to experience simulated consumption situation and capture their trade-offs between products attributes. Based on individual preference, market segments of Taiwanese mobile payments could be identified and mobile payment service providers could determine appropriate services to their target segments. With a clearly defined target audience, Taiwanese mobile payment providers could develop their product strategies to keep up with competitors and sustain continuous market growth.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Detector based on Bayes Theory"

1

Gelman, Andrew, and Deborah Nolan. Probability. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198785699.003.0008.

Full text
Abstract:
This chapter contains many classroom activities and demonstrations to help students understand basic probability calculations, including conditional probability and Bayes rule. Many of the activities alert students to misconceptions about randomness. They create dramatic settings where the instructor discerns real coin flips from fake ones, students modify dice and coins in order to load them, students “accused” of lying based on the outcome of an inaccurate simulated lie detector face their classmates. Additionally, probability models of real outcomes offer good value: first we can do the probability calculations, and then can go back and discuss the potential flaws of the model.
APA, Harvard, Vancouver, ISO, and other styles
2

Canarutto, Daniel. Gauge Field Theory in Natural Geometric Language. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198861492.001.0001.

Full text
Abstract:
This monograph addresses the need to clarify basic mathematical concepts at the crossroad between gravitation and quantum physics. Selected mathematical and theoretical topics are exposed within a not-too-short, integrated approach that exploits standard and non-standard notions in natural geometric language. The role of structure groups can be regarded as secondary even in the treatment of the gauge fields themselves. Two-spinors yield a partly original ‘minimal geometric data’ approach to Einstein-Cartan-Maxwell-Dirac fields. The gravitational field is jointly represented by a spinor connection and by a soldering form (a ‘tetrad’) valued in a vector bundle naturally constructed from the assumed 2-spinor bundle. We give a presentation of electroweak theory that dispenses with group-related notions, and we introduce a non-standard, natural extension of it. Also within the 2-spinor approach we present: a non-standard view of gauge freedom; a first-order Lagrangian theory of fields with arbitrary spin; an original treatment of Lie derivatives of spinors and spinor connections. Furthermore we introduce an original formulation of Lagrangian field theories based on covariant differentials, which works in the classical and quantum field theories alike and simplifies calculations. We offer a precise mathematical approach to quantum bundles and quantum fields, including ghosts, BRST symmetry and anti-fields, treating the geometry of quantum bundles and their jet prolongations in terms Frölicher's notion of smoothness. We propose an approach to quantum particle physics based on the notion of detector, and illustrate the basic scattering computations in that context.
APA, Harvard, Vancouver, ISO, and other styles
3

Mee, Nicholas. The Cosmic Mystery Tour. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198831860.001.0001.

Full text
Abstract:
The Cosmic Mystery Tour is a brief account of modern physics and astronomy presented in a broad historical and cultural context. The book is attractively illustrated and aimed at the general reader. Part I explores the laws of physics including general relativity, the structure of matter, quantum mechanics and the Standard Model of particle physics. It discusses recent discoveries such as gravitational waves and the project to construct LISA, a space-based gravitational wave detector, as well as unresolved issues such as the nature of dark matter. Part II begins by considering cosmology, the study of the universe as a whole and how we arrived at the theory of the Big Bang and the expanding universe. It looks at the remarkable objects within the universe such as red giants, white dwarfs, neutron stars and black holes, and considers the expected discoveries from new telescopes such as the Extremely Large Telescope in Chile, and the Event Horizon Telescope, currently aiming to image the supermassive black hole at the galactic centre. Part III considers the possibility of finding extraterrestrial life, from the speculations of science fiction authors to the ongoing search for alien civilizations known as SETI. Recent developments are discussed: space probes to the satellites of Jupiter and Saturn; the discovery of planets in other star systems; the citizen science project SETI@Home; Breakthrough Starshot, the project to develop technologies to send spacecraft to the stars. It also discusses the Fermi paradox which argues that we might actually be alone in the cosmos
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Detector based on Bayes Theory"

1

Sharma, Rajnikant, Josiah Yoder, Hyukseong Kwon, and Daniel Pack. "Vision Based Mobile Target Geo-localization and Target Discrimination Using Bayes Detection Theory." In Springer Tracts in Advanced Robotics, 59–71. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-55146-8_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhu, Zhihui, Lanlan Wu, Lirong Xiong, Daodong Hu, and Youxian Wen. "Cracked-Shell Detection of Preserved Eggs Based on Acoustic Response and Bayes Theory in Mechanical Engineering." In Advances in Mechanical and Electronic Engineering, 71–77. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-31507-7_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Savchuk, Vladimir, and Chris P. Tsokos. "The Methods of Parametric Bayes Estimation Based on Censored Samples." In Bayesian Theory and Methods with Applications, 47–77. Paris: Atlantis Press, 2011. http://dx.doi.org/10.2991/978-94-91216-14-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Qin, Jie, Qun Si, Huijuan Yan, and Fuliang Yan. "A Trojan Detector Generating Algorithm Based on Chaotic Theory." In Advanced Intelligent Computing, 502–8. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24728-6_68.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Xuepeng, Shengdong Li, Xiucong Li, Xinyi Lin, and Xiaoyuan Lv. "Service Quality Evaluation of Railway Freight Transportation Network Based on Bayes Theory." In Advances in Smart Vehicular Technology, Transportation, Communication and Applications, 89–95. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70730-3_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ye, Meng, Huaxu Zhou, Yaodong Ju, Guanjin Huang, Miaogeng Wang, Xuhui Zhang, and Meiling Dai. "Congestion Link Inference Algorithm of Power Data Network Based on Bayes Theory." In Advances in Intelligent Systems and Computing, 899–907. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-8462-6_104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nakazawa, M., N. Ueda, T. Taniguchi, and A. Sekiguchi. "A New Adjustment Code Based on the Bayes’ Theory Combined with the Monte-Carlo Technique." In Reactor Dosimetry, 649–56. Dordrecht: Springer Netherlands, 1985. http://dx.doi.org/10.1007/978-94-009-5378-9_64.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nakazawa, M., N. Ueda, T. Taniguchi, and A. Sekiguchi. "A New Adjustment Code Based on the Bayes’ Theory Combined with the Monte-Carlo Technique." In Reactor Dosimetry, 649–56. Dordrecht: Springer Netherlands, 1985. http://dx.doi.org/10.1007/978-94-010-9726-0_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhu, Ying, Ping Zou, Jun-lie Wang, and Jia-wei Chen. "Based on the Theory of a Certain Type of Black-Box Testing Electronic Equipment Within the Field of Research and Development Detector." In Proceedings of the First Symposium on Aviation Maintenance and Management-Volume I, 151–56. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54236-7_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Natesan, P., P. Balasubramanie, and G. Gowrison. "AdaBoost Algorithm with Single Weak Classifier in Network Intrusion Detection." In Network Security Attacks and Countermeasures, 259–69. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-8761-5.ch011.

Full text
Abstract:
Recently machine learning based intrusion detection system developments have been subjected to extensive researches because they can detect both misuse detection and anomaly detection. In this paper, we propose an AdaBoost based algorithm for network intrusion detection system with single weak classifier. In this algorithm, the classifiers such as Bayes Net, Naïve Bayes and Decision tree are used as weak classifiers. KDDCup99 dataset is used in these experiments to demonstrate that boosting algorithm can greatly improve the classification accuracy of weak classification algorithms. Our approach achieves higher detection rate with low false alarm rates and is scalable for large datasets, resulting in an effective intrusion detection system.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Detector based on Bayes Theory"

1

Shrivastava, Rajesh Kumar, Saradhi Ramakrishna, and Chittaranjan Hota. "Game Theory based Modified Naïve-bayes Algorithm to detect DoS attacks using Honeypot." In 2019 IEEE 16th India Council International Conference (INDICON). IEEE, 2019. http://dx.doi.org/10.1109/indicon47234.2019.9030355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hellsten, Hans, Renato Machado, Mats I. Pettersson, Viet Thuy Vu, and Patrik Dammert. "Experimental results on change detection based on Bayes probability theorem." In IGARSS 2015 - 2015 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2015. http://dx.doi.org/10.1109/igarss.2015.7325764.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Akasaki, Satoshi, Naoki Yoshinaga, and Masashi Toyoda. "Early Discovery of Emerging Entities in Microblogs." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/678.

Full text
Abstract:
Keeping up to date on emerging entities that appear every day is indispensable for various applications, such as social-trend analysis and marketing research. Previous studies have attempted to detect unseen entities that are not registered in a particular knowledge base as emerging entities and consequently find non-emerging entities since the absence of entities in knowledge bases does not guarantee their emergence. We therefore introduce a novel task of discovering truly emerging entities when they have just been introduced to the public through microblogs and propose an effective method based on time-sensitive distant supervision, which exploits distinctive early-stage contexts of emerging entities. Experimental results with a large-scale Twitter archive show that the proposed method achieves 83.2% precision of the top 500 discovered emerging entities, which outperforms baselines based on unseen entity recognition with burst detection. Besides notable emerging entities, our method can discover massive long-tail and homographic emerging entities. An evaluation of relative recall shows that the method detects 80.4% emerging entities newly registered in Wikipedia; 92.8% of them are discovered earlier than their registration in Wikipedia, and the average lead-time is more than one year (578 days).
APA, Harvard, Vancouver, ISO, and other styles
4

Dai, Leilei, Yulong Pei, and Shaohua Ping. "Road Traffic Safety Micro-Evaluation Based on Bayes Theory." In Ninth International Conference of Chinese Transportation Professionals (ICCTP). Reston, VA: American Society of Civil Engineers, 2009. http://dx.doi.org/10.1061/41064(358)112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Xiuxin, Anh Dinh, and Li Chen. "A wearable real-time fall detector based on Naive Bayes classifier." In 2010 IEEE 23rd Canadian Conference on Electrical and Computer Engineering - CCECE. IEEE, 2010. http://dx.doi.org/10.1109/ccece.2010.5575129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yang Zheng, Ting Peng, Yuebin Chen, Jianpei Chen, and Lin Gao. "Cooperative sensing of improved energy detector based on minimum Bayes risk." In 11th International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM 2015). Institution of Engineering and Technology, 2015. http://dx.doi.org/10.1049/cp.2015.0660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Haiyuan Wu, Qian Chen, and M. Yachida. "A fuzzy-theory-based face detector." In Proceedings of 13th International Conference on Pattern Recognition. IEEE, 1996. http://dx.doi.org/10.1109/icpr.1996.546979.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fleming, Karl N., and Bengt O. Y. Lydell. "Use of Markov Piping Reliability Models to Evaluate Time Dependent Frequencies of Loss of Coolant Accidents." In 12th International Conference on Nuclear Engineering. ASMEDC, 2004. http://dx.doi.org/10.1115/icone12-49172.

Full text
Abstract:
Markov model theory has been applied to develop a method to evaluate the influence of alternate strategies for in-service inspection and leak detection on the frequency of leaks and ruptures in nuclear power plant piping systems [1–4]. This approach to quantification of pipe rupture frequency was originally based on a Bayes’ uncertainty analysis approach to derive piping system failure rates from a combination of service experience data and some simple reliability models [5–7]. More recently the Markov model approach has been used in conjunction with probabilistic fracture mechanics methods in the study of flow accelerated corrosion [8]. One interesting property of the Markov model is its capability to evaluate time dependent rupture frequencies via the model hazard rate. In this paper this time dependent modeling capability is used to investigate the age related and time dependent frequencies of loss of coolant accident (LOCA) initiating event frequencies. A case is presented that plant age dependent LOCA frequencies should be used in lieu of other metrics commonly used in probabilistic risk assessments and in risk informed inservice inspection evaluations. Such more commonly used metrics include the assumed constant failure rate method and the lifetime average rupture probability. Both of these methods are shown to provide optimistic estimates of LOCA frequencies for plants in the latter part of their design lifetimes, which most operating plants are approaching.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Bo-ping, and Cui-shan Zhang. "Research of Software Reliability Estimation Method Based on Bayes theory." In 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems (ICIS 2009). IEEE, 2009. http://dx.doi.org/10.1109/icicisys.2009.5358235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yang, Shaoshi, and Lajos Hanzo. "Exact Bayes' theorem based probabilistic data association for iterative MIMO detection and decoding." In 2013 IEEE Global Communications Conference (GLOBECOM 2013). IEEE, 2013. http://dx.doi.org/10.1109/glocom.2013.6831350.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography