Journal articles on the topic 'Bayesian intelligence'

To see the other types of publications on this topic, follow the link: Bayesian intelligence.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Bayesian intelligence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zelterman, Daniel. "Bayesian Artificial Intelligence." Technometrics 47, no. 1 (February 2005): 101–2. http://dx.doi.org/10.1198/tech.2005.s836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ramoni, Marco F. "Bayesian Artificial Intelligence." Journal of the American Statistical Association 100, no. 471 (September 2005): 1096–97. http://dx.doi.org/10.1198/jasa.2005.s39.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

V. Jensen, Finn. "Bayesian Artificial Intelligence." Pattern Analysis and Applications 7, no. 2 (May 26, 2004): 221–23. http://dx.doi.org/10.1007/s10044-004-0214-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Vreeswijk, Gerard A. W. "Book Review: Bayesian Artificial Intelligence." Artificial Intelligence and Law 11, no. 4 (2003): 289–98. http://dx.doi.org/10.1023/b:arti.0000045970.25670.25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pascual-Garcia, Erica, and Guillermo De la Torre-Gea. "Bayesian Analysis to the experiences of corruption through Artificial Intelligence." International Journal of Trend in Scientific Research and Development Volume-2, Issue-2 (February 28, 2018): 103–7. http://dx.doi.org/10.31142/ijtsrd2443.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Muhsina, Elvanisa Ayu, and Nurochman Nurochman. "SISTEM PAKAR REKOMENDASI PROFESI BERDASARKAN MULTIPLE INTELLIGENCES MENGGUNAKAN TEOREMA BAYESIAN." JISKA (Jurnal Informatika Sunan Kalijaga) 2, no. 1 (August 29, 2017): 16. http://dx.doi.org/10.14421/jiska.2017.21-03.

Full text
Abstract:
Intelligence is perhaps to be the one of the most logical way to determine how smart people is. That fact has always been a problem at job because there are number of job that attract people but require a high GPA for them. Employee with high GPA doesn’t always fit in his skill and work role. They unable to understand and maintain their performance. This expert system is a necessary for recommend job using Intelligence. This research use a Bayesian theorem calculation to find out probability value and job recommendation. The value of MI (Multiple Intelligences)’s user, MI probability to a job and job probability to previous result without any evidence produce a Calculation Variable.Result of the test shows output recommendation as expert system to 81.25% match with expert recommendation. 100% users statistically states the system running well. Expert system usability test shows 80% users strongly agree, 15.7% users agree and 4.3% users are neutral.Keywords: Multiple Intelligences, Profession, Bayesian theorem
APA, Harvard, Vancouver, ISO, and other styles
7

TERZIYAN, VAGAN. "A BAYESIAN METANETWORK." International Journal on Artificial Intelligence Tools 14, no. 03 (June 2005): 371–84. http://dx.doi.org/10.1142/s0218213005002156.

Full text
Abstract:
Bayesian network (BN) is known to be one of the most solid probabilistic modeling tools. The theory of BN provides already several useful modifications of a classical network. Among those there are context-enabled networks such as multilevel networks or recursive multinets, which can provide separate BN modelling for different combinations of contextual features' values. The main challenge of this paper is the multilevel probabilistic meta-model (Bayesian Metanetwork), which is an extension of traditional BN and modification of recursive multinets. It assumes that interoperability between component networks can be modeled by another BN. Bayesian Metanetwork is a set of BN, which are put on each other in such a way that conditional or unconditional probability distributions associated with nodes of every previous probabilistic network depend on probability distributions associated with nodes of the next network. We assume parameters (probability distributions) of a BN as random variables and allow conditional dependencies between these probabilities. Several cases of two-level Bayesian Metanetworks were presented, which consist on interrelated predictive and contextual BN models.
APA, Harvard, Vancouver, ISO, and other styles
8

Pate-Cornell, Elisabeth. "Fusion of Intelligence Information: A Bayesian Approach." Risk Analysis 22, no. 3 (June 2002): 445–54. http://dx.doi.org/10.1111/0272-4332.00056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Angelopoulos, Nicos, and James Cussens. "Bayesian learning of Bayesian networks with informative priors." Annals of Mathematics and Artificial Intelligence 54, no. 1-3 (November 2008): 53–98. http://dx.doi.org/10.1007/s10472-009-9133-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sanghai, S., P. Domingos, and D. Weld. "Relational Dynamic Bayesian Networks." Journal of Artificial Intelligence Research 24 (December 2, 2005): 759–97. http://dx.doi.org/10.1613/jair.1625.

Full text
Abstract:
Stochastic processes that involve the creation of objects and relations over time are widespread, but relatively poorly studied. For example, accurate fault diagnosis in factory assembly processes requires inferring the probabilities of erroneous assembly operations, but doing this efficiently and accurately is difficult. Modeled as dynamic Bayesian networks, these processes have discrete variables with very large domains and extremely high dimensionality. In this paper, we introduce relational dynamic Bayesian networks (RDBNs), which are an extension of dynamic Bayesian networks (DBNs) to first-order logic. RDBNs are a generalization of dynamic probabilistic relational models (DPRMs), which we had proposed in our previous work to model dynamic uncertain domains. We first extend the Rao-Blackwellised particle filtering described in our earlier work to RDBNs. Next, we lift the assumptions associated with Rao-Blackwellization in RDBNs and propose two new forms of particle filtering. The first one uses abstraction hierarchies over the predicates to smooth the particle filter's estimates. The second employs kernel density estimation with a kernel function specifically designed for relational domains. Experiments show these two methods greatly outperform standard particle filtering on the task of assembly plan execution monitoring.
APA, Harvard, Vancouver, ISO, and other styles
11

Tang, Xiao-liang, and Min Han. "Semi-supervised Bayesian ARTMAP." Applied Intelligence 33, no. 3 (February 14, 2009): 302–17. http://dx.doi.org/10.1007/s10489-009-0167-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Klami, Arto. "Bayesian object matching." Machine Learning 92, no. 2-3 (April 30, 2013): 225–50. http://dx.doi.org/10.1007/s10994-013-5357-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Rosman, Benjamin, Majd Hawasly, and Subramanian Ramamoorthy. "Bayesian policy reuse." Machine Learning 104, no. 1 (February 22, 2016): 99–127. http://dx.doi.org/10.1007/s10994-016-5547-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Francis, George, and Emil O. W. Kirkegaard. "National Intelligence and Economic Growth: A Bayesian Update." Mankind Quarterly 63, no. 1 (2022): 9–78. http://dx.doi.org/10.46469/mq.2022.63.1.2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Jun, Sunghae. "Frequentist and Bayesian Learning Approaches to Artificial Intelligence." International Journal of Fuzzy Logic and Intelligent Systems 16, no. 2 (June 30, 2016): 111–18. http://dx.doi.org/10.5391/ijfis.2016.16.2.111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Garbolino, Paolo. "Bayesian theory and artificial intelligence: The quarrelsome marriage." International Journal of Man-Machine Studies 27, no. 5-6 (November 1987): 729–42. http://dx.doi.org/10.1016/s0020-7373(87)80027-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

VÉRONIQUE, DELCROIX, MAALEJ MOHAMED-AMINE, and PIECHOWIAK SYLVAIN. "BAYESIAN NETWORKS VERSUS OTHER PROBABILISTIC MODELS FOR THE MULTIPLE DIAGNOSIS OF LARGE DEVICES." International Journal on Artificial Intelligence Tools 16, no. 03 (June 2007): 417–33. http://dx.doi.org/10.1142/s0218213007003345.

Full text
Abstract:
Multiple diagnosis methods using Bayesian networks are rooted in numerous research projects about model-based diagnosis. Some of this research exploits probabilities to make a diagnosis. Many Bayesian network applications are used for medical diagnosis or for the diagnosis of technical problems in small or moderately large devices. This paper explains in detail the advantages of using Bayesian networks as graphic probabilistic models for diagnosing complex devices, and then compares such models with other probabilistic models that may or may not use Bayesian networks.
APA, Harvard, Vancouver, ISO, and other styles
18

LIU, WEI-YI, and KUN YUE. "BAYESIAN NETWORK WITH INTERVAL PROBABILITY PARAMETERS." International Journal on Artificial Intelligence Tools 20, no. 05 (October 2011): 911–39. http://dx.doi.org/10.1142/s0218213011000449.

Full text
Abstract:
Interval data are widely used in real applications to represent the values of quantities in uncertain situations. However, the implied probabilistic causal relationships among interval-valued variables with interval data cannot be represented and inferred by general Bayesian networks with point-based probability parameters. Thus, it is desired to extend the general Bayesian network with effective mechanisms of representation, learning and inference of probabilistic causal relationships implied in interval data. In this paper, we define the interval probabilities, the bound-limited weak conditional interval probabilities and the probabilistic description, as well as the multiplication rules. Furthermore, we propose the method for learning the Bayesian network structure from interval data and the algorithm for corresponding approximate inferences. Experimental results show that our methods are feasible, and we conclude that the Bayesian network with interval probability parameters is the expansion of the general Bayesian network.
APA, Harvard, Vancouver, ISO, and other styles
19

Tang, Kewei, Zhixun Su, Jie Zhang, Lihong Cui, Wei Jiang, Xiaonan Luo, and Xiyan Sun. "Bayesian rank penalization." Neural Networks 116 (August 2019): 246–56. http://dx.doi.org/10.1016/j.neunet.2019.04.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Monderer, D., and M. Tennenholtz. "Dynamic Non-Bayesian Decision Making." Journal of Artificial Intelligence Research 7 (November 1, 1997): 231–48. http://dx.doi.org/10.1613/jair.447.

Full text
Abstract:
The model of a non-Bayesian agent who faces a repeated game with incomplete information against Nature is an appropriate tool for modeling general agent-environment interactions. In such a model the environment state (controlled by Nature) may change arbitrarily, and the feedback/reward function is initially unknown. The agent is not Bayesian, that is he does not form a prior probability neither on the state selection strategy of Nature, nor on his reward function. A policy for the agent is a function which assigns an action to every history of observations and actions. Two basic feedback structures are considered. In one of them -- the perfect monitoring case -- the agent is able to observe the previous environment state as part of his feedback, while in the other -- the imperfect monitoring case -- all that is available to the agent is the reward obtained. Both of these settings refer to partially observable processes, where the current environment state is unknown. Our main result refers to the competitive ratio criterion in the perfect monitoring case. We prove the existence of an efficient stochastic policy that ensures that the competitive ratio is obtained at almost all stages with an arbitrarily high probability, where efficiency is measured in terms of rate of convergence. It is further shown that such an optimal policy does not exist in the imperfect monitoring case. Moreover, it is proved that in the perfect monitoring case there does not exist a deterministic policy that satisfies our long run optimality criterion. In addition, we discuss the maxmin criterion and prove that a deterministic efficient optimal strategy does exist in the imperfect monitoring case under this criterion. Finally we show that our approach to long-run optimality can be viewed as qualitative, which distinguishes it from previous work in this area.
APA, Harvard, Vancouver, ISO, and other styles
21

Hubin, Aliaksandr, Geir Storvik, and Florian Frommlet. "Flexible Bayesian Nonlinear Model Configuration." Journal of Artificial Intelligence Research 72 (November 22, 2021): 901–42. http://dx.doi.org/10.1613/jair.1.13047.

Full text
Abstract:
Regression models are used in a wide range of applications providing a powerful scientific tool for researchers from different fields. Linear, or simple parametric, models are often not sufficient to describe complex relationships between input variables and a response. Such relationships can be better described through flexible approaches such as neural networks, but this results in less interpretable models and potential overfitting. Alternatively, specific parametric nonlinear functions can be used, but the specification of such functions is in general complicated. In this paper, we introduce a flexible approach for the construction and selection of highly flexible nonlinear parametric regression models. Nonlinear features are generated hierarchically, similarly to deep learning, but have additional flexibility on the possible types of features to be considered. This flexibility, combined with variable selection, allows us to find a small set of important features and thereby more interpretable models. Within the space of possible functions, a Bayesian approach, introducing priors for functions based on their complexity, is considered. A genetically modi ed mode jumping Markov chain Monte Carlo algorithm is adopted to perform Bayesian inference and estimate posterior probabilities for model averaging. In various applications, we illustrate how our approach is used to obtain meaningful nonlinear models. Additionally, we compare its predictive performance with several machine learning algorithms.
APA, Harvard, Vancouver, ISO, and other styles
22

Yap, Ghim-Eng, Ah-Hwee Tan, and Hwee-Hwa Pang. "Explaining inferences in Bayesian networks." Applied Intelligence 29, no. 3 (October 7, 2007): 263–78. http://dx.doi.org/10.1007/s10489-007-0093-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Kyburg, Henry E. "Bayesian and non-bayesian evidential updating." Artificial Intelligence 31, no. 3 (March 1987): 271–93. http://dx.doi.org/10.1016/0004-3702(87)90068-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Frühwirth-Schnatter, Sylvia. "On fuzzy Bayesian inference." Fuzzy Sets and Systems 60, no. 1 (November 1993): 41–58. http://dx.doi.org/10.1016/0165-0114(93)90288-s.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Jamil, Waqas, and Abdelhamid Bouchachia. "Online Bayesian shrinkage regression." Neural Computing and Applications 32, no. 23 (May 27, 2020): 17759–67. http://dx.doi.org/10.1007/s00521-020-04947-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Jing, Yushi, Vladimir Pavlović, and James M. Rehg. "Boosted Bayesian network classifiers." Machine Learning 73, no. 2 (August 15, 2008): 155–84. http://dx.doi.org/10.1007/s10994-008-5065-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Khan, Suleiman A., Eemeli Leppäaho, and Samuel Kaski. "Bayesian multi-tensor factorization." Machine Learning 105, no. 2 (June 10, 2016): 233–53. http://dx.doi.org/10.1007/s10994-016-5563-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

De Sousa Ribeiro, Fabio, Francesco Calivá, Mark Swainson, Kjartan Gudmundsson, Georgios Leontidis, and Stefanos Kollias. "Deep Bayesian Self-Training." Neural Computing and Applications 32, no. 9 (July 10, 2019): 4275–91. http://dx.doi.org/10.1007/s00521-019-04332-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Sun, Xingping, Chang Chen, Lu Wang, Hongwei Kang, Yong Shen, and Qingyi Chen. "Hybrid Optimization Algorithm for Bayesian Network Structure Learning." Information 10, no. 10 (September 24, 2019): 294. http://dx.doi.org/10.3390/info10100294.

Full text
Abstract:
Since the beginning of the 21st century, research on artificial intelligence has made great progress. Bayesian networks have gradually become one of the hotspots and important achievements in artificial intelligence research. Establishing an effective Bayesian network structure is the foundation and core of the learning and application of Bayesian networks. In Bayesian network structure learning, the traditional method of utilizing expert knowledge to construct the network structure is gradually replaced by the data learning structure method. However, as a result of the large amount of possible network structures, the search space is too large. The method of Bayesian network learning through training data usually has the problems of low precision or high complexity, which make the structure of learning differ greatly from that of reality, which has a great influence on the reasoning and practical application of Bayesian networks. In order to solve this problem, a hybrid optimization artificial bee colony algorithm is discretized and applied to structure learning. A hybrid optimization technique for the Bayesian network structure learning method is proposed. Experimental simulation results show that the proposed hybrid optimization structure learning algorithm has better structure and better convergence.
APA, Harvard, Vancouver, ISO, and other styles
30

Fujimaki, Ryohei, Takehisa Yairi, and Kazuo Machida. "Sparse Bayesian Learning for Nonstationary Data Sources." Transactions of the Japanese Society for Artificial Intelligence 23 (2008): 50–57. http://dx.doi.org/10.1527/tjsai.23.50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ordyniak, S., and S. Szeider. "Parameterized Complexity Results for Exact Bayesian Network Structure Learning." Journal of Artificial Intelligence Research 46 (March 5, 2013): 263–302. http://dx.doi.org/10.1613/jair.3744.

Full text
Abstract:
Bayesian network structure learning is the notoriously difficult problem of discovering a Bayesian network that optimally represents a given set of training data. In this paper we study the computational worst-case complexity of exact Bayesian network structure learning under graph theoretic restrictions on the (directed) super-structure. The super-structure is an undirected graph that contains as subgraphs the skeletons of solution networks. We introduce the directed super-structure as a natural generalization of its undirected counterpart. Our results apply to several variants of score-based Bayesian network structure learning where the score of a network decomposes into local scores of its nodes. Results: We show that exact Bayesian network structure learning can be carried out in non-uniform polynomial time if the super-structure has bounded treewidth, and in linear time if in addition the super-structure has bounded maximum degree. Furthermore, we show that if the directed super-structure is acyclic, then exact Bayesian network structure learning can be carried out in quadratic time. We complement these positive results with a number of hardness results. We show that both restrictions (treewidth and degree) are essential and cannot be dropped without loosing uniform polynomial time tractability (subject to a complexity-theoretic assumption). Similarly, exact Bayesian network structure learning remains NP-hard for "almost acyclic" directed super-structures. Furthermore, we show that the restrictions remain essential if we do not search for a globally optimal network but aim to improve a given network by means of at most k arc additions, arc deletions, or arc reversals (k-neighborhood local search).
APA, Harvard, Vancouver, ISO, and other styles
32

Codetta-Raiteri, Daniele. "Editorial for the Special Issue on “Bayesian Networks: Inference Algorithms, Applications, and Software Tools”." Algorithms 14, no. 5 (April 27, 2021): 138. http://dx.doi.org/10.3390/a14050138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hanif, Ayub, and Robert Elliott Smith. "State Space Modeling & Bayesian Inference with Computational Intelligence." New Mathematics and Natural Computation 11, no. 01 (March 2015): 71–101. http://dx.doi.org/10.1142/s1793005715500040.

Full text
Abstract:
Recursive Bayesian estimation using sequential Monte Carlos methods is a powerful numerical technique to understand latent dynamics of nonlinear non-Gaussian dynamical systems. It enables us to reason under uncertainty and addresses shortcomings underlying deterministic systems and control theories which do not provide sufficient means of performing analysis and design. In addition, parametric techniques such as the Kalman filter and its extensions, though they are computationally efficient, do not reliably compute states and cannot be used to learn stochastic problems. We review recursive Bayesian estimation using sequential Monte Carlo methods highlighting open problems. Primary of these is the weight degeneracy and sample impoverishment problem. We proceed to detail synergistic computational intelligence sequential Monte Carlo methods which address this. We find that imbuing sequential Monte Carlos with computational intelligence has many advantages when applied to many application and problem domains.
APA, Harvard, Vancouver, ISO, and other styles
34

Yan, Dingqi, Qi Zhou, Jianzhou Wang, and Na Zhang. "Bayesian regularisation neural network based on artificial intelligence optimisation." International Journal of Production Research 55, no. 8 (September 29, 2016): 2266–87. http://dx.doi.org/10.1080/00207543.2016.1237785.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Nadiri, Ata Allah, Nima Chitsazan, Frank T. C. Tsai, and Asghar Asghari Moghaddam. "Bayesian Artificial Intelligence Model Averaging for Hydraulic Conductivity Estimation." Journal of Hydrologic Engineering 19, no. 3 (March 2014): 520–32. http://dx.doi.org/10.1061/(asce)he.1943-5584.0000824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Schocken, S., and P. R. Kleindorfer. "Artificial intelligence dialects of the Bayesian belief revision language." IEEE Transactions on Systems, Man, and Cybernetics 19, no. 5 (1989): 1106–21. http://dx.doi.org/10.1109/21.44027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Durante, Daniele, Sally Paganin, Bruno Scarpa, and David B. Dunson. "Bayesian modelling of networks in complex business intelligence problems." Journal of the Royal Statistical Society: Series C (Applied Statistics) 66, no. 3 (July 27, 2016): 555–80. http://dx.doi.org/10.1111/rssc.12168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Motzek, Alexander, and Ralf Möller. "Indirect Causes in Dynamic Bayesian Networks Revisited." Journal of Artificial Intelligence Research 59 (May 27, 2017): 1–58. http://dx.doi.org/10.1613/jair.5361.

Full text
Abstract:
Modeling causal dependencies often demands cycles at a coarse-grained temporal scale. If Bayesian networks are to be used for modeling uncertainties, cycles are eliminated with dynamic Bayesian networks, spreading indirect dependencies over time and enforcing an infinitesimal resolution of time. Without a ``causal design,'' i.e., without anticipating indirect influences appropriately in time, we argue that such networks return spurious results. By identifying activator random variables, we propose activator dynamic Bayesian networks (ADBNs) which are able to rapidly adapt to contexts under a causal use of time, anticipating indirect influences on a solid mathematical basis using familiar Bayesian network semantics. ADBNs are well-defined dynamic probabilistic graphical models allowing one to model cyclic dependencies from local and causal perspectives while preserving a classical, familiar calculus and classically known algorithms, without introducing any overhead in modeling or inference.
APA, Harvard, Vancouver, ISO, and other styles
39

Aussem, Alex. "Bayesian networks." Neurocomputing 73, no. 4-6 (January 2010): 561–62. http://dx.doi.org/10.1016/j.neucom.2009.11.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Imazawa, Kei, and Yoshiteru Katsumura. "Field Failure Prediction using Bayesian Network." Transactions of the Japanese Society for Artificial Intelligence 31, no. 2 (2016): L—D43_1–9. http://dx.doi.org/10.1527/tjsai.l-d43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

JA, Duenas Santana. "Using Bayesian Networks for Quantifying Domino Effect Probability in a Hydrocarbon Processing Area." Petroleum & Petrochemical Engineering Journal 5, no. 3 (2021): 1–10. http://dx.doi.org/10.23880/ppej-16000274.

Full text
Abstract:
Accidents in process industries include fires, explosions, or toxic releases depending on the spilled material properties and ignition sources. One of the worst phenomena that may occur is the called domino effect. This triggers serious consequences on the people, the environment, and the economy. That is why the European Commission defined the domino effect prediction as a mandatory challenge for the years ahead. The quantification of the domino effect probability is a complex task due to the multiple and synergic effects among all accidents that should be included in the analysis. However, these techniques could be integrated with others in order to represent the domino effect occurrence reliably. In this matter, artificial intelligence plays a vital role. Bayesian networks, as one of the artificial intelligence nets, have been widely applied for domino effect likelihood determination. This research aims to provide a guide for quantifying domino effect probability using Bayesian networks in a hydrocarbon processing area. For this purpose, a four-step model is proposed integrating some classical risk analysis techniques with Bayesian networks. Moreover, this methodology is applied to an actual hydrocarbon storage and processing facility. After that, the joint probability can reach 9.37% for the process unit tank 703 which storages naphtha. Hence, safety management plans must be improved in this area for reducing this actual risk level. Finally, this research demonstrates how Artificial intelligence techniques should be integrated with classical ones in order to get more reliable results.
APA, Harvard, Vancouver, ISO, and other styles
42

Halpern, J. Y. "Conditional Plausibility Measures and Bayesian Networks." Journal of Artificial Intelligence Research 14 (June 1, 2001): 359–89. http://dx.doi.org/10.1613/jair.817.

Full text
Abstract:
A general notion of algebraic conditional plausibility measures is defined. Probability measures, ranking functions, possibility measures, and (under the appropriate definitions) sets of probability measures can all be viewed as defining algebraic conditional plausibility measures. It is shown that algebraic conditional plausibility measures can be represented using Bayesian networks.
APA, Harvard, Vancouver, ISO, and other styles
43

Ahn, Jae Joon, Hyun Woo Byun, Kyong Joo Oh, and Tae Yoon Kim. "Bayesian forecaster using class-based optimization." Applied Intelligence 36, no. 3 (January 18, 2011): 553–63. http://dx.doi.org/10.1007/s10489-011-0275-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Sasu, L. M., and R. Andonie. "Bayesian ARTMAP for regression." Neural Networks 46 (October 2013): 23–31. http://dx.doi.org/10.1016/j.neunet.2013.04.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Mancuhan, Koray, and Chris Clifton. "Combating discrimination using Bayesian networks." Artificial Intelligence and Law 22, no. 2 (February 17, 2014): 211–38. http://dx.doi.org/10.1007/s10506-014-9156-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

KHREISAT, LAILA. "REAL TIME INFERENCE IN BAYESIAN NETWORKS: AN ANYTIME APPROACH." International Journal on Artificial Intelligence Tools 14, no. 03 (June 2005): 477–89. http://dx.doi.org/10.1142/s0218213005002211.

Full text
Abstract:
One of the major challenges facing real time world applications that employ Bayesian networks, is the design and development of efficient inference algorithms. In this paper we present an approximate real time inference algorithm for Bayesian Networks. The algorithm is an anytime reasoning method based on probabilistic inequalities, capable of handling fully and partially quantified Bayesian networks. In our method the accuracy of the results improve gradually as computation time increases, providing a trade-off between resource consumption and output quality. The method is tractable in providing the initial answers, as well as complete in the limiting case.
APA, Harvard, Vancouver, ISO, and other styles
47

Kawahara, Yoshinobu, Takehisa Yairi, and Kazuo Machida. "Spacecraft Diagnosis Method Using Dynamic Bayesian Networks." Transactions of the Japanese Society for Artificial Intelligence 21 (2006): 45–54. http://dx.doi.org/10.1527/tjsai.21.45.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Daly, R., and Q. Shen. "Learning Bayesian Network Equivalence Classes with Ant Colony Optimization." Journal of Artificial Intelligence Research 35 (June 30, 2009): 391–447. http://dx.doi.org/10.1613/jair.2681.

Full text
Abstract:
Bayesian networks are a useful tool in the representation of uncertain knowledge. This paper proposes a new algorithm called ACO-E, to learn the structure of a Bayesian network. It does this by conducting a search through the space of equivalence classes of Bayesian networks using Ant Colony Optimization (ACO). To this end, two novel extensions of traditional ACO techniques are proposed and implemented. Firstly, multiple types of moves are allowed. Secondly, moves can be given in terms of indices that are not based on construction graph nodes. The results of testing show that ACO-E performs better than a greedy search and other state-of-the-art and metaheuristic algorithms whilst searching in the space of equivalence classes.
APA, Harvard, Vancouver, ISO, and other styles
49

Uhm, Daiho, Jea-Bok Ryu, and Sunghae Jun. "Patent Data Analysis of Artificial Intelligence Using Bayesian Interval Estimation." Applied Sciences 10, no. 2 (January 13, 2020): 570. http://dx.doi.org/10.3390/app10020570.

Full text
Abstract:
Technology analysis is one of the important tasks in technology and industrial management. Much information about technology is contained in the patent documents. So, patent data analysis is required for technology analysis. The existing patent analyses relied on the quantitative analysis of the collected patent documents. However, in the technology analysis, expert prior knowledge should also be considered. In this paper, we study the patent analysis method using Bayesian inference which considers prior experience of experts and likelihood function of patent data at the same time. For keyword data analysis, we use Bayesian predictive interval estimation with count data distributions such as Poisson. Using the proposed models, we forecast the future trends of technological keywords of artificial intelligence (AI) in order to know the future technology of AI. We perform a case study to provide how the proposed method can be applied to real areas. In this paper, we retrieve the patent documents related to AI technology, and analyze them to find the technological trend of AI. From the results of AI technology case study, we can find which technological keywords are more important or critical in the entire structure of AI industry. The existing methods for patent keyword analysis were depended on the collected patent documents at present. But, in technology analysis, the prior knowledge by domain experts is as important as the collected patent documents. So, we propose a method based on Bayesian inference for technology analysis using the patent documents. Our method considers the patent data analysis with the prior knowledge from domain experts.
APA, Harvard, Vancouver, ISO, and other styles
50

Lüdtke, Stefan, and Thomas Kirste. "Lifted Bayesian Filtering in Multiset Rewriting Systems." Journal of Artificial Intelligence Research 69 (December 7, 2020): 1203–54. http://dx.doi.org/10.1613/jair.1.12066.

Full text
Abstract:
We present a model for Bayesian filtering (BF) in discrete dynamic systems where multiple entities (inter)-act, i.e. where the system dynamics is naturally described by a Multiset rewriting system (MRS). Typically, BF in such situations is computationally expensive due to the high number of discrete states that need to be maintained explicitly. We devise a lifted state representation, based on a suitable decomposition of multiset states, such that some factors of the distribution are exchangeable and thus afford an efficient representation. Intuitively, this representation groups together similar entities whose properties follow an exchangeable joint distribution. Subsequently, we introduce a BF algorithm that works directly on lifted states, without resorting to the original, much larger ground representation. This algorithm directly lends itself to approximate versions by limiting the number of explicitly represented lifted states in the posterior. We show empirically that the lifted representation can lead to a factorial reduction in the representational complexity of the distribution, and in the approximate cases can lead to a lower variance of the estimate and a lower estimation error compared to the original, ground representation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography