Dissertations / Theses on the topic 'Elicitation of expert belief'

To see the other types of publications on this topic, follow the link: Elicitation of expert belief.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Elicitation of expert belief.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Briggs, Rachael (Rachael Amy). "Partial belief and expert testimony." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/47829.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Linguistics and Philosophy, 2009.
Includes bibliographical references (p. [83]-86).
My dissertation investigates two questions from within a partial belief framework: First, when and how should deference to experts or other information sources be qualified? Second, how closely is epistemology related to other philosophical fields, such as metaphysics, ethics, and decision theory? Chapter 1 discusses David Lewis's "Big Bad Bug", an argument for the conclusion that the Principal Principle-the thesis that one's credence in a proposition A should equal one's expectation of A's chance, provided one has no inadmissible information-is incompatible with Humean Supervenience-the thesis that that laws of nature, dispositions, and objective chances supervene on the distribution of categorical properties in the world (past, present, and future). I map out the logical structure of the Big Bad Bug, survey a range of possible responses to it, and argue that none of the responses are very appealing. Chapter 2 discusses Bas van Fraassen's Reflection principle-the thesis that one's current credence in a proposition A should equal one's expected future credence in A. Van Fraassen has formulated a diachronic Dutch book argument for Reflection, but other authors cite counterexamples to Reflection that appear to undermine the credibility of diachronic Dutch books. I argue that a suitably qualified version of Reflection gets around the counterexamples. I distinguish between Dutch books that reveal incoherence-like the diachronic Dutch book for conditionalization-and Dutch books that reveal a type of problem I call selfdoubt. I argue that violating Reflection is a type of self-doubt rather than a type of incoherence.
(cont.) Chapter 3 argues that the halfer and thirder solutions to Adam Elga's Sleeping Beauty problem correspond to two more general approaches to de se information. Which approach is right depends on which approach to decision theory is right. I use Dutch books and scoring rules to argue that causal decision theorists should favor the approach that corresponds to thirding, while evidential decision theorists should favor the approach that corresponds to halfing.
by Rachael Briggs.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
2

Selvidge, Jordan R. "Managing One-to-One Initiatives: Implementation Analysis Through Expert Elicitation." Digital Commons @ East Tennessee State University, 2016. https://dc.etsu.edu/etd/3143.

Full text
Abstract:
A qualitative phenomenological study was conducted to identify and analyze issues in the implementation of one-to-one computing initiatives and provide solutions for improvement. An understanding of the implementation process was developed through the analysis of data collected through 27 interviews with teacher experts in the field who have worked with the implementation of one-to-one programs. Teachers were purposely selected from the following groups: those who were completing their first year of teaching, those who had between two and ten years of teaching experience, and those who had eleven plus years of total teaching experience. This study distinctly addresses one-to-one initiatives from both placing importance on the utilization of negative knowledge and in simultaneously treating teacher perceptions as a valid reality. Issues associated with the implementation of one-to-one initiatives develop at a faster speed than traditional school structures are accustomed to respond to. Successful one-to-one management requires a responsive, interconnected, and efficient organizational structure. This research has significance for the improvement of one-to-one initiative implementation efforts. The findings contained in this research have the potential to benefit teachers, administrators, and other stakeholders associated with the implementation of one-to-one initiatives.
APA, Harvard, Vancouver, ISO, and other styles
3

Schneider, Mark. "Studies in risk perception and financial literacy: applications using subjective belief elicitation." Doctoral thesis, Faculty of Commerce, 2019. http://hdl.handle.net/11427/30349.

Full text
Abstract:
The concept of literacy has grown from “reading literacy” to now encompass many different domain-specific topics and skill sets, such as health literacy, financial literacy, and computer literacy. The way literacy is talked about, examined, measured, and communicated has also evolved. Literacy measures began as a simple metric of counting the number of individuals in a country that could read and dividing that count by the total population to compute the percentage of literate individuals. However, this approach ignores situations in which an illiterate person has access to a literate person that could read to them. This was the premise of research in development economics that introduced the measure of effective literacy, which accounts for potential positive externalities that could arise from access to a literate individual. This dissertation expands on the idea of effective literacy and introduces a concept of extended literacy, which applies to a decision-maker having access to an external scaffold during the decision-making process. The scaffolds considered include access to the internet, to an anonymous person as part of a group, and to a household member. The research presented here measures extended financial literacy under these various scaffolds. Financial literacy reflects an individual’s knowledge about financial matters, including the management of risks. The research assesses subjects’ knowledge about interest and inflation, budgeting, and longevity risk. The techniques used to measure literacy reflect state-of-the-art advances in subjective belief elicitation that allow for the recovery of each decisionmaker’s entire underlying subjective distribution. This method generates a rich characterization of subjects’ beliefs and allows the construction of various measures of literacy, welfare, bias, and confidence with respect to a known, true answer. Using controlled laboratory and artefactual field experiments with real rewards and incentivized elicitation of beliefs, we find that these scaffolds reliably enhance literacy. We relate the notion of extended literacy to concepts in economics, cognitive science and philosophy, such as effective literacy, embedded literacy and embodied literacy.
APA, Harvard, Vancouver, ISO, and other styles
4

West, Daune. "Towards a subjective knowledge elicitation methodology for the development of expert systems." Thesis, University of Portsmouth, 1991. https://researchportal.port.ac.uk/portal/en/theses/towards-a-subjective-knowledge-elicitation-methodology-for-the-development-of-expert-systems(d63c460a-f71c-492d-9150-15c31becdb5b).html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Alkhairy, Ibrahim H. "Designing and Encoding Scenario-based Expert Elicitation for Large Conditional Probability Tables." Thesis, Griffith University, 2020. http://hdl.handle.net/10072/390794.

Full text
Abstract:
This thesis focuses on the general problem of asking experts to assess the likelihood of many scenarios, when there is insufficient time to ask about all possible scenarios. The challenge addressed here is one of experimental design: How to choose which scenarios are assessed; How to use that limited data to extrapolate information about the scenarios that remain unasked? In a mathematical sense, this problem can be constructed as a problem of expert elicitation, where experts are asked to quantify conditional probability tables (CPTs). Experts may be relied on, for example in the situation when empirical data is unavailable or limited. CPTs are used widely in statistical modelling to describe probabilistic relationships between an outcome and several factors. I consider two broad situations where CPTs are important components of quantitative models. Firstly experts are often asked to quantify CPTs that form the building blocks of Bayesian Networks (BNs). In one case study, CPTs describe how habitat suitability of feral pigs is related to various environmental factors, such as water quality and food availability. Secondly CPTs may also support a sensitivity analysis for large computer experiments, by examining how some outcome changes, as various factors are changed. Another case study uses CPTs to examine sensitivity to settings, for algorithms available through virtual laboratories, to map the geographic distribution of species such as the koala. An often-encountered problem is the sheer amount of information asked of the expert: the number of scenarios. Each scenario corresponds to a row of the CPT, and concerns a particular combination of factors, and the likely outcome. Currently most researchers arrange elicitation of CPTs by keeping the number of rows and columns in the CPT to a minimum, so that they need ask experts about no more than twenty or so scenarios. However in some practical problems, CPTs may need to involve more rows and columns, for example involving more than two factors, or factors which can take on more than two or three possible values. Here we propose a new way of choosing scenarios, that underpin the elicitation strategy, by taking advantage of experimental design to: ensure adequate coverage of all scenarios; and to make best use of the scarce resources like the valuable time of the experts. I show that this can be essentially constructed as a problem of how to better design choice of scenarios to elicit from a CPT. The main advantages of these designs is that they explore more of the design space compared to usual design choices like the one-factor-at-a-time (OFAT) design that underpins the popular encoding approach embedded in “CPT Calculator”. In addition this work tailors an under-utilized scenario-based elicitation method to ensure that the expert’s uncertainty was captured, together with their assessments, of the likelihood of each possible outcome. I adopt the more intuitive Outside-In Elicitation method to elicit the expert’s plausible range of assessed values, rather than the more common and reverse-order approach of eliciting their uncertainty around their best guess. Importantly this plausible range of values is more suitable for input into a new approach that was proposed for encoding scenario-based elicitation: Bayesian (rather than a Frequentist) interpretation. Whilst eliciting some scenarios from large CPTs, another challenge arises from the remaining CPT entries that are not elicited. This thesis shows how to adopt a statistical model to interpolate not only the missing CPT entries but also quantify the uncertainty for each scenario, which is new for these two situations: BNs and sensitivity analyses. For this purpose, I introduce the use of Bayesian generalized linear models (GLMs). The Bayesian updating framework also enables us to update the results of elicitation, by incorporating empirical data. The idea is to utilise scenarios elicited from experts to constructan informative Bayesian “prior” model. Then the prior information (e.g. about scenarios) is combined with the empirical data (e.g. from computer model runs), to update the posterior estimates of plausible outcomes (affecting all scenarios). The main findings showed that Bayesian inference suits the small data problem of encoding the expert’s mental model underlying their assessments, allowing uncertainty to vary about each scenario. In addition Bayesian inference provides rich feedback to the modeller and experts on the plausible influence of factors on the response, and whether any information was gained on their interactions. That information could be pivotal to designing the next phase of elicitation about habitat requirements or another phase of computer models. In this way, the Bayesian paradigm naturally supports a sequential approach to gradually accruing information about the issue at hand. As summarised above, the novel statistical methodology presented in this thesis also contributes to computer science. Specifically computation for Bayesian Networks and sensitivity analyses of large computer experiments can be re-designed to be more efficient. Here the expert knowledge is useful to complement the empirical data to inform a more comprehensive analyses.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Info & Comm Tech
Science, Environment, Engineering and Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles
6

Akram, Muhammad Farooq. "A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/47717.

Full text
Abstract:
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to-be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system, make it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
APA, Harvard, Vancouver, ISO, and other styles
7

Iamsumang, Chonlagarn. "A framework for nuclear facility safeguard evaluation using probabilistic methods and expert elicitation." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/76528.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 99-100).
With the advancement of the next generation of nuclear fuel cycle facilities, concerns of the effectiveness of nuclear facility safeguards have been increasing due to the inclusion of highly enriched material and reprocessing capability into fuel cycles. Therefore, an extensive and quantitative safeguard evaluation is required in order for the decision makers to have a consistent measure to verify safeguards level of protection, and to effectively improve the current safeguard scheme. The framework presented in this study provides a systematic method for safeguard evaluation of any nuclear facility. Using scenario analysis approach, a diversion scenario consists of target material, target location, diversion technique, set of tactics to help elude the safeguards, and the amount of material diverted per attempt. The success tree methodology and expert elicitation is used to construct logical models and obtain the probabilities of basic events. Then proliferator diversion success probabilities can be derived from the model for all possible scenarios in a given facility. Using Rokkasho reprocessing facility as an example, diversion pathways, uncertainty, sensitivity, and importance measure analyses are shown. Results from the analyses can be used by the safeguarder to gauge the level of protection provided by the current safeguard scheme, and to identify the weak points for improvements. The safeguarder is able to further analyze the effectiveness of the safeguard scheme for different facility designs, and the cost effectiveness analysis will help the safeguarder allocate limited resources for maximum possible protection against a material diversion.
by Chonlagarn Iamsumang.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
8

Okoli, Justin. "Expert knowledge elicitation in the firefighting domain and the implications for training novices." Thesis, Middlesex University, 2016. http://eprints.mdx.ac.uk/22940/.

Full text
Abstract:
Background/Purpose: Experienced fireground commanders are often required to make important decisions in time-pressured and dynamic environments that are characterized by a wide range of task constraints. The nature of these environments is such that firefighters are sometimes faced with novel situations that seek to challenge their expertise and therefore necessitate making knowledge-based as opposed to rule-based decisions. The purpose of this study is to elicit the tacitly held knowledge which largely underpinned expert competence when managing non-routine fire incidents. Design/Methodology/Approach: The study utilized a formal knowledge elicitation tool known as the critical decision method (CDM). The CDM method was preferred to other cognitive task analysis (CTA) methods as it is specifically designed to probe the cognitive strategies of domain experts with reference to a single incident that was both challenging and memorable. Thirty experienced firefighters and one staff development officer were interviewed in-depth across different fire stations in the UK and Nigeria (UK=15, Nigeria=16). The interview transcripts were analyzed using the emergent themes analysis (ETA) approach. Findings: Findings from the study revealed 42 salient cues that were sought by experts at each decision point. A critical cue inventory (CCI) was developed and cues were categorized into five distinct types based on the type of information each cue generated to an incident commander. The study also developed a decision making model — information filtering and intuitive decision making model (IFID), which describes how the experienced firefighters were able to make difficult fireground decisions amidst multiple informational sources without having to deliberate on their courses of action. The study also compiled and indexed the elicited tacit knowledge into a competence assessment framework (CAF) with which the competence of future incident commanders could potentially be assessed. Practical Implications: Through the knowledge elicitation process, training needs were identified, and the practical implications for transferring the elicited experts’ knowledge to novice firefighters were also discussed. The four component instructional design model aided the conceptualization of the CDM outputs for training purposes. Originality/Value: Although it is widely believed that experts perform exceptionally well in their domains of practice, the difficulty still lies in finding how best to unmask expert (tacit) knowledge, particularly when it is intended for training purposes. Since tacit knowledge operates in the unconscious realm, articulating and describing it has been shown to be challenging even for experts themselves. This study is therefore timely since its outputs can facilitate the development of training curricula for novices, who then will not have to wait for real fires to occur before learning new skills. This statement holds true particularly in this era where the rate of real fires and therefore the opportunity to gain experience has been on a decline. The current study also presents and discusses insights based on the cultural differences that were observed between the UK and the Nigerian fire service.
APA, Harvard, Vancouver, ISO, and other styles
9

Burge, Janet E. "Knowledge Elicitation for Design Task Sequencing Knowledge." Digital WPI, 1999. https://digitalcommons.wpi.edu/etd-theses/1062.

Full text
Abstract:
"There are many types of knowledge involved in producing a design (the process of specifying a description of an artifact that satisfies a collection of constraints [Brown, 1992]). Of these, one of the most crucial is the design plan: the sequence of steps taken to create the design (or a portion of the design). A number of knowledge elicitation methods can be used to obtain this knowledge from the designer. The success of the elicitation depends on the match between the knowledge elicitation method used and the information being sought. The difficulty with obtaining design plan information is that this information may involve implicit knowledge, i.e. knowledge that can not be expressed explicitly. In this thesis, an approach is used that combines two knowledge elicitation techniques: one direct, to directly request the design steps and their sequence, and one indirect, to refine this knowledge by obtaining steps and sequences that may be implicit. The two techniques used in this thesis were Forward Scenario Simulation (FSS), a technique where the domain expert describes how the procedure followed to solve it, and Card Sort, a technique where the domain expert is asked to sort items (usually entities in the domain) along different attributes. The Design Ordering Elicitation System (DOES) was built to perform the knowledge elicitation. This system is a web-based system designed to support remote knowledge elicitation: KE performed without the presence of the knowledge engineer. This system was used to administer knowledge elicitation sessions to evaluate the effectiveness of these techniques at obtaining design steps and their sequencing. The results indicate that using an indirect technique together with a direct technique obtains more alternative sequences for the design steps than using the direct technique alone."
APA, Harvard, Vancouver, ISO, and other styles
10

Zampa, Nicholas Joseph. "Structured Expert Judgment Elicitation of Use Error Probabilities for Drug Delivery Device Risk Assessment." Thesis, The George Washington University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10841440.

Full text
Abstract:

In the pharmaceutical industry, estimating the probability of occurrence for use errors and use-error-causes (here forth referred to as use error probabilities) when developing drug delivery devices is hindered by a lack of data, ultimately limiting the ability to conduct robust usability risk assessments. A lack of reliable data is the result of small sample sizes and challenges simulating actual use environments in simulated use studies, compromising the applicability of observed use error rates. Further, post-market surveillance databases and internal complaint databases are limited in their ability to provide reliable data for product development. Inadequate usability risk assessment hinders drug delivery device manufacturers' understanding of safety and efficacy risks. The current industry and regulatory paradigm with respect to use error probabilities is to de-emphasize them, focusing instead of assessing the severity of harms. However, de-emphasis of use error probabilities is not rooted in a belief that probability estimates inherently lack value. Rather, the status quo is based on the absence of suitable methodologies for estimating use error probabilities.

In instances in which data is lacking, engineers and scientist may turn to structured expert judgment elicitation methodologies, in which subjective expert opinions are quantified and aggregated in a scientific manner. This research is a case study in adapting and applying one particular structured expert judgment methodology, Cooke’s Classical model, to human factors experts for estimating use error probabilities for a drug delivery device. Results indicate that a performance-weighted linear pooling of expert judgments significantly outperforms any one expert and an equal-weighted linear pooling. Additionally, this research demonstrates that a performance-weighted linear pooling of expert judgments is statistically accurate, robust to the choice of experts, and robust to choice elicitation questions. Lastly, this research validates the good statistical accuracy of a performance-weighted linear pooling of experts on a new set of use error probabilities, indicating that good expert performance translates to use error probabilities estimates for different devices. Through structured expert judgment elicitation according to Cooke’s Classical model, this research demonstrates that it is possible to reinstall use error probability estimates, with quantified uncertainty, into usability risk assessments for drug delivery devices.

APA, Harvard, Vancouver, ISO, and other styles
11

Aucher, Guillaume, and n/a. "Perspectives on belief and change." University of Otago. Department of Computer Science, 2008. http://adt.otago.ac.nz./public/adt-NZDU20081003.115428.

Full text
Abstract:
This thesis is about logical models of belief (and knowledge) representation and belief change. This means that we propose logical systems which are intended to represent how agents perceive a situation and reason about it, and how they update their beliefs about this situation when events occur. These agents can be machines, robots, human beings. . . but they are assumed to be somehow autonomous. The way a fixed situation is perceived by agents can be represented by statements about the agents� beliefs: for example �agent A believes that the door of the room is open� or �agent A believes that her colleague is busy this afternoon�. �Logical systems� means that agents can reason about the situation and their beliefs about it: if agent A believes that her colleague is busy this afternoon then agent A infers that he will not visit her this afternoon. We moreover often assume that our situations involve several agents which interact between each other. So these agents have beliefs about the situation (such as �the door is open�) but also about the other agents� beliefs: for example agent A might believe that agent B believes that the door is open. These kinds of beliefs are called higher-order beliefs. Epistemic logic [Hintikka, 1962; Fagin et al., 1995; Meyer and van der Hoek, 1995], the logic of belief and knowledge, can capture all these phenomena and will be our main starting point to model such fixed (�static�) situations. Uncertainty can of course be expressed by beliefs and knowledge: for example agent A being uncertain whether her colleague is busy this afternoon can be expressed by �agent A does not know whether her colleague is busy this afternoon�. But we sometimes need to enrich and refine the representation of uncertainty: for example, even if agent A does not know whether her colleague is busy this afternoon, she might consider it more probable that he is actually busy. So other logics have been developed to deal more adequately with the representation of uncertainty, such as probabilistic logic, fuzzy logic or possibilistic logic, and we will refer to some of them in this thesis (see [Halpern, 2003] for a survey on reasoning about uncertainty). But things become more complex when we introduce events and change in the picture. Issues arise even if we assume that there is a single agent. Indeed, if the incoming information conveyed by the event is coherent with the agent�s beliefs then the agent can just add it to her beliefs. But if the incoming information contradicts the agent�s beliefs then the agent has somehow to revise her beliefs, and as it turns out there is no obvious way to decide what should be her resulting beliefs. Solving this problem was the goal of the logic-based belief revision theory developed by Alchourrón, Gärdenfors and Makinson (to which we will refer by the term AGM) [Alchourrón et al., 1985; Gärdenfors, 1988; Gärdenfors and Rott, 1995]. Their idea is to introduce �rationality postulates� that specify which belief revision operations can be considered as being �rational� or reasonable, and then to propose specific revision operations that fulfill these postulates. However, AGM does not consider situations where the agent might also have some uncertainty about the incoming information: for example agent A might be uncertain due to some noise whether her colleague told her that he would visit her on Tuesday or on Thursday. In this thesis we also investigate this kind of phenomenon. Things are even more complex in a multi-agent setting because the way agents update their beliefs depends not only on their beliefs about the event itself but also on their beliefs about the way the other agents perceived the event (and so about the other agents� beliefs about the event). For example, during a private announcement of a piece of information to agent A the beliefs of the other agents actually do not change because they believe nothing is actually happening; but during a public announcement all the agents� beliefs might change because they all believe that an announcement has been made. Such kind of subtleties have been dealt with in a field called dynamic epistemic logic (Gerbrandy and Groeneveld, 1997; Baltag et al., 1998; van Ditmarsch et al., 2007b]. The idea is to represent by an event model how the event is perceived by the agents and then to define a formal update mechanism that specifies how the agents update their beliefs according to this event model and their previous representaton of the situation. Finally, the issues concerning belief revision that we raised in the single agent case are still present in the multi-agent case. So this thesis is more generally about information and information change. However, we will not deal with problems of how to store information in machines or how to actually communicate information. Such problems have been dealt with in information theory [Cover and Thomas, 1991] and Kolmogorov complexity theory [Li and Vitányi, 1993]. We will just assume that such mechanisms are already available and start our investigations from there. Studying and proposing logical models for belief change and belief representation has applications in several areas. First in artificial intelligence, where machines or robots need to have a formal representation of the surrounding world (which might involve other agents), and formal mechanisms to update this representation when they receive incoming information. Such formalisms are crucial if we want to design autonomous agents, able to act autonomously in the real world or in a virtual world (such as on the internet). Indeed, the representation of the surrounding world is essential for a robot in order to reason about the world, plan actions in order to achieve goals... and it must be able to update and revise its representation of the world itself in order to cope autonomously with unexpected events. Second in game theory (and consequently in economics), where we need to model games involving several agents (players) having beliefs about the game and about the other agents� beliefs (such as agent A believes that agent B has the ace of spade, or agent A believes that agent B believes that agent A has the ace of heart...), and how they update their representation of the game when events (such as showing privately a card or putting a card on the table) occur. Third in cognitive psychology, where we need to model as accurately as possible epistemic state of human agents and the dynamics of belief and knowledge in order to explain and describe cognitive processes. The thesis is organized as follows. In Chapter 2, we first recall epistemic logic. Then we observe that representing an epistemic situation involving several agents depends very much on the modeling point of view one takes. For example, in a poker game the representation of the game will be different depending on whether the modeler is a poker player playing in the game or the card dealer who knows exactly what the players� cards are. In this thesis, we will carefully distinguish these different modeling approaches and the. different kinds of formalisms they give rise to. In fact, the interpretation of a formalism relies quite a lot on the nature of these modeling points of view. Classically, in epistemic logic, the models built are supposed to be correct and represent the situation from an external and objective point of view. We call this modeling approach the perfect external approach. In Chapter 2, we study the modeling point of view of a particular modeler-agent involved in the situation with other agents (and so having a possibly erroneous perception of the situation). We call this modeling approach the internal approach. We propose a logical formalism based on epistemic logic that this agent uses to represent �for herself� the surrounding world. We then set some formal connections between the internal approach and the (perfect) external approach. Finally we axiomatize our logical formalism and show that the resulting logic is decidable. In Chapter 3, we first recall dynamic epistemic logic as viewed by Baltag, Moss and Solecki (to which we will refer by the term BMS). Then we study in which case seriality of the accessibility relations of epistemic models is preserved during an update, first for the full updated model and then for generated submodels of the full updated model. Finally, observing that the BMS formalism follows the (perfect) external approach, we propose an internal version of it, just as we proposed an internal version of epistemic logic in Chapter 2. In Chapter 4, we still follow the internal approach and study the particular case where the event is a private announcement. We first show, thanks to our study in Chapter 3, that in a multi-agent setting, expanding in the AGM style corresponds to performing a private announcement in the BMS style. This indicates that generalizing AGM belief revision theory to a multi-agent setting amounts to study private announcement. We then generalize the AGM representation theorems to the multi-agent case. Afterwards, in the spirit of the AGM approach, we go beyond the AGM postulates and investigate multi-agent rationality postulates specific to our multi-agent setting inspired from the fact that the kind of phenomenon we study is private announcement. Finally we provide an example of revision operation that we apply to a concrete example. In Chapter 5, we follow the (perfect) external approach and enrich the BMS formalism with probabilities. This enables us to provide a fined-grained account of how human agents interpret events involving uncertainty and how they revise their beliefs. Afterwards, we review different principles for the notion of knowledge that have been proposed in the literature and show how some principles that we argue to be reasonable ones can all be captured in our rich and expressive formalism. Finally, we extend our general formalism to a multi-agent setting. In Chapter 6, we still follow the (perfect) external approach and enrich our dynamic epistemic language with converse events. This language is interpreted on structures with accessibility relations for both beliefs and events, unlike the BMS formalism where events and beliefs are not on the same formal level. Then we propose principles relating events and beliefs and provide a complete characterization, which yields a new logic EDL. Finally, we show that BMS can be translated into our new logic EDL thanks to the converse operator: this device enables us to translate the structure of the event model directly within a particular axiomatization of EDL, without having to refer to a particular event model in the language (as done in BMS). In Chapter 7 we summarize our results and give an overview of remaining technical issues and some desiderata for future directions of research. Parts of this thesis are based on publication, but we emphasize that they have been entirely rewritten in order to make this thesis an integrated whole. Sections 4.2.2 and 4.3 of Chapter 4 are based on [Aucher, 2008]. Sections 5.2, 5.3 and 5.5 of Chapter 5 are based on [Aucher, 2007]. Chapter 6 is based on [Aucher and Herzig, 2007].
APA, Harvard, Vancouver, ISO, and other styles
12

Henshall, Anthony Wilton. "An investigation into the possibility of using sociological research methodologies for the elicitation of tacit knowledge for building knowledge intensive systems." Thesis, University of Salford, 1995. http://usir.salford.ac.uk/14775/.

Full text
Abstract:
The research notes that deficiencies in knowledge acquisition are impeding the advancement of Knowledge Intensive Systems (KIS), such as Expert Systems (ES) and Decision Support Systems (DSS). Humphreys (1989) maintains the problem is not the quantity of knowledge collected but its quality. Humphreys (1989) contends that 'Knowledge' has too narrow a definition in knowledge acquisition dogma and a wider definition of 'knowledge' capable of handling 'procedural uncertainty' is required. 'Tacit knowledge' by which Polanyi (1967) contends individuals interpret the world appears a fruitful area to widen the definition of knowledge. The subjective nature of tacit knowledge makes its explication problematic, however, it is noted that tacit knowledge has a social aspect (interiorization) which appears amenable to sociological investigation. On the basis of the above it seemed prudent to focus the investigation down to the following research question, 'On the basis of its nature, is there a method whereby at least some tacit knowledge can be explicated for. a) building the knowledge base; b) more accurately predicting or planing for its usage and for setting expectations.' To test the thesis, a pilot investigation was undertaken at a local Housing Association in order to gain first hand experience of knowledge acquisition. Examples of how experts tacitally classify their domain were identified and methods of explicating this knowledge were tentatively formulated. The above resulted in the formulation of a new perspective: traditionally KBS has concerned itself with eliciting knowledge to be embodied in the knowledge base, whereas, IS has concerned itself in gaining the knowledge involved in the systems use/interpretation. Fieldwork was later conducted in the maternity units of two local hospitals in order to test the generalizability of these methods. Five methods for the explication of tacit knowledge were identified. 1) The analysis of the reification of existing systems and the rationality internal to these systems, can be used to explicate tacit knowledge. 2) More than one set of tacit knowledge can be present in one domain. Points where two sets of tacit knowledge interact expose contradictions which can be used as a tool to explicate the tacit knowledge of both groups. 3) The analysis of anecdotes revealed how domains were tacitally delimited and the 'criticality' of tasks within a domain. 4) Action research using a 'mock up' data base revealed tacitally held domain knowledge with implications for micro level criticality, of particular importance to interface design. 5) The thesis identified knowledge acquisition as a method of sociological investigation.
APA, Harvard, Vancouver, ISO, and other styles
13

Islam, Raihan Ul. "Wireless Sensor Network Based Flood Prediction Using Belief Rule Based Expert System." Licentiate thesis, Luleå tekniska universitet, Datavetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-66415.

Full text
Abstract:
Flood is one of the most devastating natural disasters. It is estimated that flooding from sea level rise will cause one trillion USD to major coastal cities of the world by the year 2050. Flood not only destroys the economy, but it also creates physical and psychological sufferings for the human and destroys infrastructures. Disseminating flood warnings and evacuating people from the flood-affected areas help to save human life. Therefore, predicting flood will help government authorities to take necessary actions to evacuate humans and arrange relief for the people. This licentiate thesis focuses on four different aspects of flood prediction using wireless sensor networks (WSNs). Firstly, different WSNs, protocols related to WSN, and backhaul connectivity in the context of predicting flood were investigated. A heterogeneous WSN network for flood prediction was proposed. Secondly, data coming from sensors contain anomaly due to different types of uncertainty, which hampers the accuracy of flood prediction. Therefore, anomalous data needs to be filtered out. A novel algorithm based on belief rule base for detecting the anomaly from sensor data has been proposed in this thesis. Thirdly, predicting flood is a challenging task as it involves multi-level factors, which cannot be measured with 100% certainty. Belief rule based expert systems (BRBESs) can be considered to handle the complex problem of this nature as they address different types of uncertainty. A web based BRBES was developed for predicting flood. This system provides better usability, more computational power to handle larger numbers of rule bases and scalability by porting it into a web-based solution. To improve the accuracy of flood prediction, a learning mechanism for multi-level BRBES was proposed. Furthermore, a comparison between the proposed multi-level belief rule based learning algorithm and other machine learning techniques including Artificial Neural Networks (ANN), Support Vector Machine (SVM) based regression, and Linear Regression has been performed. In the light of the research findings of this thesis, it can be argued that flood prediction can be accomplished more accurately by integrating WSN and BRBES.
APA, Harvard, Vancouver, ISO, and other styles
14

Gilson, Robert. "Minimizing input acquisition costs in a Bayesian belief network-based expert system /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/8763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Werner, Christoph. "Structured expert judgement for dependence in probabilistic modelling of uncertainty : advances along the dependence elicitation process." Thesis, University of Strathclyde, 2018. http://digitool.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=30519.

Full text
Abstract:
In decision and risk analysis problems, modelling uncertainty probabilistically provides key insights and information for decision makers. A common challenge is that uncertainties are typically not isolated but interlinked which introduces complex (and often unexpected) effects on the model output. Therefore, dependence needs to be taken into account and modelled appropriately if simplifying assumptions, such as independence, are not sensible. Similar to the case of univariate uncertainty, relevant historical data to quantify a (dependence) model are often lacking or too costly to obtain. This may be true even when data on a model's univariate quantities, such as marginal probabilities, are available. Then, specifying dependence between the uncertain variables through expert judgement is the only sensible option. A structured and formal process to the elicitation is essential for ensuring methodological robustness. This thesis consists of three published works and two papers which are to be published (one under review and one working paper). Two of these works provide comprehensive overviews from different perspectives about the research on dependence elicitation processes. Based on these reviews, novel risk assessment and expert judgement methods are proposed - (1) allowing experts to structure and share their knowledge and beliefs about dependence relationships prior to a quantitative assessment and (2) ensuring experts' (detailed) quantitative assessments are feasible while their elicitation is intuitive. The original research presented in this thesis is applied in case-studies with experts in real risk modelling contexts for the UK Higher Education sector, terrorism risk and future risk of antibacterial multi-drug resistance.
APA, Harvard, Vancouver, ISO, and other styles
16

Taalab, Khaled Paul. "Modelling soil bulk density using data-mining and expert knowledge." Thesis, Cranfield University, 2013. http://dspace.lib.cranfield.ac.uk/handle/1826/8273.

Full text
Abstract:
Data about the spatial variation of soil attributes is required to address a great number of environmental issues, such as improving water quality, flood mitigation, and determining the effects of the terrestrial carbon cycle. The need for a continuum of soils data is problematic, as it is only possible to observe soil attributes at a limited number of locations, beyond which, prediction is required. There is, however, disparity between the way in which much of the existing information about soil is recorded and the format in which the data is required. There are two primary methods of representing the variation in soil properties, as a set of distinct classes or as a continuum. The former is how the variation in soils has been recorded historically by the soil survey, whereas the latter is how soils data is typically required. One solution to this issue is to use a soil-landscape modelling approach which relates the soil to the wider landscape (including topography, land-use, geology and climatic conditions) using a statistical model. In this study, the soil-landscape modelling approach has been applied to the prediction of soil bulk density (Db). The original contribution to knowledge of the study is demonstrating that producing a continuous surface of Db using a soil-landscape modelling approach is that a viable alternative to the ‘classification’ approach which is most frequently used. The benefit of this method is shown in relation to the prediction of soil carbon stocks, which can be predicted more accurately and with less uncertainty. The second part of this study concerns the inclusion of expert knowledge within the soil-landscape modelling approach. The statistical modelling approaches used to predict Db are data driven, hence it is difficult to interpret the processes which the model represents. In this study, expert knowledge is used to predict Db within a Bayesian network modelling framework, which structures knowledge in terms of probability. This approach creates models which can be more easily interpreted and consequently facilitate knowledge discovery, it also provides a method for expert knowledge to be used as a proxy for empirical data. The contribution to knowledge of this section of the study is twofold, firstly, that Bayesian networks can be used as tools for data-mining to predict a continuous soil attribute such as Db and that in lieu of data, expert knowledge can be used to accurately predict landscape-scale trends in the variation of Db using a Bayesian modelling approach.
APA, Harvard, Vancouver, ISO, and other styles
17

Heltne, Mari Montri. "Knowledge-based support for management of end user computing resources: Issues in knowledge elicitation and flexible design." Diss., The University of Arizona, 1988. http://hdl.handle.net/10150/184429.

Full text
Abstract:
Effective resource management requires tools and decision aides to help determine users' needs and appropriate assignment. The goal of this research was to design, implement, and test technological tools that, even in a dynamic environment, effectively support the matching of users and resources. The context of the investigation is the Information Center, the structure used to manage and control the computing resources demanded by end users. The major contributions of the research lie in two areas: (1) the development and use of a knowledge acquisition called Resource Attribute Charts (RAC), which allow for the structured definition of the resources managed by the IC, and (2) the design, implementation, validation, and verification of the transportability of Information Center Expert, a system that supports the activities of the IC personnel. Prototyping, the system development methodology commonly used in software engineering, was used to design the general architecture of the knowledge acquisition tools, the knowledge maintenance tool, and the expert system itself. The knowledge acquisition tools, RAC, were used to build the knowledge base of ICE (Information Center Expert). ICE was installed at two corporate sites, its software recommendations were validated, and its transportability from one location to another was verified experimentally. The viability of a rule-based consultation system as a mechanism for bringing together knowledge about users, problems, and resources for the purpose of effective resource management was demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
18

Pestana, Marco Aurélio. "Elicitação de especialistas em estudos de confiabilidade e análise de risco." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/3/3135/tde-12072017-080326/.

Full text
Abstract:
O propósito desta dissertação é apresentar o uso da opinião de especialistas e outras questões relevantes acerca do assunto na avaliação das incertezas em estudos de análise de risco e confiabilidade, com apresentação de um estudo de caso prático. Em estudos de confiabilidade umas das principais preocupações está na determinação das frequências de ocorrência dos eventos e seu comportamento ao longo do tempo. Muitas vezes, os dados de frequência estão obsoletos, não estão disponíveis ou mesmo, não são suficientes para se avaliar a probabilidade de ocorrência de eventos. Nestes casos, a elicitação da opinião de especialista surge como uma alternativa a suplementar estas ausências de dados possibilitando assim uma melhor análise das incertezas. Baseado na condição da subjetividade, a elicitação dos especialistas tem como objetivo quantificar as incertezas a partir da experiência prévia e estado atual de conhecimento. Combinado com métodos matemáticos, a elicitação possibilita o gerenciamento de conflitos de informações de forma a atingir o consenso e possibilitar uma análise subjetiva dos problemas.
The purpose of this dissertationis to present the use of expert opinion and other relevant issues on the subjective assessment of uncertainties in risk analysis and reliability studies, presenting a practical case study. In reliability studies a major concern is to determine the frequencies of occurrence of events and their behavior through time. Often, the available data are not representative enough to evaluate the event probability or it is obsolete for use. In these cases, the elicitation of expert opinion is an alternative to supplement these data absences, Thus enabling a better uncertainties analysis. Based on the subjectivity condition, the elicitation of experts aims to quantify the uncertainty considering the previous experiences and current state of knowledge. Combined with mathematical elicitation methods, it enables the manegement of information conflicts in order to reach consensus and makes possible a subjective analysis of problems.
APA, Harvard, Vancouver, ISO, and other styles
19

Fooladvandi, Farzad. "Signature-based activity detection based on Bayesian networks acquired from expert knowledge." Thesis, University of Skövde, School of Humanities and Informatics, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-1123.

Full text
Abstract:

The maritime industry is experiencing one of its longest and fastest periods of growth. Hence, the global maritime surveillance capacity is in a great need of growth as well. The detection of vessel activity is an important objective of the civil security domain. Detecting vessel activity may become problematic if audit data is uncertain. This thesis aims to investigate if Bayesian networks acquired from expert knowledge can detect activities with a signature-based detection approach. For this, a maritime pilot-boat scenario has been identified with a domain expert. Each of the scenario’s activities has been divided up into signatures where each signature relates to a specific Bayesian network information node. The signatures were implemented to find evidences for the Bayesian network information nodes. AIS-data with real world observations have been used for testing, which have shown that it is possible to detect the maritime pilot-boat scenario based on the taken approach.

APA, Harvard, Vancouver, ISO, and other styles
20

Suermondt, Henri Jacques. "Explanation in Bayesian belief networks." Full text available online (restricted access), 1992. http://images.lib.monash.edu.au/ts/theses/suermondt.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Monrat, Ahmed Afif. "A BELIEF RULE BASED FLOOD RISK ASSESSMENT EXPERT SYSTEM USING REAL TIME SENSOR DATA STREAMING." Thesis, Luleå tekniska universitet, Datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-71081.

Full text
Abstract:
Among the various natural calamities, flood is considered one of the most catastrophic natural hazards, which has a significant impact on the socio-economic lifeline of a country. The Assessment of flood risks facilitates taking appropriate measures to reduce the consequences of flooding. The flood risk assessment requires Big data which are coming from different sources, such as sensors, social media, and organizations. However, these data sources contain various types of uncertainties because of the presence of incomplete and inaccurate information. This paper presents a Belief rule-based expert system (BRBES) which is developed in Big data platform to assess flood risk in real time. The system processes extremely large dataset by integrating BRBES with Apache Spark while a web-based interface has developed allowing the visualization of flood risk in real time. Since the integrated BRBES employs knowledge driven learning mechanism, it has been compared with other data-driven learning mechanisms to determine the reliability in assessing flood risk. Integrated BRBES produces reliable results comparing from the other data-driven approaches. Data for the expert system has been collected targeting different case study areas from Bangladesh to validate the integrated system.
APA, Harvard, Vancouver, ISO, and other styles
22

Hridoy, Md Rafiul Sabbir. "An Intelligent Flood Risk Assessment System using Belief Rule Base." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-65390.

Full text
Abstract:
Natural disasters disrupt our daily life and cause many sufferings. Among the various natural disasters, flood is one of the most catastrophic. Assessing flood risk helps to take necessary precautions and can save human lives. The assessment of risk involves various factors which can not be measured with hundred percent certainty. Therefore, the present methods of flood risk assessment can not assess the risk of flooding accurately.  This research rigorously investigates various types of uncertainties associated with the flood risk factors. In addition, a comprehensive study of the present flood risk assessment approaches has been conducted. Belief Rule Base expert systems are widely used to handle various of types of uncertainties. Therefore, this research considers BRBES’s approach to develop an expert system to assess the risk of flooding. In addition, to facilitate the learning procedures of BRBES, an optimal learning algorithm has been proposed. The developed BRBES has been applied taking real world case study area, located at Cox’s Bazar, Bangladesh. The training data has been collected from the case study area to obtain the trained BRB and to develop the optimal learning model. The BRBES can generate different "What-If" scenarios which enables the analysis of flood risk of an area from various perspectives which makes the system robust and sustainable. This system is said to be intelligent as it has knowledge base, inference engine as well as the learning capability.
APA, Harvard, Vancouver, ISO, and other styles
23

Farr, Anna C. "Understanding wayfinding: A Bayesian network approach." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/95789/1/Anna_Farr_Thesis.pdf.

Full text
Abstract:
This research used statistical modelling to investigate the factors that contribute to how we find our way in transportation hubs, in particular, airports. Using Bayesian Networks, the researcher built a model that incorporated both the human and environmental factors required for effective wayfinding. This research has advanced the literature on how expert opinions can be combined in as well as contributing to improvement of the understanding of wayfinding in transportation hubs.
APA, Harvard, Vancouver, ISO, and other styles
24

Zhou, Fan. "The impacts of car-sharing and shared autonomous vehicles on urban mobility: Towards a sustainable future." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/121497/1/Fan_Zhou_Thesis.pdf.

Full text
Abstract:
This dissertation presents a big-picture view for policymakers and related stakeholders regarding the future development of car-sharing services. Car sharing has the potential to significantly disrupt the personal mobility market, particularly on the dawn of self-driving cars. Thus, understanding car-sharing service's market penetration and implications are urgently needed. Studies in this thesis aim to forecast the penetration of car-sharing, to investigate if car-sharing influence consumers' vehicle ownership decision, and to explore the impacts of car-sharing on households' mode choice decision.
APA, Harvard, Vancouver, ISO, and other styles
25

Pirathiban, Ramethaa. "Improving species distribution modelling: Selecting absences and eliciting variable usefulness for input into standard algorithms or a Bayesian hierarchical meta-factor model." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/134401/1/Ramethaa_Pirathiban_Thesis.pdf.

Full text
Abstract:
This thesis explores and proposes methods to improve species distribution models. Throughout this thesis, a rich class of statistical modelling techniques has been developed to address crucial and interesting issues related to the data input into these models. The overall contribution of this research is the advancement of knowledge on species distribution modelling through an increased understanding of extraneous zeros, quality of the ecological data, variable selection that incorporates ecological theory and evaluating performance of the fitted models. Though motivated by the challenge of species distribution modelling from ecology, this research is broadly relevant to many fields, including bio-security and medicine. Specifically, this research is of potential significance to researchers seeking to: identify and explain extraneous zeros; assess the quality of their data; or employ expert-informed variable selection.
APA, Harvard, Vancouver, ISO, and other styles
26

Hajj, Paméla El. "Méthodes d'aide à la décision thérapeutique dans les cas des maladies rares : intérêt des méthodes bayésiennes et application à la maladie de Horton." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS037/document.

Full text
Abstract:
Les maladies rares sont celles qui touchent un nombre restreint de personnes. Par conséquent, des problèmes spécifiques sont dus par cette rareté.Pour cette raison nous avons systématiquement recherché dans la littérature les publications concernant les caractéristiques des différentes méthodes mathématiques qui ont été utilisées pour l'étude des maladies rares. L'objectif est d'identifier des approches novatrices pour la recherche qui ont été, ou peuvent être, utilisées afin de surmonter les difficultés méthodologiques inhérentes à l'étude des maladies rares.Les méthodes bayésiennes sont recommandées par plusieurs auteurs et dans le cas de ces méthodes il faut introduire une loi informative a priori sur l'effet inconnu du traitement.La détermination de la loi a priori dans le modèle bayésien est difficile. Nous avons travaillé sur les méthodes qui permettent de déterminer de la loi a priori en incluant la possibilité de considérer des informations provenant des études historiques et/ou des données provenant d'autres études "voisines".D'une part, on décrit un modèle bayésien qui a pour but de vérifier l'hypothèse de non-infériorité de l'essai qui repose sur l'hypothèse que le méthotrexate est plus efficace que le corticostéroïde seul.D'autre part, notre travail de thèse se repose sur la méthode epsilon- contamination, qui se base sur le principe de contaminer une loi a priori pas entièrement satisfaisante par une série de lois provenant des informations d'autres études ayant même pathologie de maladie, même traitement ou même population.Enfin, toutes les informations a priori peuvent être résumées par la distribution a priori déterminer à partir des opinions d'experts, leur avis sont recueillis lors d'une réunion où ils ont répondu à un questionnaire qui montre leurs a priori sur les paramètres du modèle bayésien
In recent years, scientists have difficulties to study rare diseases by conventional methods, because the sample size needed in such studies to meet a conventional frequentist power is not adapted to the number of available patients. After systemically searching in literature and characterizing different methods used in the contest of rare diseases, we remarked that most of the proposed methods are deterministic and are globally unsatisfactory because it is difficult to correct the insufficient statistical power.More attention has been placed on Bayesian models which through a prior distribution combined with a current study enable to draw decisionsfrom a posterior distribution. Determination of the prior distribution in a Bayesian model is challenging, we will describe the process of determining the prior including the possibility of considering information from some historical controlled trials and/or data coming from other studies sufficiently close to the subject of interest.First, we describe a Bayesian model that aims to test the hypothesis of the non-inferiority trial based on the hypothesis that methotrexate is more effective than corticosteroids alone.On the other hand, our work rests on the use of the epsilon-contamination method, which is based on contaminating an a priori not entirely satisfactory by a series of distributions drawn from information on other studies sharing close conditions,treatments or even populations. Contamination is a way to include the proximity of information provided bythese studies
APA, Harvard, Vancouver, ISO, and other styles
27

Larkin, Patricia Marguerite. "An Integrated Risk Management Framework for Carbon Capture and Storage in the Canadian Context." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/35881.

Full text
Abstract:
Climate change is a risk issue of global proportions. Human health and environmental impacts are anticipated from hazards associated with changes in temperature and precipitation regimes, and climate extremes. Increased natural hazards include storms and flooding, extreme heat, drought, and wildfires. Reduced food and water quality and quantity, reduced air quality, new geographic range of infectious diseases, and increased exposure to ultra-violet radiation are also predicted. In order to make a measurable contribution to reducing carbon dioxide emissions at point source fossil fuel and industrial process sites that contribute to climate change, estimates suggest that up to 3,000 dedicated large scale carbon capture and geological sequestration (CCS) projects will be necessary by 2050. Integrated projects include carbon dioxide capture; compression into a supercritical stream; transport, most often by pipeline; deep injection at wellheads; and sequestration in suitable saline aquifer geological formations, usually 800 metres or more below the earth’s surface. In implementing CCS as part of an overall climate change mitigation strategy, it is important to note that population health and environmental risks are associated with each of these value chain components of integrated projects. Based on an assessment of existing regulatory and non-regulatory guidance for risk assessment/risk management (RA/RM), an analysis of the application, assessment, and approval process for four large scale Canadian projects, and findings from a structured expert elicitation focused on hazard and risk issues in injection and storage and risk management of low probability high impact events, this research developed an Integrated Risk Management Framework (IRMF) for CCS in the Canadian context. The IRMF is a step-wise systematic process for RA/RM during the life of a project, including engagement with wide ranging government and non-government partners that would contribute to a determination of acceptable risk and risk control options. The execution of the IRMF is an intervention that could reduce local hazards and associated risks in terms of likelihood and consequence, as well as identify and document risk management that could underpin broad acceptance of CCS as a climate change mitigation technology. This would thereby also have an important part in protecting global population health and wellbeing in the long term. Indeed, diverse stakeholders could be unforgiving if hazard assessment and risk management in CCS is considered insufficient, leading to ‘pushback’ that could affect future implementation scenarios. On the other hand, RA/RM done right could favourably impact public perception of CCS, in turn instilling confidence, public acceptance, and ongoing support for the benefit of populations worldwide. This thesis is composed of an introduction to the research problem, including a population health conceptual framework for the IRMF, followed by five manuscripts, and concluding with a discussion about other barriers to CCS project development, and a risk management policy scenario for both the present time and during the 2017-2030 implementation period.
APA, Harvard, Vancouver, ISO, and other styles
28

Ben, Abdallah Nadia. "Modeling sea-level rise uncertainties for coastal defence adaptation using belief functions." Thesis, Compiègne, 2014. http://www.theses.fr/2014COMP1616.

Full text
Abstract:
L’adaptation côtière est un impératif pour faire face à l’élévation du niveau marin,conséquence directe du réchauffement climatique. Cependant, la mise en place d’actions et de stratégies est souvent entravée par la présence de diverses et importantes incertitudes lors de l’estimation des aléas et risques futurs. Ces incertitudes peuvent être dues à une connaissance limitée (de l’élévation du niveau marin futur par exemple) ou à la variabilité naturelle de certaines variables (les conditions de mer extrêmes). La prise en compte des incertitudes dans la chaîne d’évaluation des risques est essentielle pour une adaptation efficace.L’objectif de ce travail est de proposer une méthodologie pour la quantification des incertitudes basée sur les fonctions de croyance – un formalisme de l’incertain plus flexible que les probabilités. Les fonctions de croyance nous permettent de décrire plus fidèlement l’information incomplète fournie par des experts (quantiles,intervalles, etc.), et de combiner différentes sources d’information. L’information statistique peut quand à elle être décrite par de fonctions des croyance définies à partir de la fonction de vraisemblance. Pour la propagation d’incertitudes, nous exploitons l’équivalence mathématique entre fonctions de croyance et intervalles aléatoires, et procédons par échantillonnage Monte Carlo. La méthodologie est appliquée dans l’estimation des projections de la remontée du niveau marin global à la fin du siècle issues de la modélisation physique, d’élicitation d’avis d’experts, et de modèle semi-empirique. Ensuite, dans une étude de cas, nous évaluons l’impact du changement climatique sur les conditions de mers extrêmes et évaluons le renforcement nécessaire d’une structure afin de maintenir son niveau de performance fonctionnelle
Coastal adaptation is an imperative to deal with the elevation of the global sealevel caused by the ongoing global warming. However, when defining adaptationactions, coastal engineers encounter substantial uncertainties in the assessment of future hazards and risks. These uncertainties may stem from a limited knowledge (e.g., about the magnitude of the future sea-level rise) or from the natural variabilityof some quantities (e.g., extreme sea conditions). A proper consideration of these uncertainties is of principal concern for efficient design and adaptation.The objective of this work is to propose a methodology for uncertainty analysis based on the theory of belief functions – an uncertainty formalism that offers greater features to handle both aleatory and epistemic uncertainties than probabilities.In particular, it allows to represent more faithfully experts’ incomplete knowledge (quantiles, intervals, etc.) and to combine multi-sources evidence taking into account their dependences and reliabilities. Statistical evidence can be modeledby like lihood-based belief functions, which are simply the translation of some inference principles in evidential terms. By exploiting the mathematical equivalence between belief functions and random intervals, uncertainty can be propagated through models by Monte Carlo simulations. We use this method to quantify uncertainty in future projections of the elevation of the global sea level by 2100 and evaluate its impact on some coastal risk indicators used in coastal design. Sea-level rise projections are derived from physical modelling, expert elicitation, and historical sea-level measurements. Then, within a methodologically-oriented case study,we assess the impact of climate change on extreme sea conditions and evaluate there inforcement of a typical coastal defence asset so that its functional performance is maintained
APA, Harvard, Vancouver, ISO, and other styles
29

Fucik, Markus. "Bayesian risk management : "Frequency does not make you smarter"." Phd thesis, Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2011/5308/.

Full text
Abstract:
Within our research group Bayesian Risk Solutions we have coined the idea of a Bayesian Risk Management (BRM). It claims (1) a more transparent and diligent data analysis as well as (2)an open-minded incorporation of human expertise in risk management. In this dissertation we formulize a framework for BRM based on the two pillars Hardcore-Bayesianism (HCB) and Softcore-Bayesianism (SCB) providing solutions for the claims above. For data analysis we favor Bayesian statistics with its Markov Chain Monte Carlo (MCMC) simulation algorithm. It provides a full illustration of data-induced uncertainty beyond classical point-estimates. We calibrate twelve different stochastic processes to four years of CO2 price data. Besides, we calculate derived risk measures (ex ante/ post value-at-risks, capital charges, option prices) and compare them to their classical counterparts. When statistics fails because of a lack of reliable data we propose our integrated Bayesian Risk Analysis (iBRA) concept. It is a basic guideline for an expertise-driven quantification of critical risks. We additionally review elicitation techniques and tools supporting experts to express their uncertainty. Unfortunately, Bayesian thinking is often blamed for its arbitrariness. Therefore, we introduce the idea of a Bayesian due diligence judging expert assessments according to their information content and their inter-subjectivity.
Die vorliegende Arbeit befasst sich mit den Ansätzen eines Bayes’schen Risikomanagements zur Messung von Risiken. Dabei konzentriert sich die Arbeit auf folgende zentrale Fragestellungen: (1) Wie ist es möglich, transparent Risiken zu quantifizieren, falls nur eine begrenzte Anzahl an geeigneten historischen Beobachtungen zur Datenanalyse zur Verfügung steht? (2) Wie ist es möglich, transparent Risiken zu quantifizieren, falls mangels geeigneter historischer Beobachtungen keine Datenanalyse möglich ist? (3) Inwieweit ist es möglich, Willkür und Beliebigkeit bei der Risikoquantifizierung zu begrenzen? Zur Beantwortung der ersten Frage schlägt diese Arbeit die Anwendung der Bayes’schen Statistik vor. Im Gegensatz zu klassischen Kleinste-Quadrate bzw. Maximum-Likelihood Punktschätzern können Bayes’sche A-Posteriori Verteilungen die dateninduzierte Parameter- und Modellunsicherheit explizit messen. Als Anwendungsbeispiel werden in der Arbeit zwölf verschiedene stochastische Prozesse an CO2-Preiszeitreihen mittels des effizienten Bayes’schen Markov Chain Monte Carlo (MCMC) Simulationsalgorithmus kalibriert. Da die Bayes’sche Statistik die Berechnung von Modellwahrscheinlichkeiten zur kardinalen Modellgütemessung erlaubt, konnten Log-Varianz Prozesse als mit Abstand beste Modellklasse identifiziert werden. Für ausgewählte Prozesse wurden zusätzlich die Auswirkung von Parameterunsicherheit auf abgeleitete Risikomaße (ex-ante/ ex-post Value-at-Risks, regulatorische Kapitalrücklagen, Optionspreise) untersucht. Generell sind die Unterschiede zwischen Bayes’schen und klassischen Risikomaßen umso größer, je komplexer die Modellannahmen für den CO2-Preis sind. Überdies sind Bayes’sche Value-at-Risks und Kapitalrücklagen konservativer als ihre klassischen Pendants (Risikoprämie für Parameterunsicherheit). Bezüglich der zweiten Frage ist die in dieser Arbeit vertretene Position, dass eine Risikoquantifizierung ohne (ausreichend) verlässliche Daten nur durch die Berücksichtigung von Expertenwissen erfolgen kann. Dies erfordert ein strukturiertes Vorgehen. Daher wird das integrated Bayesian Risk Analysis (iBRA) Konzept vorgestellt, welches Konzepte, Techniken und Werkzeuge zur expertenbasierten Identifizierung und Quantifizierung von Risikofaktoren und deren Abhängigkeiten vereint. Darüber hinaus bietet es Ansätze für den Umgang mit konkurrierenden Expertenmeinungen. Da gerade ressourceneffiziente Werkzeuge zur Quantifizierung von Expertenwissen von besonderem Interesse für die Praxis sind, wurden im Rahmen dieser Arbeit der Onlinemarkt PCXtrade und die Onlinebefragungsplattform PCXquest konzipiert und mehrfach erfolgreich getestet. In zwei empirischen Studien wurde zudem untersucht, inwieweit Menschen überhaupt in der Lage sind, ihre Unsicherheiten zu quantifizieren und inwieweit sie Selbsteinschätzungen von Experten bewerten. Die Ergebnisse deuten an, dass Menschen zu einer Selbstüberschätzung ihrer Prognosefähigkeiten neigen und tendenziell hohes Vertrauen in solche Experteneinschätzungen zeigen, zu denen der jeweilige Experte selbst hohes Zutrauen geäußert hat. Zu letzterer Feststellung ist jedoch zu bemerken, dass ein nicht unbeträchtlicher Teil der Befragten sehr hohe Selbsteinschätzung des Experten als negativ ansehen. Da der Bayesianismus Wahrscheinlichkeiten als Maß für die persönliche Unsicherheit propagiert, bietet er keinerlei Rahmen für die Verifizierung bzw. Falsifizierung von Einschätzungen. Dies wird mitunter mit Beliebigkeit gleichgesetzt und könnte einer der Gründe sein, dass offen praktizierter Bayesianismus in Deutschland ein Schattendasein fristet. Die vorliegende Arbeit stellt daher das Konzept des Bayesian Due Diligence zur Diskussion. Es schlägt eine kriterienbasierte Bewertung von Experteneinschätzungen vor, welche insbesondere die Intersubjektivität und den Informationsgehalt von Einschätzungen beleuchtet.
APA, Harvard, Vancouver, ISO, and other styles
30

Oteniya, Lloyd. "Bayesian belief networks for dementia diagnosis and other applications : a comparison of hand-crafting and construction using a novel data driven technique." Thesis, University of Stirling, 2008. http://hdl.handle.net/1893/497.

Full text
Abstract:
The Bayesian network (BN) formalism is a powerful representation for encoding domains characterised by uncertainty. However, before it can be used it must first be constructed, which is a major challenge for any real-life problem. There are two broad approaches, namely the hand-crafted approach, which relies on a human expert, and the data-driven approach, which relies on data. The former approach is useful, however issues such as human bias can introduce errors into the model. We have conducted a literature review of the expert-driven approach, and we have cherry-picked a number of common methods, and engineered a framework to assist non-BN experts with expert-driven construction of BNs. The latter construction approach uses algorithms to construct the model from a data set. However, construction from data is provably NP-hard. To solve this problem, approximate, heuristic algorithms have been proposed; in particular, algorithms that assume an order between the nodes, therefore reducing the search space. However, traditionally, this approach relies on an expert providing the order among the variables --- an expert may not always be available, or may be unable to provide the order. Nevertheless, if a good order is available, these order-based algorithms have demonstrated good performance. More recent approaches attempt to ''learn'' a good order then use the order-based algorithm to discover the structure. To eliminate the need for order information during construction, we propose a search in the entire space of Bayesian network structures --- we present a novel approach for carrying out this task, and we demonstrate its performance against existing algorithms that search in the entire space and the space of orders. Finally, we employ the hand-crafting framework to construct models for the task of diagnosis in a ''real-life'' medical domain, dementia diagnosis. We collect real dementia data from clinical practice, and we apply the data-driven algorithms developed to assess the concordance between the reference models developed by hand and the models derived from real clinical data.
APA, Harvard, Vancouver, ISO, and other styles
31

Barbour, Emily. "Quantitative modelling for assessing system trade-offs in environmental flow management." Phd thesis, Canberra, ACT : The Australian National University, 2015. http://hdl.handle.net/1885/109583.

Full text
Abstract:
This research aims to better enable the management of environmental flows through exploring the opportunities and challenges in using quantitative models for decision making. It examines the development and application of ecological response models, river system models, and multi-objective optimisation for improved ecological outcomes and the identification of trade-offs. In doing so, the thesis endeavours to capture a deeper and more holistic understanding of uncertainty in the application of quantitative models, to assist in making more informed decisions in water resource management. The thesis includes three main components. Firstly, an ecological response model is developed to advance previous methods by: (1) adopting a systems approach to representing water availability for floodplain vegetation, considering rainfall and groundwater in addition to riverine flooding; (2) including antecedent conditions in estimating current ecological condition; and (3) including uncertainty in modelling ecological response through the use of upper and lower prediction bounds and multiple conceptual models derived through expert elicitation. Secondly, the ecological response model is evaluated using sensitivity and uncertainty analysis. Global sensitivity analysis was used to identify model components that are both uncertain and have critical impact on results, and demonstrated that conceptualisation of ecological response had the greatest impact on predicted ecological condition. A novel application of Bayesian analysis was then used to evaluate different expert derived models against observed data, considering multiple sources of uncertainty. The analysis demonstrates a number of remaining challenges in modelling ecological systems, where model performance depends upon assumptions that are highly uncertain. The third and final component evaluates opportunities and challenges in using multi-objective optimisation, to assist in water resource management and the improvement of ecological outcomes. This component begins with a synthesis of previous studies drawing upon literature from hydrology, ecology, optimisation and decision science, and identifies a number of strategies for improvement. The synthesis is followed by a case study on the Lachlan catchment of the Murray-Darling Basin, Australia. The case study uses multi-objective optimisation to explore different environmental flow rules using a river system model combined with the expert-based ecological models. In doing so, it addresses the challenges of objective setting and problem framing in the context of significant uncertainty. The case study evaluates results generated using the optimisation framework in terms of likely actual decision outcomes. The research identifies a need to revisit fundamental questions regarding system understanding and objective framing in the light of rapidly improving computational capacity and sophistication. This is particularly relevant in the case of ecological management, where objectives form an interplay between ecological science and social values. Modelling tools provide valuable pathways to system learning and communication, yet a deeper understanding and evaluation of model behaviour in the context of actual decisions is needed. The methods presented in this thesis aim to provide a step toward addressing the challenges of working with uncertain information, incomplete knowledge, and integration across multiple disciplines within a decision-making environment. Through the methods developed here, the research seeks to advance the science of model development and application.
APA, Harvard, Vancouver, ISO, and other styles
32

Al-Ani, Ahmed Karim. "An improved pattern classification system using optimal feature selection, classifier combination, and subspace mapping techniques." Thesis, Queensland University of Technology, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
33

O'Leary, Rebecca A. "Informed statistical modelling of habitat suitability for rare and threatened species." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/17779/1/Rebecca_O%27Leary_Thesis.pdf.

Full text
Abstract:
In this thesis a number of statistical methods have been developed and applied to habitat suitability modelling for rare and threatened species. Data available on these species are typically limited. Therefore, developing these models from these data can be problematic and may produce prediction biases. To address these problems there are three aims of this thesis. The _rst aim is to develop and implement frequentist and Bayesian statistical modelling approaches for these types of data. The second aim is develop and implement expert elicitation methods. The third aim is to apply these novel approaches to Australian rare and threatened species case studies with the intention of habitat suitability modelling. The _rst aim is ful_lled by investigating two innovative approaches for habitat suitability modelling and sensitivity analysis of the second approach to priors. The _rst approach is a new multilevel framework developed to model the species distribution at multiple scales and identify excess zeros (absences outside the species range). Applying a statistical modelling approach to the identi_cation of excess zeros has not previously been conducted. The second approach is an extension and application of Bayesian classi_cation trees to modelling the habitat suitability of a threatened species. This is the _rst `real' application of this approach in ecology. Lastly, sensitivity analysis of the priors in Bayesian classi_cation trees are examined for a real case study. Previously, sensitivity analysis of this approach to priors has not been examined. To address the second aim, expert elicitation methods are developed, extended and compared in this thesis. In particular, one elicitation approach is extended from previous research, there is a comparison of three elicitation methods, and one new elicitation approach is proposed. These approaches are illustrated for habitat suitability modelling of a rare species and the opinions of one or two experts are elicited. The _rst approach utilises a simple questionnaire, in which expert opinion is elicited on whether increasing values of a covariate either increases, decreases or does not substantively impact on a response. This approach is extended to express this information as a mixture of three normally distributed prior distributions, which are then combined with available presence/absence data in a logistic regression. This is one of the _rst elicitation approaches within the habitat suitability modelling literature that is appropriate for experts with limited statistical knowledge and can be used to elicit information from single or multiple experts. Three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression are compared, one of which is the questionnaire approach. Included in this comparison of three elicitation methods are a summary of the advantages and disadvantages of these three methods, the results from elicitations and comparison of the prior and posterior distributions. An expert elicitation approach is developed for classi_cation trees, in which the size and structure of the tree is elicited. There have been numerous elicitation approaches proposed for logistic regression, however no approaches have been suggested for classi_cation trees. The last aim of this thesis is addressed in all chapters, since the statistical approaches proposed and extended in this thesis have been applied to real case studies. Two case studies have been examined in this thesis. The _rst is the rare native Australian thistle (Stemmacantha australis), in which the dataset contains a large number of absences distributed over the majority of Queensland, and a small number of presence sites that are only within South-East Queensland. This case study motivated the multilevel modelling framework. The second case study is the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The application and sensitivity analysis of Bayesian classi_cation trees, and all expert elicitation approaches investigated in this thesis are applied to this case study. This work has several implications for conservation and management of rare and threatened species. Novel statistical approaches addressing the _rst aim provide extensions to currently existing methods, or propose a new approach, for identi _cation of current and potential habitat. We demonstrate that better model predictions can be achieved using each method, compared to standard techniques. Elicitation approaches addressing the second aim ensure expert knowledge in various forms can be harnessed for habitat modelling, a particular bene_t for rare and threatened species which typically have limited data. Throughout, innovations in statistical methodology are both motivated and illustrated via habitat modelling for two rare and threatened species: the native thistle Stemmacantha australis and the brush-tailed rock wallaby Petrogale penicillata.
APA, Harvard, Vancouver, ISO, and other styles
34

O'Leary, Rebecca A. "Informed statistical modelling of habitat suitability for rare and threatened species." Queensland University of Technology, 2008. http://eprints.qut.edu.au/17779/.

Full text
Abstract:
In this thesis a number of statistical methods have been developed and applied to habitat suitability modelling for rare and threatened species. Data available on these species are typically limited. Therefore, developing these models from these data can be problematic and may produce prediction biases. To address these problems there are three aims of this thesis. The _rst aim is to develop and implement frequentist and Bayesian statistical modelling approaches for these types of data. The second aim is develop and implement expert elicitation methods. The third aim is to apply these novel approaches to Australian rare and threatened species case studies with the intention of habitat suitability modelling. The _rst aim is ful_lled by investigating two innovative approaches for habitat suitability modelling and sensitivity analysis of the second approach to priors. The _rst approach is a new multilevel framework developed to model the species distribution at multiple scales and identify excess zeros (absences outside the species range). Applying a statistical modelling approach to the identi_cation of excess zeros has not previously been conducted. The second approach is an extension and application of Bayesian classi_cation trees to modelling the habitat suitability of a threatened species. This is the _rst `real' application of this approach in ecology. Lastly, sensitivity analysis of the priors in Bayesian classi_cation trees are examined for a real case study. Previously, sensitivity analysis of this approach to priors has not been examined. To address the second aim, expert elicitation methods are developed, extended and compared in this thesis. In particular, one elicitation approach is extended from previous research, there is a comparison of three elicitation methods, and one new elicitation approach is proposed. These approaches are illustrated for habitat suitability modelling of a rare species and the opinions of one or two experts are elicited. The _rst approach utilises a simple questionnaire, in which expert opinion is elicited on whether increasing values of a covariate either increases, decreases or does not substantively impact on a response. This approach is extended to express this information as a mixture of three normally distributed prior distributions, which are then combined with available presence/absence data in a logistic regression. This is one of the _rst elicitation approaches within the habitat suitability modelling literature that is appropriate for experts with limited statistical knowledge and can be used to elicit information from single or multiple experts. Three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression are compared, one of which is the questionnaire approach. Included in this comparison of three elicitation methods are a summary of the advantages and disadvantages of these three methods, the results from elicitations and comparison of the prior and posterior distributions. An expert elicitation approach is developed for classi_cation trees, in which the size and structure of the tree is elicited. There have been numerous elicitation approaches proposed for logistic regression, however no approaches have been suggested for classi_cation trees. The last aim of this thesis is addressed in all chapters, since the statistical approaches proposed and extended in this thesis have been applied to real case studies. Two case studies have been examined in this thesis. The _rst is the rare native Australian thistle (Stemmacantha australis), in which the dataset contains a large number of absences distributed over the majority of Queensland, and a small number of presence sites that are only within South-East Queensland. This case study motivated the multilevel modelling framework. The second case study is the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The application and sensitivity analysis of Bayesian classi_cation trees, and all expert elicitation approaches investigated in this thesis are applied to this case study. This work has several implications for conservation and management of rare and threatened species. Novel statistical approaches addressing the _rst aim provide extensions to currently existing methods, or propose a new approach, for identi _cation of current and potential habitat. We demonstrate that better model predictions can be achieved using each method, compared to standard techniques. Elicitation approaches addressing the second aim ensure expert knowledge in various forms can be harnessed for habitat modelling, a particular bene_t for rare and threatened species which typically have limited data. Throughout, innovations in statistical methodology are both motivated and illustrated via habitat modelling for two rare and threatened species: the native thistle Stemmacantha australis and the brush-tailed rock wallaby Petrogale penicillata.
APA, Harvard, Vancouver, ISO, and other styles
35

MAGRINI, ALESSANDRO. "A Bayesian network for the diagnosis of cardiopulmonary diseases: Learning from medical experts and clinical data." Doctoral thesis, 2014. http://hdl.handle.net/2158/841701.

Full text
Abstract:
Bayesian networks offer an extremely flexible environment for knowledge representation, so that they are often claimed to be the best statistical tools to support medical diagnosis. An Acyclic Directed Graph encodes causal relationships and provides a factorization of the joint probability distribution according to conditional independence properties. Although knowledge for the specification of the Acyclic Directed Graph is easilyretrievable, the information useful to develop the quantitative part of the network is typically scattered and varying in quality. For instance, medical literature seldom covers all the aspects of interest and clinical data are typically sparse. This is why the most relevant applications of Bayesian networks to medical diagnosis are entirely built from expert knowledge. However, when this is the case, the accuracy of the quantitative part remains arguable, since the the elicited information is unlikely to be fully trustworthy. In this thesis, the quantitative part of a Bayesian network for the diagnosis of cardiopulmonary diseases is estimated by combining elicitation from medical experts and clinical data. An original elicitation framework is developed to accurately quantify expert uncertainty on parameters, then prior distributions are updated in the light of data by means of Markov Chain Monte Carlo methods. The framework includes several generalizations of the Noisy-Or model and a Generalized Beta regression which are exploited to avoid polytomization of continuous variables. Parsimony in the number of parameters is dramatically improved with respect to the traditional framework, while a rescaling procedure based on a distinction between normal and pathological values makes parameters meaningful for medical experts. As such, the framework allows to either incorporate the sample size of clinical studies into the prior distributions or, when no study provides sufficiently detailed information, to compute expert uncertainty on assessments as if they were based on a virtual experiment. By taking advantage of two different sources of knowledge, the consistency between the model and data is readily checked by inspecting prior-to-posterior divergence. This enables a proper refinement of the Bayesian network in a cyclic-iterated fashion, possibly questioning the specification of the Acyclic Directed Graph, a feature which is beyond the capability of applications entirely built from either expert knowledge or data.
APA, Harvard, Vancouver, ISO, and other styles
36

黃國禎. "= New knowledge elicitation methods for constructing expert systems." Thesis, 1991. http://ndltd.ncl.edu.tw/handle/32921369062265519472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

"Expert knowledge elicitation to improve mental and formal models." Sloan School of Management, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/2654.

Full text
Abstract:
David N. Ford and John D. Sterman.
Includes bibliographical references (p. 24-25).
Supported by the Python organization, the Organizational Learning Center and the System Dynamics Group at the MIT Sloan School of Management.
APA, Harvard, Vancouver, ISO, and other styles
38

Cruickshank, Claire. "Does the Elicitation Mode Matter? Comparing Different Methods for Eliciting Expert Judgement." 2018. https://scholarworks.umass.edu/masters_theses_2/634.

Full text
Abstract:
An expert elicitation is a method of eliciting subjective probability distributions over key parameters from experts. Traditionally an expert elicitation has taken the form of a face-to-face interview; however, interest in using online methods has been growing. This thesis compares two elicitation modes and examines the effectiveness of an interactive online survey compared to a face-to-face interview. Differences in central values, overconfidence, accuracy and satisficing were considered. The results of our analysis indicated that, in instances where the online and face-to-face elicitations were directly comparable, the differences between the modes was not significant. Consequently, a carefully designed online elicitation may be used successfully to obtain accurate forecasts.
APA, Harvard, Vancouver, ISO, and other styles
39

Dunn, Jessamine Corey. "Bayesian Networks with Expert Elicitation as Applicable to Student Retention in Institutional Research." 2016. http://scholarworks.gsu.edu/eps_diss/146.

Full text
Abstract:
The application of Bayesian networks within the field of institutional research is explored through the development of a Bayesian network used to predict first- to second-year retention of undergraduates. A hybrid approach to model development is employed, in which formal elicitation of subject-matter expertise is combined with machine learning in designing model structure and specification of model parameters. Subject-matter experts include two academic advisors at a small, private liberal arts college in the southeast, and the data used in machine learning include six years of historical student-related information (i.e., demographic, admissions, academic, and financial) on 1,438 first-year students. Netica 5.12, a software package designed for constructing Bayesian networks, is used for building and validating the model. Evaluation of the resulting model’s predictive capabilities is examined, as well as analyses of sensitivity, internal validity, and model complexity. Additionally, the utility of using Bayesian networks within institutional research and higher education is discussed. The importance of comprehensive evaluation is highlighted, due to the study’s inclusion of an unbalanced data set. Best practices and experiences with expert elicitation are also noted, including recommendations for use of formal elicitation frameworks and careful consideration of operating definitions. Academic preparation and financial need risk profile are identified as key variables related to retention, and the need for enhanced data collection surrounding such variables is also revealed. For example, the experts emphasize study skills as an important predictor of retention while noting the absence of collection of quantitative data related to measuring students’ study skills. Finally, the importance and value of the model development process is stressed, as stakeholders are required to articulate, define, discuss, and evaluate model components, assumptions, and results.
APA, Harvard, Vancouver, ISO, and other styles
40

Yang, Yung-Hsiang, and 楊詠翔. "Study on Expert Experience Elicitation Applied to Bayesian Network – An Example of Construction Safety." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/98224660923602026560.

Full text
Abstract:
碩士
國立臺灣科技大學
營建工程系
97
Bayesian Network is an effective analysis method, and it is used in many domains in recent year. In construction engineering, many risk factors bring much uncertainty, and analyst cannot obtain the correct probabilities because of the lack of historical data. In the situation, how to obtain the probabilities of all the factors in every state will become an important topic. In order to build a method process to acquire experts’ subjective probabilities, this research will investigate how to obtain the accurate values in the conditional probability table (CPT) of Bayesian Network by experts’ subjective judgment elicitation if there is lack of the historical data. At the same time, the research will also bring an useful software into the method process that analyst could obtain the values efficiently, and thus use the accurate and reliable values to proceed the Bayesian Network Analysis in order to create the valid results.
APA, Harvard, Vancouver, ISO, and other styles
41

Kabir, Sohag, T. K. Goek, M. Kumar, M. Yazdi, and F. Hossain. "A method for temporal fault tree analysis using intuitionistic fuzzy set and expert elicitation." 2019. http://hdl.handle.net/10454/17992.

Full text
Abstract:
Yes
Temporal fault trees (TFTs), an extension of classical Boolean fault trees, can model time-dependent failure behaviour of dynamic systems. The methodologies used for quantitative analysis of TFTs include algebraic solutions, Petri nets (PN), and Bayesian networks (BN). In these approaches, precise failure data of components are usually used to calculate the probability of the top event of a TFT. However, it can be problematic to obtain these precise data due to the imprecise and incomplete information about the components of a system. In this paper, we propose a framework that combines intuitionistic fuzzy set theory and expert elicitation to enable quantitative analysis of TFTs of dynamic systems with uncertain data. Experts’ opinions are taken into account to compute the failure probability of the basic events of the TFT as intuitionistic fuzzy numbers. Subsequently, for the algebraic approach, the intuitionistic fuzzy operators for the logic gates of TFT are defined to quantify the TFT. On the other hand, for the quantification of TFTs via PN and BN-based approaches, the intuitionistic fuzzy numbers are defuzzified to be used in these approaches. As a result, the framework can be used with all the currently available TFT analysis approaches. The effectiveness of the proposed framework is illustrated via application to a practical system and through a comparison of the results of each approach.
This work was supported in part by the Mobile IOT: Location Aware project (grant no. MMUE/180025) and Indoor Internet of Things (IOT) Tracking Algorithm Development based on Radio Signal Characterisation project (grant no. FRGS/1/2018/TK08/MMU/02/1). This research also received partial support from DEIS H2020 project (grant no. 732242).
APA, Harvard, Vancouver, ISO, and other styles
42

Alvarado-Valencia, J., L. H. Barrero, Dilek Onkal, and J. T. Dennerlein. "Expertise, credibility of system forecasts and integration methods in judgmental demand forecasting." 2015. http://hdl.handle.net/10454/13387.

Full text
Abstract:
Yes
Expert knowledge elicitation lies at the core of judgmental forecasting—a domain that relies fully on the power of such knowledge and its integration into forecasting. Using experts in a demand forecasting framework, this work aims to compare the accuracy improvements and forecasting performances of three judgmental integration methods. To do this, a field study was conducted with 31 experts from four companies. The methods compared were the judgmental adjustment, the 50–50 combination, and the divide-and-conquer. Forecaster expertise, the credibility of system forecasts and the need to rectify system forecasts were also assessed, and mechanisms for performing this assessment were considered. When (a) a forecaster’s relative expertise was high, (b) the relative credibility of the system forecasts was low, and (c) the system forecasts had a strong need of correction, judgmental adjustment improved the accuracy relative to both the other integration methods and the system forecasts. Experts with higher levels of expertise showed higher adjustment frequencies. Our results suggest that judgmental adjustment promises to be valuable in the long term if adequate conditions of forecaster expertise and the credibility of system forecasts are met.
APA, Harvard, Vancouver, ISO, and other styles
43

Czembor, Christina Anne. "Incorporating uncertainty into expert models for management of box-ironbark forests and woodlands in Victoria, Australia." 2009. http://repository.unimelb.edu.au/10187/5801.

Full text
Abstract:
Anthropogenic utilization of forest and woodland ecosystems can cause declines in flora and fauna species. It is imperative to restore these ecosystems to mitigate further declines. In this thesis, I focused on a highly degraded region, the Box-Ironbark forests and woodlands of Victoria, Australia. Rather than mature stands with large trees, stands are currently dominated by high densities of small stems. This change has resulted in reduced populations of many flora and fauna species dependent on older-growth forests and woodlands. Managers are interested in restoring mature Box-Ironbark forests and woodlands through three alternative management strategies: allocating land to National Parks and allowing stands to develop naturally without harvesting, modifying timber harvesting regimes to retain more medium and large trees, or a new ecological thinning technique that retains target habitat trees and removes competing trees to encourage growth of retained stems.
The effects of each management strategy are not easy to predict due to complex interactions between intervention and stochastic natural processes. Forest simulation models are often employed to overcome this problem. I constructed state-and-transition simulation models (STSMs) to predict the effects of alternative management actions and natural disturbances on vegetation structure. Due to a lack of empirical data, I relied on the knowledge of experts in Box-Ironbark ecology and management to construct STSMs. Models predicted that the development of mature woodlands under all strategies was minimal over the next 150 years, and neither current harvesting nor ecological thinning is likely to expedite the development of mature stands relative to growth and natural disturbances. However, differences in experts’ opinions led to widely diverging model predictions.
Uncertainty must be acknowledged in model construction because it can affect model predictions. I quantified uncertainty due to four sources – between-expert variation, imperfect expert knowledge, natural stochasticity, and model parameterization – to determine which source caused the most variance in model predictions. I found that models were very uncertain and between-expert uncertainty contributed the majority of variance in model predictions. This brings into question the use of consensus methods in forest management where differences between experts are ignored.
Using uncertain model predictions to make management decisions is problematic because any given action can have many plausible outcomes. I applied several decision criteria to uncertain STSM predictions using a formal decision-making framework to determine the optimal management action in Box-Ironbark forests and woodlands. I found that natural development is the most risk-averse option, while ecological thinning is the most risky option because there is a small likelihood that it will greatly expedite the development of mature woodlands. Rather than selecting one option, managers could rely on a risk-spreading approach where the majority of land is allocated to no-cutting National Parks and a small amount of land is allocated to the other two harvesting strategies. This would allow managers to collect monitoring data for all management strategies in order to learn about effects of harvesting and update model predictions through time using adaptive management.
APA, Harvard, Vancouver, ISO, and other styles
44

Classen, Selwyn Ivor. "Using storytelling to elicit tacit knowledge from subject matter experts in an organization." Thesis, 2010. http://hdl.handle.net/11394/3486.

Full text
Abstract:
Magister Commercii (Information Management) - MCom(IM)
Knowledge Management has been at the heart of mounting focus over the last several years. Research and literature on the area under discussion has grown and organizations have come to realize that success is often determined by one’s ability to create, disseminate, and embody knowledge in products and services. This realization has led to increased interest in examining the ways in which knowledge can be effectively identified, elicited, codified, distributed and retained.When an employee leaves an organization, the knowledge they possess often goes with them. This loss can potentially have a negative impact on the productivity and quality of the organization. Knowledge Management seeks to find ways to minimize loss of knowledge when an employee leaves an organization. One of the impediments that knowledge management seeks to overcome is the accepted tendency in people to hoard knowledge. People often withhold knowledge when they feel it provides them with a competitive advantage over others. The argument of this study was intended to provide the organization with an approach that it can utilize to facilitate tacit knowledge elicitation by means of the storytelling method.In keeping with Grounded theory principles, and utilising an interpretive approach, stories from Subject Matter Experts were collected and re-coded into fitting knowledge management constructs. The coding of the stories into the various knowledge management constructs was then further refined by means of expert review. Pearson’s cross correlation analysis was also used as a supporting tool to determine and validate that the collected stories were classified correctly under the knowledge management constructs. The research findings eventually demonstrated that storytelling is an effective means of eliciting tacit knowledge from experts. In addition to this, the research has inadvertently resulted in the construction of a knowledge management framework for storytelling.
APA, Harvard, Vancouver, ISO, and other styles
45

Hsieh, Chien-Wen, and 謝千文. "The Study of Teaching Belief and Teaching Behavior of A Somatic Education Expert Teacher: Taking Pilates Course as An Example." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/gd32f9.

Full text
Abstract:
碩士
國立臺東大學
體育學系碩士班
96
The purpose of the study is to explore the relationship among teaching belief and teaching behavior for a somatic education expert teacher in the Pilates course. I collected qualitative data of the expert teacher
APA, Harvard, Vancouver, ISO, and other styles
46

Kashuba, Roxolana Oresta. "Bayesian Methods to Characterize Uncertainty in Predictive Modeling of the Effect of Urbanization on Aquatic Ecosystems." Diss., 2010. http://hdl.handle.net/10161/2366.

Full text
Abstract:

Urbanization causes myriad changes in watershed processes, ultimately disrupting the structure and function of stream ecosystems. Urban development introduces contaminants (human waste, pesticides, industrial chemicals). Impervious surfaces and artificial drainage systems speed the delivery of contaminants to streams, while bypassing soil filtration and local riparian processes that can mitigate the impacts of these contaminants, and disrupting the timing and volume of hydrologic patterns. Aquatic habitats where biota live are degraded by sedimentation, channel incision, floodplain disconnection, substrate alteration and elimination of reach diversity. These compounding changes ultimately lead to alteration of invertebrate community structure and function. Because the effects of urbanization on stream ecosystems are complex, multilayered, and interacting, modeling these effects presents many unique challenges, including: addressing and quantifying processes at multiple scales, representing major interrelated simultaneously acting dynamics at the system level, incorporating uncertainty resulting from imperfect knowledge, imperfect data, and environmental variability, and integrating multiple sources of available information about the system into the modeling construct. These challenges can be addressed by using a Bayesian modeling approach. Specifically, the use of multilevel hierarchical models and Bayesian network models allows the modeler to harness the hierarchical nature of the U.S. Geological Survey (USGS) Effect of Urbanization on Stream Ecosystems (EUSE) dataset to predict invertebrate response at both basin and regional levels, concisely represent and parameterize this system of complicated cause and effect relationships and uncertainties, calculate the full probabilistic function of all variables efficiently as the product of more manageable conditional probabilities, and includes both expert knowledge and data. Utilizing this Bayesian framework, this dissertation develops a series of statistically rigorous and ecologically interpretable models predicting the effect of urbanization on invertebrates, as well as a unique, systematic methodology that creates an informed expert prior and then updates this prior with available data using conjugate Dirichlet-multinomial distribution forms. The resulting models elucidate differences between regional responses to urbanization (particularly due to background agriculture and precipitation) and address the influences of multiple urban induced stressors acting simultaneously from a new system-level perspective. These Bayesian modeling approaches quantify previously unexplained regional differences in biotic response to urbanization, capture multiple interacting environmental and ecological processes affected by urbanization, and ultimately link urbanization effects on stream biota to a management context such that these models describe and quantify how changes in drivers lead to changes in regulatory endpoint (the Biological Condition Gradient; BCG).


Dissertation
APA, Harvard, Vancouver, ISO, and other styles
47

Pietrocatelli, Simon. "Analyse bayésienne et élicitation d’opinions d’experts en analyse de risques et particulièrement dans le cas de l’amiante chrysotile." Thèse, 2008. http://hdl.handle.net/1866/3345.

Full text
Abstract:
L’appréciation de la puissance cancérogène des fibres d’amiante chrysotile repose en grande partie sur des jugements subjectifs et incertains des experts et des analystes en raison des résultats hétérogènes et équivoques d’études épidémiologiques et toxicologiques sérieuses. L’approche probabiliste bayésienne en évaluation de risques peut formaliser l’impact des jugements subjectifs et de leurs incertitudes sur les estimations de risques, mais elle est encore peu utilisée en santé publique. Le présent travail examine la possibilité d’appliquer l’approche bayésienne dans une récente élicitation d’opinions d’experts pour estimer la toxicité du chrysotile, le degré de consensus et de divergence, ainsi que les niveaux d’incertitude des experts. Les estimations des experts concordaient assez bien sur la différence de toxicité entre chrysotile et amphiboles pour les mésothéliomes. Pour le cancer du poumon, les évaluations probabilistes étaient bien plus disparates. Dans ce cas, les jugements des experts semblaient influencés à différents degrés par des biais heuristiques, surtout les heuristiques d’affect et d’ancrage liés à la controverse du sujet et à l’hétérogénéité des données. Une méthodologie rigoureuse de préparation des experts à l’exercice d’élicitation aurait pu réduire l’impact des biais et des heuristiques sur le panel.
Characterizing the carcinogenic potency of chrysotile asbestos fibres relies a great deal on subjective and uncertain judgements by experts and analysts, given heterogeneous and equivocal results of important epidemiological and toxicological studies. The probabilistic Bayesian approach in risk assessments quantifies these subjective judgements and their uncertainties, along with their impact on risk estimations, but it is rarely used in the public health context. This report examines how the Bayesian approach could have been applied to a recent elicitation of experts’ opinions to estimate the toxicity of chrysotile asbestos, the degree of convergence and divergence, as well as the uncertainty levels of these experts. The experts’ estimations on the relative toxicity of chrysotile and amphibole asbestos were similar in the case of mesothelioma. However, in the case of lung cancer, the heterogeneity of the studies resulted in diverging and incompatible probabilistic evaluations. The experts’ judgements seemed influenced by heuristic biases, particularly the affect and anchor heuristics associated with a controversial topic and to heterogeneous data. If the elicitation process had been prepared following a rigorous methodology, these heuristics and biases could have been mitigated.
APA, Harvard, Vancouver, ISO, and other styles
48

"Reliability Information and Testing Integration for New Product Design." Doctoral diss., 2014. http://hdl.handle.net/2286/R.I.25799.

Full text
Abstract:
abstract: In the three phases of the engineering design process (conceptual design, embodiment design and detailed design), traditional reliability information is scarce. However, there are different sources of information that provide reliability inputs while designing a new product. This research considered these sources to be further analyzed: reliability information from similar existing products denominated as parents, elicited experts' opinions, initial testing and the customer voice for creating design requirements. These sources were integrated with three novels approaches to produce reliability insights in the engineering design process, all under the Design for Reliability (DFR) philosophy. Firstly, an enhanced parenting process to assess reliability was presented. Using reliability information from parents it was possible to create a failure structure (parent matrix) to be compared against the new product. Then, expert opinions were elicited to provide the effects of the new design changes (parent factor). Combining those two elements resulted in a reliability assessment in early design process. Extending this approach into the conceptual design phase, a methodology was created to obtain a graphical reliability insight of a new product's concept. The approach can be summarized by three sequential steps: functional analysis, cognitive maps and Bayesian networks. These tools integrated the available information, created a graphical representation of the concept and provided quantitative reliability assessments. Lastly, to optimize resources when product testing is viable (e.g., detailed design) a type of accelerated life testing was recommended: the accelerated degradation tests. The potential for robust design engineering for this type of test was exploited. Then, robust design was achieved by setting the design factors at some levels such that the impact of stress factor variation on the degradation rate can be minimized. Finally, to validate the proposed approaches and methods, different case studies were presented.
Dissertation/Thesis
Doctoral Dissertation Industrial Engineering 2014
APA, Harvard, Vancouver, ISO, and other styles
49

Chou, Shu-Chiung, and 周淑瓊. "A Comparative Study of the Belief, Cognition, Act Strategies and the Effects of Classroom Management between the Expert and Novice Teachers in the Junior High School." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/39876699871523936936.

Full text
Abstract:
碩士
國立臺灣師範大學
教育心理與輔導研究所
87
A Comparative Study of the Belief, Cognition, Act Strategies and the Effects of Classroom Management between the Expert and Novice Teachers in the Junior High School Shu-Chiung Chou Abstract The purposes of this study were: 1.to investigate the belief, cognition, and act strategies of classroom management between the expert and novice teachers in the junior high school; 2.to compare the effects of classroom management between the expert and novice teachers in the junior high school. The methods used in this study were interview and observation. The subjects of this study were two junior high school teachers, one expert teacher and one novice teacher, and 69 students of their classes. The researcher analyzed the contents obtained from the transcript of the interviews, and summarized the findings of the belief, cognition, act strategies, and other aspects of classroom management between the expert and the novice teachers. Also, The researcher compared the effects of classroom management between the expert and the novice teachers by analyzing the data of the 69 students'' scores on '''' Students'' Perception of School Life Scale''''. The major findings were as follows: 1.The expert teacher'' belief of classroom management was apt to the humanistic approach, and the expert teacher considered the goal of education was cultivating the students'' good characteristics. Meanwhile, the novice teacher'' belief of classroom management consisted of the humanistic and the behavioristic approaches. 2.The expert teacher'' cognition about classroom management was from the humanistic perspective, and the expert teacher contented the psychological needs of the students and raised the students'' sense of self-worth. The novice teacher'' cognition of classroom management was confused with the ideas between the humanistic and the behavioristic approaches. 3.The expert teacher'' act strategies of classroom management was based on the humanistic approach. The expert teacher made good use of the act strategies according to the different situations and the differences of students. Besides, the expert teacher used the production rules to distinguish the situations clearly and proposed the effective classroom management act strategies. But the novice teacher'' act strategies of classroom management were based on both the humanistic and the behavioristic approaches. Though having the declarative knowledge, the novice teacher was not familiar with the procedural knowledge. 4.The expert teacher confronted the stress and problems of the work in positive attitude, but the novice teacher moved away from the difficulties and the stress of the work. Sometimes, the novice teacher asked others for help and adopted their suggestions, but the novice teacher didn''t get appropriate assistance at all times. 5.The expert teacher reacted to problems quickly and instinctively, and used the procedural knowledge to solve the problems. But the novice teacher focused on the surface of the problems, and didn''t solve the problems efficiently. 6.The expert teacher had an evaluation criterion of her own and was contented with her works. But the novice teacher assessed herself with others'' criteria. She was disappointed in herself and lost her confidence in teaching. 7.The expert teacher actively contacted students'' parents, and work in close cooperation with them. But the novice teacher not having correct attitude and good communication skills didn''t efficiently get any help from students'' parents. 8.Students'' perception of the effects of classroom management between the expert and the novice teachers was different in ''''arranging classroom environment'''', ''''establishing rules'''', ''''having good communication skills'''', ''''managing problem behaviors'''' But there were no difference in ''''organizing class procedures'''' and ''''supervising students'' activities''''. 9.In summary, the expert teacher'' belief, cognition, and act strategies of classroom management was apt to the humanistic approach. The expert teacher was satisfied with her performance and confirmed her belief, cognition, and act strategies of classroom management, but the novice teacher'' belief, cognition, and act strategies of classroom management was inconsistent and the novice teacher suspected her teaching abilities. According to the major findings, suggestions for the normal education, the novice teachers and future researches were offered.
APA, Harvard, Vancouver, ISO, and other styles
50

Lin, Yi-Hsuan, and 林易萱. "A Comparative Study of Teachers' Belief, Professional Commitment, and Classroom Management Effectiveness between Novice Teachers and Expert Teachers in Junior and Senior High Schools in Taiwan." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/6d7w6v.

Full text
Abstract:
碩士
國立彰化師範大學
教育研究所
104
A Comparative Study of Teachers’ Belief, Professional Commitment, and Classroom Management Effectiveness between Novice Teachers and Expert Teachers in Junior and Senior High Schools in Taiwan Advisor: Hsin-Yi Kung, Ph. D. Author: Yi-Hsuan Lin Abstract The purpose of this study was to explore whether years of teaching has impact on teachers’ belief, professional commitment, and classroom management effectiveness. Novices who are homeroom teachers with one to three years of teaching are qualified. Experts who have at least six years of homeroom teacher experience and at least ten years of teaching are qualified. The participants were 302 teachers from public junior and senior high schools in Taiwan, including 150 novices and 152 experts. A questionnaire was adopted as the instrument for this study. Data were analyzed by using descriptive statistics, independent sample t-test and multiple regression analysis. The findings of this study were as follows: first, teachers’ belief was medium-high level, professional commitment was medium-high level, and classroom management effectiveness was medium-high level for both novices and experts. Second, there were no significant differences on teachers’ belief in terms of years of teaching. Third, there was a significant difference on professional commitment. Experts have more attempt on advanced studies than novices. Fourth, there were significant differences on classroom management effectiveness. Experts had better classroom management efficacy in class rules, learning environment arrangement, teaching quality, and students’ academic achievement. Fifth, teacher’s professional identity can positively predict classroom management effectiveness among novices. Final, teacher’s professional identity, job involvement and teacher-student relationship can positively predict classroom management effectiveness among experts. In general, experts appeared to possess comparatively richer prediction of classroom management effectiveness. Based on the findings of the study, some suggestions were proposed for educational administration institutions, schools, teachers, and future researchers. Keywords: novice teacher, expert teacher, teachers’ belief, professional commitment, classroom management effectiveness
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography