Tesis sobre el tema "Uncertainty (Information theory)"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Uncertainty (Information theory).

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Uncertainty (Information theory)".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

De, Aguinaga José Guillermo. "Uncertainty Assessment of Hydrogeological Models Based on Information Theory". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2011. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-71814.

Texto completo
Resumen
There is a great deal of uncertainty in hydrogeological modeling. Overparametrized models increase uncertainty since the information of the observations is distributed through all of the parameters. The present study proposes a new option to reduce this uncertainty. A way to achieve this goal is to select a model which provides good performance with as few calibrated parameters as possible (parsimonious model) and to calibrate it using many sources of information. Akaike’s Information Criterion (AIC), proposed by Hirotugu Akaike in 1973, is a statistic-probabilistic criterion based on the Information Theory, which allows us to select a parsimonious model. AIC formulates the problem of parsimonious model selection as an optimization problem across a set of proposed conceptual models. The AIC assessment is relatively new in groundwater modeling and it presents a challenge to apply it with different sources of observations. In this dissertation, important findings in the application of AIC in hydrogeological modeling using different sources of observations are discussed. AIC is tested on ground-water models using three sets of synthetic data: hydraulic pressure, horizontal hydraulic conductivity, and tracer concentration. In the present study, the impact of the following factors is analyzed: number of observations, types of observations and order of calibrated parameters. These analyses reveal not only that the number of observations determine how complex a model can be but also that its diversity allows for further complexity in the parsimonious model. However, a truly parsimonious model was only achieved when the order of calibrated parameters was properly considered. This means that parameters which provide bigger improvements in model fit should be considered first. The approach to obtain a parsimonious model applying AIC with different types of information was successfully applied to an unbiased lysimeter model using two different types of real data: evapotranspiration and seepage water. With this additional independent model assessment it was possible to underpin the general validity of this AIC approach
Hydrogeologische Modellierung ist von erheblicher Unsicherheit geprägt. Überparametrisierte Modelle erhöhen die Unsicherheit, da gemessene Informationen auf alle Parameter verteilt sind. Die vorliegende Arbeit schlägt einen neuen Ansatz vor, um diese Unsicherheit zu reduzieren. Eine Möglichkeit, um dieses Ziel zu erreichen, besteht darin, ein Modell auszuwählen, das ein gutes Ergebnis mit möglichst wenigen Parametern liefert („parsimonious model“), und es zu kalibrieren, indem viele Informationsquellen genutzt werden. Das 1973 von Hirotugu Akaike vorgeschlagene Informationskriterium, bekannt als Akaike-Informationskriterium (engl. Akaike’s Information Criterion; AIC), ist ein statistisches Wahrscheinlichkeitskriterium basierend auf der Informationstheorie, welches die Auswahl eines Modells mit möglichst wenigen Parametern erlaubt. AIC formuliert das Problem der Entscheidung für ein gering parametrisiertes Modell als ein modellübergreifendes Optimierungsproblem. Die Anwendung von AIC in der Grundwassermodellierung ist relativ neu und stellt eine Herausforderung in der Anwendung verschiedener Messquellen dar. In der vorliegenden Dissertation werden maßgebliche Forschungsergebnisse in der Anwendung des AIC in hydrogeologischer Modellierung unter Anwendung unterschiedlicher Messquellen diskutiert. AIC wird an Grundwassermodellen getestet, bei denen drei synthetische Datensätze angewendet werden: Wasserstand, horizontale hydraulische Leitfähigkeit und Tracer-Konzentration. Die vorliegende Arbeit analysiert den Einfluss folgender Faktoren: Anzahl der Messungen, Arten der Messungen und Reihenfolge der kalibrierten Parameter. Diese Analysen machen nicht nur deutlich, dass die Anzahl der gemessenen Parameter die Komplexität eines Modells bestimmt, sondern auch, dass seine Diversität weitere Komplexität für gering parametrisierte Modelle erlaubt. Allerdings konnte ein solches Modell nur erreicht werden, wenn eine bestimmte Reihenfolge der kalibrierten Parameter berücksichtigt wurde. Folglich sollten zuerst jene Parameter in Betracht gezogen werden, die deutliche Verbesserungen in der Modellanpassung liefern. Der Ansatz, ein gering parametrisiertes Modell durch die Anwendung des AIC mit unterschiedlichen Informationsarten zu erhalten, wurde erfolgreich auf einen Lysimeterstandort übertragen. Dabei wurden zwei unterschiedliche reale Messwertarten genutzt: Evapotranspiration und Sickerwasser. Mit Hilfe dieser weiteren, unabhängigen Modellbewertung konnte die Gültigkeit dieses AIC-Ansatzes gezeigt werden
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Dalvi, Nilesh. "Managing uncertainty using probabilistic databases /". Thesis, Connect to this title online; UW restricted, 2007. http://hdl.handle.net/1773/6920.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Aleem, I. "Information, uncertainty and rural credit markets in Pakistan". Thesis, University of Oxford, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.482927.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Zhang, Yanyang. "Second-order effects on uncertainty analysis calculations". Master's thesis, Mississippi State : Mississippi State University, 2002. http://library.msstate.edu/etd/show.asp?etd=etd-10292002-122359.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Noronha, Alston Marian Lee Jejung. "Information theory approach to quantifying parameter uncertainty in groundwater modeling". Diss., UMK access, 2005.

Buscar texto completo
Resumen
Thesis (M.S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2005.
"A thesis in civil engineering." Typescript. Advisor: Jejung Lee. Vita. Title from "catalog record" of the print edition Description based on contents viewed March 12, 2007. Includes bibliographical references (leaves 96-100). Online version of the print edition.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Adams, Carl. "Dealing with uncertainty within information systems development : applying prospect theory". Thesis, University of Southampton, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.395995.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Khiripet, Noppadon. "An architecture for intelligent time series prediction with causal information". Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/13896.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Lu, An. "Processing and management of uncertain information in vague databases /". View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?CSED%202009%20LU.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Windholz, Thomas. "Strategies for Handling Spatial Uncertainty due to Discretization". Fogler Library, University of Maine, 2001. http://www.library.umaine.edu/theses/pdf/Windholz.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Cvijanovic, Zoran. "A computer laboratory for generalized information theory (COLGIT)". Diss., Online access via UMI:, 2007.

Buscar texto completo
Resumen
Thesis (Ph. D.)--State University of New York at Binghamton, Department or Systems Science and Industrial Engineeering, Thomas J. Watson School of Engineering and Applied Science, 2007.
Includes bibliographical references.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

van, Welbergen Nikoleta [Verfasser], Christoph [Akademischer Betreuer] Kuzmics y Frank [Akademischer Betreuer] Riedel. "Information uncertainty in auction theory / Nikoleta van Welbergen ; Christoph Kuzmics, Frank Riedel". Bielefeld : Universitätsbibliothek Bielefeld, 2016. http://d-nb.info/1122285787/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Welbergen, Nikoleta van [Verfasser], Christoph [Akademischer Betreuer] Kuzmics y Frank [Akademischer Betreuer] Riedel. "Information uncertainty in auction theory / Nikoleta van Welbergen ; Christoph Kuzmics, Frank Riedel". Bielefeld : Universitätsbibliothek Bielefeld, 2016. http://d-nb.info/1122285787/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Gong, Jian y 龔劍. "Managing uncertainty in schema matchings". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B46076116.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Haglind, Carl. "Evaluation and Implementation of Traceable Uncertainty for Threat Evaluation". Thesis, Uppsala universitet, Avdelningen för systemteknik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-228106.

Texto completo
Resumen
Threat evaluation is used in various applications to find threatening objects or situations and neutralize them before they cause any damage. To make the threat evaluation as user-friendly as possible, it is important to know where the uncertainties are. The method Traceable Uncertainty can make the threat evaluation process more transparent and hopefully easier to rely on. Traceable Uncertainty is used when different sources of information are combined to find support for the decision making process. The uncertainty of the current information is measured before and after the combination. If the magnitude of uncertainty has changed more than a threshold, a new branch will be created which excludes the new information from the combination of evidence. Traceable Uncertainty has never been tested on any realistic scenario to investigate whether it is possible to implement the method on a large scale system. The hypothesis of this thesis is that Traceable Uncertainty can be used on large scale systems if its threshold parameter is tuned in the right way. Different threshold values were tested when recorded radar data were analyzed for threatening targets. Experiments combining random generated evidence were also analyzed for different threshold values. The results showed that a threshold value in the range [0.15, 0.25] generated a satisfying amount of interpretations that were not too similar to eachother. The results could also be filtered to take away unnecessary interpretations. This shows that in this aspect and for this data set, Traceable Uncertainty can be used on large scale systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Pryor, Ronald L. "Principles of nonspecificity". Diss., Online access via UMI:, 2007.

Buscar texto completo
Resumen
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2007.
Includes bibliographical references.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Warren, Adam L. "Sequential decision-making under uncertainty /". *McMaster only, 2004.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Lando, Jody Brauner. "Incorporating uncertainty into freshwater habitat restoration /". Thesis, Connect to this title online; UW restricted, 2004. http://hdl.handle.net/1773/5376.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Lalmas, Mounia. "Theories of information and uncertainty for the modelling of information retrieval : an application of situation theory and Dempster-Shafer's theory of evidence". Thesis, University of Glasgow, 1996. http://theses.gla.ac.uk/8385/.

Texto completo
Resumen
Current information retrieval models only offer simplistic and specific representations of information. Therefore, there is a need for the development of a new formalism able to model information retrieval systems in a more generic manner. In 1986, Van Rijsbergen suggested that such formalisms can be both appropriately and powerfully defined within a logic. The resulting formalism should capture information as it appears in an information retrieval system, and also in any of its inherent forms. The aim of this thesis is to understand the nature of information in information retrieval, and to propose a logic-based model of an information retrieval system that reflects this nature. The first objective of this thesis is to identify essential features of information in an information retrieval system. These are: 0 flow, 0 intensionality, 0 partiality, 0 structure, 0 significance, and o uncertainty. It is shown that the first four features are qualitative, whereas the last two are quantitative, and that their modelling requires different frameworks: a theory of information, and a theory of uncertainty, respectively. The second objective of this thesis is to determine the appropriate framework for each type of feature, and to develop a method to combine them in a consistent fashion. The combination is based on the Transformation Principle. Many specific attempts have been made to derive an adequate definition of information. The one adopted in this thesis is based on that of Dretske, Barwise, and Devlin who claimed that there is a primitive notion of information in terms of which a logic can be defined, and subsequently developed a theory of information, namely Situation Theory. Their approach was in accordance with Van Rijsbergen' s suggestion of a logic-based formalism for modelling an information retrieval system. This thesis shows that Situation Theory is best at representing all the qualitative features. Regarding the modelling of the quantitative features of information, this thesis shows that the framework that models them best is the Dempster-Shafer Theory of Evidence, together with the notion of refinement, later introduced by Shafer. The third objective of this thesis is to develop a model of an information retrieval system based on Situation Theory and the Dempster-Shafer Theory of Evidence. This is done in two steps. First, the unstructured model is defined in which the structure and the significance of information are not accounted for. Second, the unstructured model is extended into the structured model, which incorporates the structure and the significance of information. This strategy is adopted because it enables the careful representation of the flow of information to be performed first. The final objective of the thesis is to implement the model and to perform empirical evaluation to assess its validity. The unstructured and the structured models are implemented based on an existing on-line thesaurus, known as WordNet. The experiments performed to evaluate the two models use the National Physical Laboratory standard test collection. The experimental performance obtained was poor, because it was difficult to extract the flow of information from the document set. This was mainly due to the data used in the experimentation which was inappropriate for the test collection. However, this thesis shows that if more appropriate data, for example, indexing tools and thesauri, were available, better performances would be obtained. The conclusion of this work was that Situation Theory, combined with the Dempster-Shafer Theory of Evidence, allows the appropriate and powerful representation of several essential features of information in an information retrieval system. Although its implementation presents some difficulties, the model is the first of its kind to capture, in a general manner, these features within a uniform framework. As a result, it can be easily generalized to many types of information retrieval systems (e.g., interactive, multimedia systems), or many aspects of the retrieval process (e.g., user modelling).
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Kistenmacher, Martin. "Reservoir system management under uncertainty". Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/49012.

Texto completo
Resumen
Reservoir systems are subject to several uncertainties that are the result of imperfect knowledge about system behavior and inputs. A major source of uncertainty arises from the inability to predict future inflows. Fortunately, it is often possible to generate probabilistic forecasts of inflow volumes in the form of probability density functions or ensembles. These inflow forecasts can be coupled with stochastic management models to determine reservoir release policies and provide stakeholders with meaningful information of upcoming system responses such as reservoir levels, releases, flood damage risks, hydropower production, water supply withdrawals, water quality conditions, navigation opportunities, and environmental flows, among others. This information on anticipated system responses is also expressed in the form of forecasts that must reliably represent the actual system behavior when it eventually occurs. The first part of this study presents an assessment methodology that can be used to determine the consistency of ensemble forecasts through the use of relative frequency histograms and minimum spanning trees (MST). This methodology is then used to assess a management model's ability to produce reliable ensemble forecasts. It was found that neglecting to account for hydrologic state variables and improperly modeling the finite management horizon decrease ensemble consistency. Several extensions to the existing management model are also developed and evaluated. The second portion of this study involves the management of the uncertainties in reservoir systems. Traditional management models only find management policies that optimize the expected values of system benefits or costs, thereby not allowing operators and stakeholders to explicitly explore issues related to uncertainty and risk management. A technique that can be used to derive management policies that produce desired probabilistic distributions of reservoir system outputs reflecting stakeholder preferences is developed. This technique can be embedded in a user-interactive framework that can be employed to evaluate the trade-offs and build consensus in multi-objective and multi-stakeholder systems. The methods developed in this dissertation are illustrated in case studies of real reservoir systems, including a seven-reservoir, multi-objective system in California's Central Valley.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Wang, Liang y 王亮. "Frequent itemsets mining on uncertain databases". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B4590215X.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Froese, Thomas Michael. "Implementing Dempster-Shafer theory for inexact reasoning in expert systems". Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/28383.

Texto completo
Resumen
The work described in this thesis stems from the idea that expert systems should be able to accurately and appropriately handle uncertain information. The traditional approaches to dealing with uncertainty are discussed and are shown to contain many inadequacies. The Dempster-Shafer, or D-S, theory of evidence is proposed as an appealing theoretical basis for representing uncertain knowledge and for performing inexact reasoning in expert systems. The D-S theory is reviewed in some detail; including its approaches to representing concepts, to representing belief, to combining belief and to performing inference. The D-S implementation approaches pursued by other researchers are described and critiqued. Attempts made early in the thesis research which failed to achieve the important goal of consistency with the D-S theory are also reviewed. Two approaches to implementing D-S theory in a completely consistent manner are discussed in detail. It is shown that the second of these systems, a frame network approach, has led to the development of a fully functional prototype expert system shell called FRO. In this system, concepts are represented using D-S frames of discernment, belief is represented using D-S belief functions, and inference is performed using stored relationships between frames of discernment (forming the frame network) and D-S belief combination rules. System control is accomplished using a discrete rule-based control component and uncertain input and output are performed through an interactive belief interface system called IBIS. Each of these features is reviewed. Finally, a simple but detailed example of an application of a frame network expert system is provided. The FRO system user's documentation is provided in the appendix.
Applied Science, Faculty of
Civil Engineering, Department of
Graduate
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Duncan, Scott Joseph. "Including severe uncertainty into environmentally benign life cycle design using information gap-decision theory". Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22540.

Texto completo
Resumen
Thesis (Ph. D.)--Mechanical Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Bras, Bert; Committee Member: Allen, Janet; Committee Member: Chameau, Jean-Lou; Committee Member: McGinnis, Leon; Committee Member: Paredis, Chris.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Salim, Farzad. "Approaches to access control under uncertainty". Thesis, Queensland University of Technology, 2012. https://eprints.qut.edu.au/58408/1/Farzad_Salim_Thesis.pdf.

Texto completo
Resumen
The ultimate goal of an access control system is to allocate each user the precise level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting. On one hand employees need enough access to the organisation’s resources in order to perform their jobs and on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally, through carelessness or being socially engineered to give access to an adversary. This thesis investigates issues of existing approaches to access control in allocating optimal level of access to users and proposes solutions in the form of new access control models. These issues are most evident when uncertainty surrounding users’ access needs, incentive to misuse and accountability are considered, hence the title of the thesis. We first analyse access control in environments where the administrator is unable to identify the users who may need access to resources. To resolve this uncertainty an administrative model with delegation support is proposed. Further, a detailed technical enforcement mechanism is introduced to ensure delegated resources cannot be misused. Then we explicitly consider that users are self-interested and capable of misusing resources if they choose to. We propose a novel game theoretic access control model to reason about and influence the factors that may affect users’ incentive to misuse. Next we study access control in environments where neither users’ access needs can be predicted nor they can be held accountable for misuse. It is shown that by allocating budget to users, a virtual currency through which they can pay for the resources they deem necessary, the need for a precise pre-allocation of permissions can be relaxed. The budget also imposes an upper-bound on users’ ability to misuse. A generalised budget allocation function is proposed and it is shown that given the context information the optimal level of budget for users can always be numerically determined. Finally, Role Based Access Control (RBAC) model is analysed under the explicit assumption of administrators’ uncertainty about self-interested users’ access needs and their incentives to misuse. A novel Budget-oriented Role Based Access Control (B-RBAC) model is proposed. The new model introduces the notion of users’ behaviour into RBAC and provides means to influence users’ incentives. It is shown how RBAC policy can be used to individualise the cost of access to resources and also to determine users’ budget. The implementation overheads of B-RBAC is examined and several low-cost sub-models are proposed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Bhatt, Chinmay P. "Assessment of uncertainty in equivalent sand grain roughness methods". Birmingham, Ala. : University of Alabama at Birmingham, 2007. http://www.mhsl.uab.edu/dt/2007m/bhatt.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Cheong, Tae Su. "Value of information and supply uncertainty in supply chains". Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42725.

Texto completo
Resumen
This dissertation focuses on topics related to the value of real-time information and/or to supply uncertainties due to uncertain lead-times and yields in supply chains. The first two of these topics address issues associated with freight transportation, while the remaining two topics are concerned with inventory replenishment. We first assess the value of dynamic tour determination for the traveling salesman problem (TSP). Given a network with traffic dynamics that can be modeled as a Markov chain, we present a policy determination procedure that optimally builds a tour dynamically. We then explore the potential for expected total travel cost reduction due to dynamic tour determination, relative to two a priori tour determination procedures. Second, we consider the situation where the decision to continue or abort transporting perishable freight from an origin to a destination can be made at intermediate locations, based on real-time freight status monitoring. We model the problem as a partially observed Markov decision process (POMDP) and develop an efficient procedure for determining an optimal policy. We determine structural characteristics of an optimal policy and upper and lower bounds on the optimal reward function. Third, we analyze a periodic review inventory control problem with lost sales and random yields and present conditions that guarantee the existence of an optimal policy having a so-called staircase structure. We make use of this structure to accelerate both value iteration and policy evaluation. Lastly, we examine a model of inventory replenishment where both lead time and supply qualities are uncertain. We model this problem as an MDP and show that the weighted sum of inventory in transit and inventory at the destination is a sufficient statistic, assuming that random shrinkage can occur from the origin to the supply system or destination, shrinkage is deterministic within the supply system and from the supply system to the destination, and no shrinkage occurs once goods reach the destination.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Herner, Alan Eugene. "Measuring Uncertainty of Protein Secondary Structure". Wright State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=wright1302305875.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Shrestha, Govinda B. "Formulation and analysis of a probabilistic uncertainty evaluation technique". Diss., Virginia Tech, 1990. http://hdl.handle.net/10919/39846.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Hendricks, Michael D. "Structuring a Wayfinder's Dynamic and Uncertain Environment". Fogler Library, University of Maine, 2004. http://www.library.umaine.edu/theses/pdf/HendricksMD2004.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Chen, Xingyuan. "Investigating third-order polynomial normal transform and its applications to uncertainty and reliability analyses /". View Abstract or Full-Text, 2002. http://library.ust.hk/cgi/db/thesis.pl?CIVL%202002%20CHEN.

Texto completo
Resumen
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2002.
Includes bibliographical references (leaves 192-195). Also available in electronic version. Access restricted to campus users.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Lian, Xiang. "Efficient query processing over uncertain data /". View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?CSED%202009%20LIAN.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Drougard, Nicolas. "Exploiting imprecise information sources in sequential decision making problems under uncertainty". Thesis, Toulouse, ISAE, 2015. http://www.theses.fr/2015ESAE0037/document.

Texto completo
Resumen
Les Processus Décisionnels de Markov Partiellement Observables (PDMPOs) permettent de modéliser facilement lesproblèmes probabilistes de décision séquentielle dans l'incertain. Lorsqu'il s'agit d'une mission robotique, lescaractéristiques du robot et de son environnement nécessaires à la définition de la mission constituent le système. Son étatn'est pas directement visible par l'agent (le robot). Résoudre un PDMPO revient donc à calculer une stratégie qui remplit lamission au mieux en moyenne, i.e. une fonction prescrivant les actions à exécuter selon l'information reçue par l'agent. Cetravail débute par la mise en évidence, dans le contexte robotique, de limites pratiques du modèle PDMPO: ellesconcernent l'ignorance de l'agent, l'imprécision du modèle d'observation ainsi que la complexité de résolution. Unhomologue du modèle PDMPO appelé pi-PDMPO, simplifie la représentation de l'incertitude: il vient de la Théorie desPossibilités Qualitatives qui définit la plausibilité des événements de manière qualitative, permettant la modélisation del'imprécision et de l'ignorance. Une fois les modèles PDMPO et pi-PDMPO présentés, une mise à jour du modèle possibilisteest proposée. Ensuite, l'étude des pi-PDMPOs factorisés permet de mettre en place un algorithme appelé PPUDD utilisantdes Arbres de Décision Algébriques afin de résoudre plus facilement les problèmes structurés. Les stratégies calculées parPPUDD, testées par ailleurs lors de la compétition IPPC 2014, peuvent être plus efficaces que celles des algorithmesprobabilistes dans un contexte d'imprécision ou de grande dimension. Cette thèse propose d'utiliser les possibilitésqualitatives dans le but d'obtenir des améliorations en termes de temps de calcul et de modélisation
Partially Observable Markov Decision Processes (POMDPs) define a useful formalism to express probabilistic sequentialdecision problems under uncertainty. When this model is used for a robotic mission, the system is defined as the featuresof the robot and its environment, needed to express the mission. The system state is not directly seen by the agent (therobot). Solving a POMDP consists thus in computing a strategy which, on average, achieves the mission best i.e. a functionmapping the information known by the agent to an action. Some practical issues of the POMDP model are first highlightedin the robotic context: it concerns the modeling of the agent ignorance, the imprecision of the observation model and thecomplexity of solving real world problems. A counterpart of the POMDP model, called pi-POMDP, simplifies uncertaintyrepresentation with a qualitative evaluation of event plausibilities. It comes from Qualitative Possibility Theory whichprovides the means to model imprecision and ignorance. After a formal presentation of the POMDP and pi-POMDP models,an update of the possibilistic model is proposed. Next, the study of factored pi-POMDPs allows to set up an algorithmnamed PPUDD which uses Algebraic Decision Diagrams to solve large structured planning problems. Strategies computedby PPUDD, which have been tested in the context of the competition IPPC 2014, can be more efficient than those producedby probabilistic solvers when the model is imprecise or for high dimensional problems. This thesis proposes some ways ofusing Qualitative Possibility Theory to improve computation time and uncertainty modeling in practice
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

McCormick, David Jeremy. "Distributed uncertainty analysis techniques for conceptual launch vehicle design". Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/12892.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Xie, Xike y 谢希科. "Evaluating nearest neighbor queries over uncertain databases". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B4784954X.

Texto completo
Resumen
Nearest Neighbor (NN in short) queries are important in emerging applications, such as wireless networks, location-based services, and data stream applications, where the data obtained are often imprecise. The imprecision or imperfection of the data sources is modeled by uncertain data in recent research works. Handling uncertainty is important because this issue affects the quality of query answers. Although queries on uncertain data are useful, evaluating the queries on them can be costly, in terms of I/O or computational efficiency. In this thesis, we study how to efficiently evaluate NN queries on uncertain data. Given a query point q and a set of uncertain objects O, the possible nearest neighbor query returns a set of candidates which have non-zero probabilities to be the query answer. It is also interesting to ask \which region has the same set of possible nearest neighbors", and \which region has one specific object as its possible nearest neighbor". To reveal the relationship between the query space and nearest neighbor answers, we propose the UV-diagram, where the query space is split into disjoint partitions, such that each partition is associated with a set of objects. If a query point is located inside the partition, its possible nearest neighbors could be directly retrieved. However, the number of such partitions is exponential and the construction effort can be expensive. To tackle this problem, we propose an alternative concept, called UV-cell, and efficient algorithms for constructing it. The UV-cell has an irregular shape, which incurs difficulties in storage, maintenance, and query evaluation. We design an index structure, called UV-index, which is an approximated version of the UV-diagram. Extensive experiments show that the UV-index could efficiently answer different variants of NN queries, such as Probabilistic Nearest Neighbor Queries, Continuous Probabilistic Nearest Neighbor Queries. Another problem studied in this thesis is the trajectory nearest neighbor query. Here the query point is restricted to a pre-known trajectory. In applications (e.g. monitoring potential threats along a flight/vessel's trajectory), it is useful to derive nearest neighbors for all points on the query trajectory. Simple solutions, such as sampling or approximating the locations of uncertain objects as points, fails to achieve a good query quality. To handle this problem, we design efficient algorithms and optimization methods for this query. Experiments show that our solution can efficiently and accurately answer this query. Our solution is also scalable to large datasets and long trajectories.
published_or_final_version
Computer Science
Doctoral
Doctor of Philosophy
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Hu, Zhiji. "Statistical approach toward designing expert system". Virtual Press, 1988. http://liblink.bsu.edu/uhtbin/catkey/539812.

Texto completo
Resumen
Inference under uncertainty plays a crucial role in expert system and receives growing attention from artificial intelligence experts, statisticians, and psychologists. In searching for new satisfactory ways to model inference under uncertainty, it will be necessary to combine the efforts of researchers from different areas. It is expected that with deep insight into this crucial problem, it will not only have enormous impact on development of AI and expert system, but also bring classical areas like statistics into a new stage. This research paper gives a precise synopsis of present work in the field and explores the mechanics of statistical inference to a new depth by combining efforts of computer scientists, statisticians, and psychologists. One important part of the paper is the comparison of different paradigms, including the difference between statistical and logical views. Special attentions, which need to be paid when combining various methods, are considered in the paper. Also, some examples and counterexamples will be given to illustrate the availability of individual model which describes human behavior. Finally, a new framework to deal with uncertainty is proposed, and future trends of uncertainty management are projected.
Department of Mathematical Sciences
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Mantis, George C. "Quantification and propagation of disciplinary uncertainty via bayesian statistics". Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/12136.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Smith, Barbara S. "Uncertainty reasoning and representation: a comparison of several alternative approaches /". Online version of thesis, 1990. http://hdl.handle.net/1850/10580.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Sui, Liqi. "Uncertainty management in parameter identification". Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2330/document.

Texto completo
Resumen
Afin d'obtenir des simulations plus prédictives et plus précises du comportement mécanique des structures, des modèles matériau de plus en plus complexes ont été développés. Aujourd'hui, la caractérisation des propriétés des matériaux est donc un objectif prioritaire. Elle exige des méthodes et des tests d'identification dédiés dans des conditions les plus proches possible des cas de service. Cette thèse vise à développer une méthodologie d'identification efficace pour trouver les paramètres des propriétés matériau, en tenant compte de toutes les informations disponibles. L'information utilisée pour l'identification est à la fois théorique, expérimentale et empirique : l'information théorique est liée aux modèles mécaniques dont l'incertitude est épistémique; l'information expérimentale provient ici de la mesure de champs cinématiques obtenues pendant l'essai ct dont l'incertitude est aléatoire; l'information empirique est liée à l'information à priori associée à une incertitude épistémique ainsi. La difficulté principale est que l'information disponible n'est pas toujours fiable et que les incertitudes correspondantes sont hétérogènes. Cette difficulté est surmontée par l'utilisation de la théorie des fonctions de croyance. En offrant un cadre général pour représenter et quantifier les incertitudes hétérogènes, la performance de l'identification est améliorée. Une stratégie basée sur la théorie des fonctions de croyance est proposée pour identifier les propriétés élastiques macro et micro des matériaux multi-structures. Dans cette stratégie, les incertitudes liées aux modèles et aux mesures sont analysées et quantifiées. Cette stratégie est ensuite étendue pour prendre en compte l'information à priori et quantifier l'incertitude associée
In order to obtain more predictive and accurate simulations of mechanical behaviour in the practical environment, more and more complex material models have been developed. Nowadays, the characterization of material properties remains a top-priority objective. It requires dedicated identification methods and tests in conditions as close as possible to the real ones. This thesis aims at developing an effective identification methodology to find the material property parameters, taking advantages of all available information. The information used for the identification is theoretical, experimental, and empirical: the theoretical information is linked to the mechanical models whose uncertainty is epistemic; the experimental information consists in the full-field measurement whose uncertainty is aleatory; the empirical information is related to the prior information with epistemic uncertainty as well. The main difficulty is that the available information is not always reliable and its corresponding uncertainty is heterogeneous. This difficulty is overcome by the introduction of the theory of belief functions. By offering a general framework to represent and quantify the heterogeneous uncertainties, the performance of the identification is improved. The strategy based on the belief function is proposed to identify macro and micro elastic properties of multi-structure materials. In this strategy, model and measurement uncertainties arc analysed and quantified. This strategy is subsequently developed to take prior information into consideration and quantify its corresponding uncertainty
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Moore, Alana L. "Managing populations in the face of uncertainty : adaptive management, partial observability and the dynamic value of information /". Connect to thesis, 2008. http://repository.unimelb.edu.au/10187/3676.

Texto completo
Resumen
The work presented in this thesis falls naturally into two parts. The first part (Chapter 2), is concerned with the benefit of perturbing a population into an immediately undesirable state, in order to improve estimates of a static probability which may improve long-term management. We consider finding the optimal harvest policy for a theoretical harvested population when a key parameter is unknown. We employ an adaptive management framework to study when it is worth sacrificing short term rewards in order to increase long term profits.
Active adaptive management has been increasingly advocated in natural resource management and conservation biology as a methodology for resolving key uncertainties about population dynamics and responses to management. However, when comparing management policies it is traditional to weigh future rewards geometrically (at a constant discount rate) which results in far-distant rewards making a negligible contribution to the total benefit. Under such a discounting scheme active adaptive management is rarely of much benefit, especially if learning is slow. In Chapter 2, we consider two proposed alternative forms of discounting for evaluating optimal policies for long term decisions which have a social component.
We demonstrate that discount functions which weigh future rewards more heavily result in more conservative harvesting strategies, but do not necessarily encourage active learning. Furthermore, the optimal management strategy is not equivalent to employing geometric discounting at a lower rate. If alternative discount functions are made mandatory in calculating optimal management policies for environmental management, then this will affect the structure of optimal management regimes and change when and how much we are willing to invest in learning.
The second part of this thesis is concerned with how to account for partial observability when calculating optimal management policies. We consider the problem of controlling an invasive pest species when only partial observations are available at each time step. In the model considered, the monitoring data available are binomial observations of a probability which is an index of the population size. We are again concerned with estimating a probability, however, in this model the probability is changing over time.
Before including partial observability explicitly, we consider a model in which perfect observations of the population are available at each time step (Chapter 3). It is intuitive that monitoring will be beneficial only if the management decision depends on the outcome. Hence, a necessary condition for monitoring to be worthwhile is that control polices which are specified in terms of the system state, out-perform simpler time-based control policies. Consequently, in addition to providing a benchmark against which we can compare the optimal management policy in the case of partial observations, analysing the perfect observation case also provides insight into when monitoring is likely to be most valuable.
In Chapters 4 and 5 we include partial observability by modelling the control problem as a partially observable Markov decision process (POMDP). We outline several tests which stem from a property of conservation of expected utility under monitoring, which aid in validating the model. We discuss the optimal management policy prescribed by the POMDP for a range of model scenarios, and use simulation to compare the POMDP management policy to several alternative policies, including controlling with perfect observations and no observations.
In Chapter 6 we propose an alternative model, developed in the spirit of a POMDP, that does not strictly satisfy the definition of a POMDP. We find that although the second model has some conceptually appealing attributes, it makes an undesirable implicit assumption about the underlying population dynamics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Calanni, Fraccone Giorgio M. "Bayesian networks for uncertainty estimation in the response of dynamic structures". Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/24714.

Texto completo
Resumen
Thesis (Ph.D.)--Aerospace Engineering, Georgia Institute of Technology, 2009.
Committee Chair: Dr. Vitali Volovoi; Committee Co-Chair: Dr. Massimo Ruzzene; Committee Member: Dr. Andrew Makeev; Committee Member: Dr. Dewey Hodges; Committee Member: Dr. Peter Cento
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Wu, Yuan. "The momentum premium under the influence of information uncertainty : evidence from the Chinese stock market". Thesis, University of Southampton, 2012. https://eprints.soton.ac.uk/341447/.

Texto completo
Resumen
From this study, we find that the momentum premia are universally positive and statistically significant across 16 different momentum trading strategies in the Chinese Class A share market. By defining the time periods following UP and DOWN market states according to prior 12 or 24-month average Chinese Class A share market returns, we show that the momentum premia of different momentum strategies over time periods following UP market state eclipse those found over time periods following DOWN market state in the Chinese Class A share market for the whole sample period from January 1996 to December 2008. Furthermore, by employing 7 different factors—firm size, firm age, analysts’ coverage, return volatility, dispersion in analysts’ earnings forecast, trading volume, the quality/strength of corporate governance (free float ratio)—to gauge the degree of firm-level information uncertainty, we evidence that the information uncertainty has an amplifying effect over the momentum premium, and the amplifying effect is more pronounced over time periods following DOWN market state. The results from the sub-period analysis revolving the inception of two Chinese financial market regulatory reforms—1) July 1st, 1999 the implementation of the new P.R.C. security law; 2) July 3rd, 2003 the opening of the Chinese Class A share market to qualified foreign institutional investors (QFII) dismiss the doubt that our findings could be sample time periodspecific. Compared with the tradition FF3F model, the Wang & Xu (2004)’s version of the FF3F model, with the value effect factor of the traditional FF3F model supplanted by residual free float ratio (proxy for the quality/strength of firm-specific corporate governance), exhibits more explanatory power over the momentum premia yet still fails to fully rationalize the momentum premia found in this study. This research fills the gap in the literature and expands the understanding of the momentum premium by offering empirical evidence of the dynamics of the momentum premia amid market swings, the impact of information uncertainty over momentum premia as well as the impact of information uncertainty over momentum premia amid market swings in the context of the Chinese stock market. The results from this study can potentially provide an important reference point for international and domestic investors in adjusting investment strategies and portfolio positions, or fishing for investment diversification opportunities in a financial market with volatile market condition such as the Chinese stock market.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Sun, Liwen y 孙理文. "Mining uncertain data with probabilistic guarantees". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B45705392.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Murphy, David. "Predicting Effects of Artificial Recharge using Groundwater Flow and Transport Models with First Order Uncertainty Analysis". Thesis, The University of Arizona, 1997. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu_etd_hy0122_sip1_w.pdf&type=application/pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Arafat, Samer M. "Uncertainty modeling for classification and analysis of medical signals /". free to MU campus, to others for purchase, 2003. http://wwwlib.umi.com/cr/mo/fullcit?p3115520.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Bicker, Marcelle M. "A toolkit for uncertainty reasoning and representation using fuzzy set theory in PROLOG expert systems /". Online version of thesis, 1987. http://hdl.handle.net/1850/10294.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Ioannou, Georgios. "The Markov multi-phase transferable belief model : a data fusion theory for enhancing cyber situational awareness". Thesis, Brunel University, 2015. http://bura.brunel.ac.uk/handle/2438/13742.

Texto completo
Resumen
eXfiltration Advanced Persistent Threats (XAPTs) increasingly account for incidents concerned with critical information exfiltration from High Valued Targets (HVT's) by terrorists, cyber criminals or enemy states. Existing Cyber Defence frameworks and data fusion models do not adequately address (i) the multi-stage nature of XAPTs and (ii) the uncertainty and conflicting information associated with XAPTs. A new data fusion theory, called the Markov Multi-phase Transferable Belief Model (MM-TBM) is developed, for tracking and predicting XAPTs. MM-TBM expands the attack kill-chain model to attack trees and introduces a novel approach for combining various sources of cyber evidence, which takes into account the multi-phased nature of XAPTs and the characteristics of the cyberspace. As a data fusion theory, MM-TBM constitutes a novel approach for performing hypothesis assessment and evidence combination across phases, by means of a new combination rule, called the Multi-phase Combination Rule with conflict Reset (MCR2). This is the first combination rule in the field of data fusion that formalises a new method for combining evidence from multiple, causally connected hypotheses spaces and eliminating the bias from preceding phases of the kill-chain. Moreover, this is the first time a data fusion theory utilises the conflict mass m(Ø) for identifying paradoxes. In addition, a diagnostic formula for managing missing pieces of evidence within attack trees is presented. MM-TBM is designed, developed and evaluated using a Design Science Research approach within two iterations. Evaluation is conducted in a relevant computer network environment using scenario-based testing. The experimental design has been reviewed and approved by Cyber Security Subject Matter Experts from MoD’s Defence Science Technology Laboratory and Airbus Group. The experimental results validate the novel capabilities introduced by the new MM-TBM theory to Cyber Defence in the presence of information clutter, conflict and congestion. Furthermore, the results underpin the importance of selecting an optimal sampling policy to effectively track and predict XAPTs. This PhD bridges the gaps in the body of knowledge concerned with multi-phase fusion under uncertainty and Cyber SA against XAPTs. MM-TBM is a novel mathematical fusion theory for managing applications that existing fusion models do not address. This research has demonstrated MM-TBM enables the successful Tracking and Prediction of XAPTs to deliver an enhanced Cyber SA capability.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Mangalpally, Sharat C. "Assessment of integrity of reasoning in large-scale decision systems application to public transit investment project evaluation /". Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file 1.07Mb, 127 p, 2005. http://wwwlib.umi.com/dissertations/fullcit/1428262.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Xiang, Gang. "Fast algorithms for computing statistics under interval uncertainty with applications to computer science and to electrical and computer engineering /". To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2007. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Kennedy, Joseph L. Fales Roger. "Force control of a hydraulic servo system". Diss., Columbia, Mo. : University of Missouri--Columbia, 2009. http://hdl.handle.net/10355/6582.

Texto completo
Resumen
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on November 18, 2009). Thesis advisor: Dr. Roger Fales. Includes bibliographical references.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Melin, Alexander M. "On direct adaptive control of a class of nonlinear scalar systems /". free to MU campus, to others for purchase, 2003. http://wwwlib.umi.com/cr/mo/fullcit?p1418051.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

McInerney, Robert E. "Decision making under uncertainty". Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:a34e87ad-8330-42df-8ba6-d55f10529331.

Texto completo
Resumen
Operating and interacting in an environment requires the ability to manage uncertainty and to choose definite courses of action. In this thesis we look to Bayesian probability theory as the means to achieve the former, and find that through rigorous application of the rules it prescribes we can, in theory, solve problems of decision making under uncertainty. Unfortunately such methodology is intractable in realworld problems, and thus approximation of one form or another is inevitable. Many techniques make use of heuristic procedures for managing uncertainty. We note that such methods suffer unreliable performance and rely on the specification of ad-hoc variables. Performance is often judged according to long-term asymptotic performance measures which we also believe ignores the most complex and relevant parts of the problem domain. We therefore look to develop principled approximate methods that preserve the meaning of Bayesian theory but operate with the scalability of heuristics. We start doing this by looking at function approximation in continuous state and action spaces using Gaussian Processes. We develop a novel family of covariance functions which allow tractable inference methods to accommodate some of the uncertainty lost by not following full Bayesian inference. We also investigate the exploration versus exploitation tradeoff in the context of the Multi-Armed Bandit, and demonstrate that principled approximations behave close to optimal behaviour and perform significantly better than heuristics on a range of experimental test beds.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía