Thèses sur le sujet « Decision refinement »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Consultez les 18 meilleures thèses pour votre recherche sur le sujet « Decision refinement ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.
Aphale, Mukta S. « Intelligent agent support for policy authoring and refinement ». Thesis, University of Aberdeen, 2015. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=225826.
Texte intégralRamachandran, Sowmya. « Theory refinement of Bayesian networks with hidden variables / ». Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.
Texte intégralMazzarella, Fabio. « The Unlucky broker ». Doctoral thesis, Universita degli studi di Salerno, 2012. http://hdl.handle.net/10556/365.
Texte intégralThis dissertation collects results of the work on the interpretation, characteri- zation and quanti cation of a novel topic in the eld of detection theory -the Unlucky Broker problem-, and its asymptotic extension. The same problem can be also applied to the context of Wireless Sensor Networks (WSNs). Suppose that a WSN is engaged in a binary detection task. Each node of the system collects measurements about the state of the nature (H0 or H1) to be discovered. A common fusion center receives the observations from the sensors and implements an optimal test (for example in the Bayesian sense), exploiting its knowledge of the a-priori probabilities of the hypotheses. Later, the priors used in the test are revealed to be inaccurate and a rened pair is made available. Unfortunately, at that time, only a subset of the original data is still available, along with the original decision. In the thesis, we formulate the problem in statistical terms and we consider a system made of n sensors engaged in a binary detection task. A successive reduction of data set's cardinality occurs and multiple re nements are required. The sensors are devices programmed to take the decision from the previous node in the chain and the available data, implement some simple test to decide between the hypotheses, and forward the resulting decision to the next node. The rst part of the thesis shows that the optimal test is very di cult to be implemented even with only two nodes (the unlucky broker problem), because of the strong correlation between the available data and the decision coming from the previous node. Then, to make the designed detector implementable in practice and to ensure analytical tractability, we consider suboptimal local tests. We choose a simple local decision strategy, following the rationale ruling the optimal detector solving the unlucky broker problem: A decision in favor of H0 is always retained by the current node, while when the decision of the previous node is in favor of H1, a local log-likelihood based test is implemented. The main result is that, asymptotically, if we set the false alarm probability of the rst node (the one observing the full data set) the false alarm probability decreases along the chain and it is non zero at the last stage. Moreover, very surprisingly, the miss detection probability decays exponentially fast with the root square of the number of nodes and we provide its closed-form exponent, by exploiting tools from random processes and information theory. [edited by the author]
X n.s.
Sarigul, Erol. « Interactive Machine Learning for Refinement and Analysis of Segmented CT/MRI Images ». Diss., Virginia Tech, 2004. http://hdl.handle.net/10919/25954.
Texte intégralPh. D.
Arrufat, Ondina. « The refinement and validation of the critical decision making and problem solving scale moral dilema (CDP-MD) ». FIU Digital Commons, 1995. http://digitalcommons.fiu.edu/etd/1426.
Texte intégralWolf, Lisa Adams. « Testing and refinement of an integrated, ethically-driven environmental model of clinical decision-making in emergency settings ». Thesis, Boston College, 2011. http://hdl.handle.net/2345/2224.
Texte intégralThesis advisor: Pamela J. Grace
The purpose of the study was to explore the relationship between multiple variables within a model of critical thinking and moral reasoning that support and refine the elements that significantly correlate with accuracy and clinical decision-making. Background: Research to date has identified multiple factors that are integral to clinical decision-making. The interplay among suggested elements within the decision making process particular to the nurse, the patient, and the environment remain unknown. Determining the clinical usefulness and predictive capacity of an integrated ethically driven environmental model of decision making (IEDEM-CD) in emergency settings in facilitating accuracy in problem identification is critical to initial interventions and safe, cost effective, quality patient care outcomes. Extending the literature of accuracy and clinical decision making can inform utilization, determination of staffing ratios, and the development of evidence driven care models. Methodology: The study used a quantitative descriptive correlational design to examine the relationships between multiple variables within the IEDEM-CD model. A purposive sample of emergency nurses was recruited to participate in the study resulting in a sample size of 200, calculated to yield a power of 0.80, significance of .05, and a moderate effect size. The dependent variable, accuracy in clinical decision-making, was measured by scores on clinical vignettes. The independent variables of moral reasoning, perceived environment of care, age, gender, certification in emergency nursing, educational level, and years of experience in emergency nursing, were measures by the Defining Issues Test, version 2, the Revised Professional Practice Environment scale, and a demographic survey. These instruments were identified to test and refine the elements within the IEDEM-CD model. Data collection occurred via internet survey over a one month period. Rest's Defining Issues Test, version 2 (DIT-2), the Revised Professional Practice Environment tool (RPPE), clinical vignettes as well as a demographic survey were made available as an internet survey package using Qualtrics TM. Data from each participant was scored and entered into a PASW database. The analysis plan included bivariate correlation analysis using Pearson's product-moment correlation coefficients followed by chi square and multiple linear regression analysis. Findings: The elements as identified in the IEDEM-CD model supported moral reasoning and environment of care as factors significantly affecting accuracy in decision-making. Findings reported that in complex clinical situations, higher levels of moral reasoning significantly affected accuracy in problem identification. Attributes of the environment of care including teamwork, communication about patients, and control over practice also significantly affected nurses' critical cue recognition and selection of appropriate interventions. Study results supported the conceptualization of the IEDEM-CD model and its usefulness as a framework for predicting clinical decision making accuracy for emergency nurses in practice, with further implications in education, research and policy
Thesis (PhD) — Boston College, 2011
Submitted to: Boston College. Connell School of Nursing
Discipline: Nursing
Raghavan, Venkatesh. « Supporting Multi-Criteria Decision Support Queries over Disparate Data Sources ». Digital WPI, 2012. https://digitalcommons.wpi.edu/etd-dissertations/120.
Texte intégralDarracott, Rosalyn M. « The development and refinement of the practice domain framework as a conceptual tool for understanding and guiding social care practice ». Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/86048/15/86048.pdf.
Texte intégralMolinari, David U. « A psychometric examination and refinement of the Canadian Forces Attrition Information Questionnaire, CFAIQ, comparing the reasons cited by anglophones and francophones in the Leave decision process ». Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq20843.pdf.
Texte intégralEl, Khalfi Zeineb. « Lexicographic refinements in possibilistic sequential decision-making models ». Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30269/document.
Texte intégralThis work contributes to possibilistic decision theory and more specifically to sequential decision-making under possibilistic uncertainty, at both the theoretical and practical levels. Even though appealing for its ability to handle qualitative decision problems, possibilisitic decision theory suffers from an important drawback: qualitative possibilistic utility criteria compare acts through min and max operators, which leads to a drowning effect. To overcome this lack of decision power, several refinements have been proposed in the literature. Lexicographic refinements are particularly appealing since they allow to benefit from the expected utility background, while remaining "qualitative". However, these refinements are defined for the non-sequential decision problems only. In this thesis, we present results on the extension of the lexicographic preference relations to sequential decision problems, in particular, to possibilistic Decision trees and Markov Decision Processes. This leads to new planning algorithms that are more "decisive" than their original possibilistic counterparts. We first present optimistic and pessimistic lexicographic preference relations between policies with and without intermediate utilities that refine the optimistic and pessimistic qualitative utilities respectively. We prove that these new proposed criteria satisfy the principle of Pareto efficiency as well as the property of strict monotonicity. This latter guarantees that dynamic programming algorithm can be used for calculating lexicographic optimal policies. Considering the problem of policy optimization in possibilistic decision trees and finite-horizon Markov decision processes, we provide adaptations of dynamic programming algorithm that calculate lexicographic optimal policy in polynomial time. These algorithms are based on the lexicographic comparison of the matrices of trajectories associated to the sub-policies. This algorithmic work is completed with an experimental study that shows the feasibility and the interest of the proposed approach. Then we prove that the lexicographic criteria still benefit from an Expected Utility grounding, and can be represented by infinitesimal expected utilities. The last part of our work is devoted to policy optimization in (possibly infinite) stationary Markov Decision Processes. We propose a value iteration algorithm for the computation of lexicographic optimal policies. We extend these results to the infinite-horizon case. Since the size of the matrices increases exponentially (which is especially problematic in the infinite-horizon case), we thus propose an approximation algorithm which keeps the most interesting part of each matrix of trajectories, namely the first lines and columns. Finally, we reports experimental results that show the effectiveness of the algorithms based on the cutting of the matrices
Balbontin, Camila. « Integrating Decision Heuristics And Behavioural Refinements Into Travel Choice Models ». Thesis, The University of Sydney, 2017. http://hdl.handle.net/2123/17892.
Texte intégralBurlacu, Robert [Verfasser], Alexander [Akademischer Betreuer] Martin, Alexander [Gutachter] Martin et Rüdiger [Gutachter] Schultz. « Adaptive Mixed-Integer Refinements for Solving Nonlinear Problems with Discrete Decisions / Robert Burlacu ; Gutachter : Alexander Martin, Rüdiger Schultz ; Betreuer : Alexander Martin ». Erlangen : Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), 2020. http://d-nb.info/1205157530/34.
Texte intégralKhan, Omar Zia. « Policy Explanation and Model Refinement in Decision-Theoretic Planning ». Thesis, 2013. http://hdl.handle.net/10012/7808.
Texte intégral謝承凌. « A Study on the Decision Refinement after Losing Data based on Evidence Theory ». Thesis, 2013. http://ndltd.ncl.edu.tw/handle/87373693399516697098.
Texte intégralLin, Wan-Ting, et 林婉婷. « The Refinement Mechanism of Preliminary Dispatch Fire-alarm Decision Support System for Fire Department of New Taipei City ». Thesis, 2012. http://ndltd.ncl.edu.tw/handle/75954298327326757317.
Texte intégral國立臺北大學
資訊管理研究所
100
Taiwan is cramped and crowded. And Taiwan’s development is so fast and residence is high-density. Particularly, New Taipei City is a big city with a vast territory, large population and geographical types of diversity, and its fire caseload for most of Taiwan. When the fire event happened, the dispatcher of the Fire Department 119 Dispatch Center must make a right decision in short time to decide the appropriate resources for fire event. A right decision can be safe, efficiency, appropriate rescue effect. Conversely may be the scale of the fire event make larger and need to more cost, time and resources to rescue and may be make more property damaged, people injured and dead. Therefore, New Taipei City created a project to practice an assistant system on fire-alarm dispatch in 2011. The objective of the project is use of information technology, to set appropriate fire-alarm dispatch modules to the new needs of the fire-alarm dispatch and relief. This research is about the research of the refinement mechanism of preliminary dispatch for fire-alarm decision support system for Fire Department of New Taipei City. We designed a model based on the factor of influence rescue and use the history cases to calculate the total dispatch engine group. It can make the system catch up the trends. We use decision tree model and refinement mechanisms to design the algorithm of the system, and implement the decision support module of the preliminary fire dispatch. The model can analyze, find the rules and trend of the dispatched resources. It also can make the suggestions of quantum of the dispatched resources to activate the system in the future and assist 119 dispatchers to dispatch fire engine groups of various fire units, to make more precise and fast scheduling and effective use.
Lien, Lee-Sheng, et 連李勝. « Fast Motion Estimation Based on Diamond Refinement Search and Mode Decision Algorithm for High Efficiency Video Coding Standard ». Thesis, 2018. http://ndltd.ncl.edu.tw/handle/uer9up.
Texte intégral國立中興大學
電機工程學系所
106
With the advancement of science and technology, the demand for high-definition video is constantly increasing both in professional video, online video and daily consumer videos, and HD quality has been fully popularized. With the gradual maturation of 4K technology , will meet the 4K era of higher quality advent. HEVC can support a wide range of video formats from CIF (320 × 288) to HD (1920 × 1080), high resolution 4K (3840 × 2160) and up to 8K UHD (7680 × 4320). The encoding unit is increased from the size of (4 × 4) and (16 × 16) of H.264 to the maximum (64 × 64) to the minimum (8 × 8) of HEVC, and different size blocks are configured according to different requirements. HEVC mainly in three important units, the introduction of a larger coding unit CU (Coding Unit), the prediction unit (Prediction Unit) and the conversion unit (Transform Unit), HEVC compared with H.264 can save 50% of the compression In the video compression, in order to find the matching block with the smallest RD-Cost, the motion estimation often takes a relatively high amount of computation. Therefore, this paper presents a new HEVC algorithm for fast motion estimation and reduces the computational complexity improve compression efficiency. In this paper, a new fast search motion estimation algorithm is proposed contains changes to the three parts of Diamond Refinement Search, Mode Decision, and Inter Prediction Min Block Size. In the diamond refinement search, the motion vector IMV predicted by the AMVP is used as the starting point of the search, the origin zero vector is compared with the IMV, and the smaller RD-Cost is selected as the starting point of the first search, using the first 4 round diamond Search to obtain the minimum RD-Cost position. When the distance between the origin and the best point is not 0 or 1, the distance is determined to be greater than 4. If the distance is greater than 4, the Diamond Refinement Search solution is matched with the Concentric Diamond. Search refines the search; if the distance is less than or equal to 4, use the Small Diamond Search method for a quick search. In the mode selection, using statistics only 2N x 2N Mode, and the highest N x N Mode usage rate, can save the less used Mode, which can greatly accelerate the overall operation time , However, Inter Prediction Min Block Size reduction, the original 8 x 8 is changed to 4 x 4. The original block size can be cut into smaller blocks to improve the image quality after only using 2N x 2N Mode and N x N Mode. Experimental results This paper proposes a high-speed video coding standard fast motion estimation algorithm based on diamond refinement search and mode decision. After adding the Inter Prediction Min Block Size modification, the overall coding time can be saved by 46.15% with a 1.45% increase in Bit-Rate and a -0.03 drop in PSNR.
Scime, Anthony. « Taxonomic information retrieval (TAXIR) from the World Wide Web knowledge-based query and results refinement with user profiles and decision models / ». 1997. http://catalog.hathitrust.org/api/volumes/oclc/39258102.html.
Texte intégralRathod, Harsh. « Surface and subsurface damage quantification using multi-device robotics-based sensor system and other non-destructive testing techniques ». Thesis, 2019. http://hdl.handle.net/1828/11168.
Texte intégralGraduate
2020-09-06