Thèses sur le sujet « Models and model making, 1952 »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Models and model making, 1952.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleures thèses pour votre recherche sur le sujet « Models and model making, 1952 ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les thèses sur diverses disciplines et organisez correctement votre bibliographie.

1

Wångmar, Erik. « Från sockenkommun till storkommun : En analys av storkommunreformens genomförande 1939-1952 i en nationell och lokal kontext ». Doctoral thesis, Växjö universitet, Institutionen för humaniora, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-384.

Texte intégral
Résumé :
The primary aim of this study is to provide a deeper and more complete understanding of why the great municipal amalgamation (storkommunreformen) during the 1940s became the political solution to the problem that the Government believed many of Sweden’s municipalities had in satisfactorily providing for a local welfare society. The study also describes the results of this large-scale reorganization process. The events examined include the political decision-making process at the national level that took place during 1939-1949, as well as the regional/local realization of these decisions during 1946-1952. The parliamentary treatment of the municipal division issue should be viewed as a good example of what researchers have termed a Swedish decision-making model. One clear manifestation of this was the fact that the national commission that investigated the question primarily formulated the principles for the reform. The committee’s proposal received strong endorsements in the reports from the reviewers of the proposal. The government authorities and many of the municipalities felt that a new division of municipalities was justified. Opposition that did occur came mostly from rural municipalities with small populations. Many of these municipalities believed that the present municipal divisions functioned well as they were. Of those municipalities that were affected by amalgamation, 39 percent of them did not agree with the decision. The majority of these could agree to merge with other municipalities, but not with the municipalities stipulated by the authorities. Considering the fact that the then current divisions were based on a long tradition, demands for retaining independence could have been greater. At the same time, it should be borne in mind that 66 percent of all larger municipalities were formed using some level of force. This still indicated a relatively widely distributed opposition to the amalgamation decisions, however.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Ouederni, Bechir Nacer. « Development of a strategic capital-expenditure decision model incorporating the product abandonment option ». Diss., Virginia Tech, 1992. http://hdl.handle.net/10919/39036.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Olid, Pilar. « Making Models with Bayes ». CSUSB ScholarWorks, 2017. https://scholarworks.lib.csusb.edu/etd/593.

Texte intégral
Résumé :
Bayesian statistics is an important approach to modern statistical analyses. It allows us to use our prior knowledge of the unknown parameters to construct a model for our data set. The foundation of Bayesian analysis is Bayes' Rule, which in its proportional form indicates that the posterior is proportional to the prior times the likelihood. We will demonstrate how we can apply Bayesian statistical techniques to fit a linear regression model and a hierarchical linear regression model to a data set. We will show how to apply different distributions to Bayesian analyses and how the use of a prior affects the model. We will also make a comparison between the Bayesian approach and the traditional frequentist approach to data analyses.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Saboo, Pallabi. « A decision model to aid entry-mode strategy selection ». Thesis, This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-09122009-040423/.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Cho, Young Jin. « Effects of decomposition level on the intrarater reliability of multiattribute alternative evaluation ». Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06062008-171537/.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Heller, Collin M. « A computational model of engineering decision making ». Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50272.

Texte intégral
Résumé :
The research objective of this thesis is to formulate and demonstrate a computational framework for modeling the design decisions of engineers. This framework is intended to be descriptive in nature as opposed to prescriptive or normative; the output of the model represents a plausible result of a designer's decision making process. The framework decomposes the decision into three elements: the problem statement, the designer's beliefs about the alternatives, and the designer's preferences. Multi-attribute utility theory is used to capture designer preferences for multiple objectives under uncertainty. Machine-learning techniques are used to store the designer's knowledge and to make Bayesian inferences regarding the attributes of alternatives. These models are integrated into the framework of a Markov decision process to simulate multiple sequential decisions. The overall framework enables the designer's decision problem to be transformed into an optimization problem statement; the simulated designer selects the alternative with the maximum expected utility. Although utility theory is typically viewed as a normative decision framework, the perspective in this research is that the approach can be used in a descriptive context for modeling rational and non-time critical decisions by engineering designers. This approach is intended to enable the formalisms of utility theory to be used to design human subjects experiments involving engineers in design organizations based on pairwise lotteries and other methods for preference elicitation. The results of these experiments would substantiate the selection of parameters in the model to enable it to be used to diagnose potential problems in engineering design projects. The purpose of the decision-making framework is to enable the development of a design process simulation of an organization involved in the development of a large-scale complex engineered system such as an aircraft or spacecraft. The decision model will allow researchers to determine the broader effects of individual engineering decisions on the aggregate dynamics of the design process and the resulting performance of the designed artifact itself. To illustrate the model's applicability in this context, the framework is demonstrated on three example problems: a one-dimensional decision problem, a multidimensional turbojet design problem, and a variable fidelity analysis problem. Individual utility functions are developed for designers in a requirements-driven design problem and then combined into a multi-attribute utility function. Gaussian process models are used to represent the designer's beliefs about the alternatives, and a custom covariance function is formulated to more accurately represent a designer's uncertainty in beliefs about the design attributes.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Duan, Chunming. « A unified decision analysis framework for robust system design evaluation in the face of uncertainty ». Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-06062008-170155/.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Andrews, Rick L. « Temporal changes in marketing mix effectiveness ». Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-07282008-134759/.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Abbas, Mustafa Sulaiman. « Consistency Analysis for Judgment Quantification in Hierarchical Decision Model ». PDXScholar, 2016. https://pdxscholar.library.pdx.edu/open_access_etds/2699.

Texte intégral
Résumé :
The objective of this research is to establish consistency thresholds linked to alpha (α) levels for HDM’s (Hierarchical Decision Model) judgment quantification method. Measuring consistency in order to control it is a crucial and inseparable part of any AHP/HDM experiment. The researchers on the subject recommend establishing thresholds that are statistically based on hypothesis testing, and are linked to the number of decision variables and (α) level. Such thresholds provide the means with which to evaluate the soundness and validity of an AHP/HDM decision. The linkage of thresholds to (α) levels allows the decision makers to set an appropriate inconsistency tolerance compatible with the situation at hand. The measurements of judgments are unreliable in the absence of an inconsistency measure that includes acceptable limits. All of this is essential to the credibility of the entire decision making process and hence is extremely useful for practitioners and researchers alike. This research includes distribution fitting for the inconsistencies. It is a valuable and interesting part of the research results and adds usefulness, practicality and insight. The superb fits obtained give confidence that all the statistical inferences based on the fitted distributions accurately reflect the HDM’s inconsistency measure.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Steffanny, Elaine. « Design communication through model making a taxonomy of physical models in interior design education / ». [Ames, Iowa : Iowa State University], 2009. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1468135.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
11

Gilmore, Joan Maree, et n/a. « Rational, nonrational and mixed models of policy making in a high school change process ». University of Canberra. Education, 1992. http://erl.canberra.edu.au./public/adt-AUC20060712.092715.

Texte intégral
Résumé :
In many schools hours of energy and effort are dedicated to making decisions and developing policy. At the school level issues of curriculum, faculty groupings and structure, strategy for staff allocations and resourcing of faculties, often results in debate before being decided upon. So often valuable time and resources are wasted in argument, disagreement and political activity. This study has been designed to determine what actually happens in the decision process, with the subject of the study a single committee. The aim of the study is to determine the style of policy development that took place and what influences affected the decisions made. The study is in two parts. The first section develops a Conceptual Framework and research questions to categorise, summarise and organise data collected from policy development processes. The Conceptual framework was designed to permit analysis of the major components of the stages of Problem Structuring, Generation of Alternatives and Recommending Policy Actions. The second section in includes further Research Questions to determine whether the process applied to developing policy was Rational, Nonrational (Incremental/Political) or a Mixed Model type. The research method used was naturalistic and qualitative in nature and in the context of a case study. The main findings were that a Mixed Model of policy development was used by the Committee with elements of both Rational and Nonrational process evident from the research data.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Hilgenkamp, Heather. « Contrasting multiple models of brand equity’s role in consumer decision making ». Diss., Kansas State University, 2014. http://hdl.handle.net/2097/18711.

Texte intégral
Résumé :
Doctor of Philosophy
Department of Psychological Sciences
Gary Brase
Brand Equity is a common phrase in consumer research, but there is still a lot of ambiguity surrounding the measurement of this concept (Keller, 2008). Several methods of measurement have been proposed over the years, but no one method has been adopted as the ideal way to predict purchase intent and measure brand equity. The current research tested three theories—Social Exchange Theory (SET), Theory of Planned Behavior (TPB), and the Yoo and Donthu model—to see which is the best predictor of purchase intent and brand equity. SET assumes consumers weigh the costs and rewards of purchasing the product. TPB uses consumers’ attitudes over purchasing the product, subjective norms of what others would do, and the perceived behavioral control consumers have in actually purchasing the product. The Yoo and Donthu model has been used most often of the three theories in measuring brand equity and includes measures of brand loyalty, perceived quality, brand awareness/associations, and overall brand equity. Study 1 assessed consumer durable products (TV and athletic shoes) and Study 2 assessed consumer non-durable products (soap and toothpaste). Consumers evaluated these products online based on a picture of the product, the brand name, price, customer reviews, quality ratings, and an advertisement and then indicated their likelihood to purchase the product. Theory of Planned Behavior was the best predictor of purchase intent across all four products assessed indicating that consumers look at external factors such as what others would do as well as how much control they have over purchasing the product as much as they consider their own attitudes.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Mangleburg, Tamara F. « A socialization model of children's perceived purchase influence : family type, hierarchy, and parenting practices / ». Diss., This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-08222008-063056/.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Dosne, Anne-Gaëlle. « Improved Methods for Pharmacometric Model-Based Decision-Making in Clinical Drug Development ». Doctoral thesis, Uppsala universitet, Institutionen för farmaceutisk biovetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-305697.

Texte intégral
Résumé :
Pharmacometric model-based analysis using nonlinear mixed-effects models (NLMEM) has to date mainly been applied to learning activities in drug development. However, such analyses can also serve as the primary analysis in confirmatory studies, which is expected to bring higher power than traditional analysis methods, among other advantages. Because of the high expertise in designing and interpreting confirmatory studies with other types of analyses and because of a number of unresolved uncertainties regarding the magnitude of potential gains and risks, pharmacometric analyses are traditionally not used as primary analysis in confirmatory trials. The aim of this thesis was to address current hurdles hampering the use of pharmacometric model-based analysis in confirmatory settings by developing strategies to increase model compliance to distributional assumptions regarding the residual error, to improve the quantification of parameter uncertainty and to enable model prespecification. A dynamic transform-both-sides approach capable of handling skewed and/or heteroscedastic residuals and a t-distribution approach allowing for symmetric heavy tails were developed and proved relevant tools to increase model compliance to distributional assumptions regarding the residual error. A diagnostic capable of assessing the appropriateness of parameter uncertainty distributions was developed, showing that currently used uncertainty methods such as bootstrap have limitations for NLMEM. A method based on sampling importance resampling (SIR) was thus proposed, which could provide parameter uncertainty in many situations where other methods fail such as with small datasets, highly nonlinear models or meta-analysis. SIR was successfully applied to predict the uncertainty in human plasma concentrations for the antibiotic colistin and its prodrug colistin methanesulfonate based on an interspecies whole-body physiologically based pharmacokinetic model. Lastly, strategies based on model-averaging were proposed to enable full model prespecification and proved to be valid alternatives to standard methodologies for studies assessing the QT prolongation potential of a drug and for phase III trials in rheumatoid arthritis. In conclusion, improved methods for handling residual error, parameter uncertainty and model uncertainty in NLMEM were successfully developed. As confirmatory trials are among the most demanding in terms of patient-participation, cost and time in drug development, allowing (some of) these trials to be analyzed with pharmacometric model-based methods will help improve the safety and efficiency of drug development.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Burciaga, Aaron D. « A dynamic model for political stakeholders forecasting the actions and relationships of Lebanese Hizbullah with Markov decision processes / ». Thesis, Monterey, California : Naval Postgraduate School, 2010. http://edocs.nps.edu/npspubs/scholarly/theses/2010/Jun/10Jun%5FBurciaga.pdf.

Texte intégral
Résumé :
Thesis (M.S. in Operations Research)--Naval Postgraduate School, June 2010.
Thesis Advisor(s): Kress, Moshe ; Szechtman, Roberto ; Second Reader: Atkinson, Michael. "June 2010." Description based on title screen as viewed on July 14, 2010. Author(s) subject terms: Lebanese Hizbullah; Lebanese Diaspora; Lebanon; Markov Decision Process; Dynamic Bayesian Network; Hidden Markov Models; Decision Analysis; Decision Theory; Decision Tree; State Tree; Influence Diagram; GeNIe; Stakeholder; State Space; Rational Actor; Action; Interest; Distribution; Forecast. Includes bibliographical references (p. 65). Also available in print.
Styles APA, Harvard, Vancouver, ISO, etc.
16

AlMutairi, Bandar Saud. « Statistical Models for Characterizing and Reducing Uncertainty in Seasonal Rainfall Pattern Forecasts to Inform Decision Making ». Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/940.

Texte intégral
Résumé :
Uncertainty in rainfall forecasts affects the level of quality and assurance for decisions made to manage water resource-based systems. However, eliminating uncertainty in a complete manner could be difficult, decision-makers thus are challenged to make decisions in the light of uncertainty. This study provides statistical models as an approach to cope with uncertainty, including: a) a statistical method relying on a Gaussian mixture (GM) model to assist in better characterize uncertainty in climate model projections and evaluate their performance in matching observations; b) a stochastic model that incorporates the El Niño–Southern Oscillation (ENSO) cycle to narrow uncertainty in seasonal rainfall forecasts; and c) a statistical approach to determine to what extent drought events forecasted using ENSO information could be utilized in the water resources decision-making process. This study also investigates the relationship between calibration and lead time on the ability to narrow the interannual uncertainty of forecasts and the associated usefulness for decision making. These objectives are demonstrated for the northwest region of Costa Rica as a case study of a developing country in Central America. This region of Costa Rica is under an increasing risk of future water shortages due to climate change, increased demand, and high variability in the bimodal cycle of seasonal rainfall. First, the GM model is shown to be a suitable approach to compare and characterize long-term projections of climate models. The GM representation of seasonal cycles is then employed to construct detailed comparison tests for climate models with respect to observed rainfall data. Three verification metrics demonstrate that an acceptable degree of predictability can be obtained by incorporating ENSO information in reducing error and interannual variability in the forecast of seasonal rainfall. The predictability of multicategory rainfall forecasts in the late portion of the wet season surpasses that in the early portion of the wet season. Later, the value of drought forecast information for coping with uncertainty in making decisions on water management is determined by quantifying the reduction in expected losses relative to a perfect forecast. Both the discrimination ability and the relative economic value of drought-event forecasts are improved by the proposed forecast method, especially after calibration. Positive relative economic value is found only for a range of scenarios of the cost-loss ratio, which indicates that the proposed forecast could be used for specific cases. Otherwise, taking actions (no-actions) is preferred as the cost-loss ratio approaches zero (one). Overall, the approach of incorporating ENSO information into seasonal rainfall forecasts would provide useful value to the decision-making process - in particular at lead times of one year ahead.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Stuart, Julie Ann. « A strategic environmentally conscious production decision model ». Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/24160.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Durak, Tolga. « A Framework for Simplified Residential Energy Consumption Assessment towards Developing Performance Prediction Models for Retrofit Decision-Making ». Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/77255.

Texte intégral
Résumé :
This research proposes to simplify the energy consumption assessment for residential homes while building the foundation towards the development of prediction tools that can achieve a credible level of accuracy for confident decision making. The energy consumption assessment is based on simplified energy consumption models. The energy consumption analysis uses a reduced number of energy model equations utilizing a critical, limited set of parameters. The results of the analysis are used to develop the minimum set of consumption influence parameters with predicted effects for each energy consumption domain. During this research study, multiple modeling approaches and occupancy scenarios were utilized according to climate conditions in Blacksburg, Virginia. As a part of the analysis process, a parameter study was conducted to: develop a comprehensive set of energy consumption influence parameters, identify the inter-relationships among parameters, determine the impact of energy consumption influence parameters in energy consumption models, and classify energy consumption influence parameters under identified energy consumption domains. Based on the results of the parameter study, a minimum set of parameters and energy consumption influence matrices were developed. This research suggests the minimum set of parameters with predicted effects to be used during the development of the simplified baseline energy consumption model.
Ph. D.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Fard, Pouyan R., Hame Park, Andrej Warkentin, Stefan J. Kiebel et Sebastian Bitzer. « A Bayesian Reformulation of the Extended Drift-Diffusion Model in Perceptual Decision Making ». Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-230313.

Texte intégral
Résumé :
Perceptual decision making can be described as a process of accumulating evidence to a bound which has been formalized within drift-diffusion models (DDMs). Recently, an equivalent Bayesian model has been proposed. In contrast to standard DDMs, this Bayesian model directly links information in the stimulus to the decision process. Here, we extend this Bayesian model further and allow inter-trial variability of two parameters following the extended version of the DDM. We derive parameter distributions for the Bayesian model and show that they lead to predictions that are qualitatively equivalent to those made by the extended drift-diffusion model (eDDM). Further, we demonstrate the usefulness of the extended Bayesian model (eBM) for the analysis of concrete behavioral data. Specifically, using Bayesian model selection, we find evidence that including additional inter-trial parameter variability provides for a better model, when the model is constrained by trial-wise stimulus features. This result is remarkable because it was derived using just 200 trials per condition, which is typically thought to be insufficient for identifying variability parameters in DDMs. In sum, we present a Bayesian analysis, which provides for a novel and promising analysis of perceptual decision making experiments.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Hassler, Ryan Scott. « Mathematical comprehension facilitated by situation models : Learning opportunities for inverse relations in elementary school ». Diss., Temple University Libraries, 2016. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/410935.

Texte intégral
Résumé :
Math & Science Education
Ph.D.
The Common Core State Standards call for more rigorous, focused, and coherent curriculum and instruction, has resulted in students being faced with more cognitively high-demanding tasks which involve forming connections within and between fundamental mathematical concepts. Because mathematical comprehension generally relates back to one’s ability to form connections to prior knowledge, this study sought to examine the extent to which current learning environments expose students to connection-making opportunities that may help facilitate mathematical understanding of elementary multiplicative inverses. As part of an embedded mixed-methods design, I analyzed curriculum materials, classroom instruction, and student assessments from four elementary mathematics teachers’ classrooms. A situation model perspective of comprehension was used for analysis. The aim of this study was thus to determine how instructional tasks, representations, and deep questions are used for connection-making, which is the foundation of a situation model that can be used for inference-making. Results suggest that student comprehension depends more on connection-making opportunities afforded by classroom teachers, rather than on learning opportunities found solely within a curriculum. This included instruction that focused on deeply unpacking side-by-side comparison type examples, situated examples in personal concrete contexts, used semi-concrete representations to illustrate structural relationships, promoted efficiency through the sequence of presented representations, and posed deep questions which supported students’ sense-making and emphasized the interconnectedness of mathematics. By analyzing these key aspects, this study contributes to research on mathematical understanding and provides a foundation for helping students facilitate transfer of prior knowledge into novel mathematical situation.
Temple University--Theses
Styles APA, Harvard, Vancouver, ISO, etc.
21

Khattar, Vanshaj. « Threat Assessment and Proactive Decision-Making for Crash Avoidance in Autonomous Vehicles ». Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/103470.

Texte intégral
Résumé :
Threat assessment and reliable motion-prediction of surrounding vehicles are some of the major challenges encountered in autonomous vehicles' safe decision-making. Predicting a threat in advance can give an autonomous vehicle enough time to avoid crashes or near crash situations. Most vehicles on roads are human-driven, making it challenging to predict their intentions and movements due to inherent uncertainty in their behaviors. Moreover, different driver behaviors pose different kinds of threats. Various driver behavior predictive models have been proposed in the literature for motion prediction. However, these models cannot be trusted entirely due to the human drivers' highly uncertain nature. This thesis proposes a novel trust-based driver behavior prediction and stochastic reachable set threat assessment methodology for various dangerous situations on the road. This trust-based methodology allows autonomous vehicles to quantify the degree of trust in their predictions to generate the probabilistically safest trajectory. This approach can be instrumental in the near-crash scenarios where no collision-free trajectory exists. Three different driving behaviors are considered: Normal, Aggressive, and Drowsy. Hidden Markov Models are used for driver behavior prediction. A "trust" in the detected driver is established by combining four driving features: Longitudinal acceleration, lateral acceleration, lane deviation, and velocity. A stochastic reachable set-based approach is used to model these three different driving behaviors. Two measures of threat are proposed: Current Threat and Short Term Prediction Threat which quantify present and the future probability of a crash. The proposed threat assessment methodology resulted in a lower rate of false positives and negatives. This probabilistic threat assessment methodology is used to address the second challenge in autonomous vehicle safety: crash avoidance decision-making. This thesis presents a fast, proactive decision-making methodology based on Stochastic Model Predictive Control (SMPC). A proactive decision-making approach exploits the surrounding human-driven vehicles' intent to assess the future threat, which helps generate a safe trajectory in advance, unlike reactive decision-making approaches that do not account for the surrounding vehicles' future intent. The crash avoidance problem is formulated as a chance-constrained optimization problem to account for uncertainty in the surrounding vehicle's motion. These chance-constraints always ensure a minimum probabilistic safety of the autonomous vehicle by keeping the probability of crash below a predefined risk parameter. This thesis proposes a tractable and deterministic reformulation of these chance-constraints using convex hull formulation for a fast real-time implementation. The controller's performance is studied for different risk parameters used in the chance-constraint formulation. Simulation results show that the proposed control methodology can avoid crashes in most hazardous situations on the road.
Master of Science
Unexpected road situations frequently arise on the roads which leads to crashes. In an NHTSA study, it was reported that around 94% of car crashes could be attributed to driver errors and misjudgments. This could be attributed to drinking and driving, fatigue, or reckless driving on the roads. Full self-driving cars can significantly reduce the frequency of such accidents. Testing of self-driving cars has recently begun on certain roads, and it is estimated that one in ten cars will be self-driving by the year 2030. This means that these self-driving cars will need to operate in human-driven environments and interact with human-driven vehicles. Therefore, it is crucial for autonomous vehicles to understand the way humans drive on the road to avoid collisions and interact safely with human-driven vehicles on the road. Detecting a threat in advance and generating a safe trajectory for crash avoidance are some of the major challenges faced by autonomous vehicles. We have proposed a reliable decision-making algorithm for crash avoidance in autonomous vehicles. Our framework addresses two core challenges encountered in crash avoidance decision-making in autonomous vehicles: 1. The outside challenge: Reliable motion prediction of surrounding vehicles to continuously assess the threat to the autonomous vehicle. 2. The inside challenge: Generating a safe trajectory for the autonomous vehicle in case of future predicted threat. The outside challenge is to predict the motion of surrounding vehicles. This requires building a reliable model through which future evolution of their position states can be predicted. Building these models is not trivial, as the surrounding vehicles' motion depends on human driver intentions and behaviors, which are highly uncertain. Various driver behavior predictive models have been proposed in the literature. However, most do not quantify trust in their predictions. We have proposed a trust-based driver behavior prediction method which combines all sensor measurements to output the probability (trust value) of a certain driver being "drowsy", "aggressive", or "normal". This method allows the autonomous vehicle to choose how much to trust a particular prediction. Once a picture is painted of surrounding vehicles, we can generate safe trajectories in advance – the inside challenge. Most existing approaches use stochastic optimal control methods, which are computationally expensive and impractical for fast real-time decision-making in crash scenarios. We have proposed a fast, proactive decision-making algorithm to generate crash avoidance trajectories based on Stochastic Model Predictive Control (SMPC). We reformulate the SMPC probabilistic constraints as deterministic constraints using convex hull formulation, allowing for faster real-time implementation. This deterministic SMPC implementation ensures in real-time that the vehicle maintains a minimum probabilistic safety.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Gil, Pascual Miriam. « Adapting Interaction Obtrusiveness : Making Ubiquitous Interactions Less Obnoxious. A Model Driven Engineering approach ». Doctoral thesis, Universitat Politècnica de València, 2013. http://hdl.handle.net/10251/31660.

Texte intégral
Résumé :
La Computaci'on Ubicua plantea proveer de inteligencia a nuestros entornos ofreciendo servicios a los usuarios que permitan ayudarlos en su vida cotidiana. Con la inclusi'on de dispositivos ubicuos en nuestra vida (por ejemplo los dispositivos m'oviles), los usuarios hemos pasado a estar siempre conectados al entorno, pudiendo interactuar con el. Sin embargo, a diferencia de las interacciones de escritorio tradicionales donde los usuarios eran quienes ped'¿an informaci'on o introduc'¿an datos, las interacciones ubicuas tienen que lidiar con un entorno de los usuarios variable, demandando uno de los recursos mas valiosos para los usuarios: la atenci'on humana. De esta forma, un reto en el paradigma de computaci'on ubicua es regular las peticiones de atenci'on del usuario. Esto implica que las interacciones de los servicios deber'¿an comportarse de una manera ¿considerada¿ teniendo en cuenta el grado en que cada servicio se inmiscuye en la mente del usuario (el nivel de molestia). Partiendo de las bases de la Ingenier'¿a Dirigida por Modelos (MDE) y de los principios de la Computaci'on Considerada, esta tesis se orienta a dise¿nar y desarrollar servicios que sean capaces de adaptar sus interacciones de acuerdo a la atenci'on del usuario en cada momento. El principal objetivo de esta tesis es introducir capacidades de adaptaci'on considerada en los servicios ubicuos para proporcionar interacciones que no perturben al usuario. Esto lo conseguimos mediante un proceso de desarrollo que cubre desde el dise¿no de los servicios hasta su implementaci'on, centr'andose en los requisitos de adaptaci'on de la interacci'on particulares para cada usuario. Para el dise¿no del comportamiento de la interacci'on en base al nivel de molestia se han de¿nido unos modelos de intromisi'on e interacci'on independientes de la tecnolog'¿a. Estos modelos son los que posteriormente conducen la adaptaci'on de la interacci'on din'amicamente, por medio de una infraestructura aut'onoma que los usa en tiempo de ejecuci'on. Esta infraestructura es capaz de detectar cambios en la situaci'on del usuario (por ejemplo cambios en su localizaci'on, su actividad, etc.) y planear y ejecutar modi¿caciones en la interacci'on de los servicios. Cuando se detecta un cambio del contexto del usuario, los servicios se auto-adaptan para usar los componentes de interacci'on m'as apropiados de acuerdo a la nueva situaci'on y no molestar al usuario. Adem'as, como las necesidades y preferencias de los usuarios pueden cambiar con el tiempo, nuestra aproximaci'on utiliza la estrategia del aprendizaje por refuerzo para ajustar los modelos de dise¿no iniciales de forma que maximicemos la experiencia del usuario. El dise¿no inicial de la interacci'on basado en el nivel de molestia nos asegura un comportamiento inicial consistente con las necesidades de los usuarios en ese momento. Luego, este dise¿no se va re¿nando de acuerdo al comportamiento y preferencias de cada usuario por medio de su retroalimentaci'on a trav'es de la experiencia de uso. Adem'as, tambi'en proporcionamos una interfaz m'ovil que permite a los usuarios ¿nales personalizarse de forma manual los modelos en base a sus propias preferencias. El trabajo presentado en esta tesis se ha llevado a la pr'actica para su evaluaci'on desde el punto de vista de los dise¿nadores y de los usuarios ¿nales. Por una parte, el m'etodo de dise¿no se ha validado para comprobar que ayuda a los dise¿nadores a especi¿car este tipo de servicios. Pese a que el proceso de desarrollo no ofrece una automatizaci'on completa, las gu'¿as ofrecidas y la formalizaci'on de los conceptos implicados ha demostrado ser 'util a la hora de desarrollar servicios cuya interacci'on es no molesta. Por otra parte, la adaptaci'on de la interacci'on en base al nivel de molestia se ha puesto en pr'actica con usuarios para evaluar su satisfacci'on con el sistema y su experiencia de usuario. Esta validaci'on ha desvelado la importancia de considerar los aspectos de molestia en el proceso de adaptaci'on de la interacci'on para ayudar a mejorar la experiencia de usuario.
In Ubiquitous Computing environments, people are surrounded by a lot of embedded services. Since ubiquitous devices, such as mobile phones, have become a key part of our everyday life, they enable users to be always connected to the environment and interact with it. However, unlike traditional desktop interactions where users are used to request for information or input data, ubiquitous interactions have to face with variable user¿s environment, making demands on one of the most valuable resources of users: human attention. A challenge in the Ubiquitous Computing paradigm is regulating the request for user¿s attention. That is, service interactions should behave in a considerate manner by taking into account the degree in which each service intrudes the user¿s mind (i.e., the obtrusiveness degree). In order to prevent service behavior from becoming overwhelming, this work, based on Model Driven Engineering foundations and the Considerate Computing principles, is devoted to design and develop services that adapt their interactions according to user¿s attention. The main goal of the present thesis is to introduce considerate adaptation capabilities in ubiquitous services to provide non-disturbing interactions. We achieve this by means of a systematic method that covers from the services¿ design to their implementation and later adaptation of interaction at runtime
Gil Pascual, M. (2013). Adapting Interaction Obtrusiveness: Making Ubiquitous Interactions Less Obnoxious. A Model Driven Engineering approach [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/31660
TESIS
Styles APA, Harvard, Vancouver, ISO, etc.
23

Watkiss, Brendon Miles. « The SLEUTH urban growth model as forecasting and decision-making tool ». Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019.1/1654.

Texte intégral
Résumé :
Thesis (MSc (Geography and Environmental Studies))--Stellenbosch University, 2008.
Accelerating urban growth places increasing pressure not only on the efficiency of infrastructure and service provision, but also on the natural environment. City managers are delegated the task of identifying problem areas that arise from this phenomenon and planning the strategies with which to alleviate them. It is with this in mind that the research investigates the implementation of an urban growth model, SLEUTH, as a support tool in the planning and decision making process. These investigations are carried out on historical urban data for the region falling under the control of the Cape Metropolitan Authority. The primary aim of the research was to simulate future urban expansion of Cape Town based on past growth patterns by making use of cellular automata methodology in the SLEUTH modeling platform. The following objectives were explored, namely to: a) determine the impact of urbanization on the study area, b) identify strategies for managing urban growth from literature, c) apply cellular automata as a modeling tool and decision-making aid, d) formulate an urban growth policy based on strategies from literature, and e) justify SLEUTH as the desired modeling framework from literature. An extensive data base for the study area was acquired from the product of a joint initiative between the private and public sector, called “Urban Monitoring”. The data base included: a) five historical urban extent images (1977, 1988, 1993, 1996 and 1998); b) an official urban buffer zone or ‘urban edge’, c) a Metropolitan Open Space System (MOSS) database, d) two road networks, and d) a Digital Elevation Model (DEM). Each dataset was converted to raster format in ArcEdit and finally .gif images were created of each data layer for compliance with SLEUTH requirements. SLEUTH processed this historic data to calibrate the growth variables for best fit of observed historic growth. An urban growth forecast was run based on the calibration parameters. Findings suggest SLEUTH can be applied successfully and produce realistic projection of urban expansion. A comparison between modelled and real urban area revealed 76% model accuracy. The research then attempts to mimic urban growth policy in the modeling environment, with mixed results.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Hutchinson, Craig Alan. « Multiscale Modelling as an Aid to Decision Making in the Dairy Industry ». Thesis, University of Canterbury. Chemical and Process Engineering, 2006. http://hdl.handle.net/10092/2146.

Texte intégral
Résumé :
This work presents the first known attempt to model the dairy business from a multiscale modelling perspective. The multiscale nature of the dairy industry is examined with emphasis on those key decision making and process scales involved in production. Decision making scales identified range from the investor level to the plant operator level, and encompass business, production, plant, and operational levels. The model considers scales from the production manager to the unit operation scale. The cheese making process is used to demonstrate scale identification in the context of the important phenomena and other natural levels of scrutiny of interest to decision makers. This work was a first step in the establishment of a multiscale system model capable of delivering information for process troubleshooting, scheduling, process and business optimization, and process control decision-making for the dairy industry. Here, only material transfer throughout a process, use of raw materials, and production of manufactured product is modelled. However, an implementation pathway for adding other models (such as the precipitation of milk protein which forms curd) to the system model is proposed. The software implementation of the dairy industry multiscale model presented here tests the validity of the proposed: • object model (object and collection classes) used to model unit operations and integrate them into a process, • mechanisms for modelling material and energy streams, • method to create simulations over variable time horizons. The model was implemented using object oriented programming (OOP) methods in conjunction with technologies such as Visual Basic .NET and CAPE-OPEN. An OOP object model is presented which successfully enabled the construction of a multiscale model of the cheese making process. Material content, unit operation, and raw milk supply models were integrated into the multiscale model. The model is capable of performing simulations over variable time horizons, from 1 second, to multiple years. Mechanisms for modelling material streams, connecting unit operations, and controlling unit operation behaviour were implemented. Simple unit operations such as pumps and storage silos along with more complex unit operations, such as a cheese vat batch, were modelled. Despite some simplifications to the model of the cheese making process, the simulations successfully reproduced the major features expected from the process and its constituent unit operations. Decision making information for process operators, plant managers, production managers, and the dairy business manager can be produced from the data generated. The multiscale model can be made more sophisticated by extending the functionality of existing objects, and incorporating other scale partial models. However, increasing the number of reported variables by even a small number can quickly increase the data processing and storage demands of the model. A unit operation’s operational state of existence at any point of time was proposed as a mechanism for integrating and recalculating lower scale partial models. This mechanism was successfully tested using a unit operation’s material content model and is presented here as a new concept in multiscale modelling. The proposed modelling structure can be extended to include any number of partial models and any number of scales.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Shi, Zhenzhen. « A MARKOV DECISION PROCESS EMBEDDED WITH PREDICTIVE MODELING : A MODELING APPROACH FROM SYSTEM DYNAMICS MATHEMATICAL MODELS, AGENT-BASED MODELS TO A CLINICAL DECISION MAKING ». Diss., Kansas State University, 2015. http://hdl.handle.net/2097/20578.

Texte intégral
Résumé :
Doctor of Philosophy
Department of Industrial & Manufacturing Systems Engineering
David H. Ben-Arieh
Chih-Hang Wu
Patients who suffer from sepsis or septic shock are of great concern in the healthcare system. Recent data indicate that more than 900,000 severe sepsis or septic shock cases developed in the United States with mortality rates between 20% and 80%. In the United States alone, almost $17 billion is spent each year for the treatment of patients with sepsis. Clinical trials of treatments for sepsis have been extensively studied in the last 30 years, but there is no general agreement of the effectiveness of the proposed treatments for sepsis. Therefore, it is necessary to find accurate and effective tools that can help physicians predict the progression of disease in a patient-specific way, and then provide physicians recommendation on the treatment of sepsis to lower risk for patients dying from sepsis. The goal of this research is to develop a risk assessment tool and a risk management tool for sepsis. In order to achieve this goal, two system dynamic mathematical models (SDMMs) are initially developed to predict dynamic patterns of sepsis progression in innate immunity and adaptive immunity. The two SDMMs are able to identify key indicators and key processes of inflammatory responses to an infection, and a sepsis progression. Second, an integrated-mathematical-multi-agent-based model (IMMABM) is developed to capture the stochastic nature embedded in the development of inflammatory responses to a sepsis. Unlike existing agent-based models, this agent-based model is enhanced by incorporating developed SDMMs and extensive experimental data. With the risk assessment tools, a Markov decision process (MDP) is proposed, as a risk management tool, to apply to clinical decision-makings on sepsis. With extensive computational studies, the major contributions of this research are to firstly develop risk assessment tools to identify the risk of sepsis development during the immune system responding to an infection, and secondly propose a decision-making framework to manage the risk of infected individuals dying from sepsis. The methodology and modeling framework used in this dissertation can be expanded to other disease situations and treatment applications, and have a broad impact to the research area related to computational modeling, biology, medical decision-making, and industrial engineering.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Barwich, Ann-Sophie. « Making sense of smell : classifications and model thinking in olfaction theory ». Thesis, University of Exeter, 2013. http://hdl.handle.net/10871/13869.

Texte intégral
Résumé :
This thesis addresses key issues of scientific realism in the philosophy of biology and chemistry through investigation of an underexplored research domain: olfaction theory, or the science of smell. It also provides the first systematic overview of the development of olfactory practices and research into the molecular basis of odours across the 19th and 20th century. Historical and contemporary explanations and modelling techniques for understanding the material basis of odours are analysed with a specific focus on the entrenchment of technological process, research tradition and the definitions of materiality for understanding scientific advancement. The thesis seeks to make sense of the explanatory and problem solving strategies, different ways of reasoning and the construction of facts by drawing attention to the role and application of scientific representations in olfactory practices. Scientific representations such as models, classifications, maps, diagrams, lists etc. serve a variety of purposes that range from the stipulation of relevant properties and correlations of the research materials and the systematic formation of research questions, to the design of experiments that explore or test particular hypotheses. By examining a variety of modelling strategies in olfactory research, I elaborate on how I understand the relation between representations and the world and why this relation requires a pluralist perspective on scientific models, methods and practices. Through this work I will show how a plurality of representations does not pose a problem for realism about scientific entities and their theoretical contexts but, on the contrary, that this plurality serves as the most reliable grounding for a realistic interpretation of scientific representations of the world and the entities it contains. The thesis concludes that scientific judgement has to be understood through its disciplinary trajectory, and that scientific pluralism is a direct consequence of the historicity of scientific development.
Styles APA, Harvard, Vancouver, ISO, etc.
27

MONROE, STUART ROBERT. « COMPUTER SIMULATION MODEL FOR STRATEGIC MANAGEMENT DECISIONS RELATED TO YUMA, ARIZONA CITRUS ORCHARDS (POLICY, OPTIMIZATION, OPERATIONS) ». Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/187986.

Texte intégral
Résumé :
This research assisted the Yuma, Arizona citrus orchard manager in his strategic planning for achieving a low-cost position in a focused segment of the citrus industry. Citrus growers in the Yuma district are faced with major changes in their competitive environment and must adopt new strategic plans in order to continue to compete effectively in what has recently become a global industry. Since the planning horizon for new citrus orchards is in excess of 20 years, a long range planning model was developed to aid in evaluating alternative operating strategies. This research established the interrelatedness of water, nitrogen, and phosphorous relative to the yields of Valenica Oranges, Lisbon Lemons, and Redblush Grapefruit on Rough Lemon, Sour Orange, and Troyer rootstocks. A computer simulation model was used to evaluate optimal operating policies for a variety of resource prices and market conditions. The methodology utilized in development of the simulation model was unique in that it emulates individual tree performance from the time of planting until maturation. Four operating strategies were investigated and the profit maximizing and cost minimizing strategies were found to be significant. Evaluation of market selling prices indicated that the profit maximizing strategy was optimal except at very low market prices where the cost minimization strategy was optimal. Price sensitivity for water and fertilizer resources was investigated. Operating strategies were not affected by water price increases over the foreseeable future, however, price changes in nitrogen and phosphorous were found to affect the optimal operating strategy primarily through the substitution of manure in the system. Existing horticultural practices in the Yuma growing area were confirmed by the research. Additional optimal operating strategies were suggested relative to market prices. The long run policy decision making process for orchard managers was enhanced.
Styles APA, Harvard, Vancouver, ISO, etc.
28

Marinelli, Marco Antonio. « Modelling and communicating the effects of spatial data uncertainty on spatially based decision-making ». Thesis, Curtin University, 2011. http://hdl.handle.net/20.500.11937/1842.

Texte intégral
Résumé :
Important economic and environmental decisions are routinely based on spatial/ temporal models. This thesis studies the uncertainty in the predictions of three such models caused by uncertainty propagation. This is considered important as it quantifies the sensitivity of a model’s prediction to uncertainty in other components of the model, such as the model’s inputs. Furthermore, many software packages that implement these models do not permit users to easily visualize either the uncertainty in the data inputs, the effects of the model on the magnitude of that uncertainty, or the sensitivity of the uncertainty to individual data layers. In this thesis, emphasis has been placed on demonstrating the methods used to quantify and then, to a lesser extent, visualize the sensitivity of the models. Also, the key questions required to be resolved with regards to the source of the uncertainty and the structure of the model is investigated. For all models investigated, the propagation paths that most influence the uncertainty in the prediction were determined. How the influence of these paths can be minimised, or removed, is also discussed.Two different methods commonly used to analyse uncertainty propagation were investigated. The first is the analytical Taylor series method, which can be applied to models with continuous functions. The second is the Monte Carlo simulation method which can be used on most types of models. Also, the later can be used to investigate how the uncertainty propagation changes when the distribution of model uncertainty is non Gaussian. This is not possible with the Taylor method.The models tested were two continuous Precision Agriculture models and one ecological niche statistical model. The Precision Agriculture models studied were the nitrogen (N) availability component of the SPLAT model and the Mitscherlich precision agricultural model. The third, called BIOCLIM, is a probabilistic model that can be used to investigate and predict species distributions for both native and agricultural species.It was generally expected that, for a specific model, the results from the Taylor method and the Monte Carlo will agree. However, it was found that the structure of the model in fact influences this agreement, especially in the Mitscherlich Model which has more complex non linear functions. Several non-normal input uncertainty distributions were investigated to see if they could improve the agreement between these methods. The uncertainty and skew of the Monte Carlo results relative to the prediction of the model was also useful in highlighting how the distribution of model inputs and the models structure itself, may bias the results.The version of BIOCLIM used in this study uses three basic spatial climatic input layers (monthly maximum and minimum temperature and precipitation layers) and a dataset describing the current spatial distribution of the species of interest. The thesis investigated how uncertainty in the input data propagates through to the estimated spatial distribution for Field Peas (Pisum sativum) in the agriculturally significant region of south west Western Australia. The results clearly show the effect of uncertainty in the input layers on the predicted specie’s distribution map. In places the uncertainty significantly influences the final validity of the result and the spatial distribution of the validity also varies significantly.
Styles APA, Harvard, Vancouver, ISO, etc.
29

Xu, Lu. « The Importance of Construct Definition and Specification in Operations Management Structured Model Research : The Case for Quality and Sustainability Constructs in a Decision-Making Model ». Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1248479/.

Texte intégral
Résumé :
In the operations management research, the inconsistent use of the same term for different concepts and the use of the similar concepts for different constructs potentially causes theoretical and statistical problems. This research addresses the importance of construct definitions and specification methodologically within the context of quality and sustainability management. It involves three essays using multiple quantitative methods such as partial least squares structural equation modeling and multiple regression in different consumer decision-making models in the automobile industry. In the first two essays, a comprehensive literature review results in definition and contextualization of the quality and sustainability constructs as applied to operations management and marketing research. The relationships of these constructs with consumer behavior are empirically tested. Building upon the first two essays, the third essay addresses the methodological issues on formative and reflective measurements by summarizing a procedure of validating formative measurements. The quality construct was used to illustrate the methodology. This research contributes to the literature, theory, and practices in the area of quality and sustainability management.
Styles APA, Harvard, Vancouver, ISO, etc.
30

Guisse, Amadou Wane. « Spatial model development for resource management decision making and strategy formulation : application of neural network (Mounds State Park, Anderson, Indiana) ». Virtual Press, 1993. http://liblink.bsu.edu/uhtbin/catkey/864949.

Texte intégral
Résumé :
An important requirement of a rational policy for provision of outdoor recreation opportunities is some understanding of natural processes and public concern and /or preferences. Computerized land use suitability mapping is a technique which can help find the best location for a variety of developmental actions given a set of goals and other criteria. Over the past two decades, the methods and techniques of land use planning have been engaged in a revolution on at least two fronts as to shift the basic theories and attitudes of which land use decisions are based. The first of these fronts is the inclusion of environmental concerns, and the second is the application of more systematic methods or models. While these automated capabilities have shed new light on environmental issues, they, unfortunately, have failed to develop sufficient intelligence and adaptation to accurately model the dynamics of ecosystems.The work reported proceeds on the belief that neural network models can be used to assess and develop resource management strategies for Mounds State Park, Anderson, Indiana. The study combines a photographic survey technique with a geographic information system (GIS) and artificial neural networks (NN) to investigate the perceived impact of park management activities on recreation opportunities and experiences. It is unique in that it incorporates both survey data with spatial data and an optimizing technique to develop a model for predicting perceived management values for short and long term recreation management.According to Jeannette Stanley and Evan Bak (1988) a neural network is a massively parallel, dynamic systems of highly interconnected interacting parts based on neurobiological models. The behavior of the network depends heavily on the connection details. The state of the network evolves continually with time. Networks are considered clever and intuitive because they learn by example rather than following simple programming rules. They are defined by a set of rules or patterns based on expertise or perception for better decision making. With experience networks become sensitive to subtle relationships in the environment which are not obvious to humans.The model was developed as a counter-propagation network with a four layer learning network consisting of an input layer, a normalized layer, a kohonen layer, and an output layer. The counter-propagation network is a feed-forward network which combines Kohonen and Widrow-Hoff learning rules for a new type of mapping neural network. The network was trained with patterns derived by mapping five variables (slope, aspect, vegetation, soil, site features) and survey responses from three groups. The responses included, for each viewshed, the preference and management values, and three recreational activities each group associated with a given landscape. Overall the model behaves properly in learning the different rules and generalizing in cases where inputs had not been shown to the network apriori. Maps are provided to illustrate the different responses obtained from each group and simulated by the model. The study is not conclusive as to the capabilities of the combination of GIS techniques and neural networks, but it gives a good flavor of what can be achieved when accurate mapping information is used by an intelligent system for decision making.
Department of Landscape Architecture
Styles APA, Harvard, Vancouver, ISO, etc.
31

Cope, Dayana. « AUTOMATIC GENERATION OF SUPPLY CHAIN SIMULATION MODELS FROM SCOR BASED ONTOLOGIES ». Doctoral diss., University of Central Florida, 2008. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2640.

Texte intégral
Résumé :
In today's economy of global markets, supply chain networks, supplier/customer relationship management and intense competition; decision makers are faced with a need to perform decision making using tools that do not accommodate the nature of the changing market. This research focuses on developing a methodology that addresses this need. The developed methodology provides supply chain decision makers with a tool to perform efficient decision making in stochastic, dynamic and distributed supply chain environments. The integrated methodology allows for informed decision making in a fast, sharable and easy to use format. The methodology was implemented by developing a stand alone tool that allows users to define a supply chain simulation model using SCOR based ontologies. The ontology includes the supply chain knowledge and the knowledge required to build a simulation model of the supply chain system. A simulation model is generated automatically from the ontology to provide the flexibility to model at various levels of details changing the model structure on the fly. The methodology implementation is demonstrated and evaluated through a retail oriented case study. When comparing the implementation using the developed methodology vs. a "traditional" simulation methodology approach, a significant reduction in definition and execution time was observed.
Ph.D.
Department of Industrial Engineering and Management Systems
Engineering and Computer Science
Industrial Engineering PhD
Styles APA, Harvard, Vancouver, ISO, etc.
32

Martínez-García, Marina. « Statistical analysis of neural correlates in decision-making ». Doctoral thesis, Universitat Pompeu Fabra, 2014. http://hdl.handle.net/10803/283111.

Texte intégral
Résumé :
We investigated the neuronal processes which occur during a decision- making task based on a perceptual classi cation judgment. For this purpose we have analysed three di erent experimental paradigms (somatosensory, visual, and auditory) in two di erent species (monkey and rat), with the common goal of shedding light into the information carried by neurons. In particular, we focused on how the information content is preserved in the underlying neuronal activity over time. Furthermore we considered how the decision, the stimuli, and the con dence are encoded in memory and, when the experimental paradigm allowed it, how the attention modulates these features. Finally, we went one step further, and we investigated the interactions between brain areas that arise during the process of decision- making.
Durant aquesta tesi hem investigat els processos neuronals que es pro- dueixen durant tasques de presa de decisions, tasques basades en un ju- dici l ogic de classi caci o perceptual. Per a aquest prop osit hem analitzat tres paradigmes experimentals diferents (somatosensorial, visual i auditiu) en dues espcies diferents (micos i rates), amb l'objectiu d'il.lustrar com les neurones codi quen informaci on referents a les t asques. En particular, ens hem centrat en com certes informacions estan cod- i cades en l'activitat neuronal al llarg del temps. Concretament, com la informaci o sobre: la decisi o comportamental, els factors externs, i la con- ana en la resposta, b e codi cada en la mem oria. A m es a m es, quan el paradigma experimental ens ho va permetre, com l'atenci o modula aquests aspectes. Finalment, hem anat un pas m es enll a, i hem analitzat la comu- nicaci o entre les diferents arees corticals, mentre els subjectes resolien una tasca de presa de decisions.
Styles APA, Harvard, Vancouver, ISO, etc.
33

Mathur, Kush. « Mathematical Models and Genetic Algorithm Approaches to Simultaneously Perform Workforce Overtime Capacity Planning and Schedule Cells ». Ohio University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1351306927.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Ghebretsadik, Amanuel Habte. « Farm planning for a typical crop-livestock integrated farm : an application of a mixed integer linear programming model ». Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/49965.

Texte intégral
Résumé :
Assignment (MSc) -- University of Stellenbosch, 2004.
ENGLISH ABSTRACT: In an integrated crop-livestock production farm, the profitability and sustainability of farm production is dependent on the crop rotation strategy applied. Crop rotations have historically been applied to maintain long-term profitability and sustainabiliry of farming production by exploiting the jointly beneficial interrelationships existing among different crop types and the animal production activity. Monocrop (specifically wheat) growers in the Swartland area of the Western Cape are struggling to maintain long-term profitability and sustainability of the crop production, challenging them to rethink about the introduction crop rotation in the production planning. By making proper assumptions, this paper develops a mixed integer linear programming model to suggest a decision planning for the farm planning problem faced by an integratedcrop- livestock production farmer. The mathematical model developed includes crop production, dairy production and wool sheep production activities, which permitted the consideration of five crop types within a crop rotation system. By assuming that a farmer uses a cycle of at most three years, the crop rotation model was incorporated in the composite mixed integer linear farm planning model. In order to demonstrate the application of the mathematical farm planning model formulated, a case study is presented. Relevant data from the Koeberg area of the Swartland region of the Western Cape was applied. For each planning period, the model assumed that the farm has the option of selecting from any of 15 cropping strategies. A land which is not allocated to any of the 15 crop rotation strategies due to risky production situation is left as grass land for roughage purposes of the animal production. Results of the mathematical model indicated that farm profit is dependent on the cropping strategy selected. Additionally, animal production level was also dependent on the crop strategy appl ied. Furthermore, study results suggest that the profit generated from the integrated crop-livestock farm production by adopting crop rotation was superior to profit generated 1'1'0111 the farm activities which are based on monocrop wheat strategy. Empirical results also indicated that the complex interrelationship involved in a mixed crop-livestock farm operation play a major role in determining optimal farm plans. This complex interrelationships favour the introduction of crop rotation in the crop production activities of the farm under investigation. Crop production risk is the major risk component of risk the farmer faces in the farm production. In this study, risk is incorporated in the mixed integer programrnmg farm planning model as a deviation from the expected values of an activity of returns. Model solution with risk indicated that crop rotation strategy and animal production level is sensitive to risk levels considered. The Results also showed that the incorporation of risk in the model greatly affects the level of acreage allocation, crop rotation and animal production level of the farm. Finally, to improve the profitability and sustainability of the farm activity, the study results suggest that the introduction of crop rotation which consist cereals, oil crops and leguminous forages is of paramount importance. Furthermore, the inclusion of forage crops such as medics in the integrated crop livestock production is beneficial for sustained profitability from year to year.
AFRIKAANSE OPSOMMING: Wisselbou is baie belangrik om volhoubare winsgewindheid te verseker in 'n geintegreerde lewendehawe I gewasverbouing boerdery in die Swartland gebied van Wes-Kaap. "n Monokultuur van veral koring produksie het ernstige problerne vir produsente veroorsaak. In hierdie studie word 'n gemengde heeltallige liniere prograrnmerings-model gebruik om te help met besluitneming in sulke boerderye.Die wiskundige model beskou die produksie van kontant- en voer-gewasse (5 verskillende soorte) asook suiwel- en wol/vleis-produksie (beeste en skape) .Daar word aanvaar dat die boer "n siklus van hoogstens 3 jaar in die wisselbou rotasie model gebruik .. 'n Gevallestudie word gedoen met behulp van toepaslike data van 'n plaas in die Koeberg gebied. Die model aanvaar dat die produsent 'n keuse het uit 16 wisselbou strategic .Resultate toon dat winsgewindheid afhanklik is van die strategie gekies en dat wisselbou beter resultate lewer as in die geval van "n monokultuur.Dit wys ook dat die wisselwerking tussen diereproduksie en gewasproduksie baie belangrik is in die keuse van 'n optimale strategie. Die risiko in gewasverbouing is die belangrikste risiko factor vir die produsent.In hierdie studie word risiko ook ingesluit in die gemengde heeltallige model, naamlik as 'n afwyking van die verwagte opbrengs-waardes .Die model toon duidelik dat gewasproduksie en lewendehawe-produksie baie sensitief is ten opsigte van die gekose risiko vlak. Die studie toon ook dat 'n wisselbou program wat die produksie van graan (veral koring) .oliesade asook voere insluit belangrik is vir volhoubare winsgewindheid Die insluiting van klawers (bv "medics") is veral belangrik hier.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Lavoie, João Ricardo. « A Scoring Model to Assess Organizations' Technology Transfer Capabilities : the Case of a Power Utility in the Northwest USA ». PDXScholar, 2019. https://pdxscholar.library.pdx.edu/open_access_etds/4995.

Texte intégral
Résumé :
This research intends to advance knowledge in the technology management field, most importantly in the study of organizations that develop technologies in-house and wish to enhance their technology transfer performance while maintaining adherence between R&D activities and overall business strategies. The objective was to build a multi-criteria decision-making model capable of producing a technology transfer score, which can be used by practitioners in order to assess and later improve their organizations' technology transfer capabilities -- ultimately aiming to improve technology development as a whole. The model was applied to a major power utility organization in the Pacific Northwest of the United States. The introduction brings initial and basic information on the topic, along with the problem statement -- this chapter is aimed at situating the reader on the boundaries of the topic while highlighting its importance within the technology management field of study. The second chapter is the literature review. It brings general and specific information on technology transfer, as well as its complexities, gaps, relationship with other fields and the characteristics of this topic within the energy realm. It also tries to shed a light on how the alignment between R&D and business strategy is perceived by the literature, discussing some of the methods used and its shortcomings. Additionally, the literature review brings an analysis that builds the argument in favor of a continuous technology transfer process, and tries to show how it would be helpful in aligning R&D and business strategy. The third chapter presents the methodological approach -- hierarchical decision modeling (HDM) aided by action research -- which constitutes a methodological novelty piloted and validated throughout the development of the study. The fourth chapter details the model development process step-by-step, and the fifth chapter details the model application process with the analysis of the aforementioned organization. Additionally, results are interpreted and analyzed, and insights for the specific case and for technology managers in general are discussed. Lastly, the contributions of the study towards the advancement of the body of knowledge are discussed, as well as the study limitations and future research opportunities.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Graf, Brolund Alice. « Compartmental Models in Social Dynamics ». Thesis, Uppsala universitet, Avdelningen för systemteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-448163.

Texte intégral
Résumé :
The dynamics of many aspects of social behaviour, such as spread of fads and fashion, collective action, group decision-making, homophily and disagreement, have been captured by mathematical models. The power of these models is that they can provide novel insight into the emergent dynamics of groups, e.g. 'epidemics' of memes, tipping points for collective action, wisdom of crowds and leadership by small numbers of individuals, segregation and polarisation. A current weakness in the scientific models is their sheer number. 'New' models are continually 'discovered' by physicists, engineers and mathematicians. The models are analysed mathematically, but very seldom provide predictions that can be tested empirically. In this work, we provide a framework of simple models, based on Lotka's original idea of using chemical reactions to describe social interactions. We show how to formulate models for social epidemics, social recovery, cycles, collective action, group decision-making, segregation and polarisation, which we argue encompass the majority of social dynamics models. We present an open-access tool, written in Python, for specifying social interactions, studying them in terms of mass action, and creating spatial simulations of model dynamics. We argue that the models in this article provide a baseline of empirically testable predictions arising from social dynamics, and that before creating new and more complicated versions of the same idea, researchers should explain how their model differs substantially from our baseline models.
Matematiska modeller kan hjälpa oss att förstå många typer av sociala fenomen, som ryktesspridning, spridning av memes, gruppbeslut, segregation och radikalisering. Det finns idag otaliga modeller för sociala beteenden hos människor och djur, och fler presenteras kontinuerligt. Det stora antalet modeller försvårar navigering inom forskningsfältet, och många av modellerna är dessutom komplicerade och svåra att verifiera genom experiment. I detta arbete föreslås ett ramverk av grundläggande modeller, som var och en modellerar en aspekt av socialt beteende; det gäller sociala epidemier, cykler, gemensamt handlande, gruppbeslut, segregation och polarisering. Vi menar att dessa modeller utgör majoriteten av de verifierbara aspekter av socialt beteende som studeras, och att de bör behandlas som en utgångspunkt när en ny modell ska introduceras. Vilka av mekanismerna från utgångspunkten finns representerade i modellen? Skiljer den sig ens nämnvärt från utgångspunkten? Genom att ha en god förståelse för grundmodellerna, och genom att förklara på vilket sätt en ny modell skiljer sig från dem, kan forskare undvika att presentera modeller som i praktiken är mer komplicerade varianter av sådana som redan finns. I detta arbete visar vi hur dessa grundläggande modeller kan formuleras och studeras. Modellerna bygger på enkla regler om vad som händer när individer i en befolkning möter varandra. Till exempel, om en person som har vetskap om ett rykte träffar någon som inte har det, kan ryktet spridas vidare. Därför har antaganden om vilka personer som kan träffa varandra stor påverkan på de resultat som modellerna ger. I detta arbete studeras varje modell med två olika metoder: i den ena har alla personer i befolkningen samma sannolikhet att träffa varandra, i den andra representeras befolkningen av ett rutnät, där varje plats motsvarar en individ. I den senare har alltså varje person ett begränsat antal grannar att interagera med. Vilken av dessa två metoder man väljer har stor betydelse för vilka beteenden modellerna förutspår. Som ett komplement till detta arbete presenteras ett verktyg i form av ett Python-program som utför analysen av modellerna. Detta kan användas för att undersöka grundmodellerna som presenteras i detta arbete, men också för att formulera och analysera nya modeller på samma sätt. På det viset kan nya modeller enkelt jämföras mot grundmodellerna. Verktyget är användbart både som introduktion för de som är nya inom social dynamik, men också för de forskare som som vill ta fram nya modeller och föra forskningsfältet vidare.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Ko, Hung-Tse. « Distribution system meta-models in an electronic commerce environment ». Ohio : Ohio University, 2001. http://www.ohiolink.edu/etd/view.cgi?ohiou1173977323.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
38

Soh, Boon Kee. « Validation of the recognition-primed decision model and the roles of common-sense strategies in an adversarial environment ». Diss., Virginia Tech, 2007. http://hdl.handle.net/10919/26173.

Texte intégral
Résumé :
This dissertation set out to understand the decision processes used by decision makers in adversarial environment by setting up an adversarial decision making microworld, as an experimental platform, using a real time strategy (RTS) game called Rise of Nations (RON). The specific objectives of this dissertation were: 1.Contribute to the validation of recognition-primed decision (RPD) model in a simulated adversarial environment; 2.Explore the roles of common-sense strategies in decision making in the adversarial environment; and 3.Test the effectiveness of training recommendations based on the RPD model. Three related experimental studies were setup to investigate each of the objectives. Study 1 found that RPD model was partly valid where RPD processes were prevalently used but other decision processes were also important in an adversarial environment. A new decision model (ConPAD model) was proposed to capture the nature of decision making in the adversarial environment. It was also found that cognitive abilities might have some effects on the types of decision processes used by the decision makers. Study 2 found that common-sense strategies were prevalent in the adversarial environment where the participants were able to use all but one of the warfare related strategies extracted from literature without teaching them. The strategy familiarization training was not found to significantly improve decision making but showed that common-sense strategies were prevalent and simple familiarization training was not sufficient to produce differences in strategy usage and performances from the novice participants. Study 3 also found that RPD based training (cue-recognition and decision skill training) were not significant in producing better performance although subjective feedback found such training to be useful. However, the participants with RPD based training conditions were able to perform on the same level as the expert participants bridging the gap between novices and experts. Based on the findings, it was recommended that decision training should involve not just RPD based training, but comparisons of attributes as well. A more interactive training combining common-sense strategies, cue-recognition and decision skill training might be more useful. More theoretical experimentation would be required to validate the new decision model proposed in this dissertation.
Ph. D.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Mutongo, Tongai. « Nascent technology ventures commercialization : A framework for capability development and business model transitions ». Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2023. https://ro.ecu.edu.au/theses/2678.

Texte intégral
Résumé :
This study examines the development of business models in early-stage technology startups and explores the impact of entrepreneurial decisions on the venture's structure and success. The research sheds light on the importance of adapting business models to address complexity and uncertainty and examines the use of strategic decision-making logics by technology ventures. The findings reveal that early-stage technology ventures employ a combination of bricolage and effectuation logics, transitioning to a hybrid approach, which leads to greater success for firms with strong adaptive capabilities. The study also highlights the role of learning capabilities in the success of platform-based technology startups and demonstrates the importance of hybrid decision-making, experimentation, and agile pivoting in the adaptability of the business model. The study provides new insights into the commercialization process, regulation, and leadership in the growth trajectory of early-stage technology ventures, emphasizing the importance of decision-making dynamics, business model adaptation, and dynamic capabilities for success. The author argues that the interplay between business model regulation and organizational learning is essential for the commercialization and scaling of platform-based technology ventures and that utilizing both local and distant searches is key to unlocking the bottlenecks in the commercialization process. Overall, this study provides valuable insights into the centrality of business models, capabilities, and leadership in the growth trajectory of early-stage technology ventures.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Marković, Dimitrije, et Stefan J. Kiebel. « Comparative Analysis of Behavioral Models for Adaptive Learning in Changing Environments ». Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-214867.

Texte intégral
Résumé :
Probabilistic models of decision making under various forms of uncertainty have been applied in recent years to numerous behavioral and model-based fMRI studies. These studies were highly successful in enabling a better understanding of behavior and delineating the functional properties of brain areas involved in decision making under uncertainty. However, as different studies considered different models of decision making under uncertainty, it is unclear which of these computational models provides the best account of the observed behavioral and neuroimaging data. This is an important issue, as not performing model comparison may tempt researchers to over-interpret results based on a single model. Here we describe how in practice one can compare different behavioral models and test the accuracy of model comparison and parameter estimation of Bayesian and maximum-likelihood based methods. We focus our analysis on two well-established hierarchical probabilistic models that aim at capturing the evolution of beliefs in changing environments: Hierarchical Gaussian Filters and Change Point Models. To our knowledge, these two, well-established models have never been compared on the same data. We demonstrate, using simulated behavioral experiments, that one can accurately disambiguate between these two models, and accurately infer free model parameters and hidden belief trajectories (e.g., posterior expectations, posterior uncertainties, and prediction errors) even when using noisy and highly correlated behavioral measurements. Importantly, we found several advantages of Bayesian inference and Bayesian model comparison compared to often-used Maximum-Likelihood schemes combined with the Bayesian Information Criterion. These results stress the relevance of Bayesian data analysis for model-based neuroimaging studies that investigate human decision making under uncertainty.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Loire, Cédric. « L’art de (ne pas) fabriquer : Évolution des modes de conception et de production de la sculpture, a l’ère de l’objet produit en masse, entre le milieu des années 1950 et le début des annees 1970, aux États-Unis ». Thesis, Tours, 2012. http://www.theses.fr/2012TOUR2035/document.

Texte intégral
Résumé :
L’analyse de la réception critique des nouvelles formes d’art apparaissant dès la fin des années 1950 et se développant au cours des années 1960, en particulier dans le champ de la sculpture et des œuvres en trois dimensions, constitue le socle de notre réflexion. Celle-ci vise à mettre en lumière les profondes évolutions que connaissent les processus de conception et de production des œuvres en trois dimensions, chez des artistes que la réception critique « à chaud » puis l’histoire de l’art ont séparés en fonction de critères stylistiques : néo-dada, pop, minimal… L’observation de ces déplacements de la pratique, intégrant des matériaux et des modes de production industriels (ou résistant à ces derniers) offre une autre approche des enjeux de l’art de cette période, qui voit s’éloigner la figure archétypale et héroïque du sculpteur moderniste incarnée par David Smith, et s’élaborer la figure nouvelle de l’artiste « post-studio ». Parallèlement, apparaissent de nouveaux soutiens, institutionnels, financiers et surtout techniques, pour les artistes produisant des œuvres en trois dimensions et délégant tout ou partie de la fabrication à des sociétés industrielles. Un nouveau type d’entreprise voit le jour, spécialisé dans la fabrication d’œuvres en trois dimensions et de sculptures monumentales. Au début des années 1970, les nouveaux modes de fabrication expérimentés durant la décennie précédente sont parfaitement intégrés à l’économie générale de l’art. En proposer une forme d’archéologie afin d’en comprendre les motivations initiales vise à mieux penser les enjeux actuels des pratiques artistiques ayant recours à la fabrication déléguée
The analysis of the critical reception of the new forms of art appearing from the end of the 1950s and developing during the 1960s, especially in the field of sculpture and tridimensional works, constitutes the foundation of our thought. It aims at bringing to light the profound shifts in the conception and production processes of the works in three dimensions, made by artists separated by the critical reception then the art history according to stylistic criteria : Neo-Dada, Pop, Minimal, and so on. To observe these displacements of the art practice, integrating industrials materials and means of production (or resisting them) offers another approach of the art stakes in this period, which sees the archetypal and heroic figure of the modernist sculptor (embodied by David Smith) fading, and elaborating the new figure of the post-studio artist. At the same time, new supports (institutional, financial and especially technical) appear for the artists producing works in three dimensions and delegating all or any of the manufacturing to industrial companies. A new type of company, specialized in the manufacturing of works in three dimensions and monumental sculptures, is born. In the early 1970s, the new means of manufacturing experienced during the previous decade are perfectly integrated into the general economy of art. To propose a kind of archeology of these means in order to understand the initial motivations aims at a better thinking of the current stakes in the artistic practices turning to delegated manufacturing processes
Styles APA, Harvard, Vancouver, ISO, etc.
42

Turcanu, Catrinel. « Multi-criteria decision aiding model for the evaluation of agricultural countermeasures after an accidental release of radionuclides to the environment ». Doctoral thesis, Universite Libre de Bruxelles, 2007. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210642.

Texte intégral
Résumé :
Multi-criteria decision aid has emerged from the operational research field as the answer given to a couple of important questions encountered in complex decisions problems. Firstly, as decision aiding tools, such methods do not replace the decision maker with a mathematical model, but support him to construct his solution by describing and evaluating his options. Secondly, instead of using a unique criterion capturing all aspects of the problem, in the multi-criteria decision aid methods one seeks to build multiple criteria, representing several points of view.

This work explores the application of multi-criteria decision aid methods for optimising food chain countermeasure strategies after a radioactive release to the environment.

The core of the thesis is dedicated to formulating general lines for the development of a multi-criteria decision aid model. This includes the definition of potential actions, construction of evaluation criteria and preference modelling and is essentially based on the results of a stakeholders’ process. The work is centred on the management of contaminated milk in order to provide a concrete focus and because of its importance as an ingestion pathway in short term after an accident.

Among other issues, the public acceptance of milk countermeasures as a key evaluation criterion is analysed in detail. A comparison of acceptance based on stochastic dominance is proposed and, based on that, a countermeasures’ acceptance ranking is deduced.

In order to assess “global preferences” taking into account all the evaluation criteria, an ordinal method is chosen. This method allows expressing the relative importance of criteria in a qualitative way instead of using, for instance, numerical weights. Some algorithms that can be used for robustness analysis are also proposed. This type of analysis is an alternative to sensitivity analysis in what concerns data uncertainty and imprecision and seeks to determine how and if a model result or conclusion obtained for a specific instance of a model’s parameters holds over the entire domain of acceptable values for these parameters.

The integrated multi-criteria decision aid approach proposed makes use of outranking and interactive methodologies and is implemented and tested through a number of case studies and prototype tools.
Doctorat en Sciences de l'ingénieur
info:eu-repo/semantics/nonPublished

Styles APA, Harvard, Vancouver, ISO, etc.
43

Marković, Dimitrije, et Stefan J. Kiebel. « Comparative Analysis of Behavioral Models for Adaptive Learning in Changing Environments ». Frontiers Research Foundation, 2016. https://tud.qucosa.de/id/qucosa%3A30009.

Texte intégral
Résumé :
Probabilistic models of decision making under various forms of uncertainty have been applied in recent years to numerous behavioral and model-based fMRI studies. These studies were highly successful in enabling a better understanding of behavior and delineating the functional properties of brain areas involved in decision making under uncertainty. However, as different studies considered different models of decision making under uncertainty, it is unclear which of these computational models provides the best account of the observed behavioral and neuroimaging data. This is an important issue, as not performing model comparison may tempt researchers to over-interpret results based on a single model. Here we describe how in practice one can compare different behavioral models and test the accuracy of model comparison and parameter estimation of Bayesian and maximum-likelihood based methods. We focus our analysis on two well-established hierarchical probabilistic models that aim at capturing the evolution of beliefs in changing environments: Hierarchical Gaussian Filters and Change Point Models. To our knowledge, these two, well-established models have never been compared on the same data. We demonstrate, using simulated behavioral experiments, that one can accurately disambiguate between these two models, and accurately infer free model parameters and hidden belief trajectories (e.g., posterior expectations, posterior uncertainties, and prediction errors) even when using noisy and highly correlated behavioral measurements. Importantly, we found several advantages of Bayesian inference and Bayesian model comparison compared to often-used Maximum-Likelihood schemes combined with the Bayesian Information Criterion. These results stress the relevance of Bayesian data analysis for model-based neuroimaging studies that investigate human decision making under uncertainty.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Neilan, Lourdes T. « Design and Implementation of a Data Model for the Prototype Monitor Assignment Support System ». Thesis, Ft. Belvoir Defense Technical Information Center, 1994. http://handle.dtic.mil/100.2/ADA288467.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
45

González, Ramírez Humberto. « Study of the choice behaviour of travellers in a transport network via a “simulation game” Travel time and bounded rationality in travellers’ route choice behaviour : a computer route choice experiment Unravelling travellers’ route choice behaviour at full-scale urban network by focusing on representative OD pairs in computer experiments ». Thesis, Lyon, 2020. http://www.theses.fr/2020LYSET008.

Texte intégral
Résumé :
L'objectif de cette thèse est de trouver des modèles de choix d'itinéraire qui évoluent au niveau du réseau, c'est-à-dire des modèles qui rapprochent les choix des voyageurs sur la diversité des situations rencontrées dans un réseau de transport. L'approche de cette thèse pour étudier le comportement des voyageurs dans les réseaux de transport passe par des expériences informatiques à grande échelle, pour lesquelles une plateforme nommée Mobility Decision Game (MDG) a été développée. Le MDG permet d'observer les choix des participants sur un ensemble diversifié de scénarios (paires OD et itinéraires) avec des conditions de circulation et des informations de temps de trajet variables. Dans cette thèse, les expériences se concentrent sur les choix d'itinéraire des trajets en voiture qui sont basés sur la carte de la ville de Lyon, France. Pour atteindre l'objectif de cette thèse, une méthodologie de recherche de couples OD représentatifs du réseau est tout d'abord proposée. Les paires OD représentatives sont utilisées dans les expériences de choix de route pour obtenir des modèles de choix qui se généralisent aux différentes configurations OD dans le réseau. Deuxièmement, les choix des participants aux expériences sont analysés du point de vue du comportement rationnel et borné, afin d'établir le principe qui décrit le mieux leurs choix. Enfin, les modèles de choix sont évalués en fonction de leur précision prédictive. Cette thèse fait partie d'un projet européen ERC intitulé MAGnUM: Approche de modélisation du trafic multi-échelle et multimodale pour la gestion durable de la mobilité urbaine
The objective of this thesis is to find route choice models that scale-up at network level, i.e., models that predict the choices of travellers over the diversity of situations found in a transport network. The approach in this thesis to investigate travellers' behaviour in transportation networks is through computer-based experiments at large scale, for which a platform named the Mobility Decision Game (MDG), has been developed. The MDG permits to observe the choices of the participants on a diverse set of scenarios (OD pairs and routes) with varying traffic conditions and travel time information. In this thesis, the experiments focus on the route choices of uni-modal car trips that are based on the map of the city of Lyon, France. To attain the objective of this thesis, firstly a methodology to find OD pairs that are representative of the network is proposed. The representative OD pairs are used in route choice experiments to obtain choice models that generalise to the various OD configurations in the network. Secondly, the choices of participants in the experiments are analysed from the rational and boundedly rational behaviour perspectives, in order to establish the principle that best describe their choices. Finally, the choice models are assessed in terms of their predictive accuracy. This thesis is part of a European ERC project entitled MAGnUM: Multiscale and Multimodal Traffic Modeling Approach for Sustainable Management of Urban Mobility
Styles APA, Harvard, Vancouver, ISO, etc.
46

Wilczkowski, Susanna. « The Pricing Decision Process in Software-as-a-Service Companies ». Thesis, Uppsala universitet, Företagsekonomiska institutionen, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-276464.

Texte intégral
Résumé :
This study examines various approaches used by companies providingsoftware-as-a-service (SaaS) in a business-to-business (B2B) environment to find a pricing strategy. To be able to meet competition in a global market, a good pricing strategy is vital. Pricing is an important part of marketing, which must be congruent with the company's overall objectives. Strategic pricing is made up of different factors represented in the strategic pricing pyramid, which is based on a value-based approach. It is paramount to know your customers and their preferences when designing a pricing strategy and selecting pricing models, price metrics, market segmentation, bundling, and price levels. After having estimated how much value a product or service creates for a customer, this must be communicated to potential customers in order to convince them to purchase your offering. Choosing the right pricing strategy is not a onetime occurrence, but an on-going process. In this qualitative study, three case studies are performed to tie theory to real world practise.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Koehn, Amy R. « To report or not report : a qualitative study of nurses' decisions in error reporting ». Thesis, Indiana University, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3665927.

Texte intégral
Résumé :

This qualitative study was successful in utilization of grounded theory methodology to ascertain nurses' decision-making processes following their awareness of having made a medical error, as well as how and/or if they corrected and reported the error. Significant literature documents the existence of medical errors; however, this unique study interviewed thirty nurses from adult intensive care units seeking to discover through a detailed interview process their individual stories and experiences, which were then analyzed for common themes. Common themes led to the development of a theoretical model of thought processes regarding error reporting when nurses made an error. Within this theoretical model are multiple processes that outline a shared, time-orientated sequence of events nurses encounter before, during, and after an error. One common theme was the error occurred during a busy day when they had been doing something unfamiliar. Each nurse expressed personal anguish at the realization she had made an error, she sought to understand why the error happened and what corrective action was needed. Whether the error was reported on or told about depended on each unit's expectation and what needed to be done to protect the patient. If there was no perceived patient harm, errors were not reported. Even for reported errors, no one followed-up with the nurses in this study. Nurses were left on their own to reflect on what had happened and to consider what could be done to prevent error recurrence. The overall impact of the process of and the recovery from the error led to learning from the error that persisted throughout her nursing career. Findings from this study illuminate the unique viewpoint of licensed nurses' experiences with errors and have the potential to influence how the prevention of, notification about and resolution of errors are dealt with in the clinical setting. Further research is needed to answer multiple questions that will contribute to nursing knowledge about error reporting activities and the means to continue to improve error-reporting rates.

Styles APA, Harvard, Vancouver, ISO, etc.
48

Theodoni, Panagiota. « Fluctuations in perceptual decisions : cortical microcircuit dynamics mediating alternations in conscious visual perception ». Doctoral thesis, Universitat Pompeu Fabra, 2014. http://hdl.handle.net/10803/145642.

Texte intégral
Résumé :
Fluctuations in perceptual decisions emerge when our brain confronts with ambiguous sensory stimuli. For instance, our perception alternates between two conflicting images when presented dichoptically to our eyes, allowing a dissociation of the sensory stimulation from the conscious visual perception, and therefore providing a gateway to consciousness. How does the brain work when it deals with such ambiguous sensory stimuli? We addressed this question theoretically by employing a biophysically realistic attractor network, by consistently reducing it to a four- variable rate- based model, and by extracting analytical expressions for second- order statistics. We considered human behavioral and macaque neurophysiological data collected when subjects were confronting with such ambiguities. Our results show the relevance of neuronal adaptation in perceptual decision making, as well as that it contributes to the speed- accuracy trade- off. Furthermore, our findings affirm that both noise and neural adaptation operate in balance during the fluctuating states of visual awareness and suggest that while adaptation in inhibition is not relevant for the perceptual alternations, it contributes to the brain dynamics at rest. Finally, we explain the observed neuronal noise- decorrelation during visual consciousness and provide insights on the long- standing question: where in the brain rivalry is resolved.
Les fluctuacions en les decisions perceptives sorgeixen quan el nostre cervell s'enfronta a estímuls sensorials ambigus. Per exemple, la nostra percepció alterna entre dues imatges contradictòries quan es presenten de forma dicòptica als nostres ulls, cosa que permet una dissociació de l'estimulació sensorial de la percepció visual conscient, i per tant proporciona una porta d'entrada a la consciència. Com funciona el cervell quan es tracta d'aquest tipus d'estímuls sensorials ambigus? Hem tractat aquesta qüestió de forma teòrica mitjançant l'ús d'una xarxa d'atractors biofísicament realista, reduint-la de forma consistent a un model de quatre variables basat en la freqüència, i extraient expressions analítiques pels estadístics de segon ordre. Hem emprat dades neurofisiològiques de comportament d'humans i macacos recollides quan els subjectes s'enfrontaven a aquest tipus d'ambigüitats. Els nostres resultats mostren la importància de l'adaptació neuronal en la presa de decisions perceptives i mostren la seva contribució a l'equilibri velocitat-precisió. D'altra banda, els nostres resultats confirmen que tant el soroll com l'adaptació neural operen en equilibri durant els estats fluctuants de consciència visual i suggereixen que, si bé l'adaptació en la inhibició no és rellevant per a les alternances de percepció, contribueix a la dinàmica del cervell en repòs. Finalment, expliquem la decorrelació del soroll neuronal observada durant la consciència visual i proporcionem noves idees en relació a l’antiga qüestió de en quin lloc del cervell es resol la rivalitat visual.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Shakeri, Shakib. « A mathematical modeling framework for scheduling and managing multiple concurrent tasks ». Thesis, 2002. http://hdl.handle.net/1957/31165.

Texte intégral
Résumé :
Occurrence of human error in highly complex systems, such as a cockpit, can be disastrous and/or overwhelmingly costly. Mismanagement of multiple concurrent tasks has been observed by researchers to be a type of repetitive human error in previous studies of accidents and incidents. This error may occur in the form of wrong selection of a strategy to attend to tasks, and/or wrong assessment of a task's priority at each moment. The desire to prevent such errors forms two essential questions: 1) Is there any (near) optimal method of managing multiple concurrent tasks? 2) How optimally do human operators manage these tasks? To answer the first question, operations research as it is applied to single machine scheduling was used. The operator was assumed to be a single resource that attended to different tasks, one at a time. To answer the second question, a software environment was developed to measure the human's multitasking performance, which was then compared with the answer to question one. In this research, the operator's quality of performance was maximized as opposed to the number of tasks accomplished, which was considered by previous researchers. A metaphor of 'Juggler and spinning plates' along with a graphic bar illustration was used to resemble an operator (a juggler) who manages several tasks (plates on vertical poles) concurrently. Several mixed (binary) integer-linear programming models were developed discretely over time. One model was selected and solved by the means of tabu search heuristic method. In tabu search, the significance of different initial solution finding mechanisms and different applications of long-term memory was investigated. A conjecturing method, within the tabu search, was introduced for solving problems with very large planning horizons. In all cases, tabu search gave good quality solutions in a much shorter time than branch-and-bound. Under five different scenarios, ten subjects were studied while managing multiple concurrent tasks in the software environment. None of the subjects could gain a score better than tabu search in any of the scenarios. Subjects' patterns of attendance to tasks were analyzed and compared against the pattern suggested by tabu search, and similarities/differences were identified.
Graduation date: 2003
Styles APA, Harvard, Vancouver, ISO, etc.
50

Yuan, Soe-Tsyr. « Knowledge-based decision model construction for hierarchical diagnosis and repair ». Thesis, 1994. http://hdl.handle.net/1957/35300.

Texte intégral
Résumé :
Knowledge-Based Model Construction (KBMC) has generated a lot of attention due to its importance as a technique for generating probabilistic or decision-theoretic models whose range of applicability in AI has been vastly increased. However, no one has tried to analyze the essential issues in KBMC, to determine if there exists a general efficient KBMC method for any problem domain, or to y identify the fruitful future research on KBMC. This research presents a unified framework for comparative analysis of KBMC systems identifying the essential issues in KBMC, showing that there is no such general efficient KBMC method, and listing the fruitful future research on KBMC. This thesis then presents a new KBMC mechanism for hierarchical diagnosis and repair. Diagnosis is formulated as a stochastic process and modeled using influence diagrams. In the best case using an abstraction hierarchy in problem-solving can yield an exponential speedup in search efficiency. However, this speedup assumes backtracking never occurs across abstraction levels. When this assumption fails, search may have to consider different abstract solutions before finding one that can be refined to a base solution, and, therefore, search efficiency is not necessarily improved. In this thesis, we present a decision model construction method for hierarchical diagnosis and repair. We show analytically and experimentally that our method always yields a significant speedup in search efficiency, and that hierarchies with smaller branching factors yield more significant efficiency gains. This thesis employs two causal pathways (functional and bridge fault) of domain knowledge in device trouble shooting, preventing either whole class of faults we will never be able to diagnose. Each causal pathway models the knowledge of adjacency and behavior within the corresponding interaction layer. Careful search of causal pathways allows us to restrict the search space of fault hypotheses at each time. We model this search among causal pathways decision-theoretically. Decision-theoretic control usually results in significant improvements over unaided human expert judgments. Furthermore, these improvements in performance are robust to substantial errors in the assessed costs and probabilities.
Graduation date: 1995
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie