To see the other types of publications on this topic, follow the link: DES modeling.

Dissertations / Theses on the topic 'DES modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'DES modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

SOUSA, HENRIQUE PRADO. "INTEGRATING INTENTIONAL MODELING TO PROCESS MODELING." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2012. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=19928@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
A modelagem de processos de negócio é utilizada por empresas que desejam documentar detalhes do fluxo de execução de seus processos, resultando em um documento rico em detalhes sobre o negócio. Este artefato também é utilizado pela Engenharia de Software para elicitação de requisitos de sistema. A modelagem intencional possui foco na modelagem de objetivos - definidos como metas e metas flexíveis - e registra as estratégias que podem ser seguidas por um ator de forma a melhor atender suas necessidades, mapeando tarefas e recursos necessários, além disso, também aborda as dependências entre atores. É importante que os modelos de processos de negócio estejam alinhados aos objetivos da organização de forma a prover fonte de informações confiável que gere consequentemente requisitos alinhados ao negócio. Diversas ferramentas estão disponíveis no mercado com o objetivo de apoiar a modelagem dos processos de negócio e dos objetivos organizacionais, entretanto, percebe-se que as soluções disponíveis ainda são incompletas quando se fala na integração de modelos de processos e modelo de objetivos e formas de verificação do alinhamento entre processos e objetivos organizacionais a partir da modelagem. Na arquitetura organizacional, processos de negócio e objetivos são intrinsecamente interdependentes, porém, as linguagens de modelagem atuais não oferecem recursos suficientes para tratar processos e objetivos de forma alinhada, uma vez que existem deficiências na integração entre a camada de modelagem de objetivos e a de processos. Assim, o uso do ferramental disponível que se apoia nessas linguagens e métodos dificulta sobremaneira a tarefa de identificar se os processos utilizados para gerar serviços e produtos, verdadeiramente atingem os objetivos da organização, bem como o impacto que as mudanças nos objetivos causariam nos processos de negócio. Neste trabalho integramos uma linguagem de modelagem de objetivos a uma linguagem de processos de negócio e provemos os elementos e métodos necessários para ampliar a capacidade de análise do alinhamento dos processos de negócio às estratégias organizacionais.
The business processes modeling is used by companies who wish to document details of the execution flow of their processes, resulting in a document rich in details about the business. This artifact is also used by the Software Engineering for system requirements elicitation. The intentional modeling is focused on objectives - defined as goals and softgoals - and registers the strategies that may be followed by an actor in a way to better meet their needs, mapping the tasks and resources needs, in addition, it also addresses the dependencies between actors. It is important that business processes models are aligned to the objectives of the organization in order to provide reliable information source that generates consequently requirements aligned to business. Several tools are available on the market in order to support the business processes and organizational objectives modeling, however, it’s possible to realize that the available solutions are still incomplete when it comes to the integration of process models and goals models and ways to check the alignment between organizational goals and processes using the models. In the organizational architecture, business processes and goals are intrinsically interdependent, however, the current modeling languages treat process and goals in a misaligned way, since there are deficiencies in the integration between the modeling layer of objectives and processes. Thus, the use of the available tools that supports these language and methods greatly complicates the task of identify if the processes used to generate products and services truly achieve the organizational goals as well as the impact of the changes in the goals would cause in business processes. In this paper we integrated a goal modeling language to a business processes modeling language and proved the elements and methods needed to expand the capacity of analysis of the alignment between the business processes and the organizational strategies.
APA, Harvard, Vancouver, ISO, and other styles
2

Hansen, Daniel L. "Modeling." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Queiroz, Eurico Tiago Justino. "Modelling Benguela niños using the regional oceanic modeling system (ROMS)." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/6499.

Full text
Abstract:
Includes bibliographical references (leaves 132-141).
Pierre Florenchie
This study is framed by three questions: firstly, could the Regional Oceanic Modelling System (ROMS) reproduce the seasonal cycle of the equatorial Atlantic? Secondly, what is the nature of the link between remote forcing in the western equatorial Atlantic and Benguela Niños/Niñas? Thirdly, what is the impact of these events on the equatorial Atlantic Ocean SST and circulation patterns? The results obtained suggest that the model is very sensitive to different wind stress forcing, particularly in respect of the impact on the mixed layer characteristics. As a result the equatorial upwelling is overestimated in both temporal and spatial scales.
APA, Harvard, Vancouver, ISO, and other styles
4

Andersson, Conny. "Design of the Modelica Library VehProLib with Non-ideal Gas Models in Engines." Thesis, Linköpings universitet, Fordonssystem, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-121817.

Full text
Abstract:
This thesis covers the reconstruction and the redesign of the modeling library VehProLib,which is constructed in the modeling language Modelica with help of the modeling toolWolfram SystemModeler. The design choices are discussed and implemented. This thesisalso includes the implementation of a turbocharger package and an initial study of the justificationof the ideal gas law in vehicle modeling. The study is made with help of Van derWaals equation of states as a reference of non-ideal gas model. It will be shown that for themean-value-engine-model, the usage of ideal gas law is justified.
APA, Harvard, Vancouver, ISO, and other styles
5

Milligan, Walter W. Jr. "Deformation modeling and constitutive modeling for anisotropic superalloys." Diss., Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/19922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Musunuri, Shravana Kumar. "Hybrid electric vehicle modeling in generic modeling environment." Master's thesis, Mississippi State : Mississippi State University, 2006. http://sun.library.msstate.edu/ETD-db/ETD-browse/browse.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Stollhoff, Rainer. "Modeling Prosopagnosia." Doctoral thesis, Universitätsbibliothek Leipzig, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-39600.

Full text
Abstract:
Prosopagnosia is defined as a profound deficit in facial identification which can be either acquired due to brain damage or is present from birth, i.e. congenital. Normally, faces and objects are processed in different parts of the inferotemporal cortex by distinct cortical systems for face vs. object recognition, an association of function and location. Accordingly, in acquired prosopagnosia locally restricted damage can lead to specific deficits in face recognition. However, in congenital prosopagnosia faces and objects are also processed in spatially separated areas. Accordingly, the face recognition deficit in congenital prosopagnosia can not be solely explained by the association of function and location. Rather, this observation raises the question why and how such an association evolves at all. So far, no quantitative or computational model of congenital prosopagnosia has been proposed and models of acquired prosopagnosia have focused on changes in the information processing taking place after in icting some kind of \damage" to the system. To model congenital prosopagnosia, it is thus necessary to understand how face processing in congenital prosopagnosia differs from normal face processing, how differences in neuroanatomical development can give rise to differences in processing and last but not least why facial identification requires a specialized cortical processing system in the first place. In this work, a computational model of congenital prosopagnosia is derived from formal considerations, implemented in artificial neural network models of facial information encoding, and tested in experiments with prosopagnosic subjects. The main hypothesis is that the deficit in congenital prosopagnosia is caused by a failure to obtain adequate descriptions of individual faces: A predisposition towards a reduced structural connectivity in visual cortical areas enforces descriptions of visual stimuli that lack the amount of detail necessary to distinguish a specific exemplar from its population, i.e. achieve a successful identification. Formally recognition tasks can be divided into identification tasks (separating a single individual from its sampling population) and classification tasks (partitioning the full object space into distinct classes). It is shown that a high-dimensionality in the sensory representation facilitates individuation (\blessing of dimensionality"), but complicates estimation of object class representations (\curse of dimensionality"). The dimensionality of representations is then studied explicitly in a neural network model of facial encoding. Whereas optimal encoding entails a \holistic" (high-dimensional) representation, a constraint on the network connectivity induces a decomposition of faces into localized, \featural" (low-dimensional) parts. In an experimental validation, the perceptual deficit in congenital prosopagnosia was limited to holistic face manipulations and didn't extend to featural manipulations. Finally, an extensive and detailed investigation of face and object recognition in congenital prosopagnosia enabled a better behavioral characterization and the identification of subtypes of the deficit. In contrast to previous models of prosopagnosia, here the developmental aspect of congenital prosopagnosia is incorporated explicitly into the model, quantitative arguments for a deficit that is task specific (identification) - and not necessarily domain specific (faces) - are provided for synthetic as well as real data (face images), and the model is validated empirically in experiments with prosopagnosic subjects.
APA, Harvard, Vancouver, ISO, and other styles
8

Дядечко, Алла Миколаївна, Алла Николаевна Дядечко, Alla Mykolaivna Diadechko, and V. O. Hlushchenko. "Computer modeling." Thesis, Видавництво СумДУ, 2011. http://essuir.sumdu.edu.ua/handle/123456789/13473.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Boström, Rikard, and Lars-Olof Moilanen. "Capacity profiling modeling for baseband applications." Thesis, Karlstad University, Faculty of Economic Sciences, Communication and IT, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-3352.

Full text
Abstract:

Real-time systems are systems which must produce a result within a given time frame. A result given outside of this time frame is as useless as not delivering any result at all. It is therefore essential to verify that real-time systems fulfill their timing requirements. A model of the system can facilitate the verification process. This thesis investigates two possible methods for modeling a real-time system with respect to CPU-utilization and latency of the different components in the system. The two methods are evaluated and one method is chosen for implementation.The studied system is the decoder of a WCDMA system which utilizes a real-time operating called system OSEck. The methodology of analyzing the system and different ways of obtaining measurements to base the model upon will be described. The model was implemented using the simulation library VirtualTime, which contains a model of the previously mentioned operating system. Much work was spent acquiring input for the model, since the quality of the model depends largely on the quality of the analysis work. The model created contains two of the studied systems main components.This thesis identifies thorough system knowledge and efficient profiling methods as the key success factors when creating models of real-time systems.

APA, Harvard, Vancouver, ISO, and other styles
10

Akhlagi, Ali. "A Modelica-based framework for modeling and optimization of microgrids." Thesis, KTH, Energiteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263037.

Full text
Abstract:
Microgrids have lately drawn much attention due to their considerable financial benefits and the increasing concerns about environmental issues. A solution that can address different engineering problems - from design to operation - is desired for practical reasons and to ensure consistency of the analyses. In this thesis, the capabilities of a Modelicabased framework is investigated for various microgrid optimization problems. Various sizing and scheduling problems are successfully formulated and optimized using nonlinear and physical component models, covering both electrical and thermal domains. Another focus of the thesis is to test the optimization platform when varying the problem formulation; performance and robustness tests have been performed with different boundary conditions and system setups. The results show that the technology can effectively handle complex scheduling strategies such as Model Predictive Control and Demand Charge Management. In sizing problems, although the platform can efficiently size the components while simultaneously solving for the economical load dispatch for short horizons (weekly or monthly), the implemented approach would require adaptations to become efficient on longer horizons (yearly).
APA, Harvard, Vancouver, ISO, and other styles
11

Shrestha, shilu. "Software Modeling in Cyber-Physical Systems." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-111435.

Full text
Abstract:
A Cyber-Physical System (CPS) has a tight integration of computation, networking and physicalprocess. It is a heterogeneous system that combines multi-domain consisting of both hardware andsoftware systems. Cyber subsystems in the CPS implement the control strategy that affects the physicalprocess. Therefore, software systems in the CPS are more complex. Visualization of a complex system provides a method of understanding complex systems byaccumulating, grouping, and displaying components of systems in such a manner that they may beunderstood more efficiently just by viewing the model rather than understanding the code. Graphicalrepresentation of complex systems provides an intuitive and comprehensive way to understand thesystem. OpenModelica is the open source development environment based on Modelica modeling andsimulation language that consists of several interconnected subsystems. OMEdit is one of the subsystemintegrated into OpenModelica. It is a graphical user interface for graphical modeling. It consists of toolsthat allow the user to create their own shapes and icons for the model. This thesis presents a methodology that provides an easy way of understanding the structure andexecution of programs written in the imperative language like C through graphical Modelica model.
APA, Harvard, Vancouver, ISO, and other styles
12

Daniel, Michael M. "Multiresolution statistical modeling with application to modeling groundwater flow." Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/10749.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1997.
Includes bibliographical references (p. 205-211).
by Michael M. Daniel.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
13

Xu, Yuanhan. "Modeling print time for a fused deposition modeling machine." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/114325.

Full text
Abstract:
Thesis: M. Eng. in Advanced Manufacturing and Design, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 58).
A feature-based time estimation model is developed to predict printing time of any given object on a fused deposition modeling machine. Preliminary experiments are conducted to detect the significant factors that cause the error between actual and estimated submission time. A proposed method models the movements and heating processes of the extruder nozzle and calculates print time using G-code. The result of the proposed model is also compared with the most downloaded print time simulator on Google Play and demonstrated that the proposed model significantly improves the estimation of print time. Recommendations for future steps to improve the accuracy of the model are also presented.
by Yuanhan Xu.
M. Eng. in Advanced Manufacturing and Design
APA, Harvard, Vancouver, ISO, and other styles
14

Shikur, H. (Henok). "Assessing modeling and visualization capabilities of modeling tools:limitations and gaps of the open source modeling tools." Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201502111072.

Full text
Abstract:
Due to the increasing number of Information Communication Technology (ICT) environments, security is becoming a concern for many researchers and organisations. Organisations have implemented different security measures to protect their assets. Different industries—such as power plants and water, oil, and gas utilities—are adapting different network modelling tools for guarding their assets and are preparing for incidents that might occur in the future. Modelling tools are very important for the visualisation of computer networks. There are currently many modelling tools with different modelling and visualisation capabilities for computer networks. The aim of this research is to make a thorough assessment of the different modelling tools’ capabilities of modelling computer networks and visualising computer network communication. Furthermore, it hopes to show areas for improvement in order to increase the quality of modelling tools based on industry requirements. The research methodology of this research takes the form of a case study. First, the study analyses previous research in order to illustrate gaps in the literature, as well as identifying the strengths and weaknesses of existing network modelling tools. The empirical part of the research includes first, studying and evaluating seven open-source modelling tools based on different types of capabilities, this may limit the generalisability of the findings to some extent; and second, selecting four modelling tools for further study. Once four modelling tools were evaluated based on literature reviews and the requirements set in this study, the top two open-source (OSS) modelling tool packages were selected, downloaded, installed, and evaluated further. The criteria set to evaluate the four modelling tools in this research are based on the requirements provided by the European company nSense, which provides different vulnerability assessments, security consulting, and training, and the existing literature. The evaluation of the tools resulted in the screens that were copied and presented in this document for verification. Finally, the one tool which was the most suitable for further studies, and which fulfilled most of the requirements set in this research, was recommended for further research. In total, four modelling tools were chosen for the evaluation, using different literature reviews based on the requirements (see Appendix A) in this research. The results showed that the two top modelling tools were OMNeT++ and IMUNES. After practical analysis of these tools, OMNeT++ was found to be the best tool based on the aims and requirements of this research. Further, the study found that usability problems played a large part in evaluating different modelling tools, which might have changed the outcomes of the result. It can therefore be concluded that this type of evaluation is highly dependent on the evaluator’s knowledge and skill, as well as the usability of the tool.
APA, Harvard, Vancouver, ISO, and other styles
15

Mamadapur, Monish Shivappa. "Constitutive modeling of fused deposition modeling acrylonitrile butadiene styrene (ABS)." [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-2545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Perry, Stephen J. "Modeling operations other than war : non-combatants in combat modeling /." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA285995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Coggin, David. "LIDAR IN COASTAL STORM SURGE MODELING: MODELING LINEAR RAISED FEATURES." Master's thesis, University of Central Florida, 2008. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3362.

Full text
Abstract:
A method for extracting linear raised features from laser scanned altimetry (LiDAR) datasets is presented. The objective is to automate the method so that elements in a coastal storm surge simulation finite element mesh might have their edges aligned along vertical terrain features. Terrain features of interest are those that are high and long enough to form a hydrodynamic impediment while being narrow enough that the features might be straddled and not modeled if element edges are not purposely aligned. These features are commonly raised roadbeds but may occur due to other manmade alterations to the terrain or natural terrain. The implementation uses the TauDEM watershed delineation software included in the MapWindow open source Geographic Information System to initially extract watershed boundaries. The watershed boundaries are then examined computationally to determine which sections warrant inclusion in the storm surge mesh. Introductory work towards applying image analysis techniques as an alternate means of vertical feature extraction is presented as well. Vertical feature lines extracted from a LiDAR dataset for Manatee County, Florida are included in a limited storm surge finite element mesh for the county and Tampa Bay. Storm surge simulations using the ADCIRC-2DDI model with two meshes, one which includes linear raised features as element edges and one which does not, verify the usefulness of the method.
M.S.
Department of Civil and Environmental Engineering
Engineering and Computer Science
Civil Engineering MS
APA, Harvard, Vancouver, ISO, and other styles
18

Constantino, Carlos Augusto. "Hydraulic actuation system modeling: an analysis of high frequency modeling." Instituto Tecnológico de Aeronáutica, 2010. http://www.bd.bibl.ita.br/tde_busca/arquivo.php?codArquivo=1307.

Full text
Abstract:
The objective of this work was to develop a high fidelity model representative up to high frequencies of a Flight Control System with hydraulic actuation on active-active mode. The usage of a Fly-By-Wire architecture and hydraulic actuation system on active-active mode has brought new engineering challenges like the force-fight between actuators and its structure fatigue life consumption on normal and failure scenarios such as oscillatory mal-functions. Once that these failure modes can exist up to high frequencies, it makes necessary the development of a high fidelity model of a flight control system representative up to high frequencies. The model herein developed has a high fidelity model of an EHSV, hydraulic actuator, a position loop and the control surface, as well as other models complementary in order that it can be possible to analyze the whole system up to high frequencies. It was analyzed the performance of the model at step input response and frequency response, showing to be a model close to the expected response of a real system. Also it was analyzed the frequency response of its components showing to be a representative model up to high frequencies. Besides the performance analysis, it was studied the behavior on a oscillatory mal-function scenario, showing the expected level of structure load, as well as its fatigue life consumption, showing the need to monitor these types of failures.
APA, Harvard, Vancouver, ISO, and other styles
19

Singh, Raymond Charan. "Modeling Energy Harvesting From Membrane Vibrations using Multi-physics Modeling." Thesis, Virginia Tech, 2012. http://hdl.handle.net/10919/76793.

Full text
Abstract:
Given the ever-growing need for device autonomy and renewable sources of energy, energy harvesting has become an increasingly popular field of research. This research focuses on energy harvesting using the piezoelectric effect, from vibrating membrane structures by converting mechanical energy into electric energy. Specific applications of this research include powering components of bio-inspired micro air vehicles (MAVs), which require long range with as little regular maintenance as possible, and powering sensors for structural health monitoring on otherwise inaccessible locations (the roof of the Denver Int'l Airport is a good example). Coming up with an efficient, high-fidelity model of these systems allows for design optimization without the extensive use of experimental testing, as well as a deeper understanding of the physics involved. These are the twin goals of this research. This work describes a modeling algorithm using COMSOL, a multi-physics software, to predict the structural mechanics of and subsequent power harvested from a piezoelectric patch placed on a prestressed membrane structure. The model is verified by an FE comparison of the modeled system's dynamic response. For a 0.5 x 0.5 x 0.001 m nylon membrane with a 0.1 x 0.1 x 0.001 m piezoelectric patch placed on its corner, a maximum power output of ~10 microwatts was achieved, using a resistance of 100 Ohms and exciting the system around resonance. When the patch was placed on the side of the membrane, the power output was ~100 milliwatts. The ultimate goal is to estimate the energy harvested by a network of these piezoelectric patches and optimize the harvesting system based on the size, shape and location of the patches.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
20

DESHMUKH, PUSHKARAJ M. "MODELING ERROR ESTIMATION AND ADAPTIVE MODELING OF FUNCTIONALLY GRADED MATERIALS." University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1096036755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Hafezi, Mohammad Hadi, and Mohammad Hadi Hafezi. "Peridynamic Modeling and Extending the Concept to Peri-Ultrasound Modeling." Diss., The University of Arizona, 2017. http://hdl.handle.net/10150/625456.

Full text
Abstract:
In this dissertation, a novel fast modeling technique called peri-ultrasound that can model both linear and nonlinear ultrasonic behavior of materials is developed and implemented. Nonlinear ultrasonic response can detect even very small material non- linearity. Quantification of the material nonlinearity at the early stages of damage is important to avoid catastrophic failure and reduce repair costs. The developed model uses the nonlocal continuum-based peridynamic theory which was found to be a good simulation tool for handling crack propagation modeling, in particular when multiple cracks grow simultaneously. The developed peri-ultrasound modeling tool has been used to model the ultrasonic response at the interface of two materials in presence of an interface crack. Also, the stress wave propagation in a half-space (or half-plane for a 2-dimensional problem) with boundary loading is investigated using peri-ultrasound modeling. In another simulation, well-established two-dimensional Lamb's problem is investigated where the results are verified against available analytical solution. Also, the interaction between the surface wave and a surface breaking crack is studied.
APA, Harvard, Vancouver, ISO, and other styles
22

Polk, Brandon. "Modeling Naturally Occurring Wildfires Across the US Using Niche Modeling." OpenSIUC, 2017. https://opensiuc.lib.siu.edu/theses/2163.

Full text
Abstract:
Wildfires can cause significant damage to an area by destroying forested and agricultural areas, homes, businesses, and leading to the potential loss of life. Climate change may further increase the frequency of wildfires. Thus, developing a quick, simple, and accurate method for identifying key drivers that cause wildfires and modeling and predicting their occurrence becomes very important and urgent. Various modeling methods have been developed and applied for this purpose. The objective of this study was to identify key drivers and search for an appropriate method for modeling and predicting natural wildfire occurrence for the United States. In this thesis, various vegetation, topographic and climate variables were examined and key drivers were identified based on their spatial distributions and using their correlations with natural wildfire occurrence. Five models including General Linearized Models (GLM) with Binomial and Poisson distribution, MaxEnt, Random Forests, Artificial Neural Networks, and Multiple Adaptive Regression Splines, were compared to predict natural wildfire occurring for seven different climate regions across the United States. The comparisons were conducted using three datasets including LANDFIRE consisting of thirteen variables including characteristics of vegetation, topography and disturbance, BIOCLIM containing climate variables such as temperature and precipitation, and composite data that combine the most important variables from LANDFIRE and BIOCLIM after the multicollinearity test of the variables done using variance inflation factor (VIF). This results of this study showed that niche modeling techniques such as MaxEnt, GLM with logistic regression (LR), and binomial distribution were an appropriate choice for modeling natural wildfire occurrence. MaxEnt provided highly accurate predictions of natural wildfire occurrence for most of seven different climate regions across the United States. This implied that MaxEnt offered a powerful solution for modeling natural wildfire occurrence for complex and highly specialized systems. This study also showed that although MaxEnt and GLM were quite similar, both models produced very different spatial distributions of probability for natural wildfire occurrence in some regions. Moreover, it was found that natural wildfire occurrence in the western regions was more influenced by precipitation and drought conditions while in the eastern regions the natural wildfire occurrence was more affected by extreme temperature.
APA, Harvard, Vancouver, ISO, and other styles
23

Depman, Albert J. III. "Stoker boiler CFD modeling improvements through alternative heat exchanger modeling." Thesis, University of Iowa, 2014. https://ir.uiowa.edu/etd/4609.

Full text
Abstract:
Accurate models and realistic simulations are essential in developing cleaner and more efficient coal- and biomass-fired boilers. Using the CFD simulation software Fluent The University of Iowa created a model of an industrial boiler that adequately compares the practice of co-firing biomass and coal against firing only coal. The simulations used in this comparison, show significant circulation zones and an unrealistic temperature profile inside the boiler heat exchanger region. This model is effective for comparing the relative decrease in emissions when co-firing with biomass versus exclusively coal combustion, but it does not present a realistic simulation of biomass or coal combustion. The purpose of the current work is to develop a more realistic baseline coal combustion model. Calculations for the proximate and ultimate analysis of coal, as well as properties necessary for energy and mass flux computations, have been updated in the current model. The fuel bed model - a simple two-dimensional distribution of energy and mass fluxes from the grate - was kept the same due to the complexities of fuel bed modeling. Simulation boundary conditions and flow models were tested and modified to determine the most realistic model settings. The geometry and mesh grid of the boiler model were also varied in an attempt to fix problematic areas. Several approaches were implemented in an effort to reduce the circulation zones and generate a realistic temperature profile. The negative energy source term in the boiler representing the energy removed by the water pipes in the heat exchanger was analyzed, and different configurations of this sink were tested. Finally, the heat exchanger models built in to Fluent were studied and implemented. These models proved to be the most effective in reducing recirculation zones and decreasing high temperature gradients. While the current model of the coal-fired boiler has a higher overall temperature than the previous one, circulation zones are almost completely eliminated, the flow path has been improved, and the temperature profile in the boiler is more realistic.
APA, Harvard, Vancouver, ISO, and other styles
24

Peña, Reyes Carlos Andrés. "Coevolutionary fuzzy modeling /." [S.l.] : [s.n.], 2002. http://library.epfl.ch/theses/?display=detail&nr=2634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Nazarian, Bamshad. "Integrated Field Modeling." Doctoral thesis, Norwegian University of Science and Technology, Faculty of Engineering Science and Technology, 2003. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-84.

Full text
Abstract:

This research project studies the feasibility of developing and applying an integrated field simulator to simulate the production performance of an entire oil or gas field. It integrates the performance of the reservoir, the wells, the chokes, the gathering system, the surface processing facilities and, whenever applicable, gas and water injection systems.

The approach adopted for developing the integrated simulator is to couple existing commercial reservoir and process simulators using available linking technologies. The simulators are dynamically linked and customized into a single hybrid application that benefits from the concept of open software architecture. The integrated field simulator is linked to an optimization routine developed based on the genetic algorithm search strategies. This enables optimization of the system at field level, from the reservoir to the process. Modeling the wells and the gathering network is achieved by customizing the process simulator.

This study demonstrates that the integrated simulation improves currentcapabilities to simulate the performance of an entire field and optimize its design. This is achieved by evaluating design options including spread and layout of the wells and gathering system, processing alternatives, reservoir development schemes, and production strategies.

Effectiveness of the integrated simulator is demonstrated and tested through several field-level case studies that discuss and investigate technical problems relevant to offshore field development. The case studies cover topics such as process optimization, optimum tie-in of satellite wells into existing process facilities, optimal well location, and field layout assessment of a high pressure high temperature deepwater oil field.

Case study results confirm the viability of the total field simulator by demonstrating that the field performance simulation and optimal design were obtained in an automated process with reasonable computation time. No significant simplifying assumptions were required to solve the system and tedious manual data transfer between simulators, as conventionally practiced, was avoided.

APA, Harvard, Vancouver, ISO, and other styles
26

Ozkan, Sule. "Modeling Elementary Students." Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/2/12610168/index.pdf.

Full text
Abstract:
This study aimed to explore the relationships between elementary students&rsquo
epistemological beliefs, learning approaches, self-regulated learning strategies, and their science achievement. In this investigation, a model of the potential associations among these variables was proposed and tested by using structural equation modeling. It was hypothesized that (a) students&rsquo
epistemological beliefs would directly influence their learning approaches, self-regulated learning strategies, and science achievement, (b) students&rsquo
adopted learning approaches and their use of self-regulated learning strategies would be related with science achievement, and (c) students&rsquo
learning approaches were expected to be related with their use of self-regulated strategies. A total of 1240 seventh graders from 21 public elementary schools throughout the Ç
ankaya district of Ankara completed measures designed to assess students&rsquo
(a) epistemological beliefs (beliefs about the Certainty of Knowledge, Development of Knowledge, Source of Knowing, and Justification for Knowing) (b) adopted learning approaches (either rote or meaningful), (c) use of self-regulated learning strategies, and (d) science achievement. Separate confirmatory factor analyses were conducted to determine the structure of students&rsquo
epistemological beliefs and their adopted learning approaches. While the factor analyses of students&rsquo
responses to the epistemological beliefs questionnaire supported the multidimensional nature of these beliefs, some features distinct from the findings of the Western countries were identified. Socio-cultural influences were proposed to account for the observed differences in the factor structure obtained with the Turkish sample. The results of the structural equation modeling while supporting some of the proposed hypotheses, contradicted with others. Epistemological beliefs emerged as a major contributor to learning approaches and science achievement as expected, whereas those beliefs can not be used as a predictor of self-regulated learning strategies. In addition, students&rsquo
adopted learning approaches were found to be a predictor of their self-regulated learning strategies which in turn influence the science achievement in the model. Contrary to the expectations, learning approaches of the students were not found to be directly related with their science achievement.
APA, Harvard, Vancouver, ISO, and other styles
27

Fullerton, A. W. "Discussion: Spectral modeling." Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1791/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Vink, J. S. "Discussion: Hydrodynamic modeling." Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1804/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Dingle, Brent Michael. "Volumetric particle modeling." Texas A&M University, 2003. http://hdl.handle.net/1969.1/5937.

Full text
Abstract:
This dissertation presents a robust method of modeling objects and forces for computer animation. Within this method objects and forces are represented as particles. As in most modeling systems, the movement of objects is driven by physically based forces. The usage of particles, however, allows more artistically motivated behavior to be achieved and also allows the modeling of heterogeneous objects and objects in different state phases: solid, liquid or gas. By using invisible particles to propagate forces through the modeling environment complex behavior is achieved through the interaction of relatively simple components. In sum, 'macroscopic' behavior emerges from 'microscopic' modeling. We present a newly developed modeling framework expanding on related work. This framework allows objects and forces to be modeled using particle representations and provides the details on how objects are created, how they interact, and how they may be displayed. We present examples to demonstrate the viability and robustness of the developed method of modeling. They illustrate the breaking and fracturing of solids, the interaction of objects in different phase states, and the achievement of a reasonable balance between artistic and physically based behaviors.
APA, Harvard, Vancouver, ISO, and other styles
30

Jansson, Johan. "Automated Computational Modeling." Doctoral thesis, Chalmers University of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-52828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Jöhnemark, Alexander. "Modeling Operational Risk." Thesis, KTH, Matematisk statistik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-107435.

Full text
Abstract:
The Basel II accord requires banks to put aside a capital buffer against unexpected operational losses, resulting from inadequate or failed internal processes, people and systems or from external events. Under the sophisticated Advanced Measurement Approach banks are given the opportunity to develop their own model to estimate operational risk.This report focus on a loss distribution approach based on a set of real data. First a comprehensive data analysis was made which suggested that the observations belonged to a heavy tailed distribution. An evaluation of commonly used distributions was performed. The evaluation resulted in the choice of a compound Poisson distribution to model frequency and a piecewise defined distribution with an empirical body and a generalized Pareto tail to model severity. The frequency distribution and the severity distribution define the loss distribution from which Monte Carlo simulations were made in order to estimate the 99.9% quantile, also known as the the regulatory capital. Conclusions made on the journey were that including all operational risks in a model is hard, but possible, and that extreme observations have a huge impact on the outcome.
APA, Harvard, Vancouver, ISO, and other styles
32

Chaqchaq, Othmane. "Fixed Income Modeling." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-192372.

Full text
Abstract:
Besides financial analysis, quantitative tools play a major role in asset management. By managing the aggregation of large amount of historical and prospective data on different asset classes, it can give portfolio allocation solution with respect to risk and regulatory constraints. Asset class modeling requires three main steps, the first one is to assess the product features (risk premium and risks) by considering historical and prospective data, which in the case of fixed income depends on spread and default levels. The second is choosing the quantitative model, in this study we introduce a new credit model, which unlike equity like models, model default as a main feature of fixed income performance. The final step consists on calibrating the model. We start in this study with the modeling of bond classes and study its behavior in asset allocation, we than model the capital solution transaction as an example of a fixed income structured product.
Förutom finansiell analys, kvantitativa verktyg spelar en viktig roll i kapitalförvaltningen också. Genom att hantera sammanläggning av stora mängder historiska och framtida uppgifter om olika tillgångsklasser kan dessa verktyg ge placeringslösning med avseende på risk och regulatoriska begränsningar. Tillgångsklass modellering kräver tre huvudsteg: Den första är att utvärdera produktens funktioner (riskpremie och risker) genom att beakta historiska och framtida uppgifter, som i fallet med fast inkomst beror på spridning och normalnivåer. Den andra är att välja den kvantitativa modellen. I denna studie presenterar vi en ny kreditmodell, som till skillnad från aktieliknande modeller, utformar "standard" som det viktigaste inslaget i Fixed Income prestanda. Det sista steget består i att kalibrera modellen. Vi börjar denna studie med modellering av obligationsklasser och med att studera dess beteende i tillgångsallokering. Sedan, modellerar vi kapital lösning transaktionen som ett exempel på en fast inkomst strukturerad produkt.
APA, Harvard, Vancouver, ISO, and other styles
33

Forde, Cameron E. "Modeling biological iron." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape16/PQDD_0021/NQ27644.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Aguenaou, Karim. "Modeling of solidification." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ36950.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Sargent, Aitbala. "Modeling Ice Streams." Fogler Library, University of Maine, 2009. http://www.library.umaine.edu/theses/pdf/SargentA2009.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Heckmann, Dominikus. "Ubiquitous user modeling." Berlin Aka, 2005. http://deposit.d-nb.de/cgi-bin/dokserv?id=2860787&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Heckmann, Dominikus. "Ubiquitous user modeling /." Berlin : Akad. Verl.-Ges. Aka, 2006. http://deposit.d-nb.de/cgi-bin/dokserv?id=2860787&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Walås, Gustav. "Modeling deposit prices." Thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-122306.

Full text
Abstract:
Thisreport investigates whether there are sufficient differences between a bank'sdepositors to motivate price discrimination. This is done by looking at timeseries of individual depositors to try to find predictors by a regressionanalysis. To be able to conclude on the value of more stable deposits for thebank and hence deduce a price, one also needs to look at regulatory aspects ofdeposits and different depositors. Once these qualities of a deposit have beenassigned by both the bank and regulator, they need to be transformed into aprice. This is done by replicationwith market funding instruments.
Denna studie syftar till att kartlägga eventuella skillnader mellan insättare i en bank för att kunna avgöra om dessa skillnader motiverar olika räntor. Genom att analysera tidsserier av insatta belopp och göra en regressionsanalys fastställs eventuella skillnader. Bankinsättningar påverkas även i hög grad av olika regleringar varför även effekterna av dessa ingår i studien.  För att kunna få fram ett värde på insättningarna replikeras sedan dessa under givna kriterier med olika skuldinstrument.
APA, Harvard, Vancouver, ISO, and other styles
39

Telang, Pankaj Ramesh. "Multiagent Business Modeling." Thesis, North Carolina State University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3575893.

Full text
Abstract:

Cross-organizational business processes are common in today’s economy. Of necessity, enterprises conduct their business in cooperation to create products and services for the marketplace. Thus business processes inherently involve autonomous partners with heterogeneous software designs and implementations. The existing business modeling approaches that employ high-level abstractions are difficult to operationalize, and the approaches that employ low-level abstractions lead to highly rigid processes that lack business semantics. We propose a novel business model based on multiagent abstractions. Unlike existing approaches, our model gives primacy to the contractual relationships among the business partners, thus providing a notion of business-level correctness, and offers flexibility to the participants. Our approach employs reusable patterns as building blocks to model recurring business scenarios. A step-by-step methodology guides a modeler in constructing a business model. Our approach employs temporal logic to formalize the correctness properties of a business model, and model checking to verify if a given operationalization satisfies those properties. Developer studies found that our approach yields improved model quality compared to the traditional approaches from the supply chain and healthcare domains.

Commitments capture how an agent relates with another agent, whereas goals describe states of the world that an agent is motivated to bring about. It makes intuitive sense that goals and commitments be understood as being complementary to each other. More importantly, an agent’s goals and commitments ought to be coherent, in the sense that an agent’s goals would lead it to adopt or modify relevant commitments and an agent’s commitments would lead it to adopt or modify relevant goals. However, despite the intuitive naturalness of the above connections, they have not yet been studied in a formal framework. This dissertation provides a combined operational semantics for goals and commitments. Our semantics yields important desirable properties, including convergence of the configurations of cooperating agents, thereby delineating some theoretically well-founded yet practical modes of cooperation in a multiagent system.

We formalize the combined operational semantics of achievement commitments and goals in terms of hierarchical task networks (HTNs) and show how HTN planning provides a natural representation and reasoning framework for them. Our approach combines a domain-independent theory capturing the lifecycles of goals and commitments, generic patterns of reasoning, and domain models. We go beyond existing approaches by proposing a first-order representation that accommodates settings where the commitments and goals are templatic and may be applied repeatedly with differing bindings for domain objects. Doing so not only leads to a more perspicuous modeling, it also enables us to support a variety of practical patterns.

APA, Harvard, Vancouver, ISO, and other styles
40

White, Scott Alan. "Modeling process redesign." Thesis, Monterey, California. Naval Postgraduate School, 1992. http://hdl.handle.net/10945/23967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Marinsalta, Daniel Alberto. "Optical excisor modeling." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/26131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Hart, Justin Wildrick. "Robot Self-Modeling." Thesis, Yale University, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3582284.

Full text
Abstract:

Traditionally, models of a robot's kinematics and sensors have been provided by designers through manual processes. Such models are used for sensorimotor tasks, such as manipulation and stereo vision. However, these techniques often yield static models based on one-time calibrations or ideal engineering drawings; models that often fail to represent the actual hardware, or in which individual unimodal models, such as those describing kinematics and vision, may disagree with each other.

Humans, on the other hand, are not so limited. One of the earliest forms of self-knowledge learned during infancy is knowledge of the body and senses. Infants learn about their bodies and senses through the experience of using them in conjunction with each other. Inspired by this early form of self-awareness, the research presented in this thesis attempts to enable robots to learn unified models of themselves through data sampled during operation. In the presented experiments, an upper torso humanoid robot, Nico, creates a highly-accurate self-representation through data sampled by its sensors while it operates. The power of this model is demonstrated through a novel robot vision task in which the robot infers the visual perspective representing reflections in a mirror by watching its own motion reflected therein.

In order to construct this self-model, the robot first infers the kinematic parameters describing its arm. This is first demonstrated using an external motion capture system, then implemented in the robot's stereo vision system. In a process inspired by infant development, the robot then mutually refines its kinematic and stereo vision calibrations, using its kinematic structure as the invariant against which the system is calibrated. The product of this procedure is a very precise mutual calibration between these two, traditionally separate, models, producing a single, unified self-model.

The robot then uses this self-model to perform a unique vision task. Knowledge of its body and senses enable the robot to infer the position of a mirror placed in its environment. From this, an estimate of the visual perspective describing reflections in the mirror is computed, which is subsequently refined over the expected position of images of the robot's end-effector as reflected in the mirror, and their real-world, imaged counterparts. The computed visual perspective enables the robot to use the mirror as an instrument for spacial reasoning, by viewing the world from its perspective. This test utilizes knowledge that the robot has inferred about itself through experience, and approximates tests of mirror use that are used as a benchmark of self-awareness in human infants and animals.

APA, Harvard, Vancouver, ISO, and other styles
43

Xiao, Hui. "MIMO channel modeling." Thesis, University of York, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.479187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Carr, Justin P. "Modeling Volatility Derivatives." Digital WPI, 2011. https://digitalcommons.wpi.edu/etd-theses/1117.

Full text
Abstract:
"The VIX was introduced in 1993 by the CBOE and has been commonly referred to as the fear gauge due to decreases in market sentiment leading market participants to purchase protection from declining asset prices. As market sentiment improves, declines in the VIX are generally observed. In reality the VIX measures the markets expectations about future volatility with asset prices either rising or falling in value. With the VIX gaining popularity in the marketplace a proliferation of derivative products has emerged allowing investors to trade volatility. In observance of the behavior of the VIX we attempt to model the derivative VXX as a mean reverting process via the Ornstein-Uhlenbeck stochastic differential equation. We extend this analysis by calibrating VIX options with observed market prices in order to extract the market density function. Using these parameters as the diffusion process in our Ornstein-Uhlenbeck model we derive futures prices on the VIX which serves to value our target derivative VXX."
APA, Harvard, Vancouver, ISO, and other styles
45

Szummer, Marcin Olof. "Temporal texture modeling." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11210.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.
Includes bibliographical references (p. 63-67).
by Marcin Olof Szummer.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
46

Mendel, Lucy (Lucy R. ). "Modeling By Example." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/45974.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2007.
Includes bibliographical references (p. 105-109).
Software developers use modeling to explore design alternatives before investing in the higher costs of building the full system. Unlike constructing specific examples, constructing general models is challenging and error-prone. Modeling By Example (MBE) is a new tool designed to help programmers construct general models faster and without errors. Given an object model and an acceptable, or included, example, MBE generates near-hit and near-miss examples for the user to mark as included or not by their mental goal model. The marked examples form a training data-set from which MBE constructs the user's general model. By generating examples dynamically to direct its own learning, MBE learns the concrete goal model with a significantly smaller training data set size than conventional instance-based learning techniques. Empirical experiments show that MBE is a practical solution for constructing simple structural models, but even with a number of optimizations to improve performance does not scale to learning complex models.
by Lucy Mendel.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
47

Chou, Danielle 1981. "Dahl friction modeling." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/32826.

Full text
Abstract:
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2004.
Includes bibliographical references (p. 22).
The drive behind improved friction models has been better prediction and control of dynamic systems. The earliest model was of classical Coulomb friction; however, the discontinuity during force reversal of the Coulomb friction model has long been a point of contention since such a discontinuity does not accurately portray the behavior of real systems. Other models have been suggested, but variations of the Dahl solid friction model remain some of the simplest yet most useful. Dahl's original theory proposed that friction behaved as a stress acting upon the quantum mechanical bonds at the interface. Thus, the relationship between frictional force and position would be analogous to a stress-strain curve, complete with hysteresis should there be permanent displacement akin to plastic deformation in materials. This project reviews the variations of Dahl friction models popular in the literature and then demonstrates it both analytically via Matlab and Simulink simulations and experimentally by observing the behavior of a limited angle torque motor.
by Danielle Chou.
S.B.
APA, Harvard, Vancouver, ISO, and other styles
48

Chen, Guoqiang. "Fuel volatility modeling." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/12282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Sim, Nicholas. "Modeling Quantile Dependence." Thesis, Boston College, 2009. http://hdl.handle.net/2345/2467.

Full text
Abstract:
Thesis advisor: Zhijie Xiao
In recent years, quantile regression has achieved increasing prominence as a quantitative method of choice in applied econometric research. The methodology focuses on how the quantile of the dependent variable is influenced by the regressors, thus providing the researcher with much information about variations in the relationship between the covariates. In this dissertation, I consider two quantile regression models where the information set may contain quantiles of the regressors. Such frameworks thus capture the dependence between quantiles - the quantile of the dependent variable and the quantile of the regressors - which I call models of quantile dependence. These models are very useful from the applied researcher's perspective as they are able to further uncover complex dependence behavior and can be easily implemented using statistical packages meant for standard quantile regressions. The first chapter considers an application of the quantile dependence model in empirical finance. One of the most important parameter of interest in risk management is the correlation coefficient between stock returns. Knowing how correlation behaves is especially important in bear markets as correlations become unstable and increase quickly so that the benefits of diversification are diminished especially when they are needed most. In this chapter, I argue that it remains a challenge to estimate variations in correlations. In the literature, either a regime-switching model is used, which can only estimate correlation in a finite number of states, or a model based on extreme-value theory is used, which can only estimate correlation between the tails of the returns series. Interpreting the quantile of the stock return as having information about the state of the financial market, this chapter proposes to model the correlation between quantiles of stock returns. For instance, the correlation between the 10th percentiles of stock returns, say the U.S. and the U.K. returns, reflects correlation when the U.S. and U.K. are in the bearish state. One can also model the correlation between the 60th percentile of one series and the 40th percentile of another, which is not possible using existing tools in the literature. For this purpose, I propose a nonlinear quantile regression where the regressor is a conditional quantile itself, so that the left-hand-side variable is a quantile of one stock return and the regressor is a quantile of the other return. The conditional quantile regressor is an unknown object, hence feasible estimation entails replacing it with the fitted counterpart, which then gives rise to problems in inference. In particular, inference in the presence of generated quantile regressors will be invalid when conventional standard errors are used. However, validity is restored when a correction term is introduced into the regression model. In the empirical section, I investigate the dependence between the quantile of U.S. MSCI returns and the quantile of MSCI returns to eight other countries including Canada and major equity markets in Europe and Asia. Using regression models based on the Gaussian and Student-t copula, I construct correlation surfaces that reflect how the correlations between quantiles of these market returns behave. Generally, the correlations tend to rise gradually when the markets are increasingly bearish, as reflected by the fact that the returns are jointly declining. In addition, correlations tend to rise when markets are increasingly bullish, although the magnitude is smaller than the increase associated with bear markets. The second chapter considers an application of the quantile dependence model in empirical macroeconomics examining the money-output relationship. One area in this line of research focuses on the asymmetric effects of monetary policy on output growth. In particular, letting the negative residuals estimated from a money equation represent contractionary monetary policy shocks and the positive residuals represent expansionary shocks, it has been widely established that output growth declines more following a contractionary shock than it increases following an expansionary shock of the same magnitude. However, correctly identifying episodes of contraction and expansion in this manner presupposes that the true monetary innovation has a zero population mean, which is not verifiable. Therefore, I propose interpreting the quantiles of the monetary shocks as having information about the monetary policy stance. For instance, the 10th percentile shock reflects a restrictive stance relative to the 90th percentile shock, and the ranking of shocks is preserved regardless of shifts in the shock's distribution. This idea motivates modeling output growth as a function of the quantiles of monetary shocks. In addition, I consider modeling the quantile of output growth, which will enable policymakers to ascertain whether certain monetary policy objectives, as indexed by quantiles of monetary shocks, will be more effective in particular economic states, as indexed by quantiles of output growth. Therefore, this calls for a unified framework that models the relationship between the quantile of output growth and the quantile of monetary shocks. This framework employs a power series method to estimate quantile dependence. Monte Carlo experiments demonstrate that regressions based on cubic or quartic expansions are able to estimate the quantile dependence relationships well with reasonable bias properties and root-mean-squared errors. Hence, using the cubic and quartic regression models with M1 or M2 money supply growth as monetary instruments, I show that the right tail of the output growth distribution is generally more sensitive to M1 money supply shocks, while both tails of output growth distribution are more sensitive than the center is to M2 money supply shocks, implying that monetary policy is more effective in periods of very low and very high growth rates. In addition, when non-neutral, the influence of monetary policy on output growth is stronger when it is restrictive than expansive, which is consistent with previous findings on the asymmetric effects of monetary policy on output
Thesis (PhD) — Boston College, 2009
Submitted to: Boston College. Graduate School of Arts and Sciences
Discipline: Economics
APA, Harvard, Vancouver, ISO, and other styles
50

Kriek, Andre. "RoboCup formation modeling." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/2810.

Full text
Abstract:
Thesis (MSc (Mathematical Sciences. Computer Science))--University of Stellenbosch, 2009.
Since the late 1990s, the Robot Soccer World Cup has been used as a testing ground for new technology in the eld of robotic design and arti cial intelligence. This research initiative pits two teams of robots against each other in a game of soccer. It is hoped that the technology gained will enable the construction of a fully autonomous team of robot players to play a normal soccer game against a human team by the year 2050. In robot soccer matches, as in real soccer matches, inferring an opponent's strategy can give a team a major advantage. One important aspect of a team's strategy is the formation the team uses. Knowing the formations that an opposing team tends to take, enables a team to prepare appropriate countermeasures. This thesis will investigate methods to extract formation information from a completed soccer game. The results show that these methods can be used to infer a classical team formation, as well as other distinguishing characteristics of the players, such as which areas on the eld the players tend to occupy, or the players' movement patterns - both valuable items of information for a future opposition team.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography