Tesis sobre el tema "Modeling"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Modeling.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Modeling".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

SOUSA, HENRIQUE PRADO. "INTEGRATING INTENTIONAL MODELING TO PROCESS MODELING". PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2012. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=19928@1.

Texto completo
Resumen
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
A modelagem de processos de negócio é utilizada por empresas que desejam documentar detalhes do fluxo de execução de seus processos, resultando em um documento rico em detalhes sobre o negócio. Este artefato também é utilizado pela Engenharia de Software para elicitação de requisitos de sistema. A modelagem intencional possui foco na modelagem de objetivos - definidos como metas e metas flexíveis - e registra as estratégias que podem ser seguidas por um ator de forma a melhor atender suas necessidades, mapeando tarefas e recursos necessários, além disso, também aborda as dependências entre atores. É importante que os modelos de processos de negócio estejam alinhados aos objetivos da organização de forma a prover fonte de informações confiável que gere consequentemente requisitos alinhados ao negócio. Diversas ferramentas estão disponíveis no mercado com o objetivo de apoiar a modelagem dos processos de negócio e dos objetivos organizacionais, entretanto, percebe-se que as soluções disponíveis ainda são incompletas quando se fala na integração de modelos de processos e modelo de objetivos e formas de verificação do alinhamento entre processos e objetivos organizacionais a partir da modelagem. Na arquitetura organizacional, processos de negócio e objetivos são intrinsecamente interdependentes, porém, as linguagens de modelagem atuais não oferecem recursos suficientes para tratar processos e objetivos de forma alinhada, uma vez que existem deficiências na integração entre a camada de modelagem de objetivos e a de processos. Assim, o uso do ferramental disponível que se apoia nessas linguagens e métodos dificulta sobremaneira a tarefa de identificar se os processos utilizados para gerar serviços e produtos, verdadeiramente atingem os objetivos da organização, bem como o impacto que as mudanças nos objetivos causariam nos processos de negócio. Neste trabalho integramos uma linguagem de modelagem de objetivos a uma linguagem de processos de negócio e provemos os elementos e métodos necessários para ampliar a capacidade de análise do alinhamento dos processos de negócio às estratégias organizacionais.
The business processes modeling is used by companies who wish to document details of the execution flow of their processes, resulting in a document rich in details about the business. This artifact is also used by the Software Engineering for system requirements elicitation. The intentional modeling is focused on objectives - defined as goals and softgoals - and registers the strategies that may be followed by an actor in a way to better meet their needs, mapping the tasks and resources needs, in addition, it also addresses the dependencies between actors. It is important that business processes models are aligned to the objectives of the organization in order to provide reliable information source that generates consequently requirements aligned to business. Several tools are available on the market in order to support the business processes and organizational objectives modeling, however, it’s possible to realize that the available solutions are still incomplete when it comes to the integration of process models and goals models and ways to check the alignment between organizational goals and processes using the models. In the organizational architecture, business processes and goals are intrinsically interdependent, however, the current modeling languages treat process and goals in a misaligned way, since there are deficiencies in the integration between the modeling layer of objectives and processes. Thus, the use of the available tools that supports these language and methods greatly complicates the task of identify if the processes used to generate products and services truly achieve the organizational goals as well as the impact of the changes in the goals would cause in business processes. In this paper we integrated a goal modeling language to a business processes modeling language and proved the elements and methods needed to expand the capacity of analysis of the alignment between the business processes and the organizational strategies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Hansen, Daniel L. "Modeling". Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27137.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Queiroz, Eurico Tiago Justino. "Modelling Benguela niños using the regional oceanic modeling system (ROMS)". Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/6499.

Texto completo
Resumen
Includes bibliographical references (leaves 132-141).
Pierre Florenchie
This study is framed by three questions: firstly, could the Regional Oceanic Modelling System (ROMS) reproduce the seasonal cycle of the equatorial Atlantic? Secondly, what is the nature of the link between remote forcing in the western equatorial Atlantic and Benguela Niños/Niñas? Thirdly, what is the impact of these events on the equatorial Atlantic Ocean SST and circulation patterns? The results obtained suggest that the model is very sensitive to different wind stress forcing, particularly in respect of the impact on the mixed layer characteristics. As a result the equatorial upwelling is overestimated in both temporal and spatial scales.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Andersson, Conny. "Design of the Modelica Library VehProLib with Non-ideal Gas Models in Engines". Thesis, Linköpings universitet, Fordonssystem, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-121817.

Texto completo
Resumen
This thesis covers the reconstruction and the redesign of the modeling library VehProLib,which is constructed in the modeling language Modelica with help of the modeling toolWolfram SystemModeler. The design choices are discussed and implemented. This thesisalso includes the implementation of a turbocharger package and an initial study of the justificationof the ideal gas law in vehicle modeling. The study is made with help of Van derWaals equation of states as a reference of non-ideal gas model. It will be shown that for themean-value-engine-model, the usage of ideal gas law is justified.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Milligan, Walter W. Jr. "Deformation modeling and constitutive modeling for anisotropic superalloys". Diss., Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/19922.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Musunuri, Shravana Kumar. "Hybrid electric vehicle modeling in generic modeling environment". Master's thesis, Mississippi State : Mississippi State University, 2006. http://sun.library.msstate.edu/ETD-db/ETD-browse/browse.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Stollhoff, Rainer. "Modeling Prosopagnosia". Doctoral thesis, Universitätsbibliothek Leipzig, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-39600.

Texto completo
Resumen
Prosopagnosia is defined as a profound deficit in facial identification which can be either acquired due to brain damage or is present from birth, i.e. congenital. Normally, faces and objects are processed in different parts of the inferotemporal cortex by distinct cortical systems for face vs. object recognition, an association of function and location. Accordingly, in acquired prosopagnosia locally restricted damage can lead to specific deficits in face recognition. However, in congenital prosopagnosia faces and objects are also processed in spatially separated areas. Accordingly, the face recognition deficit in congenital prosopagnosia can not be solely explained by the association of function and location. Rather, this observation raises the question why and how such an association evolves at all. So far, no quantitative or computational model of congenital prosopagnosia has been proposed and models of acquired prosopagnosia have focused on changes in the information processing taking place after in icting some kind of \damage" to the system. To model congenital prosopagnosia, it is thus necessary to understand how face processing in congenital prosopagnosia differs from normal face processing, how differences in neuroanatomical development can give rise to differences in processing and last but not least why facial identification requires a specialized cortical processing system in the first place. In this work, a computational model of congenital prosopagnosia is derived from formal considerations, implemented in artificial neural network models of facial information encoding, and tested in experiments with prosopagnosic subjects. The main hypothesis is that the deficit in congenital prosopagnosia is caused by a failure to obtain adequate descriptions of individual faces: A predisposition towards a reduced structural connectivity in visual cortical areas enforces descriptions of visual stimuli that lack the amount of detail necessary to distinguish a specific exemplar from its population, i.e. achieve a successful identification. Formally recognition tasks can be divided into identification tasks (separating a single individual from its sampling population) and classification tasks (partitioning the full object space into distinct classes). It is shown that a high-dimensionality in the sensory representation facilitates individuation (\blessing of dimensionality"), but complicates estimation of object class representations (\curse of dimensionality"). The dimensionality of representations is then studied explicitly in a neural network model of facial encoding. Whereas optimal encoding entails a \holistic" (high-dimensional) representation, a constraint on the network connectivity induces a decomposition of faces into localized, \featural" (low-dimensional) parts. In an experimental validation, the perceptual deficit in congenital prosopagnosia was limited to holistic face manipulations and didn't extend to featural manipulations. Finally, an extensive and detailed investigation of face and object recognition in congenital prosopagnosia enabled a better behavioral characterization and the identification of subtypes of the deficit. In contrast to previous models of prosopagnosia, here the developmental aspect of congenital prosopagnosia is incorporated explicitly into the model, quantitative arguments for a deficit that is task specific (identification) - and not necessarily domain specific (faces) - are provided for synthetic as well as real data (face images), and the model is validated empirically in experiments with prosopagnosic subjects.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Дядечко, Алла Миколаївна, Алла Николаевна Дядечко, Alla Mykolaivna Diadechko y V. O. Hlushchenko. "Computer modeling". Thesis, Видавництво СумДУ, 2011. http://essuir.sumdu.edu.ua/handle/123456789/13473.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Akhlagi, Ali. "A Modelica-based framework for modeling and optimization of microgrids". Thesis, KTH, Energiteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263037.

Texto completo
Resumen
Microgrids have lately drawn much attention due to their considerable financial benefits and the increasing concerns about environmental issues. A solution that can address different engineering problems - from design to operation - is desired for practical reasons and to ensure consistency of the analyses. In this thesis, the capabilities of a Modelicabased framework is investigated for various microgrid optimization problems. Various sizing and scheduling problems are successfully formulated and optimized using nonlinear and physical component models, covering both electrical and thermal domains. Another focus of the thesis is to test the optimization platform when varying the problem formulation; performance and robustness tests have been performed with different boundary conditions and system setups. The results show that the technology can effectively handle complex scheduling strategies such as Model Predictive Control and Demand Charge Management. In sizing problems, although the platform can efficiently size the components while simultaneously solving for the economical load dispatch for short horizons (weekly or monthly), the implemented approach would require adaptations to become efficient on longer horizons (yearly).
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Boström, Rikard y Lars-Olof Moilanen. "Capacity profiling modeling for baseband applications". Thesis, Karlstad University, Faculty of Economic Sciences, Communication and IT, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-3352.

Texto completo
Resumen

Real-time systems are systems which must produce a result within a given time frame. A result given outside of this time frame is as useless as not delivering any result at all. It is therefore essential to verify that real-time systems fulfill their timing requirements. A model of the system can facilitate the verification process. This thesis investigates two possible methods for modeling a real-time system with respect to CPU-utilization and latency of the different components in the system. The two methods are evaluated and one method is chosen for implementation.The studied system is the decoder of a WCDMA system which utilizes a real-time operating called system OSEck. The methodology of analyzing the system and different ways of obtaining measurements to base the model upon will be described. The model was implemented using the simulation library VirtualTime, which contains a model of the previously mentioned operating system. Much work was spent acquiring input for the model, since the quality of the model depends largely on the quality of the analysis work. The model created contains two of the studied systems main components.This thesis identifies thorough system knowledge and efficient profiling methods as the key success factors when creating models of real-time systems.

Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Shrestha, shilu. "Software Modeling in Cyber-Physical Systems". Thesis, Linköpings universitet, Institutionen för datavetenskap, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-111435.

Texto completo
Resumen
A Cyber-Physical System (CPS) has a tight integration of computation, networking and physicalprocess. It is a heterogeneous system that combines multi-domain consisting of both hardware andsoftware systems. Cyber subsystems in the CPS implement the control strategy that affects the physicalprocess. Therefore, software systems in the CPS are more complex. Visualization of a complex system provides a method of understanding complex systems byaccumulating, grouping, and displaying components of systems in such a manner that they may beunderstood more efficiently just by viewing the model rather than understanding the code. Graphicalrepresentation of complex systems provides an intuitive and comprehensive way to understand thesystem. OpenModelica is the open source development environment based on Modelica modeling andsimulation language that consists of several interconnected subsystems. OMEdit is one of the subsystemintegrated into OpenModelica. It is a graphical user interface for graphical modeling. It consists of toolsthat allow the user to create their own shapes and icons for the model. This thesis presents a methodology that provides an easy way of understanding the structure andexecution of programs written in the imperative language like C through graphical Modelica model.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Daniel, Michael M. "Multiresolution statistical modeling with application to modeling groundwater flow". Thesis, Massachusetts Institute of Technology, 1997. http://hdl.handle.net/1721.1/10749.

Texto completo
Resumen
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1997.
Includes bibliographical references (p. 205-211).
by Michael M. Daniel.
Ph.D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Xu, Yuanhan. "Modeling print time for a fused deposition modeling machine". Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/114325.

Texto completo
Resumen
Thesis: M. Eng. in Advanced Manufacturing and Design, Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.
Cataloged from PDF version of thesis.
Includes bibliographical references (page 58).
A feature-based time estimation model is developed to predict printing time of any given object on a fused deposition modeling machine. Preliminary experiments are conducted to detect the significant factors that cause the error between actual and estimated submission time. A proposed method models the movements and heating processes of the extruder nozzle and calculates print time using G-code. The result of the proposed model is also compared with the most downloaded print time simulator on Google Play and demonstrated that the proposed model significantly improves the estimation of print time. Recommendations for future steps to improve the accuracy of the model are also presented.
by Yuanhan Xu.
M. Eng. in Advanced Manufacturing and Design
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Shekhawat, Lalita Kanwar. "Modeling of liquid chromatography empirical and mechanistic modeling approaches". Thesis, IIT Delhi, 2019. http://eprint.iitd.ac.in:80//handle/2074/8051.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Shikur, H. (Henok). "Assessing modeling and visualization capabilities of modeling tools:limitations and gaps of the open source modeling tools". Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201502111072.

Texto completo
Resumen
Due to the increasing number of Information Communication Technology (ICT) environments, security is becoming a concern for many researchers and organisations. Organisations have implemented different security measures to protect their assets. Different industries—such as power plants and water, oil, and gas utilities—are adapting different network modelling tools for guarding their assets and are preparing for incidents that might occur in the future. Modelling tools are very important for the visualisation of computer networks. There are currently many modelling tools with different modelling and visualisation capabilities for computer networks. The aim of this research is to make a thorough assessment of the different modelling tools’ capabilities of modelling computer networks and visualising computer network communication. Furthermore, it hopes to show areas for improvement in order to increase the quality of modelling tools based on industry requirements. The research methodology of this research takes the form of a case study. First, the study analyses previous research in order to illustrate gaps in the literature, as well as identifying the strengths and weaknesses of existing network modelling tools. The empirical part of the research includes first, studying and evaluating seven open-source modelling tools based on different types of capabilities, this may limit the generalisability of the findings to some extent; and second, selecting four modelling tools for further study. Once four modelling tools were evaluated based on literature reviews and the requirements set in this study, the top two open-source (OSS) modelling tool packages were selected, downloaded, installed, and evaluated further. The criteria set to evaluate the four modelling tools in this research are based on the requirements provided by the European company nSense, which provides different vulnerability assessments, security consulting, and training, and the existing literature. The evaluation of the tools resulted in the screens that were copied and presented in this document for verification. Finally, the one tool which was the most suitable for further studies, and which fulfilled most of the requirements set in this research, was recommended for further research. In total, four modelling tools were chosen for the evaluation, using different literature reviews based on the requirements (see Appendix A) in this research. The results showed that the two top modelling tools were OMNeT++ and IMUNES. After practical analysis of these tools, OMNeT++ was found to be the best tool based on the aims and requirements of this research. Further, the study found that usability problems played a large part in evaluating different modelling tools, which might have changed the outcomes of the result. It can therefore be concluded that this type of evaluation is highly dependent on the evaluator’s knowledge and skill, as well as the usability of the tool.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Mamadapur, Monish Shivappa. "Constitutive modeling of fused deposition modeling acrylonitrile butadiene styrene (ABS)". [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-2545.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Perry, Stephen J. "Modeling operations other than war : non-combatants in combat modeling /". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA285995.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Coggin, David. "LIDAR IN COASTAL STORM SURGE MODELING: MODELING LINEAR RAISED FEATURES". Master's thesis, University of Central Florida, 2008. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3362.

Texto completo
Resumen
A method for extracting linear raised features from laser scanned altimetry (LiDAR) datasets is presented. The objective is to automate the method so that elements in a coastal storm surge simulation finite element mesh might have their edges aligned along vertical terrain features. Terrain features of interest are those that are high and long enough to form a hydrodynamic impediment while being narrow enough that the features might be straddled and not modeled if element edges are not purposely aligned. These features are commonly raised roadbeds but may occur due to other manmade alterations to the terrain or natural terrain. The implementation uses the TauDEM watershed delineation software included in the MapWindow open source Geographic Information System to initially extract watershed boundaries. The watershed boundaries are then examined computationally to determine which sections warrant inclusion in the storm surge mesh. Introductory work towards applying image analysis techniques as an alternate means of vertical feature extraction is presented as well. Vertical feature lines extracted from a LiDAR dataset for Manatee County, Florida are included in a limited storm surge finite element mesh for the county and Tampa Bay. Storm surge simulations using the ADCIRC-2DDI model with two meshes, one which includes linear raised features as element edges and one which does not, verify the usefulness of the method.
M.S.
Department of Civil and Environmental Engineering
Engineering and Computer Science
Civil Engineering MS
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Constantino, Carlos Augusto. "Hydraulic actuation system modeling: an analysis of high frequency modeling". Instituto Tecnológico de Aeronáutica, 2010. http://www.bd.bibl.ita.br/tde_busca/arquivo.php?codArquivo=1307.

Texto completo
Resumen
The objective of this work was to develop a high fidelity model representative up to high frequencies of a Flight Control System with hydraulic actuation on active-active mode. The usage of a Fly-By-Wire architecture and hydraulic actuation system on active-active mode has brought new engineering challenges like the force-fight between actuators and its structure fatigue life consumption on normal and failure scenarios such as oscillatory mal-functions. Once that these failure modes can exist up to high frequencies, it makes necessary the development of a high fidelity model of a flight control system representative up to high frequencies. The model herein developed has a high fidelity model of an EHSV, hydraulic actuator, a position loop and the control surface, as well as other models complementary in order that it can be possible to analyze the whole system up to high frequencies. It was analyzed the performance of the model at step input response and frequency response, showing to be a model close to the expected response of a real system. Also it was analyzed the frequency response of its components showing to be a representative model up to high frequencies. Besides the performance analysis, it was studied the behavior on a oscillatory mal-function scenario, showing the expected level of structure load, as well as its fatigue life consumption, showing the need to monitor these types of failures.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Singh, Raymond Charan. "Modeling Energy Harvesting From Membrane Vibrations using Multi-physics Modeling". Thesis, Virginia Tech, 2012. http://hdl.handle.net/10919/76793.

Texto completo
Resumen
Given the ever-growing need for device autonomy and renewable sources of energy, energy harvesting has become an increasingly popular field of research. This research focuses on energy harvesting using the piezoelectric effect, from vibrating membrane structures by converting mechanical energy into electric energy. Specific applications of this research include powering components of bio-inspired micro air vehicles (MAVs), which require long range with as little regular maintenance as possible, and powering sensors for structural health monitoring on otherwise inaccessible locations (the roof of the Denver Int'l Airport is a good example). Coming up with an efficient, high-fidelity model of these systems allows for design optimization without the extensive use of experimental testing, as well as a deeper understanding of the physics involved. These are the twin goals of this research. This work describes a modeling algorithm using COMSOL, a multi-physics software, to predict the structural mechanics of and subsequent power harvested from a piezoelectric patch placed on a prestressed membrane structure. The model is verified by an FE comparison of the modeled system's dynamic response. For a 0.5 x 0.5 x 0.001 m nylon membrane with a 0.1 x 0.1 x 0.001 m piezoelectric patch placed on its corner, a maximum power output of ~10 microwatts was achieved, using a resistance of 100 Ohms and exciting the system around resonance. When the patch was placed on the side of the membrane, the power output was ~100 milliwatts. The ultimate goal is to estimate the energy harvested by a network of these piezoelectric patches and optimize the harvesting system based on the size, shape and location of the patches.
Master of Science
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

DESHMUKH, PUSHKARAJ M. "MODELING ERROR ESTIMATION AND ADAPTIVE MODELING OF FUNCTIONALLY GRADED MATERIALS". University of Cincinnati / OhioLINK, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1096036755.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Hafezi, Mohammad Hadi y Mohammad Hadi Hafezi. "Peridynamic Modeling and Extending the Concept to Peri-Ultrasound Modeling". Diss., The University of Arizona, 2017. http://hdl.handle.net/10150/625456.

Texto completo
Resumen
In this dissertation, a novel fast modeling technique called peri-ultrasound that can model both linear and nonlinear ultrasonic behavior of materials is developed and implemented. Nonlinear ultrasonic response can detect even very small material non- linearity. Quantification of the material nonlinearity at the early stages of damage is important to avoid catastrophic failure and reduce repair costs. The developed model uses the nonlocal continuum-based peridynamic theory which was found to be a good simulation tool for handling crack propagation modeling, in particular when multiple cracks grow simultaneously. The developed peri-ultrasound modeling tool has been used to model the ultrasonic response at the interface of two materials in presence of an interface crack. Also, the stress wave propagation in a half-space (or half-plane for a 2-dimensional problem) with boundary loading is investigated using peri-ultrasound modeling. In another simulation, well-established two-dimensional Lamb's problem is investigated where the results are verified against available analytical solution. Also, the interaction between the surface wave and a surface breaking crack is studied.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Polk, Brandon. "Modeling Naturally Occurring Wildfires Across the US Using Niche Modeling". OpenSIUC, 2017. https://opensiuc.lib.siu.edu/theses/2163.

Texto completo
Resumen
Wildfires can cause significant damage to an area by destroying forested and agricultural areas, homes, businesses, and leading to the potential loss of life. Climate change may further increase the frequency of wildfires. Thus, developing a quick, simple, and accurate method for identifying key drivers that cause wildfires and modeling and predicting their occurrence becomes very important and urgent. Various modeling methods have been developed and applied for this purpose. The objective of this study was to identify key drivers and search for an appropriate method for modeling and predicting natural wildfire occurrence for the United States. In this thesis, various vegetation, topographic and climate variables were examined and key drivers were identified based on their spatial distributions and using their correlations with natural wildfire occurrence. Five models including General Linearized Models (GLM) with Binomial and Poisson distribution, MaxEnt, Random Forests, Artificial Neural Networks, and Multiple Adaptive Regression Splines, were compared to predict natural wildfire occurring for seven different climate regions across the United States. The comparisons were conducted using three datasets including LANDFIRE consisting of thirteen variables including characteristics of vegetation, topography and disturbance, BIOCLIM containing climate variables such as temperature and precipitation, and composite data that combine the most important variables from LANDFIRE and BIOCLIM after the multicollinearity test of the variables done using variance inflation factor (VIF). This results of this study showed that niche modeling techniques such as MaxEnt, GLM with logistic regression (LR), and binomial distribution were an appropriate choice for modeling natural wildfire occurrence. MaxEnt provided highly accurate predictions of natural wildfire occurrence for most of seven different climate regions across the United States. This implied that MaxEnt offered a powerful solution for modeling natural wildfire occurrence for complex and highly specialized systems. This study also showed that although MaxEnt and GLM were quite similar, both models produced very different spatial distributions of probability for natural wildfire occurrence in some regions. Moreover, it was found that natural wildfire occurrence in the western regions was more influenced by precipitation and drought conditions while in the eastern regions the natural wildfire occurrence was more affected by extreme temperature.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Depman, Albert J. III. "Stoker boiler CFD modeling improvements through alternative heat exchanger modeling". Thesis, University of Iowa, 2014. https://ir.uiowa.edu/etd/4609.

Texto completo
Resumen
Accurate models and realistic simulations are essential in developing cleaner and more efficient coal- and biomass-fired boilers. Using the CFD simulation software Fluent The University of Iowa created a model of an industrial boiler that adequately compares the practice of co-firing biomass and coal against firing only coal. The simulations used in this comparison, show significant circulation zones and an unrealistic temperature profile inside the boiler heat exchanger region. This model is effective for comparing the relative decrease in emissions when co-firing with biomass versus exclusively coal combustion, but it does not present a realistic simulation of biomass or coal combustion. The purpose of the current work is to develop a more realistic baseline coal combustion model. Calculations for the proximate and ultimate analysis of coal, as well as properties necessary for energy and mass flux computations, have been updated in the current model. The fuel bed model - a simple two-dimensional distribution of energy and mass fluxes from the grate - was kept the same due to the complexities of fuel bed modeling. Simulation boundary conditions and flow models were tested and modified to determine the most realistic model settings. The geometry and mesh grid of the boiler model were also varied in an attempt to fix problematic areas. Several approaches were implemented in an effort to reduce the circulation zones and generate a realistic temperature profile. The negative energy source term in the boiler representing the energy removed by the water pipes in the heat exchanger was analyzed, and different configurations of this sink were tested. Finally, the heat exchanger models built in to Fluent were studied and implemented. These models proved to be the most effective in reducing recirculation zones and decreasing high temperature gradients. While the current model of the coal-fired boiler has a higher overall temperature than the previous one, circulation zones are almost completely eliminated, the flow path has been improved, and the temperature profile in the boiler is more realistic.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Serafin, Francesco. "Enabling modeling framework with surrogate modeling capabilities and complex networks". Doctoral thesis, Università degli studi di Trento, 2019. https://hdl.handle.net/11572/369029.

Texto completo
Resumen
Conceptual and physically based environmental simulation models as products of research environments efforts became complex software over time in order to allow describing the behaviour of natural phenomena more accurately. Results from these models are considered accurate but often require to operate an entire system of modeling resources with dedicated knowledge, an extensive set up, and sometimes significant computational time. Model complexity limits wide model adaptation among consultants because of lower available technical resources and capabilities. However, models should be ubiquitous to use in both research and consulting environments. This dissertation aims to address and alleviate two aspects of research model complexity: 1) for researchers, the model design complexity with respect to its internal software structure and 2) for consultants, the model application complexity with respect to data and parameter setup, runtime requirements, and proper model infrastructure setup. The first contribution provides modeling design and implementation support by managing interacting modeling solutions as “Directed Acyclic Graph†, while the second one helps to create surrogate models of complex physical models as a streamlined process. Both contributions are implemented within the OMS/CSIP modeling framework and infrastructure and were applied in various studies. First, a machine learning (ML)-based surrogate model approach is presented to respond to field application requirements to get quick but “accurate enough†model results with limited input and limited a-priori knowledge of the internal physical processes involved. The surrogate model aims to capture the behaviour of a physical model as an ensemble system of artificial neural networks (ANN). Here, the NeuroEvolution of Augmenting Topology (NEAT) technique has been leveraged because of its integration of a genetic approach to build and evolve its ANNs during supervised training. Throughout this phase, the thorough design of the services facilitate seamless monitoring of structural mutations of the artificial neural network and its performances with respect to behavioural emulation of the original model response. This results in a streamlined surrogate model generation. Furthermore, the stochasticity inherent to the evolutionary genetic algorithm combined with a specially designed cross-validation approach allows for straightforward use of the ensemble application. Several, slightly different artificial neural networks are concurrently trained. The ensemble system is built upon the selection of the utmost performant surrogate models and is used collectively to provide uncertainty quantified results when applied against new data. Secondly, a Directed Acyclic Graph (DAG) modeling structure NET3 was developed. NET3 provides appropriate data structures to represent modeling states interactions as relationships based on network topologies. The inherent structure of the DAG commands the execution of modeling tasks. NET3 implicitly manages the parallel computation depending on the network topology. A node of a NET3 modeling structure encapsulates any sort of modeling solution such as a system of ordinary differential equations, a set of statistical rules, or a system of partial differential equations. Each link connects these modeling solutions by handling their data flow. As a result, NET3 simplifies 1) the translation of physical mathematical concepts into model components, and 2) the management of complex interactions of modeling solutions. NET3 also pushes forward the idea of separating concerns between software architecture and scientific model codebase. It manages aspects that relate to the architectural design of the graph modeling structure and lets research scientist focus on their model’s domain. NET3 improves encapsulation and reusability of scientific/mathematical concepts. It avoids code duplication by allowing the same modeling solution to be adopted in different nodes and finely adapted to specific requirements. In summary, NET3 enables a new level of modeling flexibility by allowing to quickly change model representations to explore new modeling solutions. The two presented contributions were integrated into the Object Modeling System/Cloud Services Integrated Platform (OMS/CSIP) environmental modeling framework (EMF). EMFs are standard practice in environmental modeling because they represent a software solution of separating the burden of software architectural design management from scientific research. Here, OMS/CSIP has been identified “advanced†in terms of EMFs design. It offers high flexibility, low language invasiveness, fine and thorough architectural design, and innovative cloud computing deployment infrastructure. These aspects make OMS/CSIP infrastructure the suitable platform to host NEAT based surrogate modeling and NET3 extensions. Framework-enabled NEAT based Surrogate modeling (FeNS) results from the full integration of NEAT based surrogate modeling approach with OMS/CSIP platform. Here, the surrogate model approach was developed as CSIP services to help transitioning from research models to “field models†by enabling the modeling framework to interact with CSIP services, ML libraries, and a NoSQL database to emerge model surrogates for a(ny) modelling solution. OMS/CSIP was extended to harvest data from each model run and automatically derive the surrogate model at the modeling framework level. NET3 extends OMS modeling simulations to run as a graph network of interconnected modeling solutions. Furthermore, it enhances available OMS calibration algorithms to become multi-site calibration procedures. OMS already provided implicit parallel computation of independent components in a modeling solution. NET3 now adds a further layer of implicit parallelism by concurrently running independent modeling solutions. Two studies were carried out to develop and test FeSN while three applications supported the development and testing of NET3. Surrogate models of the Revised Universal Soil Loss Equation, Version 2 (R2) were generated to scale up from simple test cases with a constrained input space to more generic applications including a larger variety of input parameters. The main goal of the surrogate model was to streamline and simplify access to the R2 model behaviour. We performed sensitivity analysis of R2 to limit the input space to only relevant parameters (e.g. soil properties, climate parameter, field geometries, crop rotation description). The main study area was the State of Iowa starting from a single county (Clay county) ending up to four counties (Buena Vista, Cherokee, Clay, and Wright). Clustering methodologies were applied to improve surrogate model accuracy and to accelerate the training process by reducing the dataset size. The overall “goodness-of-fit†against the testing dataset estimated on the median of the uncertainty quantified result of the surrogate models ensemble was always above 0.95 Nash-Sutcliffe (NS), root mean squared error (RMSE) between 0.13 and 0.36, and bias between -0.07 and 0.02. In many cases, accuracy of the surrogate model with respect to testing dataset was above 0.98 NS. Surrogate models of the AgroEcoSystem (AgES) were generated to apply and test FeNS methodology to a semi-distributed hydrologic model. The main goal of the surrogate model was to streamline and simplify access to the AgES model behaviour. Only relevant lumped parameters on watershed centroid were used to train the surrogate models and limit the input space to only relevant parameters (e.g. precipitation, groundwater level, LAI, and potential evapotranspiration). The main study area was the South Fork Iowa River (SFIR) watershed in the State of Iowa across Wright, Franklin, Hamilton, and Hardin counties. The overall “goodness-of-fit†against the testing dataset estimated on the median of the uncertainty quantified result of the surrogate models ensemble was above 0.97 Nash-Sutcliffe (NS), root mean squared error (RMSE) of 2.24, and bias of -0.0794. With respect to NET3, the first application is the real-time modeling of flood forecasting through GEOframe system for the Civil Protection of Regione Basilicata implemented by PhD Bancheri. To scale the computation and finely tune calibration parameters, the Basilicata river basins were split into subcatchments where each was represented by a different NET3 node. The second application was part of Mr. Dalla Torre’s master thesis where the computational core of the rainfall-runoff model of Storm Water Management Model (SWMM by EPA) was componentized. NET3 now allows for reimplementing a concise and lightweight SWMM modeling core and highly parallel model runs. Software architectural design of rainfall-runoff, routing and sewer pipe design components targeted separation of concerns, single responsibility, and encapsulation principles. It resulted in clean and minimized code base. NET3 manages component connections and scalable computation by hosting rainfall-runoff modeling solution into separated nodes from routing and sewer pipe design modeling solution. It also enables each node of the modeling structure to 1) access a shared data structure to fetch input data from and push results to (SWMMobject), and 2) internally analyze the upstream subtree in order to adjust sewer pipe design parameters. The third test case is the application of a “system of systems†of urban models where each node of the graph modeling structure encapsulates a single responsibility system of models. Because of the stochasticity involved in each system of models, the entire graph modeling solution was required to run several times and generate independent realizations. Hence, NET3 was enabled to run a “graph of graphs†modeling structure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Serafin, Francesco. "Enabling modeling framework with surrogate modeling capabilities and complex networks". Doctoral thesis, University of Trento, 2019. http://eprints-phd.biblio.unitn.it/3650/1/FrancescoSerafin_183280_finale.pdf.

Texto completo
Resumen
Conceptual and physically based environmental simulation models as products of research environments efforts became complex software over time in order to allow describing the behaviour of natural phenomena more accurately. Results from these models are considered accurate but often require to operate an entire system of modeling resources with dedicated knowledge, an extensive set up, and sometimes significant computational time. Model complexity limits wide model adaptation among consultants because of lower available technical resources and capabilities. However, models should be ubiquitous to use in both research and consulting environments. This dissertation aims to address and alleviate two aspects of research model complexity: 1) for researchers, the model design complexity with respect to its internal software structure and 2) for consultants, the model application complexity with respect to data and parameter setup, runtime requirements, and proper model infrastructure setup. The first contribution provides modeling design and implementation support by managing interacting modeling solutions as “Directed Acyclic Graph”, while the second one helps to create surrogate models of complex physical models as a streamlined process. Both contributions are implemented within the OMS/CSIP modeling framework and infrastructure and were applied in various studies. First, a machine learning (ML)-based surrogate model approach is presented to respond to field application requirements to get quick but “accurate enough” model results with limited input and limited a-priori knowledge of the internal physical processes involved. The surrogate model aims to capture the behaviour of a physical model as an ensemble system of artificial neural networks (ANN). Here, the NeuroEvolution of Augmenting Topology (NEAT) technique has been leveraged because of its integration of a genetic approach to build and evolve its ANNs during supervised training. Throughout this phase, the thorough design of the services facilitate seamless monitoring of structural mutations of the artificial neural network and its performances with respect to behavioural emulation of the original model response. This results in a streamlined surrogate model generation. Furthermore, the stochasticity inherent to the evolutionary genetic algorithm combined with a specially designed cross-validation approach allows for straightforward use of the ensemble application. Several, slightly different artificial neural networks are concurrently trained. The ensemble system is built upon the selection of the utmost performant surrogate models and is used collectively to provide uncertainty quantified results when applied against new data. Secondly, a Directed Acyclic Graph (DAG) modeling structure NET3 was developed. NET3 provides appropriate data structures to represent modeling states interactions as relationships based on network topologies. The inherent structure of the DAG commands the execution of modeling tasks. NET3 implicitly manages the parallel computation depending on the network topology. A node of a NET3 modeling structure encapsulates any sort of modeling solution such as a system of ordinary differential equations, a set of statistical rules, or a system of partial differential equations. Each link connects these modeling solutions by handling their data flow. As a result, NET3 simplifies 1) the translation of physical mathematical concepts into model components, and 2) the management of complex interactions of modeling solutions. NET3 also pushes forward the idea of separating concerns between software architecture and scientific model codebase. It manages aspects that relate to the architectural design of the graph modeling structure and lets research scientist focus on their model’s domain. NET3 improves encapsulation and reusability of scientific/mathematical concepts. It avoids code duplication by allowing the same modeling solution to be adopted in different nodes and finely adapted to specific requirements. In summary, NET3 enables a new level of modeling flexibility by allowing to quickly change model representations to explore new modeling solutions. The two presented contributions were integrated into the Object Modeling System/Cloud Services Integrated Platform (OMS/CSIP) environmental modeling framework (EMF). EMFs are standard practice in environmental modeling because they represent a software solution of separating the burden of software architectural design management from scientific research. Here, OMS/CSIP has been identified “advanced” in terms of EMFs design. It offers high flexibility, low language invasiveness, fine and thorough architectural design, and innovative cloud computing deployment infrastructure. These aspects make OMS/CSIP infrastructure the suitable platform to host NEAT based surrogate modeling and NET3 extensions. Framework-enabled NEAT based Surrogate modeling (FeNS) results from the full integration of NEAT based surrogate modeling approach with OMS/CSIP platform. Here, the surrogate model approach was developed as CSIP services to help transitioning from research models to “field models” by enabling the modeling framework to interact with CSIP services, ML libraries, and a NoSQL database to emerge model surrogates for a(ny) modelling solution. OMS/CSIP was extended to harvest data from each model run and automatically derive the surrogate model at the modeling framework level. NET3 extends OMS modeling simulations to run as a graph network of interconnected modeling solutions. Furthermore, it enhances available OMS calibration algorithms to become multi-site calibration procedures. OMS already provided implicit parallel computation of independent components in a modeling solution. NET3 now adds a further layer of implicit parallelism by concurrently running independent modeling solutions. Two studies were carried out to develop and test FeSN while three applications supported the development and testing of NET3. Surrogate models of the Revised Universal Soil Loss Equation, Version 2 (R2) were generated to scale up from simple test cases with a constrained input space to more generic applications including a larger variety of input parameters. The main goal of the surrogate model was to streamline and simplify access to the R2 model behaviour. We performed sensitivity analysis of R2 to limit the input space to only relevant parameters (e.g. soil properties, climate parameter, field geometries, crop rotation description). The main study area was the State of Iowa starting from a single county (Clay county) ending up to four counties (Buena Vista, Cherokee, Clay, and Wright). Clustering methodologies were applied to improve surrogate model accuracy and to accelerate the training process by reducing the dataset size. The overall “goodness-of-fit” against the testing dataset estimated on the median of the uncertainty quantified result of the surrogate models ensemble was always above 0.95 Nash-Sutcliffe (NS), root mean squared error (RMSE) between 0.13 and 0.36, and bias between -0.07 and 0.02. In many cases, accuracy of the surrogate model with respect to testing dataset was above 0.98 NS. Surrogate models of the AgroEcoSystem (AgES) were generated to apply and test FeNS methodology to a semi-distributed hydrologic model. The main goal of the surrogate model was to streamline and simplify access to the AgES model behaviour. Only relevant lumped parameters on watershed centroid were used to train the surrogate models and limit the input space to only relevant parameters (e.g. precipitation, groundwater level, LAI, and potential evapotranspiration). The main study area was the South Fork Iowa River (SFIR) watershed in the State of Iowa across Wright, Franklin, Hamilton, and Hardin counties. The overall “goodness-of-fit” against the testing dataset estimated on the median of the uncertainty quantified result of the surrogate models ensemble was above 0.97 Nash-Sutcliffe (NS), root mean squared error (RMSE) of 2.24, and bias of -0.0794. With respect to NET3, the first application is the real-time modeling of flood forecasting through GEOframe system for the Civil Protection of Regione Basilicata implemented by PhD Bancheri. To scale the computation and finely tune calibration parameters, the Basilicata river basins were split into subcatchments where each was represented by a different NET3 node. The second application was part of Mr. Dalla Torre’s master thesis where the computational core of the rainfall-runoff model of Storm Water Management Model (SWMM by EPA) was componentized. NET3 now allows for reimplementing a concise and lightweight SWMM modeling core and highly parallel model runs. Software architectural design of rainfall-runoff, routing and sewer pipe design components targeted separation of concerns, single responsibility, and encapsulation principles. It resulted in clean and minimized code base. NET3 manages component connections and scalable computation by hosting rainfall-runoff modeling solution into separated nodes from routing and sewer pipe design modeling solution. It also enables each node of the modeling structure to 1) access a shared data structure to fetch input data from and push results to (SWMMobject), and 2) internally analyze the upstream subtree in order to adjust sewer pipe design parameters. The third test case is the application of a “system of systems” of urban models where each node of the graph modeling structure encapsulates a single responsibility system of models. Because of the stochasticity involved in each system of models, the entire graph modeling solution was required to run several times and generate independent realizations. Hence, NET3 was enabled to run a “graph of graphs” modeling structure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Peña, Reyes Carlos Andrés. "Coevolutionary fuzzy modeling /". [S.l.] : [s.n.], 2002. http://library.epfl.ch/theses/?display=detail&nr=2634.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Nazarian, Bamshad. "Integrated Field Modeling". Doctoral thesis, Norwegian University of Science and Technology, Faculty of Engineering Science and Technology, 2003. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-84.

Texto completo
Resumen

This research project studies the feasibility of developing and applying an integrated field simulator to simulate the production performance of an entire oil or gas field. It integrates the performance of the reservoir, the wells, the chokes, the gathering system, the surface processing facilities and, whenever applicable, gas and water injection systems.

The approach adopted for developing the integrated simulator is to couple existing commercial reservoir and process simulators using available linking technologies. The simulators are dynamically linked and customized into a single hybrid application that benefits from the concept of open software architecture. The integrated field simulator is linked to an optimization routine developed based on the genetic algorithm search strategies. This enables optimization of the system at field level, from the reservoir to the process. Modeling the wells and the gathering network is achieved by customizing the process simulator.

This study demonstrates that the integrated simulation improves currentcapabilities to simulate the performance of an entire field and optimize its design. This is achieved by evaluating design options including spread and layout of the wells and gathering system, processing alternatives, reservoir development schemes, and production strategies.

Effectiveness of the integrated simulator is demonstrated and tested through several field-level case studies that discuss and investigate technical problems relevant to offshore field development. The case studies cover topics such as process optimization, optimum tie-in of satellite wells into existing process facilities, optimal well location, and field layout assessment of a high pressure high temperature deepwater oil field.

Case study results confirm the viability of the total field simulator by demonstrating that the field performance simulation and optimal design were obtained in an automated process with reasonable computation time. No significant simplifying assumptions were required to solve the system and tedious manual data transfer between simulators, as conventionally practiced, was avoided.

Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Ozkan, Sule. "Modeling Elementary Students". Phd thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/2/12610168/index.pdf.

Texto completo
Resumen
This study aimed to explore the relationships between elementary students&rsquo
epistemological beliefs, learning approaches, self-regulated learning strategies, and their science achievement. In this investigation, a model of the potential associations among these variables was proposed and tested by using structural equation modeling. It was hypothesized that (a) students&rsquo
epistemological beliefs would directly influence their learning approaches, self-regulated learning strategies, and science achievement, (b) students&rsquo
adopted learning approaches and their use of self-regulated learning strategies would be related with science achievement, and (c) students&rsquo
learning approaches were expected to be related with their use of self-regulated strategies. A total of 1240 seventh graders from 21 public elementary schools throughout the Ç
ankaya district of Ankara completed measures designed to assess students&rsquo
(a) epistemological beliefs (beliefs about the Certainty of Knowledge, Development of Knowledge, Source of Knowing, and Justification for Knowing) (b) adopted learning approaches (either rote or meaningful), (c) use of self-regulated learning strategies, and (d) science achievement. Separate confirmatory factor analyses were conducted to determine the structure of students&rsquo
epistemological beliefs and their adopted learning approaches. While the factor analyses of students&rsquo
responses to the epistemological beliefs questionnaire supported the multidimensional nature of these beliefs, some features distinct from the findings of the Western countries were identified. Socio-cultural influences were proposed to account for the observed differences in the factor structure obtained with the Turkish sample. The results of the structural equation modeling while supporting some of the proposed hypotheses, contradicted with others. Epistemological beliefs emerged as a major contributor to learning approaches and science achievement as expected, whereas those beliefs can not be used as a predictor of self-regulated learning strategies. In addition, students&rsquo
adopted learning approaches were found to be a predictor of their self-regulated learning strategies which in turn influence the science achievement in the model. Contrary to the expectations, learning approaches of the students were not found to be directly related with their science achievement.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Fullerton, A. W. "Discussion: Spectral modeling". Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1791/.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Vink, J. S. "Discussion: Hydrodynamic modeling". Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2008/1804/.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Dingle, Brent Michael. "Volumetric particle modeling". Texas A&M University, 2003. http://hdl.handle.net/1969.1/5937.

Texto completo
Resumen
This dissertation presents a robust method of modeling objects and forces for computer animation. Within this method objects and forces are represented as particles. As in most modeling systems, the movement of objects is driven by physically based forces. The usage of particles, however, allows more artistically motivated behavior to be achieved and also allows the modeling of heterogeneous objects and objects in different state phases: solid, liquid or gas. By using invisible particles to propagate forces through the modeling environment complex behavior is achieved through the interaction of relatively simple components. In sum, 'macroscopic' behavior emerges from 'microscopic' modeling. We present a newly developed modeling framework expanding on related work. This framework allows objects and forces to be modeled using particle representations and provides the details on how objects are created, how they interact, and how they may be displayed. We present examples to demonstrate the viability and robustness of the developed method of modeling. They illustrate the breaking and fracturing of solids, the interaction of objects in different phase states, and the achievement of a reasonable balance between artistic and physically based behaviors.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Jansson, Johan. "Automated Computational Modeling". Doctoral thesis, Chalmers University of Technology, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-52828.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Jöhnemark, Alexander. "Modeling Operational Risk". Thesis, KTH, Matematisk statistik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-107435.

Texto completo
Resumen
The Basel II accord requires banks to put aside a capital buffer against unexpected operational losses, resulting from inadequate or failed internal processes, people and systems or from external events. Under the sophisticated Advanced Measurement Approach banks are given the opportunity to develop their own model to estimate operational risk.This report focus on a loss distribution approach based on a set of real data. First a comprehensive data analysis was made which suggested that the observations belonged to a heavy tailed distribution. An evaluation of commonly used distributions was performed. The evaluation resulted in the choice of a compound Poisson distribution to model frequency and a piecewise defined distribution with an empirical body and a generalized Pareto tail to model severity. The frequency distribution and the severity distribution define the loss distribution from which Monte Carlo simulations were made in order to estimate the 99.9% quantile, also known as the the regulatory capital. Conclusions made on the journey were that including all operational risks in a model is hard, but possible, and that extreme observations have a huge impact on the outcome.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Chaqchaq, Othmane. "Fixed Income Modeling". Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-192372.

Texto completo
Resumen
Besides financial analysis, quantitative tools play a major role in asset management. By managing the aggregation of large amount of historical and prospective data on different asset classes, it can give portfolio allocation solution with respect to risk and regulatory constraints. Asset class modeling requires three main steps, the first one is to assess the product features (risk premium and risks) by considering historical and prospective data, which in the case of fixed income depends on spread and default levels. The second is choosing the quantitative model, in this study we introduce a new credit model, which unlike equity like models, model default as a main feature of fixed income performance. The final step consists on calibrating the model. We start in this study with the modeling of bond classes and study its behavior in asset allocation, we than model the capital solution transaction as an example of a fixed income structured product.
Förutom finansiell analys, kvantitativa verktyg spelar en viktig roll i kapitalförvaltningen också. Genom att hantera sammanläggning av stora mängder historiska och framtida uppgifter om olika tillgångsklasser kan dessa verktyg ge placeringslösning med avseende på risk och regulatoriska begränsningar. Tillgångsklass modellering kräver tre huvudsteg: Den första är att utvärdera produktens funktioner (riskpremie och risker) genom att beakta historiska och framtida uppgifter, som i fallet med fast inkomst beror på spridning och normalnivåer. Den andra är att välja den kvantitativa modellen. I denna studie presenterar vi en ny kreditmodell, som till skillnad från aktieliknande modeller, utformar "standard" som det viktigaste inslaget i Fixed Income prestanda. Det sista steget består i att kalibrera modellen. Vi börjar denna studie med modellering av obligationsklasser och med att studera dess beteende i tillgångsallokering. Sedan, modellerar vi kapital lösning transaktionen som ett exempel på en fast inkomst strukturerad produkt.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Forde, Cameron E. "Modeling biological iron". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape16/PQDD_0021/NQ27644.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Aguenaou, Karim. "Modeling of solidification". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ36950.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Sargent, Aitbala. "Modeling Ice Streams". Fogler Library, University of Maine, 2009. http://www.library.umaine.edu/theses/pdf/SargentA2009.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Heckmann, Dominikus. "Ubiquitous user modeling". Berlin Aka, 2005. http://deposit.d-nb.de/cgi-bin/dokserv?id=2860787&prov=M&dok_var=1&dok_ext=htm.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Heckmann, Dominikus. "Ubiquitous user modeling /". Berlin : Akad. Verl.-Ges. Aka, 2006. http://deposit.d-nb.de/cgi-bin/dokserv?id=2860787&prov=M&dok_var=1&dok_ext=htm.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Walås, Gustav. "Modeling deposit prices". Thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-122306.

Texto completo
Resumen
Thisreport investigates whether there are sufficient differences between a bank'sdepositors to motivate price discrimination. This is done by looking at timeseries of individual depositors to try to find predictors by a regressionanalysis. To be able to conclude on the value of more stable deposits for thebank and hence deduce a price, one also needs to look at regulatory aspects ofdeposits and different depositors. Once these qualities of a deposit have beenassigned by both the bank and regulator, they need to be transformed into aprice. This is done by replicationwith market funding instruments.
Denna studie syftar till att kartlägga eventuella skillnader mellan insättare i en bank för att kunna avgöra om dessa skillnader motiverar olika räntor. Genom att analysera tidsserier av insatta belopp och göra en regressionsanalys fastställs eventuella skillnader. Bankinsättningar påverkas även i hög grad av olika regleringar varför även effekterna av dessa ingår i studien.  För att kunna få fram ett värde på insättningarna replikeras sedan dessa under givna kriterier med olika skuldinstrument.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Telang, Pankaj Ramesh. "Multiagent Business Modeling". Thesis, North Carolina State University, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3575893.

Texto completo
Resumen

Cross-organizational business processes are common in today’s economy. Of necessity, enterprises conduct their business in cooperation to create products and services for the marketplace. Thus business processes inherently involve autonomous partners with heterogeneous software designs and implementations. The existing business modeling approaches that employ high-level abstractions are difficult to operationalize, and the approaches that employ low-level abstractions lead to highly rigid processes that lack business semantics. We propose a novel business model based on multiagent abstractions. Unlike existing approaches, our model gives primacy to the contractual relationships among the business partners, thus providing a notion of business-level correctness, and offers flexibility to the participants. Our approach employs reusable patterns as building blocks to model recurring business scenarios. A step-by-step methodology guides a modeler in constructing a business model. Our approach employs temporal logic to formalize the correctness properties of a business model, and model checking to verify if a given operationalization satisfies those properties. Developer studies found that our approach yields improved model quality compared to the traditional approaches from the supply chain and healthcare domains.

Commitments capture how an agent relates with another agent, whereas goals describe states of the world that an agent is motivated to bring about. It makes intuitive sense that goals and commitments be understood as being complementary to each other. More importantly, an agent’s goals and commitments ought to be coherent, in the sense that an agent’s goals would lead it to adopt or modify relevant commitments and an agent’s commitments would lead it to adopt or modify relevant goals. However, despite the intuitive naturalness of the above connections, they have not yet been studied in a formal framework. This dissertation provides a combined operational semantics for goals and commitments. Our semantics yields important desirable properties, including convergence of the configurations of cooperating agents, thereby delineating some theoretically well-founded yet practical modes of cooperation in a multiagent system.

We formalize the combined operational semantics of achievement commitments and goals in terms of hierarchical task networks (HTNs) and show how HTN planning provides a natural representation and reasoning framework for them. Our approach combines a domain-independent theory capturing the lifecycles of goals and commitments, generic patterns of reasoning, and domain models. We go beyond existing approaches by proposing a first-order representation that accommodates settings where the commitments and goals are templatic and may be applied repeatedly with differing bindings for domain objects. Doing so not only leads to a more perspicuous modeling, it also enables us to support a variety of practical patterns.

Los estilos APA, Harvard, Vancouver, ISO, etc.
43

White, Scott Alan. "Modeling process redesign". Thesis, Monterey, California. Naval Postgraduate School, 1992. http://hdl.handle.net/10945/23967.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Marinsalta, Daniel Alberto. "Optical excisor modeling". Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/26131.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Hart, Justin Wildrick. "Robot Self-Modeling". Thesis, Yale University, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3582284.

Texto completo
Resumen

Traditionally, models of a robot's kinematics and sensors have been provided by designers through manual processes. Such models are used for sensorimotor tasks, such as manipulation and stereo vision. However, these techniques often yield static models based on one-time calibrations or ideal engineering drawings; models that often fail to represent the actual hardware, or in which individual unimodal models, such as those describing kinematics and vision, may disagree with each other.

Humans, on the other hand, are not so limited. One of the earliest forms of self-knowledge learned during infancy is knowledge of the body and senses. Infants learn about their bodies and senses through the experience of using them in conjunction with each other. Inspired by this early form of self-awareness, the research presented in this thesis attempts to enable robots to learn unified models of themselves through data sampled during operation. In the presented experiments, an upper torso humanoid robot, Nico, creates a highly-accurate self-representation through data sampled by its sensors while it operates. The power of this model is demonstrated through a novel robot vision task in which the robot infers the visual perspective representing reflections in a mirror by watching its own motion reflected therein.

In order to construct this self-model, the robot first infers the kinematic parameters describing its arm. This is first demonstrated using an external motion capture system, then implemented in the robot's stereo vision system. In a process inspired by infant development, the robot then mutually refines its kinematic and stereo vision calibrations, using its kinematic structure as the invariant against which the system is calibrated. The product of this procedure is a very precise mutual calibration between these two, traditionally separate, models, producing a single, unified self-model.

The robot then uses this self-model to perform a unique vision task. Knowledge of its body and senses enable the robot to infer the position of a mirror placed in its environment. From this, an estimate of the visual perspective describing reflections in the mirror is computed, which is subsequently refined over the expected position of images of the robot's end-effector as reflected in the mirror, and their real-world, imaged counterparts. The computed visual perspective enables the robot to use the mirror as an instrument for spacial reasoning, by viewing the world from its perspective. This test utilizes knowledge that the robot has inferred about itself through experience, and approximates tests of mirror use that are used as a benchmark of self-awareness in human infants and animals.

Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Xiao, Hui. "MIMO channel modeling". Thesis, University of York, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.479187.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Carr, Justin P. "Modeling Volatility Derivatives". Digital WPI, 2011. https://digitalcommons.wpi.edu/etd-theses/1117.

Texto completo
Resumen
"The VIX was introduced in 1993 by the CBOE and has been commonly referred to as the fear gauge due to decreases in market sentiment leading market participants to purchase protection from declining asset prices. As market sentiment improves, declines in the VIX are generally observed. In reality the VIX measures the markets expectations about future volatility with asset prices either rising or falling in value. With the VIX gaining popularity in the marketplace a proliferation of derivative products has emerged allowing investors to trade volatility. In observance of the behavior of the VIX we attempt to model the derivative VXX as a mean reverting process via the Ornstein-Uhlenbeck stochastic differential equation. We extend this analysis by calibrating VIX options with observed market prices in order to extract the market density function. Using these parameters as the diffusion process in our Ornstein-Uhlenbeck model we derive futures prices on the VIX which serves to value our target derivative VXX."
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Szummer, Marcin Olof. "Temporal texture modeling". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/11210.

Texto completo
Resumen
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.
Includes bibliographical references (p. 63-67).
by Marcin Olof Szummer.
M.Eng.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Mendel, Lucy (Lucy R. ). "Modeling By Example". Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/45974.

Texto completo
Resumen
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2007.
Includes bibliographical references (p. 105-109).
Software developers use modeling to explore design alternatives before investing in the higher costs of building the full system. Unlike constructing specific examples, constructing general models is challenging and error-prone. Modeling By Example (MBE) is a new tool designed to help programmers construct general models faster and without errors. Given an object model and an acceptable, or included, example, MBE generates near-hit and near-miss examples for the user to mark as included or not by their mental goal model. The marked examples form a training data-set from which MBE constructs the user's general model. By generating examples dynamically to direct its own learning, MBE learns the concrete goal model with a significantly smaller training data set size than conventional instance-based learning techniques. Empirical experiments show that MBE is a practical solution for constructing simple structural models, but even with a number of optimizations to improve performance does not scale to learning complex models.
by Lucy Mendel.
M.Eng.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Chou, Danielle 1981. "Dahl friction modeling". Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/32826.

Texto completo
Resumen
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2004.
Includes bibliographical references (p. 22).
The drive behind improved friction models has been better prediction and control of dynamic systems. The earliest model was of classical Coulomb friction; however, the discontinuity during force reversal of the Coulomb friction model has long been a point of contention since such a discontinuity does not accurately portray the behavior of real systems. Other models have been suggested, but variations of the Dahl solid friction model remain some of the simplest yet most useful. Dahl's original theory proposed that friction behaved as a stress acting upon the quantum mechanical bonds at the interface. Thus, the relationship between frictional force and position would be analogous to a stress-strain curve, complete with hysteresis should there be permanent displacement akin to plastic deformation in materials. This project reviews the variations of Dahl friction models popular in the literature and then demonstrates it both analytically via Matlab and Simulink simulations and experimentally by observing the behavior of a limited angle torque motor.
by Danielle Chou.
S.B.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía