Dissertations / Theses on the topic 'Stochastic simulator'

To see the other types of publications on this topic, follow the link: Stochastic simulator.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Stochastic simulator.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Chua, Cheong Wei 1975. "A stochastic pool-based electricity market simulator /." Thesis, McGill University, 2000. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=31045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In Part I, two pool-based electricity market models are compared in terms of their economic impact on the market participants, the Lossless Economic Dispatch (LED) and the Optimal Power Flow (OPF). The OPF is shown to be economically more efficient, more accurate and more equitable to the participants.
In Part II, a stochastic electricity market simulator (SEMS) is designed using elements of Monte Carlo methods and game theory. Each generator is assumed to operate in a stochastic manner, according to a bid strategy composed of a set of pre-established bid instances and a corresponding set of bid probabilities. The Pool dispatches power and defines prices according to either the LED or OPF models from Part I. Generators can update their bidding strategies according to a profit performance index reflecting their degree of risk tolerance, Chicken (risk averse), Average, and Cowboy (risk taker). SEMS can predict issues such as unintended collusion, as well as to evaluate bidding strategies.
2

Kim, Daniel D. 1982. "A biological simulator using a stochastic approach for synthetic biology." Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/33307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.
Includes bibliographical references (leaves 58-59).
Synthetic Biology is a new engineering discipline created by the development of genetic engineering technology. Part of a new engineering discipline is to create new tools to build an integrated engineering environment. In this thesis, I designed and implemented a biological system simulator that will enable synthetic biologists to simulate their systems before they put time into building actual physical cells. Improvements to the current simulators in use include a design that enables extensions in functionality, external input signals, and a GUI that allows user interaction. The significance of the simulation results was tested by comparing them to actual live cellular experiments. The results showed that the new simulator can successfully simulate the trends of a simple synthetic cell.
by Daniel D. Kim.
M.Eng.
3

Fan, Futing. "Improving GEMFsim: a stochastic simulator for the generalized epidemic modeling framework." Kansas State University, 2016. http://hdl.handle.net/2097/34564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Master of Science
Department of Electrical and Computer Engineering
Caterina M. Scoglio
The generalized epidemic modeling framework simulator (GEMFsim) is a tool designed by Dr. Faryad Sahneh, former PhD student in the NetSE group. GEMFsim simulates stochastic spreading process over complex networks. It was first introduced in Dr. Sahneh’s doctoral dissertation "Spreading processes over multilayer and interconnected networks" and implemented in Matlab. As limited by Matlab language, this implementation typically solves only small networks; the slow simulation speed is unable to generate enough results in reasonable time for large networks. As a generalized tool, this framework must be equipped to handle large networks and contain sufficient support to provide adequate performance. The C language, a low-level language that effectively maps a program to machine in- structions with efficient execution, was selected for this study. Following implementation of GEMFsim in C, I packed it into Python and R libraries, allowing users to enjoy the flexibility of these interpreted languages without sacrificing performance. GEMFsim limitations are not limited to language, however. In the original algorithm (Gillespie’s Direct Method), the performance (simulation speed) is inversely proportional to network size, resulting in unacceptable speed for very large networks. Therefore, this study applied the Next Reaction Method, making the performance irrelevant of network size. As long as the network fits into memory, the speed is proportional to the average node degree of the network, which is not very large for most real-world networks. This study also applied parallel computing in order to advantageously utilize multiple cores for repeated simulations. Although single simulation can not be paralleled as a Markov process, multiple simulations with identical network structures were run simultaneously, sharing one network description in memory.
4

Boulianne, Laurier. "An algorithm and VLSI architecture for a stochastic particle based biological simulator." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
With the recent progress in both computer technology and systems biology, it is now possible to simulate and visualise biological systems virtually. It is expected that realistic in silico simulations will enhance our understanding of biological processes and will promote the development of effective therapeutic treatments. Realistic biochemical simulators aim to improve our understanding of biological processes that could not be, otherwise, properly understood in experimental studies. This situation calls for increasingly accurate simulators that take into account not only the stochastic nature of biological systems, but also the spatial heterogeneity and the effect of crowding of biological systems. This thesis presents a novel particle-based stochastic biological simulator named Grid- Cell. It also presents a novel VLSI architecture accelerating GridCell between one and two orders of magnitude. GridCell is a three-dimensional simulation environment for investigating the behaviour of biochemical networks under a variety of spatial influences including crowding, recruitment and localisation. GridCell enables the tracking and characterisation of individual particles, leading to insights on the behaviour of low copy number molecules participating in signalling networks. The simulation space is divided into a discrete 3D grid that provides ideal support for particle collisions without distance calculations and particle searches. SBML support enables existing networks to be simulated and visualised. The user interface provides intuitive navigation that facilitates insights into species behaviour across spatial and temporal dimensions. Crowding effects on a Michaelis- Menten system are simulated and results show they can have a huge impact on the effective rate of product formation. Tracking millions of particles is extremely computationally expensive and in order to run whole cells at the molecular resolution in less than 24 hours, a commonly expressed goal in systems biology, accelerating GridCell with parallel hardware is required. An FPGA architecture combining pipelining, parallel processing units and streaming is presented. The architecture is scalable to multiple FPGAs and the streaming approach ensures that the architecture scales well to very large systems. An architecture containing 25 processing units on each stage of the pipeline is synthesised on a single Virtex-6 XC6VLX760 FPGA device and a speedup of 76x over the serial implementation is achieved. This speedup reduces the gap between the complexity of cell simulation and the processing power of advanced simulators. Future work on GridCell could include support for highly complex compartment and high definition particles.
Grâce aux récents progrès en informatique et en biologie, il est maintenant possible de simuler et de visualiser des systèmes biologiques de façon virtuelle. Il est attendu que des simulations réalistes produites par ordinateur, in silico, nous permettront d'améliorer notre connaissance des processus biologiques et de favoriser le développement de traitements thérapeutiques efficaces. Les simulateurs biologiques visent à améliorer notre connaissance de processus biologiques qui, autrement, ne pourraient pas être correctement analysés par des études expérimentales. Cette situation requiert le développement de simulateurs de plus en plus précis qui tiennent compte non seulement de la nature stochastique des systèmes biologiques, mais aussi de l'hétérogénéité spatiale ainsi que des effets causés par la grande densité de particules présentes dans ces systèmes. Ce mémoire présente GridCell, un simulateur biologique stochastique original basé sur une représentation microscopique des particules. Ce mémoire présente aussi une architecture parallèle originale accélérant GridCell par presque deux ordres de magnitude. GridCell est un environnement de simulation tridimensionnel qui permet d'étudier le comportement des réseaux biochimique sous différentes influences spatiales, notamment l'encombrement moléculaire ainsi que les effets de recrutement et de localisation des particules. GridCell traque les particules individuellement, ce qui permet d'explorer le comportement de molécules participants en très petits nombres à divers réseaux de signalisation. L'espace de simulation est divisé en une grille 3D discrète qui permet de générer des collisions entre les particules sans avoir à faire de calculs de distance ni de recherches de particules complexes. La compatibilité avec le format SBML permet à des réseaux déjà existants d'être simulés et visualisés. L'interface visuelle permet à l'utilisateur de naviguer de façon intuitive dans la simulation afin d'observer le comportement des espèces à travers le temps et l'espace. Des effets d'encombrement moléculaire sur un système enzymatique de type Michaelis-Menten sont simulés, et les résultats montrent un effet important sur le taux de formation du produit. Tenir compte de millions de particules à la fois est extrêmement demandant pour un ordinateur et, pour pouvoir simuler des cellules complètes avec une résolution spatiale moléculaire en moins d'une journée, un but souvent exprimé en biologie des systèmes, il est essentiel d'accélérer GridCell à l'aide de matériel informatique fonctionnant en parallèle. On propose une architecture sur FPGA combinant le traitement en pipeline, le fonctionnement en mode continu ainsi que l'exécution parallèle. L'architecture peut supporter plusieurs FPGA et l'approche en mode continu permet à l'architecture de supporter très grands systèmes. Une architecture comprenant 25 unités de traitement sur chaque étage du pipeline est synthétisée sur un seul FPGA Virtex-6 XC6VLX760, ce qui permet d'obtenir des gains de performance 76 fois supérieurs à l'implémentation séquentielle de l'algorithme. Ce gain de performance réduit l'écart entre la complexité de la simulation des cellules biologiques et la puissance de calcul des simulateurs avancés. Des travaux futurs sur GridCell pourraient avoir pour objectif de supporter des compartiments de forme très complexe ainsi que des particules haute définition.
5

Soltani-Moghaddam, Alireza. "Network simulator design with extended object model and generalized stochastic petri-net /." free to MU campus, to others for purchase, 2000. http://wwwlib.umi.com/cr/mo/fullcit?p9999317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Taleb, B. "The theory and design of a stochastic reliability simulator for large scale systems." Thesis, Open University, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.383689.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

PINTO, ROBERTO JOSE. "STOCHASTIC SIMULATOR TO CALCULATE THE AGENTS FINANCIAL FLOW AT BRAZILIAN WHOLESALE ENERGY MARKET." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2002. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=2876@1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
ELETROBRAS - CENTRAIS ELÉTRICAS BRASILEIRAS S. A.
No novo modelo de livre concorrência do Setor Elétrico Nacional,o Mercado Atacadista de Energia (MAE) foi criado para ser o ambiente onde se processam as compras e vendas de energia de curto prazo. Logo, os agentes que possuem excedentes de energia, provenientes de excesso de geração ou de sobra de contrato, poderão vendê-los no MAE. A situação inversa também pode ocorrer, ou seja, o agente que necessitar de energia para cobrir um deficit de energia ou honrar contratos também poderá comprar energia no MAE. Em cada instante de tempo, os montantes de energia que cada agente poderá comercializar no MAE, assim como o preço de liquidação, não podem ser previstos com exatidão, pois dependem, por exemplo, das condições hidrológicas futuras. Isto acarreta incertezas com relação ao fluxo de caixa futuro dos agentes.No presente trabalho é apresentado um modelo de simulador estocástico capaz de fornecer estimativas futuras do fluxo financeiro de um agente no MAE, considerando-se em detalhe as regras vigentes, analisando- se diversos cenários hidrológicos.
In the new trading model for the Brazilian electricity sector, the Wholesale Energy Market -Mercado Atacadista de Energia - MAE- is the place where all buyers and sellers of electricity can trade and in which the spot price of energy will be determined. In this market the agents can sell the excess of generation or the positive net energy of bilateral contracts. However, lack of generation or negative net energy of bilateral contracts will be exposured to spot market price.The market price and the energy amount that each agent can trade at MAE depends on many factors, such as future hydrological conditions, for example.This fact causes financial flow uncertainties to all market agents. Then, this dissertation shows a model to make the market accounts using the MAE rules and future estimation of generations and consumptions energies. The results of this model could help the agents to forecast the payments and receipts at MAE.
8

Olsén, Jörgen. "Stochastic Modeling and Simulation of the TCP protocol." Doctoral thesis, Uppsala University, Mathematical Statistics, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-3534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:

The success of the current Internet relies to a large extent on a cooperation between the users and the network. The network signals its current state to the users by marking or dropping packets. The users then strive to maximize the sending rate without causing network congestion. To achieve this, the users implement a flow-control algorithm that controls the rate at which data packets are sent into the Internet. More specifically, the Transmission Control Protocol (TCP) is used by the users to adjust the sending rate in response to changing network conditions. TCP uses the observation of packet loss events and estimates of the round trip time (RTT) to adjust its sending rate.

In this thesis we investigate and propose stochastic models for TCP. The models are used to estimate network performance like throughput, link utilization, and packet loss rate. The first part of the thesis introduces the TCP protocol and contains an extensive TCP modeling survey that summarizes the most important TCP modeling work. Reviewed models are categorized as renewal theory models, fixed-point methods, fluid models, processor sharing models or control theoretic models. The merits of respective category is discussed and guidelines for which framework to use for future TCP modeling is given.

The second part of the thesis contains six papers on TCP modeling. Within the renewal theory framework we propose single source TCP-Tahoe and TCP-NewReno models. We investigate the performance of these protocols in both a DropTail and a RED queuing environment. The aspects of TCP performance that are inherently depending on the actual implementation of the flow-control algorithm are singled out from what depends on the queuing environment.

Using the fixed-point framework, we propose models that estimate packet loss rate and link utilization for a network with multiple TCP-Vegas, TCP-SACK and TCP-Reno on/off sources. The TCP-Vegas model is novel and is the first model capable of estimating the network's operating point for TCP-Vegas sources sending on/off traffic. All TCP and network models in the contributed research papers are validated via simulations with the network simulator ns-2.

This thesis serves both as an introduction to TCP and as an extensive orientation about state of the art stochastic TCP models.

9

Erben, Vojtěch. "Návrh a testování stochastické navigace v TRASI." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-219900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The thesis deals with the design and implementation of routing algorithms in trafic simulator TRASI. These algorithms are capable of planning vehicle's route by giving a set of crossroads that vehicle needs to go through. Furthermore, this work deals with design and implementation of stochastic navigation including implementation of communication between vehicles. Stochastic navigation suggests several alternative routes based on a traffic event. From these routes is randomly (stochastically) choosen one based on information about the throughput of particular found routes. In the introduction of this work is described the traffic simulator TRASI, it's user interface and basic control interface. Further is described theory of traffic flow on macroscopic and microscopic level, followed by the descripion of algorithms for oriented graphs traversal and their implementation in the simulator. In the following parts of this thesis is described communication layer, that takes care of the communication between vehicles, and it's implementation. Further is described design and implementation of stochastic navigation. In the final chapter is done verification of the functionality of the simulator and tests of particular routing algorithms.
10

Echavarria, Gregory Maria Angelica. "Predictive Data-Derived Bayesian Statistic-Transport Model and Simulator of Sunken Oil Mass." Scholarly Repository, 2010. http://scholarlyrepository.miami.edu/oa_dissertations/471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Sunken oil is difficult to locate because remote sensing techniques cannot as yet provide views of sunken oil over large areas. Moreover, the oil may re-suspend and sink with changes in salinity, sediment load, and temperature, making deterministic fate models difficult to deploy and calibrate when even the presence of sunken oil is difficult to assess. For these reasons, together with the expense of field data collection, there is a need for a statistical technique integrating limited data collection with stochastic transport modeling. Predictive Bayesian modeling techniques have been developed and demonstrated for exploiting limited information for decision support in many other applications. These techniques brought to a multi-modal Lagrangian modeling framework, representing a near-real time approach to locating and tracking sunken oil driven by intrinsic physical properties of field data collected following a spill after oil has begun collecting on a relatively flat bay bottom. Methods include (1) development of the conceptual predictive Bayesian model and multi-modal Gaussian computational approach based on theory and literature review; (2) development of an object-oriented programming and combinatorial structure capable of managing data, integration and computation over an uncertain and highly dimensional parameter space; (3) creating a new bi-dimensional approach of the method of images to account for curved shoreline boundaries; (4) confirmation of model capability for locating sunken oil patches using available (partial) real field data and capability for temporal projections near curved boundaries using simulated field data; and (5) development of a stand-alone open-source computer application with graphical user interface capable of calibrating instantaneous oil spill scenarios, obtaining sets maps of relative probability profiles at different prediction times and user-selected geographic areas and resolution, and capable of performing post-processing tasks proper of a basic GIS-like software. The result is a predictive Bayesian multi-modal Gaussian model, SOSim (Sunken Oil Simulator) Version 1.0rc1, operational for use with limited, randomly-sampled, available subjective and numeric data on sunken oil concentrations and locations in relatively flat-bottomed bays. The SOSim model represents a new approach, coupling a Lagrangian modeling technique with predictive Bayesian capability for computing unconditional probabilities of mass as a function of space and time. The approach addresses the current need to rapidly deploy modeling capability without readily accessible information on ocean bottom currents. Contributions include (1) the development of the apparently first pollutant transport model for computing unconditional relative probabilities of pollutant location as a function of time based on limited available field data alone; (2) development of a numerical method of computing concentration profiles subject to curved, continuous or discontinuous boundary conditions; (3) development combinatorial algorithms to compute unconditional multimodal Gaussian probabilities not amenable to analytical or Markov-Chain Monte Carlo integration due to high dimensionality; and (4) the development of software modules, including a core module containing the developed Bayesian functions, a wrapping graphical user interface, a processing and operating interface, and the necessary programming components that lead to an open-source, stand-alone, executable computer application (SOSim - Sunken Oil Simulator). Extensions and refinements are recommended, including the addition of capability for accepting available information on bathymetry and maybe bottom currents as Bayesian prior information, the creation of capability of modeling continuous oil releases, and the extension to tracking of suspended oil (3-D).
11

Montanari, Carlo Emilio. "Development of an event-based simulator for analysing excluded volume effects in a Brownian gas." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14527/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Il presente lavoro si pone come scopo lo sviluppo di un simulatore in C++ di dinamica molecolare utilizzando un approccio event-based, in grado di simulare la dinamica newtoniana semplice di molecole bidimensionali di forma arbitraria. Abbiamo utilizzato il simulatore NOCS per imbastire un primo tentativo di ricerca e di analisi degli effetti di volume escluso sul moto Browniano di molecole. In particolare si vogliono ricercare violazioni locali di isotropia nel moto Browniano. Nella parte teorica dell'elaborato, si analizzano gli strumenti matematici e statistici fondamentali della Kinetic Theory (teoria cinetica dei gas) ed i principali modelli della depletion force, uno dei fenomeni causati da potenziale di volume escluso.
12

Azzi, Soumaya. "Surrogate modeling of stochastic simulators." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAT009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Cette thèse propose des outils statistiques pour étudier l’impact qu’a la morphologie d’une ville sur l’exposition des populations induite par un champ électromagnétique provenant d’une station de base. Pour cela l’exposition a été évaluée numériquement en propageant (via des techniques de lancer de rayons) les champs émis dans une antenne dans des villes aléatoires. Ces villes aléatoires ont les mêmes caractéristiques macroscopiques (e.g. hauteur moyenne des immeubles, largeur moyenne des rues et anisotropie) mais sont distinctes les unes des autres. Pour les mêmes caractéristiques de nombreuses villes aléatoires ont été générées et l’exposition induite a été calculée pour chacune. Par conséquent, chaque combinaison de variables correspond à plusieurs valeurs d’exposition. L’exposition est décrite par une distribution statistique non nécessairement gaussienne. Ce comportement stochastique est présent en plusieurs problèmes industriels et souvent les nombreuses simulations menées ont un cout de calcul important. Les travaux de cette thèse étudient la modélisation de substitution des fonctions aléatoires. Le simulateur stochastique est considéré comme un processus stochastique. On propose une approche non paramétrique basée sur la décomposition de Karhunen-Loève du processus stochastique. La fonction de substitution a l’avantage d’être très peu coûteuse à exécuter et à fournir des prédictions précises.En effet, l’objectif de la thèse consiste à évaluer la sensibilité de l’exposition aux caractéristiques morphologiques d’une ville. On propose une approche d’analyse de sensibilité tenant compte de l’aspect stochastique du modèle. L’entropie différentielle du processus stochastique est évaluée et la sensibilité est estimée en calculant les indices de Sobol de l’entropie. La variance de l’entropie est exprimée en fonction de la variabilité de chacune des variables d’entrée
This thesis is a contribution to the surrogate modeling and the sensitivity analysis on stochastic simulators. Stochastic simulators are a particular type of computational models, they inherently contain some sources of randomness and are generally computationally prohibitive. To overcome this limitation, this manuscript proposes a method to build a surrogate model for stochastic simulators based on Karhunen-Loève expansion. This thesis also aims to perform sensitivity analysis on such computational models. This analysis consists on quantifying the influence of the input variables onto the output of the model. In this thesis, the stochastic simulator is represented by a stochastic process, and the sensitivity analysis is then performed on the differential entropy of this process.The proposed methods are applied to a stochastic simulator assessing the population’s exposure to radio frequency waves in a city. Randomness is an intrinsic characteristic of the stochastic city generator. Meaning that, for a set of city parameters (e.g. street width, building height and anisotropy) does not define a unique city. The context of the electromagnetic dosimetry case study is presented, and a surrogate model is built. The sensitivity analysis is then performed using the proposed method
13

Ghafari, Jouneghani Farzad. "Photonic quantum information science for stochastic simulation and non-locality." Thesis, Griffith University, 2019. http://hdl.handle.net/10072/386534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Since its inception, quantum information processing (QIP) has been branched into many new directions. As well as well-known applications such as teleportation and metrology, researchers are beginning to investigate interdisciplinary areas such as quantum machine learning. One of the interesting areas is where quantum information science meets stochastic modelling, from the field of complexity science. Recently, it has been shown theoretically that, for simulating classical systems, quantumassisted models and simulators are more efficient in terms of memory storage they require to do simulations, compared with classical computers. That is, for most systems, classical simulators demand an excessive amount of memory storage, while quantum simulators can do the same simulation with less memory. The main focus of this thesis is on experimental realisation of quantum simulators that are capable of simulating stochastic processes with a reduced amount of memory. To implement the (nearly) exact simulation of stochastic processes using quantum simulators, it is essential to have low-noise state preparation, robust unitary operations, and high-precision read-out. These requirements, and the flexibility and precision of photonic quantum optics, make photonics the ideal system for developing the science of this new quantum advantage and for making strides towards its technological realisation. In the context of stochastic simulation, I have experimentally studied three key problems that serve as stepping stones in advancing this new field. In the first experiment, an error-tolerant quantum simulator was designed, and realised to simulate a 1D Ising spin chain, using internal states that store less information than the corresponding classical approach. Furthermore, an interesting and fundamental phenomenon named the ambiguity of simplicity is witnessed. This is the inconsistency that we observe in the order of relative complexity of two systems when we change the simulators from classical to quantum. In the second experiment, a quantum simulator was built that can simulate classical processes for more than one step of the simulation at a time, storing the information from multiple steps coherently. In the third experiment, a new type of quantum memory advantage, which is based on the dimensionality of the memory register, rather than on information entropy measure, is demonstrated. This advantage is realisable in a single simulator, in contrast to previous works. Realising the dimensionality memory advantage in practical applications does not rely on running multiple simulators in parallel.In the field of optical QIP, realising an ideal single-photon source has been a longstanding challenge. In another part of my PhD, I have worked on a source that aims to tackle some of the existing issues in this context. We built a source of high-quality entangled photons with a high heralding efficiency, which has the potential to be used in multi-photon experiments. This source was also essential to demonstrate a fundamental task in quantum non-locality, one-way steering, which was performed conclusively for the first time.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Environment and Sc
Science, Environment, Engineering and Technology
Full Text
14

Basak, Subhasish. "Multipathogen quantitative risk assessment in raw milk soft cheese : monotone integration and Bayesian optimization." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Ce manuscrit se concentre sur l'optimisation Bayésienne d'un modèle d'appréciation quantitative des risques microbiologiques (AQRM) dans le cadre du projet ArtiSaneFood soutenu par l'Union européenne. L'objectif est d'établir des stratégies de bio-intervention efficaces pour les fabricants de fromage au lait cru en France, en s'appuyant sur trois types de travaux : 1) le développement d'un modèle AQRM multipathogène pour un fromage de type pâte molle au lait cru, 2) étudier des méthodes d'intégration monotone pour l'estimation des sorties du modèle AQRM et 3) la conception d'un algorithme d'optimisation Bayésienne adapté à un simulateur stochastique et coûteux.Dans la première partie, nous proposons un modèle AQRM multipathogène construit sur la base d'études existantes (voir, par exemple, Bonifait et al., 2021, Perrin et al., 2014, Sanaa et al., 2004, Strickland et al., 2023). Ce modèle est conçu pour estimer l'impact des maladies d'origine alimentaire sur la santé publique, causées par des agents pathogènes tels que Escherichia coli entérohémorragiques (EHEC), Salmonella et Listeria monocytogenes, potentiellement présents dans le fromage de type pâte molle au lait cru. Ce modèle “farm-to-fork” intègre les mesure de maitrise liées aux tests microbiologiques du lait et du fromage, permettant d'estimer les coûts associés aux interventions. Une implémentation du modèle AQRM pour EHEC est fournie en R et dans le cadre FSKX (Basak et al., under review). La deuxième partie de ce manuscrit explore l'application potentielle de méthodes d'intégration séquentielle, exploitant les propriétés de monotonie et de bornage des sorties du simulateur. Nous menons une revue de littérature approfondie sur les méthodes d'intégration existantes (voir, par exemple, Kiefer, 1957, Novak, 1992), et examinons les résultats théoriques concernant leur convergence. Notre contribution comprend la proposition d'améliorations à ces méthodes et la discussion des défis associés à leur application dans le domaine de l'AQRM.Dans la dernière partie de ce manuscrit, nous proposons un algorithme Bayésien d'optimisation multiobjectif pour estimer les entrées optimales de Pareto d'un simulateur stochastique et coûteux en calcul. L'approche proposée est motivée par le principe de “Stepwise Uncertainty Reduction” (SUR) (voir, par exemple, Vazquez and Bect, 2009, Vazquez and Martinez, 2006, Villemonteix et al., 2007), avec un critère d'échantillonnage basé sur weighted integrated mean squared error (w-IMSE). Nous présentons une évaluation numérique comparant l'algorithme proposé avec PALS (Pareto Active Learning for Stochastic simulators) (Barracosa et al., 2021), sur un ensemble de problèmes de test bi-objectifs. Nous proposons également une extension (Basak et al., 2022a) de l'algorithme PALS, adaptée au cas d'application de l'AQRM
This manuscript focuses on Bayesian optimization of a quantitative microbiological risk assessment (QMRA) model, in the context of the European project ArtiSaneFood, supported by the PRIMA program. The primary goal is to establish efficient bio-intervention strategies for cheese producers in France.This work is divided into three broad directions: 1) development and implementation of a multipathogen QMRA model for raw milk soft cheese, 2) studying monotone integration methods for estimating outputs of the QMRA model, and 3) designing a Bayesian optimization algorithm tailored for a stochastic and computationally expensive simulator.In the first part we propose a multipathogen QMRA model, built upon existing studies in the literature (see, e.g., Bonifait et al., 2021, Perrin et al., 2014, Sanaa et al., 2004, Strickland et al., 2023). This model estimates the impact of foodborne illnesses on public health, caused by pathogenic STEC, Salmonella and Listeria monocytogenes, which can potentially be present in raw milk soft cheese. This farm-to-fork model also implements the intervention strategies related to mlik and cheese testing, which allows to estimate the cost of intervention. An implementation of the QMRA model for STEC is provided in R and in the FSKX framework (Basak et al., under review). The second part of this manuscript investigates the potential application of sequential integration methods, leveraging the monotonicity and boundedness properties of the simulator outputs. We conduct a comprehensive literature review on existing integration methods (see, e.g., Kiefer, 1957, Novak, 1992), and delve into the theoretical findings regarding their convergence. Our contribution includes proposing enhancements to these methods and discussion on the challenges associated with their application in the QMRA domain.In the final part of this manuscript, we propose a Bayesian multiobjective optimization algorithm for estimating the Pareto optimal inputs of a stochastic and computationally expensive simulator. The proposed approach is motivated by the principle of Stepwise Uncertainty Reduction (SUR) (see, e.g., Vazquezand Bect, 2009, Vazquez and Martinez, 2006, Villemonteix et al., 2007), with a weighted integrated mean squared error (w-IMSE) based sampling criterion, focused on the estimation of the Pareto front. A numerical benchmark is presented, comparing the proposed algorithm with PALS (Pareto Active Learning for Stochastic simulators) (Barracosa et al., 2021), over a set of bi-objective test problems. We also propose an extension (Basak et al., 2022a) of the PALS algorithm, tailored to the QMRA application case
15

Hellander, Stefan. "Stochastic Simulation of Reaction-Diffusion Processes." Doctoral thesis, Uppsala universitet, Avdelningen för beräkningsvetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-198522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Numerical simulation methods have become an important tool in the study of chemical reaction networks in living cells. Many systems can, with high accuracy, be modeled by deterministic ordinary differential equations, but other systems require a more detailed level of modeling. Stochastic models at either the mesoscopic level or the microscopic level can be used for cases when molecules are present in low copy numbers. In this thesis we develop efficient and flexible algorithms for simulating systems at the microscopic level. We propose an improvement to the Green's function reaction dynamics algorithm, an efficient microscale method. Furthermore, we describe how to simulate interactions with complex internal structures such as membranes and dynamic fibers. The mesoscopic level is related to the microscopic level through the reaction rates at the respective scale. We derive that relation in both two dimensions and three dimensions and show that the mesoscopic model breaks down if the discretization of space becomes too fine. For a simple model problem we can show exactly when this breakdown occurs. We show how to couple the microscopic scale with the mesoscopic scale in a hybrid method. Using the fact that some systems only display microscale behaviour in parts of the system, we can gain computational time by restricting the fine-grained microscopic simulations to only a part of the system. Finally, we have developed a mesoscopic method that couples simulations in three dimensions with simulations on general embedded lines. The accuracy of the method has been verified by comparing the results with purely microscopic simulations as well as with theoretical predictions.
eSSENCE
16

Drawert, Brian J. "Spatial Stochastic Simulation of Biochemical Systems." Thesis, University of California, Santa Barbara, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=3559784.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:

Recent advances in biology have shown that proteins and genes often interact probabilistically. The resulting effects that arise from these stochastic dynamics differ significantly than traditional deterministic formulations, and have biologically significant ramifications. This has led to the development of computational models of the discrete stochastic biochemical pathways found in living organisms. These include spatial stochastic models, where the physical extent of the domain plays an important role; analogous to traditional partial differential equations.

Simulation of spatial stochastic models is a computationally intensive task. We have developed a new algorithm, the Diffusive Finite State Projection (DFSP) method for the efficient and accurate simulation of stochastic spatially inhomogeneous biochemical systems. DFSP makes use of a novel formulation of Finite State Projection (FSP) to simulate diffusion, while reactions are handled by the Stochastic Simulation Algorithm (SSA). Further, we adapt DFSP to three dimensional, unstructured, tetrahedral meshes in inclusion in the mature and widely usable systems biology modeling software URDME, enabling simulation of the complex geometries found in biological systems. Additionally, we extend DFSP with adaptive error control and a highly efficient parallel implementation for the graphics processing units (GPU).

In an effort to understand biological processes that exhibit stochastic dynamics, we have developed a spatial stochastic model of cellular polarization. Specifically we investigate the ability of yeast cells to sense a spatial gradient of mating pheromone and respond by forming a projection in the direction of the mating partner. Our results demonstrates that higher levels of stochastic noise results in increased robustness, giving support to a cellular model where noise and spatial heterogeneity combine to achieve robust biological function. This also highlights the importance of spatial stochastic modeling to reproduce experimental observations.

17

Homem, de Mello Tito. "Simulation-based methods for stochastic optimization." Diss., Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/24846.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Morton-Firth, Carl Jason. "Stochastic simulation of cell signalling pathways." Thesis, University of Cambridge, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.625063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Cheung, Ricky. "Stochastic based football simulation using data." Thesis, Uppsala universitet, Matematiska institutionen, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-359835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis is an extension of a football simulator made in a previous project, where we also made different visualizations and simulators based on football data. The goal is to create a football simulator based on a modified Markov chain process, where two teams can be chosen, to simulate entire football matches play-by-play. To validate our model, we compare simulated data with the provided data from Opta. Several adjustments are made to make the simulation as realistic as possible. After conducting a few experiments to compare simulated data with real data before and after adjustments, we conclude that the model may not be adequately accurate to reflect real life matches.
20

Du, Manuel. "Stochastic simulation studies for honeybee breeding." Doctoral thesis, Humboldt-Universität zu Berlin, 2021. http://dx.doi.org/10.18452/22295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Die Arbeit beschreibt ein stochastisches Simulationsprogramm zur Modellierung von Honigbienenpopulationen unter Zuchtbedingungen. Dieses Programm wurde neu implementiert um unterschiedliche Selektionsstrategien zu evaluieren und zu optimieren. In einer ersten Studie wurde untersucht, inwiefern die Vorhersagen, die das Programm trifft, vom verwendeten genetischen Modell abhängt. Hierbei wurde festgestellt, dass das Finite-Locus-Modell dem Infinitesimalmodell in Langzeitstudien vorzuziehen ist. Eine zweite Studie beleuchtete die Bedeutung der sicheren Anpaarung auf Belegstellen für die Honigbienenzucht. Hier zeigten die Simulationen, dass die Zucht mit Anpaarungskontrolle derjenigen mit freier Paarung von Königinnen deutlich überlegen ist. Schließlich wurde in einer finalen Studie der Frage nachgegangen, wie erfolgreiche Zuchtprogramme bei der Honigbiene langfristig nachhaltig zu gestalten sind. Hierbei sind kurzfristiger genetischer Zugewinn und langfristige Inzuchtvermeidung gegeneinander abzuwägen. Durch umfangreiche Simulationen konnten für verschiedene Ausgangspopulationen Empfehlungen für eine optimale Zuchtintensität auf mütterlicher und väterlicher Seite gefunden werden.
The present work describes a stochastic simulation program for modelling honeybee populations under breeding conditions. The program was newly implemented to investigate and optimize different selection strategies. A first study evaluated in how far the program's predictions depend on the underlying genetic model. It was found that the finite locus model rather than the infinitesimal model should be used for long-term investigations. A second study shed light into the importance of controlled mating for honeybee breeding. It was found that breeding schemes with controlled mating are far superior to free-mating alternatives. Ultimately, a final study examined how successful breeding strategies can be designed so that they are sustainable in the long term. For this, short-term genetic progress has to be weighed against the avoidance of inbreeding in the long run. By extensive simulations, optimal selection intensities on the maternal and paternal paths could be determined for different sets of population parameters.
21

Vasan, Arunchandar. "Timestepped stochastic simulation of 802.11 WLANs." College Park, Md.: University of Maryland, 2008. http://hdl.handle.net/1903/8533.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2008.
Thesis research directed by: Dept. of Computer Science. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
22

Hashemi, Fatemeh Sadat. "Sampling Controlled Stochastic Recursions: Applications to Simulation Optimization and Stochastic Root Finding." Diss., Virginia Tech, 2015. http://hdl.handle.net/10919/76740.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We consider unconstrained Simulation Optimization (SO) problems, that is, optimization problems where the underlying objective function is unknown but can be estimated at any chosen point by repeatedly executing a Monte Carlo (stochastic) simulation. SO, introduced more than six decades ago through the seminal work of Robbins and Monro (and later by Kiefer and Wolfowitz), has recently generated much attention. Such interest is primarily because of SOs flexibility, allowing the implicit specification of functions within the optimization problem, thereby providing the ability to embed virtually any level of complexity. The result of such versatility has been evident in SOs ready adoption in fields as varied as finance, logistics, healthcare, and telecommunication systems. While SO has become popular over the years, Robbins and Monros original stochastic approximation algorithm and its numerous modern incarnations have seen only mixed success in solving SO problems. The primary reason for this is stochastic approximations explicit reliance on a sequence of algorithmic parameters to guarantee convergence. The theory for choosing such parameters is now well-established, but most such theory focuses on asymptotic performance. Automatically choosing parameters to ensure good finite-time performance has remained vexingly elusive, as evidenced by continuing efforts six decades after the introduction of stochastic approximation! The other popular paradigm to solve SO is what has been called sample-average approximation. Sample-average approximation, more a philosophy than an algorithm to solve SO, attempts to leverage advances in modern nonlinear programming by first constructing a deterministic approximation of the SO problem using a fixed sample size, and then applying an appropriate nonlinear programming method. Sample-average approximation is reasonable as a solution paradigm but again suffers from finite-time inefficiency because of the simplistic manner in which sample sizes are prescribed. It turns out that in many SO contexts, the effort expended to execute the Monte Carlo oracle is the single most computationally expensive operation. Sample-average approximation essentially ignores this issue since, irrespective of where in the search space an incumbent solution resides, prescriptions for sample sizes within sample-average approximation remain the same. Like stochastic approximation, notwithstanding beautiful asymptotic theory, sample-average approximation suffers from the lack of automatic implementations that guarantee good finite-time performance. In this dissertation, we ask: can advances in algorithmic nonlinear programming theory be combined with intelligent sampling to create solution paradigms for SO that perform well in finite-time while exhibiting asymptotically optimal convergence rates? We propose and study a general solution paradigm called Sampling Controlled Stochastic Recursion (SCSR). Two simple ideas are central to SCSR: (i) use any recursion, particularly one that you would use (e.g., Newton and quasi- Newton, fixed-point, trust-region, and derivative-free recursions) if the functions involved in the problem were known through a deterministic oracle; and (ii) estimate objects appearing within the recursions (e.g., function derivatives) using Monte Carlo sampling to the extent required. The idea in (i) exploits advances in algorithmic nonlinear programming. The idea in (ii), with the objective of ensuring good finite-time performance and optimal asymptotic rates, minimizes Monte Carlo sampling by attempting to balance the estimated proximity of an incumbent solution with the sampling error stemming from Monte Carlo. This dissertation studies the theoretical and practical underpinnings of SCSR, leading to implementable algorithms to solve SO. We first analyze SCSR in a general context, identifying various sufficient conditions that ensure convergence of SCSRs iterates to a solution. We then analyze the nature of such convergence. For instance, we demonstrate that in SCSRs which guarantee optimal convergence rates, the speed of the underlying (deterministic) recursion and the extent of Monte Carlo sampling are intimately linked, with faster recursions permitting a wider range of Monte Carlo effort. With the objective of translating such asymptotic results into usable algorithms, we formulate a family of SCSRs called Adaptive SCSR (A-SCSR) that adaptively determines how much to sample as a recursion evolves through the search space. A-SCSRs are dynamic algorithms that identify sample sizes to balance estimated squared bias and variance of an incumbent solution. This makes the sample size (at every iteration of A-SCSR) a stopping time, thereby substantially complicating the analysis of the behavior of A-SCSRs iterates. That A-SCSR works well in practice is not surprising" the use of an appropriate recursion and the careful sample size choice ensures this. Remarkably, however, we show that A-SCSRs are convergent to a solution and exhibit asymptotically optimal convergence rates under conditions that are no less general than what has been established for stochastic approximation algorithms. We end with the application of a certain A-SCSR to a parameter estimation problem arising in the context of brain-computer interfaces (BCI). Specifically, we formulate and reduce the problem of probabilistically deciphering the electroencephalograph (EEG) signals recorded from the brain of a paralyzed patient attempting to perform one of a specified set of tasks. Monte Carlo simulation in this context takes a more general view, as the act of drawing an observation from a large dataset accumulated from the recorded EEG signals. We apply A-SCSR to nine such datasets, showing that in most cases A-SCSR achieves correct prediction rates that are between 5 and 15 percent better than competing algorithms. More importantly, due to the incorporated adaptive sampling strategies, A-SCSR tends to exhibit dramatically better efficiency rates for comparable prediction accuracies.
Ph. D.
23

Stocks, Nigel Geoffrey. "Experiments in stochastic nonlinear dynamics." Thesis, Lancaster University, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.315224.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Albertyn, Martin. "Generic simulation modelling of stochastic continuous systems." Thesis, Pretoria : [s.n.], 2004. http://upetd.up.ac.za/thesis/available/etd-05242005-112442.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Xu, Zhouyi. "Stochastic Modeling and Simulation of Gene Networks." Scholarly Repository, 2010. http://scholarlyrepository.miami.edu/oa_dissertations/645.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Recent research in experimental and computational biology has revealed the necessity of using stochastic modeling and simulation to investigate the functionality and dynamics of gene networks. However, there is no sophisticated stochastic modeling techniques and efficient stochastic simulation algorithms (SSA) for analyzing and simulating gene networks. Therefore, the objective of this research is to design highly efficient and accurate SSAs, to develop stochastic models for certain real gene networks and to apply stochastic simulation to investigate such gene networks. To achieve this objective, we developed several novel efficient and accurate SSAs. We also proposed two stochastic models for the circadian system of Drosophila and simulated the dynamics of the system. The K-leap method constrains the total number of reactions in one leap to a properly chosen number thereby improving simulation accuracy. Since the exact SSA is a special case of the K-leap method when K=1, the K-leap method can naturally change from the exact SSA to an approximate leap method during simulation if necessary. The hybrid tau/K-leap and the modified K-leap methods are particularly suitable for simulating gene networks where certain reactant molecular species have a small number of molecules. Although the existing tau-leap methods can significantly speed up stochastic simulation of certain gene networks, the mean of the number of firings of each reaction channel is not equal to the true mean. Therefore, all existing tau-leap methods produce biased results, which limit simulation accuracy and speed. Our unbiased tau-leap methods remove the bias in simulation results that exist in all current leap SSAs and therefore significantly improve simulation accuracy without sacrificing speed. In order to efficiently estimate the probability of rare events in gene networks, we applied the importance sampling technique to the next reaction method (NRM) of the SSA and developed a weighted NRM (wNRM). We further developed a systematic method for selecting the values of importance sampling parameters. Applying our parameter selection method to the wSSA and the wNRM, we get an improved wSSA (iwSSA) and an improved wNRM (iwNRM), which can provide substantial improvement over the wSSA in terms of simulation efficiency and accuracy. We also develop a detailed and a reduced stochastic model for circadian rhythm in Drosophila and employ our SSA to simulate circadian oscillations. Our simulations showed that both models could produce sustained oscillations and that the oscillation is robust to noise in the sense that there is very little variability in oscillation period although there are significant random fluctuations in oscillation peeks. Moreover, although average time delays are essential to simulation of oscillation, random changes in time delays within certain range around fixed average time delay cause little variability in the oscillation period. Our simulation results also showed that both models are robust to parameter variations and that oscillation can be entrained by light/dark circles.
26

Park, Chuljin. "Discrete optimization via simulation with stochastic constraints." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/49088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this thesis, we first develop a new method called penalty function with memory (PFM). PFM consists of a penalty parameter and a measure of constraint violation and it converts a discrete optimization via simulation (DOvS) problem with stochastic constraints into a series of DOvS problems without stochastic constraints. PFM determines a penalty of a visited solution based on past results of feasibility checks on the solution. Specifically, assuming a minimization problem, a penalty parameter of PFM, namely the penalty sequence, diverges to infinity for an infeasible solution but converges to zero almost surely for any strictly feasible solution under certain conditions. For a feasible solution located on the boundary of feasible and infeasible regions, the sequence converges to zero either with high probability or almost surely. As a result, a DOvS algorithm combined with PFM performs well even when optimal solutions are tight or nearly tight. Second, we design an optimal water quality monitoring network for river systems. The problem is to find the optimal location of a finite number of monitoring devices, minimizing the expected detection time of a contaminant spill event while guaranteeing good detection reliability. When uncertainties in spill and rain events are considered, both the expected detection time and detection reliability need to be estimated by stochastic simulation. This problem is formulated as a stochastic DOvS problem with the objective of minimizing expected detection time and with a stochastic constraint on the detection reliability; and it is solved by a DOvS algorithm combined with PFM. Finally, we improve PFM by combining it with an approximate budget allocation procedure. We revise an existing optimal budget allocation procedure so that it can handle active constraints and satisfy necessary conditions for the convergence of PFM.
27

Chaleeraktrakoon, Chavalit. "Stochastic modelling and simulation of streamflow processes." Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=28704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The main objectives of this research are to propose a general stochastic method for determining analytically the distribution of flood volume, and to develop a simulation procedure for generating synthetic multiseason streamflows at different sites simultaneously. The research study is divided into two parts: (a) First, a general stochastic model is proposed to derive analytically the probability distribution function for flood volume. The volume of a flood is defined as the sum of an unbroken sequence of consecutive daily flows above a given truncation level. Analytical expressions were then derived for the exact distribution of flood volume for various cases in which successive flow exceedances can be assumed to be either independent or correlated, and the cumulative flow exceedance can be considered to be independent or dependent of the corresponding flood duration. The proposed stochastic method is more general and more flexible than empirical fitting approach because it can take explicitly into account different stochastic properties inherent in the underlying streamflow process. Results of a numerical application have shown that the proposed model could provide a very good fit between observed and theoretical results. Further, in the estimation of flood volume distribution, it has been found that the effect of the serial correlation property of the flow series is not significant as compared to the impact due to the dependence between flood volume and corresponding duration. Finally, it has been demonstrated that the simplistic assumption of triangular flood hydrograph shape, as usually appeared in previous studies, is not necessary in the estimation of flood volume distribution. (b) Second, a multivariate stochastic simulation approach is proposed for generating synthetic seasonal streamflow series at a single location or at many different locations concurrently. The suggested simulation scheme is based on the combination of the singular value decomposition
28

Minoukadeh, Kimiya. "Deterministic and stochastic methods for molecular simulation." Phd thesis, Université Paris-Est, 2010. http://tel.archives-ouvertes.fr/tel-00597694.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Molecular simulation is an essential tool in understanding complex chemical and biochemical processes as real-life experiments prove increasingly costly or infeasible in practice . This thesis is devoted to methodological aspects of molecular simulation, with a particular focus on computing transition paths and their associated free energy profiles. The first part is dedicated to computational methods for reaction path and transition state searches on a potential energy surface. In Chapter 3 we propose an improvement to a widely-used transition state search method, the Activation Relaxation Technique (ART). We also present a local convergence study of a prototypical algorithm. The second part is dedicated to free energy computations. We focus in particular on an adaptive importance sampling technique, the Adaptive Biasing Force (ABF) method. The first contribution to this field, presented in Chapter 5, consists in showing the applicability to a large molecular system of a new parallel implementation, named multiple-walker ABF (MW-ABF). Numerical experiments demonstrated the robustness of MW-ABF against artefacts arising due to poorly chosen or oversimplified reaction coordinates. These numerical findings inspired a new study of the longtime convergence of the ABF method, as presented in Chapter 6. By studying a slightly modified model, we back our numerical results by showing a faster theoretical rate of convergence of ABF than was previously shown
29

Wang, Eric Yiqing. "Comparison Between Deterministic and Stochastic Biological Simulation." Thesis, Uppsala universitet, Analys och sannolikhetsteori, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-230732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

McCloughan, Patrick. "A stochastic simulation model of industrial concentration." Thesis, University of East Anglia, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.333558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Hardy, Mary Rosalyn. "Stochastic simulation in life office solvency assessment." Thesis, Heriot-Watt University, 1994. http://hdl.handle.net/10399/1398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Liu, Kuo-Ching. "Stochastic simulation-based finite capacity scheduling systems /." The Ohio State University, 1997. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487946776022111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Geurts, Kevin Richard. "Stochastic simulation of non-Newtonian flow fields /." Thesis, Connect to this title online; UW restricted, 1995. http://hdl.handle.net/1773/9821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Szekely, Tamas. "Stochastic modelling and simulation in cell biology." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:f9b8dbe6-d96d-414c-ac06-909cff639f8c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Modelling and simulation are essential to modern research in cell biology. This thesis follows a journey starting from the construction of new stochastic methods for discrete biochemical systems to using them to simulate a population of interacting haematopoietic stem cell lineages. The first part of this thesis is on discrete stochastic methods. We develop two new methods, the stochastic extrapolation framework and the Stochastic Bulirsch-Stoer methods. These are based on the Richardson extrapolation technique, which is widely used in ordinary differential equation solvers. We believed that it would also be useful in the stochastic regime, and this turned out to be true. The stochastic extrapolation framework is a scheme that admits any stochastic method with a fixed stepsize and known global error expansion. It can improve the weak order of the moments of these methods by cancelling the leading terms in the global error. Using numerical simulations, we demonstrate that this is the case up to second order, and postulate that this also follows for higher order. Our simulations show that extrapolation can greatly improve the accuracy of a numerical method. The Stochastic Bulirsch-Stoer method is another highly accurate stochastic solver. Furthermore, using numerical simulations we find that it is able to better retain its high accuracy for larger timesteps than competing methods, meaning it remains accurate even when simulation time is speeded up. This is a useful property for simulating the complex systems that researchers are often interested in today. The second part of the thesis is concerned with modelling a haematopoietic stem cell system, which consists of many interacting niche lineages. We use a vectorised tau-leap method to examine the differences between a deterministic and a stochastic model of the system, and investigate how coupling niche lineages affects the dynamics of the system at the homeostatic state as well as after a perturbation. We find that larger coupling allows the system to find the optimal steady state blood cell levels. In addition, when the perturbation is applied randomly to the entire system, larger coupling also results in smaller post-perturbation cell fluctuations compared to non-coupled cells. In brief, this thesis contains four main sets of contributions: two new high-accuracy discrete stochastic methods that have been numerically tested, an improvement that can be used with any leaping method that introduces vectorisation as well as how to use a common stepsize adapting scheme, and an investigation of the effects of coupling lineages in a heterogeneous population of haematopoietic stem cell niche lineages.
35

Pahle, Jürgen. "Stochastic simulation and analysis of biochemical networks." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2008. http://dx.doi.org/10.18452/15786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Stochastische Effekte können einen großen Einfluss auf die Funktionsweise von biochemischen Netzwerken haben. Vor allem Signalwege, z.B. Calciumsignaltransduktion, sind anfällig gegenüber zufälligen Schwankungen. Daher stellt sich die wichtige Frage, wie dadurch der Informationstransfer in diesen Systemen beeinträchtigt wird. Zunächst werden eine Reihe von stochastischen Simulationsmethoden diskutiert und systematisch klassifiziert. Dies dient als methodische Grundlage der ganzen Dissertation. Der Schwerpunkt liegt hier auf approximativen und hybriden Ansätzen, einschließlich der Hybridmethode des Softwaresystems Copasi, deren Implementierung Teil dieser Arbeit war. Die Dynamik biochemischer Systeme zeigt in den meisten Fällen einen Übergang von stochastischem zu deterministischem Verhalten mit steigender Partikelzahl. Dieser Übergang wird für Calciumsignaltransduktion und andere Systeme untersucht. Es zeigt sich, dass das Auftreten stochastischer Effekte stark von der Sensitivität des Systems abhängt. Ein Maß dafür ist die Divergenz. Systeme mit hoher Divergenz zeigen noch mit hohen Teilchenzahlen stochastische Effekte und umgekehrt. Schließlich wird der Einfluss von zufälligen Fluktuationen auf die Leistungsfähigkeit von Signalpfaden erforscht. Dazu werden simulierte sowie experimentell gemessene Calcium-Zeitreihen stochastisch an die Aktivierung eines Zielenzyms gekoppelt. Das Schätzen des informationstheoretischen Maßes Transferentropie unter unterschiedlichen zellulären Bedingungen dient zur Abschätzung des Informationstransfers. Dieser nimmt mit steigender Partikelzahl zu, ist jedoch sehr abhängig von der momentanen Dynamik (z.B. spikende, burstende oder irreguläre Oszillationen). Die hier entwickelten Methoden, wie der Gebrauch der Divergenz als Indikator für den stoch./det. Übergang oder die stochastische Kopplung und informationstheoretische Analyse mittels Transferentropie, sind wertvolle Werkzeuge für die Analyse von biochemischen Systemen.
Stochastic effects in biochemical networks can affect the functioning of these systems significantly. Signaling pathways, such as calcium signal transduction, are particularly prone to random fluctuations. Thus, an important question is how this influences the information transfer in these pathways. First, a comprehensive overview and systematic classification of stochastic simulation methods is given as methodical basis for the thesis. Here, the focus is on approximate and hybrid approaches. Also, the hybrid solver in the software system Copasi is described whose implementation was part of this PhD work. Then, in most cases, the dynamic behavior of biochemical systems shows a transition from stochastic to deterministic behavior with increasing particle numbers. This transition is studied in calcium signaling as well as other test systems. It turns out that the onset of stochastic effects is very dependent on the sensitivity of the specific system quantified by its divergence. Systems with high divergence show stochastic effects even with high particle numbers and vice versa. Finally, the influence of noise on the performance of signaling pathways is investigated. Simulated and experimentally measured calcium time series are stochastically coupled to an intracellular target enzyme activation process. Then, the information transfer under different cellular conditions is estimated with the information-theoretic quantity transfer entropy. The amount of information that can be transferred increases with rising particle numbers. However, this increase is very dependent on the current dynamical mode of the system, such as spiking, bursting or irregular oscillations. The methods developed in this thesis, such as the use of the divergence as an indicator for the transition from stochastic to deterministic behavior or the stochastic coupling and information-theoretic analysis using transfer entropy, are valuable tools for the analysis of biochemical systems.
36

Fang, Fang. "A simulation study for Bayesian hierarchical model selection methods." View electronic thesis (PDF), 2009. http://dl.uncw.edu/etd/2009-2/fangf/fangfang.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Liss, Anders. "Optimizing stochastic simulation of a neuron with parallelization." Thesis, Uppsala universitet, Avdelningen för beräkningsvetenskap, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-324444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In order to optimize the solving of stochastic simulations of neuron channels, an attempt to parallelize the solver has been made. The result of the implementation was unsuccessful. However, the implementation is not impossible and is still a field of research with big potential for improving performance of stochastic simulations.
38

Olsén, Jörgen. "Stochastic modeling and simulation of the TCP protocol /." Uppsala : Matematiska institutionen, Univ. [distributör], 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-3534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Pérez, Godofredo. "Stochastic conditional simulation for description of reservoir properties /." Access abstract and link to full text, 1991. http://0-wwwlib.umi.com.library.utulsa.edu/dissertations/fullcit/9203796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Siu, Daniel. "Stochastic Hybrid Dynamic Systems: Modeling, Estimation and Simulation." Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/4405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Stochastic hybrid dynamic systems that incorporate both continuous and discrete dynamics have been an area of great interest over the recent years. In view of applications, stochastic hybrid dynamic systems have been employed to diverse fields of studies, such as communication networks, air traffic management, and insurance risk models. The aim of the present study is to investigate properties of some classes of stochastic hybrid dynamic systems. The class of stochastic hybrid dynamic systems investigated has random jumps driven by a non-homogeneous Poisson process and deterministic jumps triggered by hitting the boundary. Its real-valued continuous dynamic between jumps is described by stochastic differential equations of the It\^o-Doob type. Existing results of piecewise deterministic models are extended to obtain the infinitesimal generator of the stochastic hybrid dynamic systems through a martingale approach. Based on results of the infinitesimal generator, some stochastic stability results are derived. The infinitesimal generator and stochastic stability results can be used to compute the higher moments of the solution process and find a bound of the solution. Next, the study focuses on a class of multidimensional stochastic hybrid dynamic systems. The continuous dynamic of the systems under investigation is described by a linear non-homogeneous systems of It\^o-Doob type of stochastic differential equations with switching coefficients. The switching takes place at random jump times which are governed by a non-homogeneous Poisson process. Closed form solutions of the stochastic hybrid dynamic systems are obtained. Two important special cases for the above systems are the geometric Brownian motion process with jumps and the Ornstein-Uhlenbeck process with jumps. Based on the closed form solutions, the probability distributions of the solution processes for these two special cases are derived. The derivation employs the use of the modal matrix and transformations. In addition, the parameter estimation problem for the one-dimensional cases of the geometric Brownian motion and Ornstein-Uhlenbeck processes with jumps are investigated. Through some existing and modified methods, the estimation procedure is presented by first estimating the parameters of the discrete dynamic and subsequently examining the continuous dynamic piecewisely. Finally, some simulated stochastic hybrid dynamic processes are presented to illustrate the aforementioned parameter-estimation methods. One simulated insurance example is given to demonstrate the use of the estimation and simulation techniques to obtain some desired quantities.
41

Du, Manuel [Verfasser]. "Stochastic simulation studies for honeybee breeding / Manuel Du." Berlin : Humboldt-Universität zu Berlin, 2021. http://d-nb.info/1226153194/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Brand, Samuel P. C. "Spatial and stochastic epidemics : theory, simulation and control." Thesis, University of Warwick, 2012. http://wrap.warwick.ac.uk/56738/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
It is now widely acknowledged that spatial structure and hence the spatial position of host populations plays a vital role in the spread of infection. In this work I investigate an ensemble of techniques for understanding the stochastic dynamics of spatial and discrete epidemic processes, with especial consideration given to SIR disease dynamics for the Levins-type metapopulation. I present a toolbox of techniques for the modeller of spatial epidemics. The highlight results are a novel form of moment closure derived directly from a stochastic differential representation of the epidemic, a stochastic simulation algorithm that asymptotically in system size greatly out-performs existing simulation methods for the spatial epidemic and finally a method for tackling optimal vaccination scheduling problems for controlling the spread of an invasive pathogen.
43

L'Ecuyer, Pierre, and Josef Leydold. "rstream: Streams of Random Numbers for Stochastic Simulation." Department of Statistics and Mathematics, Abt. f. Angewandte Statistik u. Datenverarbeitung, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1544/1/document.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The package rstream provides a unified interface to streams of random numbers for the R statistical computing language. Features are: * independent streams of random numbers * substreams * easy handling of streams (initialize, reset) * antithetic random variates The paper describes this packages and demonstrates an simple example the usefulness of this approach.
Series: Preprint Series / Department of Applied Statistics and Data Processing
44

Chen, Minghan. "Stochastic Modeling and Simulation of Multiscale Biochemical Systems." Diss., Virginia Tech, 2019. http://hdl.handle.net/10919/90898.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Numerous challenges arise in modeling and simulation as biochemical networks are discovered with increasing complexities and unknown mechanisms. With the improvement in experimental techniques, biologists are able to quantify genes and proteins and their dynamics in a single cell, which calls for quantitative stochastic models for gene and protein networks at cellular levels that match well with the data and account for cellular noise. This dissertation studies a stochastic spatiotemporal model of the Caulobacter crescentus cell cycle. A two-dimensional model based on a Turing mechanism is investigated to illustrate the bipolar localization of the protein PopZ. However, stochastic simulations are often impeded by expensive computational cost for large and complex biochemical networks. The hybrid stochastic simulation algorithm is a combination of differential equations for traditional deterministic models and Gillespie's algorithm (SSA) for stochastic models. The hybrid method can significantly improve the efficiency of stochastic simulations for biochemical networks with multiscale features, which contain both species populations and reaction rates with widely varying magnitude. The populations of some reactant species might be driven negative if they are involved in both deterministic and stochastic systems. This dissertation investigates the negativity problem of the hybrid method, proposes several remedies, and tests them with several models including a realistic biological system. As a key factor that affects the quality of biological models, parameter estimation in stochastic models is challenging because the amount of empirical data must be large enough to obtain statistically valid parameter estimates. To optimize system parameters, a quasi-Newton algorithm for stochastic optimization (QNSTOP) was studied and applied to a stochastic budding yeast cell cycle model by matching multivariate probability distributions between simulated results and empirical data. Furthermore, to reduce model complexity, this dissertation simplifies the fundamental cooperative binding mechanism by a stochastic Hill equation model with optimized system parameters. Considering that many parameter vectors generate similar system dynamics and results, this dissertation proposes a general α-β-γ rule to return an acceptable parameter region of the stochastic Hill equation based on QNSTOP. Different objective functions are explored targeting different features of the empirical data.
Doctor of Philosophy
Modeling and simulation of biochemical networks faces numerous challenges as biochemical networks are discovered with increased complexity and unknown mechanisms. With improvement in experimental techniques, biologists are able to quantify genes and proteins and their dynamics in a single cell, which calls for quantitative stochastic models, or numerical models based on probability distributions, for gene and protein networks at cellular levels that match well with the data and account for randomness. This dissertation studies a stochastic model in space and time of a bacterium’s life cycle— Caulobacter. A two-dimensional model based on a natural pattern mechanism is investigated to illustrate the changes in space and time of a key protein population. However, stochastic simulations are often complicated by the expensive computational cost for large and sophisticated biochemical networks. The hybrid stochastic simulation algorithm is a combination of traditional deterministic models, or analytical models with a single output for a given input, and stochastic models. The hybrid method can significantly improve the efficiency of stochastic simulations for biochemical networks that contain both species populations and reaction rates with widely varying magnitude. The populations of some species may become negative in the simulation under some circumstances. This dissertation investigates negative population estimates from the hybrid method, proposes several remedies, and tests them with several cases including a realistic biological system. As a key factor that affects the quality of biological models, parameter estimation in stochastic models is challenging because the amount of observed data must be large enough to obtain valid results. To optimize system parameters, the quasi-Newton algorithm for stochastic optimization (QNSTOP) was studied and applied to a stochastic (budding) yeast life cycle model by matching different distributions between simulated results and observed data. Furthermore, to reduce model complexity, this dissertation simplifies the fundamental molecular binding mechanism by the stochastic Hill equation model with optimized system parameters. Considering that many parameter vectors generate similar system dynamics and results, this dissertation proposes a general α-β-γ rule to return an acceptable parameter region of the stochastic Hill equation based on QNSTOP. Different optimization strategies are explored targeting different features of the observed data.
45

Mesogitis, Tassos. "Stochastic simulation of the cure of advanced composites." Thesis, Cranfield University, 2015. http://dspace.lib.cranfield.ac.uk/handle/1826/9216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This study focuses on the development of a stochastic simulation methodology to study the effects of cure kinetics uncertainty, in plane fibre misalignment and boundary conditions uncertainty on the cure process of composite materials. Differential Scanning Calorimetry was used to characterise cure kinetics variability of a commercial epoxy resin used in aerospace applications. It was found that cure kinetics uncertainty is associated with variations in the initial degree of cure, activation energy and reaction order. Image analysis was employed to characterise in plane fibre misalignment in a carbon fibre ±45º non-crimp fabric. The experimental results showed that variability in tow orientation was significant with a standard deviation of about 1.2º. A set of experiments using an infusion set-up was carried out to quantify boundary conditions uncertainty related to tool temperature, ambient temperature and surface heat transfer coefficient using thermocouples (tool/ambient temperature) and heat flux sensors (surface heat transfer coefficient). It was concluded that boundary conditions uncertainty can show considerable short term and long term variability. Conventional Monte Carlo and Probabilistic Collocation Method were integrated with a thermo-mechanical cure simulation model in order to investigate the effect of cure kinetics, fibre misalignment and boundary conditions variability on process outcome. The cure model was developed and implemented using a finite element model incorporating appropriate material sub-models of cure kinetics, specific heat capacity, thermal conductivity, moduli, thermal expansion and cure shrinkage. The effect of cure kinetics uncertainty on the temperature overshoot of a thick carbon fibre epoxy flat panel was investigated using the two stochastic simulation schemes. The stochastic simulation results showed that variability in cure kinetics can introduce a significant scatter in temperature overshoot, presenting a coefficient of variation of about 30%. Furthermore, it was shown that the collocation method can offer an efficient solution with significantly lower computational cost compared to Monte Carlo at comparable accuracy. Stochastic simulation of the cure of an angle shaped carbon fibre-epoxy component within the Monte Carlo scheme showed that fibre misalignment can cause considerable variability in the process outcome. The coefficient of variation of maximum residual stress can reach up to approximately 2% (standard deviation of 1 MPa) whilst qualitative and quantitative variations in final distortion of the cured part occur with the standard deviation in twist and corner angle reaching values of 0.4 º and 0.05º respectively. Simulation of the cure of a thin carbon fibre-epoxy panel within the Monte Carlo scheme indicated that surface heat transfer and tool temperature variability dominate variability in cure time, resulting in a coefficient of variation of about 22%. In addition to Monte Carlo, the effect of surface heat transfer coefficient and tool temperature variations on cure time was addressed using the collocation method. It was found that probabilistic collocation is capable of capturing variability propagation with good accuracy while offering tremendous benefits in terms of computational costs.
46

Gu, Xiaoqing. "The behavior of simulated annealing in stochastic optimization." [Ames, Iowa : Iowa State University], 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
47

Chapman, Jacob. "Multi-agent stochastic simulation of occupants in buildings." Thesis, University of Nottingham, 2017. http://eprints.nottingham.ac.uk/39868/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
One of the principle causes for deviations between predicted and simulated performance of buildings relates to the stochastic nature of their occupants: their presence, activities whilst present, activity dependent behaviours and the consequent implications for their perceived comfort. A growing research community is active in the development and validation of stochastic models addressing these issues; and considerable progress has been made. Specifically models in the areas of presence, activities while present, shading devices, window openings and lighting usage. One key outstanding challenge relates to the integration of these prototype models with building simulation in a coherent and generalizable way; meaning that emerging models can be integrated with a range of building simulation software. This thesis describes our proof of concept platform that integrates stochastic occupancy models within a multi agent simulation platform, which communicates directly with building simulation software. The tool is called Nottingham Multi-Agent Stochastic Simulation (No-MASS). No-MASS is tested with a building performance simulation solver to demonstrate the effectiveness of the integrated stochastic models on a residential building and a non-residential building. To account for diversity between occupants No-MASS makes use of archetypical behaviours within the stochastic models of windows, shades and activities. Thus providing designers with means to evaluate the performance of their designs in response to the range of expected behaviours and to evaluate the robustness of their design solutions; which is not possible using current simplistic deterministic representations. A methodology for including rule based models is built into No-MASS, this allows for testing what-if scenarios with building performance simulation and provides a pragmatic basis for the modelling of the behaviours for which there is insufficient data to develop stochastic models. A Belief-Desire-Intention model is used to develop a set of goals and plans that an agent must follow to influence the environment based on their beliefs about current environmental conditions. Recommendations for the future development of stochastic models are presented based on the sensitivity analysis of the plans. A social interactions framework is developed within No-MASS to resolve conflicts between competing agents. This framework resolves situations where each agent may have different desires, for example one may wish to have a window open and another closed based on the outputs of the stochastic models. A votes casting system determines the agent choice, the most votes becomes the action acted on. No-MASS employs agent machine learning techniques that allow them to learn how to respond to the processes taking place within a building and agents can choose a strategy without the need for context specific rules. Employing these complementary techniques to support the comprehensive simulation of occupants presence and behaviour, integrated within a single platform that can readily interface with a range of building (and urban) energy simulation programs is the key contribution to knowledge from this thesis. Nevertheless, there is significant scope to extend this work to further reduce the performance gap between simulated and real world buildings.
48

Xu, Lina. "Simulation methods for stochastic differential equations in finance." Thesis, Queensland University of Technology, 2019. https://eprints.qut.edu.au/134388/1/Lina_Xu_Thesis.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis resolves a number of econometric problems relating to the use of stochastic differential equations based on computer-intensive simulation methods. Stochastic differential equations play an important role in modern finance. They have been used to model the trajectories of key variables such as short-term interest rates and the volatility of financial assets. The central theme of the thesis is the use of Hermite polynomials to approximate the transitional probability distribution functions of stochastic differential equations. Based on these approximations, a new method is proposed for simulating solutions to these equations and new testing procedures are developed to examine the fit of the equations to observed data.
49

Sato, Hiroyuki. "Stochastic and simulation models of maritime intercept operations capabilities." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Dec%5FSato.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M.S. in Operations Research)--Naval Postgraduate School, December 2005.
Thesis Advisor(s): Patricia A. Jacobs, Donald P. Gaver. Includes bibliographical references (p.117-119). Also available online.
50

Posadas, Sergio. "Stochastic simulation of a Commander's decision cycle (SSIM CODE)." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA392113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thesis (M.S. in Operations Research) Naval Postgraduate School, June 2001.
Thesis advisor(s): Paulo, Eugene P. ; Olson, Allen S. "June 2001." Includes bibliographical references (p. 111-115). Also available in print.

To the bibliography