Dissertations / Theses on the topic 'Simulation à base de processus'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Simulation à base de processus.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Troncoso, Alan. "Conditional simulations of reservoir models using Sequential Monte-Carlo methods." Electronic Thesis or Diss., Université Paris sciences et lettres, 2022. http://www.theses.fr/2022UPSLM055.
Full textA sequential Monte Carlo method, called particle filtering, has been used in a spatial context to produce simulations of two reservoir models that respect the observed facies at wells. The first one, the Boolean model, is an object-based model. It canbe used to model two-facies reservoirs: One porous facies, and an impermeable facies that acts as a barrier for the fluidcirculation. The model is mathematically tractable: There exists statistical methods to infer its parameters as well as aniterative conditional simulation algorithm. However, the convergence rate of this algorithm is difficult to establish. Asequential algorithm based on the particle filtering is proposed as an alternative. It finally appears that this sequentialalgorithm outperforms the iterative algorithm in terms of quality of results and computational time.The second model, Flumy, is a model of sedimentary processes. It is used for representing the formation of meanderingchannelized systems. This model can reproduce the heterogeneity induced by the complex geometries of sedimentary deposits.The current algorithm implemented in Flumy modifies dynamically the processes for fitting the data at best to produceconditional simulations. The set-up of this algorithm requires a deep knowledge of the processes to modify them and avoidartifacts and biases. For this reason, another conditioning algorithm, called sequential, has been developed. It consists in building the reservoir by stacking horizontal layers using particle filtering, thus allowing the observed facies to beassimilated in each layer. These two algorithms have been compared on a synthetic case and on a real case (Loranca Basin,Spain). Both give comparable results, but they differ in terms of the resources required for their implementation: whereasthe sequential algorithm needs high computer power, the dynamic algorithm requires a fine understanding of the processes to be modified
Beaujouan, David. "Simulation des matériaux magnétiques à base Cobalt par Dynamique Moléculaire Magnétique." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00760645.
Full textBlondet, Gaëtan. "Système à base de connaissances pour le processus de plan d'expériences numériques." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2363/document.
Full textIn order to improve industrial competitiveness, product design relies more and more on numerical tools, such as numerical simulation, to develop better and cheaper products faster. Numerical Design of Experiments (NDOE) are more and more used to include variabilities during simulation processes, to design more robust, reliable and optimized product earlier in the product development process. Nevertheless, a NDOE process may be too expensive to be applied to a complex product, because of the high computational cost of the model and the high number of required experiments. Several methods exist to decrease this computational cost, but they required expert knowledge to be efficiently applied. In addition to that, NDoE process produces a large amount of data which must be managed. The aim of this research is to propose a solution to define, as fast as possible, an efficient NDoE process, which produce as much useful information as possible with a minimal number of simulations, for complex products. The objective is to shorten both process definition and execution steps. A knowledge-based system is proposed, based on a specific ontology and a bayesian network, to capitalise, share and reuse knowledge and data to predict the best NDoE process definition regarding to a new product. This system is validated on a product from automotive industry
Hoock, Jean-Baptiste. "Contributions to Simulation-based High-dimensional Sequential Decision Making." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00912338.
Full textDinaharison, Jean Bienvenue. "Conception d’une approche spatialisée à base d’agent pour coupler les modèles mathématiques et informatiques : application à la modélisation du processus écosystémique du sol." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS184.
Full textSoil is a highly heterogeneous environment in which many processes interact to provide ecosystem services. Model coupling approaches propose to define such system by using a modular architecture in which various processes, represented by models, communicate to reproduce different aspects of a phenomenon such as soil functioning. In this thesis project, we develop such an approach for the purpose of modelling soil functioning. The challenges of such a scheme lie in solving representation problems of soil processes. These representation problems originate from the fact that models from various disciplines are reused to describe the processes. By representations problems, we mean model description which can be (individual or equation based), the temporal execution settings and data resolution. These coupling constraints are addressed by a number of approaches in the literature. All of them propose satisfactory solutions to these constraints in their respective application fields. In our approach, we use the agent paradigm to encapsulate the various soil processes. Then processes will communicate through the space by using resources inside it. The behaviour of the processes then depends on the availability of resources. A coordination problem can arise from this type of coupling, as processes may consume the resource simultaneously while the resource may not support this demand. To overcome that matter, we use an action-theoretic technique called Influence-Reaction to define strategies to manage this type of situation. We used algorithms suggested by the abbundant litterature to manage any processes temporality issues. This coupling approach was applied to a model of organic matter decomposition in which several processes (earthworms, microbes and roots) compete for soil ressources. The results suggest that the use of our approach is suitable for modelling soil functioning, but also gives more accurate indications of resource availability
Li, Haizhou. "Modeling and verification of probabilistic data-aware business processes." Thesis, Clermont-Ferrand 2, 2015. http://www.theses.fr/2015CLF22563/document.
Full textThere is a wide range of new applications that stress the need for business process models that are able to handle imprecise data. This thesis studies the underlying modelling and analysis issues. It uses as formal model to describe process behaviours a labelled transitions system in which transitions are guarded by conditions defined over a probabilistic database. To tackle verification problems, we decompose this model to a set of traditional automata associated with probabilities named as world-partition automata. Next, this thesis presents an approach for testing probabilistic simulation preorder in this context. A complexity analysis reveals that the problem is in 2-exptime, and is exptime-hard, w.r.t. expression complexity while it matches probabilistic query evaluation w.r.t. data-complexity. Then P-LTL and P-CTL model checking methods are studied to verify this model. In this context, the complexity of P-LTL and P-CTL model checking is in exptime. Finally a prototype called ”PRODUS” which is a modeling and verification tool is introduced and we model a realistic scenario in the domain of GIS (graphical information system) by using our approach
Sirin, Göknur. "Supporting multidisciplinary vehicle modeling : towards an ontology-based knowledge sharing in collaborative model based systems engineering environment." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2015. http://www.theses.fr/2015ECAP0024/document.
Full textSimulation models are widely used by industries as an aid for decision making to explore and optimize a broad range of complex industrial systems’ architectures. The increased complexity of industrial systems (cars, airplanes, etc.), ecological and economic concerns implies a need for exploring and analysing innovative system architectures efficiently and effectively by using simulation models. However, simulations designers currently suffer from limitations which make simulation models difficult to design and develop in a collaborative, multidisciplinary design environment. The multidisciplinary nature of simulation models requires a specific understanding of each phenomenon to simulate and a thorough description of the system architecture, its components and connections between components. To accomplish these objectives, the Model-Based Systems Engineering (MBSE) and Information Systems’ (IS) methodologies were used to support the simulation designer’s analysing capabilities in terms of methods, processes and design tool solutions. The objective of this thesis is twofold. The first concerns the development of a methodology and tools to build accurate simulation models. The second focuses on the introduction of an innovative approach to design, product and integrate the simulation models in a “plug and play" manner by ensuring the expected model fidelity. However, today, one of the major challenges in full-vehicle simulation model creation is to get domain level simulation models from different domain experts while detecting any potential inconsistency problem before the IVVQ (Integration, Verification, Validation, and Qualification) phase. In the current simulation model development process, most of the defects such as interface mismatch and interoperability problems are discovered late, during the IVVQ phase. This may create multiple wastes, including rework and, may-be the most harmful, incorrect simulation models, which are subsequently used as basis for design decisions. In order to address this problem, this work aims to reduce late inconsistency detection by ensuring early stage collaborations between the different suppliers and OEM. Thus, this work integrates first a Detailed Model Design Phase to the current model development process and, second, the roles have been re-organized and delegated between design actors. Finally an alternative architecture design tool is supported by an ontology-based DSL (Domain Specific Language) called Model Identity Card (MIC). The design tools and mentioned activities perspectives (e.g. decisions, views and viewpoints) are structured by inspiration from Enterprise Architecture Frameworks. To demonstrate the applicability of our proposed solution, engine-after treatment, hybrid parallel propulsion and electric transmission models are tested across automotive and aeronautic industries
Prodel, Martin. "Modélisation automatique et simulation de parcours de soins à partir de bases de données de santé." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEM009/document.
Full textDuring the last two decades, the amount of data collected in Information Systems has drastically increased. This large amount of data is highly valuable. This reality applies to health-care where the computerization is still an ongoing process. Existing methods from the fields of process mining, data mining and mathematical modeling cannot handle large-sized and variable event logs. Our goal is to develop an extensive methodology to turn health data from event logs into simulation models of clinical pathways. We first introduce a mathematical framework to discover optimal process models. Our approach shows the benefits of combining combinatorial optimization and process mining techniques. Then, we enrich the discovered model with additional data from the log. An innovative combination of a sequence alignment algorithm and of classical data mining techniques is used to analyse path choices within long-term clinical pathways. The approach is suitable for noisy and large logs. Finally, we propose an automatic procedure to convert static models of clinical pathways into dynamic simulation models. The resulting models perform sensitivity analyses to quantify the impact of determinant factors on several key performance indicators related to care processes. They are also used to evaluate what-if scenarios. The presented methodology was proven to be highly reusable on various medical fields and on any source of event logs. Using the national French database of all the hospital events from 2006 to 2015, an extensive case study on cardiovascular diseases is presented to show the efficiency of the proposed framework
Michel, Thierry. "Test en ligne des systèmes à base de microprocesseur." Phd thesis, Grenoble INPG, 1993. http://tel.archives-ouvertes.fr/tel-00343488.
Full textEUGENE, REMI. "Etude architecturale, modelisation et realisation d'un processeur a base de gapp pour le traitement d'images temps reel et la simulation par automate cellulaire." Université Louis Pasteur (Strasbourg) (1971-2008), 1990. http://www.theses.fr/1990STR13043.
Full textMichau, Alexandre. "Dépôts chimiques en phase vapeur de revêtements à base de chrome sur surfaces complexes pour environnements extrêmes : expérimental et simulation." Thesis, Toulouse, INPT, 2016. http://www.theses.fr/2016INPT0088/document.
Full textNuclear fuel cladding tubes resistance against high temperature oxidation during accident conditions is crucial because it means protecting the first containment barrier. This can be done by coating the inner wall of the cladding tube with CVD processes, which are most likely to do so. More specifically, we used DLI-MOCVD to grow chromium based (Cr(S), metallic crystalline chromium) and chromium carbides based (amorphous chromium carbides CrxCy, recycled CrxCy, silicon doped CrxSizCy) coatings, known for their good oxidation resistance. The coating process was optimized using numerical modelling to improve coatings performance. A reaction kinetics model of the deposition process of amorphous CrxCy coatings was adjusted and validated after the identification of the chemical mechanism. It was also shown that the liquid solution containing organometallic precursor (bis(arene)chromium) and solvent (toluene) could be directly recycled, thereby increasing the industrialization potential of such process. Physical, chemical and structural properties of coatings deposited with this process were characterized. A study of the coatings mechanical properties has also been undertaken. It shows that compared to related coatings grown with other processes, those deposited by DLI-MOCVD exhibit a particularly high hardness (up to 30 GPa), compressive residual stresses, good adhesion with the substrate and finally a different abrasive wear resistance depending on the temperature. The assessment of their oxidation resistance at 1200 °C revealed excellent performances of amorphous chromium carbides coatings, which can delay catastrophic oxidation up to two hours with only a 10 µm thickness. All the other coatings only increase the thermal resistance of zircaloy substrates
Hamonier, Julien. "Analyse par ondelettes du mouvement multifractionnaire stable linéaire." Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2012. http://tel.archives-ouvertes.fr/tel-00753510.
Full textOlinda, Ricardo Alves de. "Modelagem estatística de extremos espaciais com base em processos max-stable aplicados a dados meteorológicos no estado do Paraná." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-17092012-103936/.
Full textThe most mathematical models developed for rare events are based on probabilistic models for extremes. Although the tools for statistical modeling of univariate and multivariate extremes are well-developed, the extension of these tools to model spatial extremes data is currently a very active area of research. Modeling of maximum values under the spatial domain, applied to meteorological data is important for the proper management of risks and environmental disasters in the countries where the agricultural sector has great influence on the economy. A natural approach for such modeling is the theory of extreme spatial and max-stable process, characterized by infinite dimensional extension of multivariate extreme value theory, and we can then incorporate the current correlation functions in geostatistics and thus, check the extreme dependence through the extreme coefficient and the madogram. This thesis describes the application of such procedures in the modeling of spatial maximum dependency of monthly maximum rainfall of Paraná State, historical series based on observed meteorological stations. The proposed models consider the Euclidean space and a transformation called climatic space, which makes it possible to explain the presence of directional effects resulting from synoptic weather patterns. This methodology is based on the theorem proposed by De Haan (1984) and Smith (1990) models and Schlather (2002), checking the isotropic and anisotropic behavior these models through Monte Carlo simulation. Estimates are performed using maximum pairwise likelihood and the models are compared using the Takeuchi information criterion. The algorithm used in the fit is very fast and robust, allowing a good statistical and computational efficiency in monthly maximum rainfall modeling, allowing the modeling of directional effects resulting from environmental phenomena.
Girard-Desprolet, Romain. "Filtrage spectral plasmonique à base de nanostructures métalliques adaptées aux capteurs d'image CMOS." Thesis, Université Grenoble Alpes (ComUE), 2015. http://www.theses.fr/2015GREAT053/document.
Full textImage sensors have experienced a renewed interest with the prominent market growth of wireless communication, together with a diversification of functionalities. In particular, a recent application known as Ambient Light Sensing (ALS) has emerged for a smarter screen backlight management of display-based handheld devices. Technological progress has led to the fabrication of thinner handsets, which imposes a severe constraint on light sensors' heights. This thickness reduction can be achieved with the use of an innovative, thinnest and entirely on-chip spectral filter. In this work, we present the investigation and the demonstration of plasmonic filters aimed for commercial ALS products. The most-efficient filtering structures are identified with strong emphasis on the stability with respect to the light angle of incidence and polarization state. Integration schemes are proposed according to CMOS compatibility and wafer-scale fabrication concerns. Plasmon resonances are studied to reach optimal optical properties and a dedicated methodology was used to propose optimized ALS performance based on actual customers' specifications. The robustness of plasmonic filters to process dispersions is addressed through the identification and the simulation of typical 300 mm fabrication inaccuracies and defects. In the light of these studies, an experimental demonstration of ALS plasmonic filters is performed with the development of a wafer-level integration and with the characterization and performance evaluation of the fabricated structures to validate the plasmonic solution
Petitdemange, Eva. "SAMUFLUX : une démarche outillée de diagnostic et d'amélioration à base de doubles numériques : application aux centres d'appels d'urgence de trois SAMU." Thesis, Ecole nationale des Mines d'Albi-Carmaux, 2020. http://www.theses.fr/2020EMAC0012.
Full textThe demand for emergency medical services has been significant and increasing over the last decade. In a constrained medico-economic context, the maintenance of operational capacities represents a strategic strake in front of the risk of congestion and insufficient accessibility for the population. Recent events such as the COVID-19 pandemic show the limits of the current system to face crisis situations. Reinforcement in human resources cannot be the only solution in front of this observation and it becomes unavoidable to build new organizational models while aiming at a quality of service allowing to answer 99% of the incoming calls in less than 60 seconds (90% in 15s and 99% in 30s MARCUS report and HAS recommendation October 2020). However, these models must take into account the great heterogeneity of EMS and their operation. In the light of these findings, the research work presented in this manuscript aims to evaluate the organizational effiectiveness and resilience of EMS in managing the flow of emergency telephone calls to deal with daily life and crisis situations. This evaluation allows us to propose and test new organizational schemes in order to make recommendations adpated to the particularities of emergency call centers. In a first part, we propose a methodology equipped for the diagnosis and improvement of emergency call centers. It can be broken down into two main parts: the study of data from emergency call centers, and then the design and use of a dual digital system. For each step of this methodology, we propose an associated tool. In a second part, we apply the first part of the methodology to our partner EMS data. The aim is to be able to extract information and knowledge from the telephony data as well as from the business processes for handling emergency calls. The knowledge thus extracted makes it possible to design a digital duplicate that is close to the real behavior of the EMS. Finally, in a third part, we use the material produced previously to model and parameterize a digital duplicate deployed on a discrete event simulation engine. It allows us to test several scenarios by playing on different call management organizations. Thanks to this, we make recommendations on the types of organizations to adopt in order to improve the performance of call centers
Fromm, Bradley S. "Linking phase field and finite element modeling for process-structure-property relations of a Ni-base superalloy." Thesis, Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45789.
Full textBergier, Jean-Yves. "Analyse et modélisation du processus de propagation des effets des actions militaires d'influence au sein d'une population cible : approche par la culture et les réseaux sociaux." Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0752/document.
Full textThe study, analysis and understanding of armed violence phenomenons in developing countries and of the effects of international military interventions aimed at resolving them is a crucial contemporary issue. These situations coincidentally present, considering the plethora and complexity of the implicated and intertwined social processes, a challenge for social computing, modeling and simulation. A challenge but also an opportunity, as the evolution of the forms of conflict, today centered on the local populations, has prompted the armies tasked with implementing stabilization missions to develop influence actions. Such operations, overall concerned with persuading the locals of the legitimacy of the operations, allow a more comprehensive approach to conflict resolution, beyond the simple use of force. Modeling some of these specific actions (PSYOPS, CIMIC, and Key Leader Engagement) is a credible project and a contribution to analysis of communication and persuasion processes in social networks by taking into account detailed and specific social and cultural factors. This research thus presents a conceptual model allowing simulation of the effects of these specific actions of influence in a realistic civilian population. We chose an agent-based approach as these lend particularly well with this type of research, allowing us to generate a group of up to 10,000 agents, composed solely of individuals for a detailed cognitive treatment, and structured as a multilayer network for representing complex sociality. Given the nature of such actions and their context of application, such a model also highlights some social mechanisms typical of armed conflict situations
Boulon, Julien. "Approche multi-échelle de la formation des particules secondaires." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2011. http://tel.archives-ouvertes.fr/tel-00697022.
Full textMarbach, Peter 1966. "Simulation-based optimization of Markov decision processes." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/9660.
Full textIncludes bibliographical references (p. 127-129).
Markov decision processes have been a popular paradigm for sequential decision making under uncertainty. Dynamic programming provides a framework for studying such problems, as well as for devising algorithms to compute an optimal control policy. Dynamic programming methods rely on a suitably defined value function that has to be computed for every state in the state space. However, many interesting problems involve very large state spaces ( "curse of dimensionality"), which prohibits the application of dynamic programming. In addition, dynamic programming assumes the availability of an exact model, in the form of transition probabilities ( "curse of modeling"). In many situations, such a model is not available and one must resort to simulation or experimentation with an actual system. For all of these reasons, dynamic programming in its pure form may be inapplicable. In this thesis we study an approach for overcoming these difficulties where we use (a) compact (parametric) representations of the control policy, thus avoiding the curse of dimensionality, and (b) simulation to estimate quantities of interest, thus avoiding model-based computations and the curse of modeling. ,Furthermore, .our approach is not limited to Markov decision processes, but applies to general Markov reward processes for which the transition probabilities and the one-stage rewards depend on a tunable parameter vector 0. We propose gradient-type algorithms for updating 0 based on the simulation of a single sample path, so as to improve a given performance measure. As possible performance measures, we consider the weighted reward-to-go and the average reward. The corresponding algorithms(a) can be implemented online and update the parameter vector either at visits to a certain state; or at every time step . . . ,(b) have the property that the gradient ( with respect to 0) of the performance 'measure converges to O with probability 1. This is the strongest possible result · for gradient:-related stochastic approximation algorithms.
by Peter Marbach.
Ph.D.
Krutz, Nicholas J. "On the Path-Dependent Microstructure Evolution of an Advanced Powder Metallurgy Nickel-base Superalloy During Heat Treatment." The Ohio State University, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1606949447780975.
Full textSennersten, Charlotte. "Model-based Simulation Training Supporting Military Operational Processes." Doctoral thesis, Karlskrona : Blekinge Institute of Technology, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00469.
Full textKritchanchai, Duangpun. "Responsiveness of order fulfillment processes." Thesis, University of Nottingham, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326547.
Full textEid, Abdelrahman. "Stochastic simulations for graphs and machine learning." Thesis, Lille 1, 2020. http://www.theses.fr/2020LIL1I018.
Full textWhile it is impractical to study the population in many domains and applications, sampling is a necessary method allows to infer information. This thesis is dedicated to develop probability sampling algorithms to infer the whole population when it is too large or impossible to be obtained. Markov chain Monte Carlo (MCMC) techniques are one of the most important tools for sampling from probability distributions especially when these distributions haveintractable normalization constants.The work of this thesis is mainly interested in graph sampling techniques. Two methods in chapter 2 are presented to sample uniform subtrees from graphs using Metropolis-Hastings algorithms. The proposed methods aim to sample trees according to a distribution from a graph where the vertices are labelled. The efficiency of these methods is proved mathematically. Additionally, simulation studies were conducted and confirmed the theoretical convergence results to the equilibrium distribution.Continuing to the work on graph sampling, a method is presented in chapter 3 to sample sets of similar vertices in an arbitrary undirected graph using the properties of the Permanental Point processes PPP. Our algorithm to sample sets of k vertices is designed to overcome the problem of computational complexity when computing the permanent by sampling a joint distribution whose marginal distribution is a kPPP.Finally in chapter 4, we use the definitions of the MCMC methods and convergence speed to estimate the kernel bandwidth used for classification in supervised Machine learning. A simple and fast method called KBER is presented to estimate the bandwidth of the Radial basis function RBF kernel using the average Ricci curvature of graphs
Iyigunlu, Serter. "Agent-based modelling and simulation of airplane boarding processes." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/83637/1/Serter_Iyigunlu_Thesis.pdf.
Full textClegg, David Richard. "A construction-specific simulation-based framework for earthworks." Thesis, Sheffield Hallam University, 1999. http://shura.shu.ac.uk/19480/.
Full textQasim, M. "Improving highways construction processes using computer-based simulation techniques." Thesis, University of Salford, 2018. http://usir.salford.ac.uk/49496/.
Full textChristeleit, Deborah. "Use of Simulation to Reinforce Evidence-based Collection Processes." UNF Digital Commons, 2011. http://digitalcommons.unf.edu/etd/209.
Full textRannestad, Bjarte. "Exact Statistical Inference in Nonhomogeneous Poisson Processes, based on Simulation." Thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2007. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-10775.
Full textWe present a general approach for Monte Carlo computation of conditional expectations of the form E[(T)|S = s] given a sufficient statistic S. The idea of the method was first introduced by Lillegård and Engen [4], and has been further developed by Lindqvist and Taraldsen [7, 8, 9]. If a certain pivotal structure is satised in our model, the simulation could be done by direct sampling from the conditional distribution, by a simple parameter adjustment of the original statistical model. In general it is shown by Lindqvist and Taraldsen [7, 8] that a weighted sampling scheme needs to be used. The method is in particular applied to the nonhomogeneous Poisson process, in order to develop exact goodness-of-fit tests for the null hypothesis that a set of observed failure times follow the NHPP of a specic parametric form. In addition exact confidence intervals for unknown parameters in the NHPP model are considered [6]. Different test statistics W=W(T) designed in order to reveal departure from the null model are presented [1, 10, 11]. By the method given in the following, the conditional expectation of these test statistics could be simulated in the absence of the pivotal structure mentioned above. This extends results given in [10, 11], and answers a question stated in [1]. We present a power comparison of 5 of the test statistics considered under the nullhypothesis that a set of observed failure times are from a NHPP with log linear intensity, under the alternative hypothesis of power law intensity. Finally a convergence comparison of the method presented here and an alternative approach of Gibbs sampling is given.
Hong, Seng-Phil. "Data base security through simulation." Virtual Press, 1994. http://liblink.bsu.edu/uhtbin/catkey/902465.
Full textDepartment of Computer Science
Yuktadatta, Panurit. "Simulation of a parallel processor based small tactical system." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/28538.
Full textKadi, Imène Yamina. "Simulation et monotonie." Versailles-St Quentin en Yvelines, 2011. http://www.theses.fr/2011VERS0029.
Full textThe work of this thesis concern the contribution of the monotony in simulation methods. Initially, we focus on the study of different monotonicity notions used in stochastic modeling, trying to define the relationships between them. Three concepts have been defined in this field: the stochastic monotonicity based on stochastic comparison, the realizable monotony and finally the events monotony used in the perfect simulation. This study allowed us to use the stochastic monotonicity properties under the monotone perfect simulation. On the other hand, we have proposed monotone invertible encodings for systems whose natural representation is not monotone. This encoding allows to accelerate monotonous simulations and found its application in the simulation of optical burst. Another work was done in the field of parallel simulation, it use monotonicity properties of simulated systems to better parallelize the simulation process. This should produce a substantial acceleration in simulations
Fauvet, Marie-Christine. "ETIC : un SGBD pour la CAO dans un environnement partagé." Phd thesis, Grenoble 1, 1988. http://tel.archives-ouvertes.fr/tel-00328187.
Full textZhang, Zebin. "Intégration des méthodes de sensibilité d'ordre élevé dans un processus de conception optimale des turbomachines : développement de méta-modèles." Thesis, Ecully, Ecole centrale de Lyon, 2014. http://www.theses.fr/2014ECDL0047/document.
Full textThe turbomachinery optimal design usually relies on some iterative methods with either experimental or numerical evaluations that can lead to high cost due to numerous manipulations and intensive usage of CPU. In order to limit the cost and shorten the development time, the present thesis work proposes to integrate a parameterization method and the meta-modelization method in an optimal design cycle of an axial low speed turbomachine. The parameterization, realized by the high order sensitivity study of Navier-Stokes equations, allows to construct a parameterized database that contains not only the evaluations results, but also the simple and cross derivatives of objectives as a function of parameters. Enriched information brought by the derivatives are utilized during the meta-model construction, particularly by the Co-Kriging method employed to couple several databases. Compared to classical methods that are without derivatives, the economic benefit of the proposed method lies in the use of less reference points. Provided the number of reference points is small, chances are a unique point presenting at one or several dimensions, which requires a hypothesis on the error distribution. For those dimensions, the Co-Kriging works like a Taylor extrapolation from the reference point making the most of its derivatives. This approach has been experimented on the construction of a meta-model for a conic hub fan. The methodology recalls the coupling of databases based on two fan geometries and two operating points. The precision of the meta-model allows to perform an optimization with help of NSGA-2, one of the optima selected reaches the maximum efficiency, and another covers a large operating range. The optimization results are eventually validated by further numerical simulations
Bejo, Laszlo. "Simulation based modeling of the elastic properties of structural wood based composite lumber." Morgantown, W. Va. : [West Virginia University Libraries], 2001. http://etd.wvu.edu/templates/showETD.cfm?recnum=2058.
Full textTitle from document title page. Document formatted into pages; contains xiv, 224 p. : ill. (some col.). Vita. Includes abstract. Includes bibliographical references (p. 149-155).
Lau, C. C. "Development and assessment of a reactive simulation-based manufacturing planning and control system." Thesis, Queen's University Belfast, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.269057.
Full textPuig, Bénédicte. "Modélisation et simulation de processus stochastiques non gaussiens." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2003. http://tel.archives-ouvertes.fr/tel-00003526.
Full textCanicas, Jean-François. "Formalisation et simulation du processus de reconnaissance intermoléculaire." Bordeaux 1, 1985. http://www.theses.fr/1985BOR10585.
Full textCanicas, Jean-François. "Formalisation et simulation du processus de reconnaissance intermoléculaire." Grenoble : ANRT, 1985. http://catalogue.bnf.fr/ark:/12148/cb37594545c.
Full textLenôtre, Lionel. "Étude et simulation des processus de diffusion biaisés." Thesis, Rennes 1, 2015. http://www.theses.fr/2015REN1S079/document.
Full textWe consider the skew diffusion processes and their simulation. This study are divided into four parts and concentrate on the processes whose coefficients are piecewise constant with discontinuities along a simple hyperplane. We start by a theoretical study of the one-dimensional case when the coefficients belong to a broader class. We particularly give a result on the structure of the resolvent densities of these processes and obtain a computational method. When it is possible, we perform a Laplace inversion of these densities and provide some transition functions. Then we concentrate on the simulation of skew diffusions process. We build a numerical scheme using the resolvent density for any Feller processes. With this scheme and the resolvent densities computed in the previous part, we obtain a simulation method for the skew diffusion processes in dimension one. After that, we consider the multidimensional case. We provide a theoretical study and compute some functionals of the skew diffusions processes. This allows to obtain among others the transition function of the marginal process orthogonal to the hyperplane of discontinuity. Finally, we consider the parallelization of Monte Carlo methods. We provide a strategy which allows to simulate a large batch of skew diffusions processes sample paths on massively parallel architecture. An interesting feature is the possibility to replay some the sample paths of previous simulations
Delay, Etienne. "Réflexions géographiques sur l'usage des systèmes multi agents dans la compréhension des processus d'évolution des territoires viticoles de fortes pentes : le cas de la Côte Vermeille et du Val di Cembra." Thesis, Limoges, 2015. http://www.theses.fr/2015LIMO0037/document.
Full textWine and wineyards stand nowadays as a significant wealth for a number of countries. While retaining their properties as production space, vine-growing regions are developing adaptation strategies to market globalisation and to the ever more versatile consumer expectations. Yet, due to the corresponding specific orographic conditions in steep slope and mountain regions, actor's relative leeway is reducing ,. Comparatively to plain wine-growing, a large part of their production costs often remains indeed incompressible. On the other hand, these vine-growing landscape take advantage of such harsh conditions in terms of international recognition, as images of social construct and environmental equilibrium. The work presented here emerged as a response to this steep slope vineyards' sensitivity. This investigation relies on two study areas: the Côte Vermeille vineyards in France, and the val di Cembra in Italy. Our approach focuses on the opportunities granted by agent-based empirical modelling methods, in order to put forward a renewed look at the role of society-environment interactions in the sustainability and development of territories subject to constraints. Using an exploratory and incremental method, three significant issues of steep slope vine-growing regions have been addressed, thanks to a constellation of multi-agent models, derived from questioning actors of this sector. The first considers the market's impacts on the small-scale plant cover dynamics. The second issue focuses on meso-scale plant cover dynamics and questions the definition of simple socio-economic ruleset, within the frame of land property dynamics and applied to the scale of a few municipalities. The last section of this work is dedicated to some descriptive phenomena within a large scale. Thus, the sum of these reflections leads us to exploit co-designed modelling with the stakeholders in order to propose a global prospective vision for mountain and steep slope regions. This prospective approach is conducted in association with some of the players in the sector, thus allowing us to delineate the structural variables linked to steep slope vineyards' systems in agreement with their experience. Based on these delineated variables, four prospective scenarii are put forward for the steep slope vine-growing activity
Xiang, Haiou. "BLUETOOTH-BASE WORM MODELING AND SIMULATION." Master's thesis, University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2669.
Full textM.S.
School of Electrical Engineering and Computer Science
Engineering and Computer Science
Computer Science MS
Dahlström, Christina. "Quantitative microscopy of coating uniformity." Doctoral thesis, Mittuniversitetet, Institutionen för tillämpad naturvetenskap och design, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-16454.
Full textLiu, ke. "A Simulation Based Approach to EstimateEnergy Consumption for Embedded Processors." Thesis, Högskolan i Halmstad, Centrum för forskning om inbyggda system (CERES), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-29913.
Full textTang, Gula. "Research on distributed warning system of water quality in Mudan river based on EFDC and GIS." Thesis, Strasbourg, 2016. http://www.theses.fr/2016STRAD023/document.
Full textSimulation and Early Warning System (SEWS) is a powerful tool for river water quality monitoring. Mudan River, an important river in northeastern cold regions of China, can run out of China into Russia. Thus, the water quality of Mudan River is highly concerned not only locally andregionally but also internationally. Objective of this study is to establish an excellent SEWS of water quality so that the spatio-temporal distribution of water quality in both open-water and ice-covered periods can be accurately simulated and visualized to understand the spatial variation of pollutants along the river course. The dissertation is structured into 7 chapters, chapter 1 outlines the background of the study and reviews the current progress. Chapter Il compares the main available models for evaluating river water quality so that a better model can be selected as the basis to establish a modeling system for Mudan River. Chapter Ill establishes the model, the required boundary conditions and parameters for the model were verified and calibrated. Chapter IV, a distributed simulation procedure was designed to increase the simulation efficiency. Chapter V discusses more about the programing and operational issues of the distributed simulation. Chapter VI is about the core techniques to implement the system. Chapter VII is the conclusion of the study to summarize the key points and innovations of the study. The study has the following three points as innovation : a two-dimensional environmental fluid dynamics model for Mudan River, an efficient distributed model computational method and a prototype of SEWS, which can greatly improve the capability of monitoring and management of water quality in Mudan River and other similar rivers
El, Harabi Rafika. "Supervision des processus chimiques à base de modèles Bond Graphs." Thesis, Lille 1, 2011. http://www.theses.fr/2011LIL10074/document.
Full textThe proposed Ph.D. thesis deals with integrated design of bond graph model based for health monitoring of chemical processes. This research work is performed within the framework of the topic “Bond graphs for supervision of energetic processes" developed between the University of Gabés (Tunisia) and “Polytech’Lille”, the Engineering School within the University of Science and Technology of Lille. The chemical processes are polluting processes and with risk. They need, for staff safety and environmental protection, online surveillance for early detection and the identification of the failures. chemical processes occur phenomena of various nature: chemical and or biochemical, thermo-fluidic… Their modeling requires a unified approach. The Bond graph as a multidisciplinary tool is well suited to this task.Furthermore, this tool can be used also for the design of diagnosis algorithms thanks to its behavioral, causal and structural properties, and allows providing structural diagnosability conditions of the pertinent equipment without numerical calculation. The main scientific contributions of this research can be summarized as follows: (i)elaboration of a data base of dynamic bond graph models of components and chemical phenomena, (ii) methodology of bond graph model based diagnosis for systematic generation of fault indicators sensitive to the appearance of the secondary reactions source of pollution and explosion, (iii) robust diagnosis based on coupled uncertain bond graph model, (iv) computerization of the diagnosability analysis procedures applied to a class of a chemical process, (v) application to a real process (installation of esterification)
Bouchaala, Charfeddine Olfa. "Gestion sensible au métier des processus à base de services." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLL004/document.
Full textContinuous business environment changes urge companies to adapt their processes from a business environment point of view. Indeed, companies struggle to find a balance between adapting their processes and keeping competitiveness. While the imperative nature of business processes is too rigid to adapt them at run-time, the declarative one of the purely rule based business processes is, however, very time consuming. Hybrid approaches in turn try to reconcile between these approaches aiming to reach the market requirements. Nevertheless, they also need an effort for aligning business logic and process logic. Therefore, in this thesis, we focus on business environment-aware management of service-based business processes aiming at conciliating imperative and declarative approaches. Our challenge is to develop a hybrid management approach that preserves industry standards to describe and to manage SBPs as well as minimizes designers’ efforts. Based on a semantic modeling of business environment, business processes as well as their relationships, and a control dependency analysis of business processes, we are able to synthesize a controller, itself modeled as a process, connected to the business process to be monitored and configured at run-time. We also validated the feasibility of our management approach by implementing the framework Business Environment-Aware Management for Service-based Business processes (BEAM4SBP). Experimentations show the efficiency of our approach with respect to other BEAM approaches
Villefranche, Laurent, and Frédéric Serin. "Simulateur de gestion d'un terminal à conteneurs : simulation discrète par macro-processus et processus complementaires." Rouen, 1996. http://www.theses.fr/1996ROUES002.
Full textBen, Salah Fatma. "Modélisation et simulation à base de règles pour la simulation physique." Thesis, Poitiers, 2018. http://www.theses.fr/2018POIT2293.
Full textThe physical simulation of deformable objects is at the core of several computer graphics applications. In this context, we are interested in the creation of a framework, that combines a topological model, namely Generalized Maps, with one or several mechanical models, for the physical animation of deformable meshed objects that can undergo topological modifications such as tearing or fractures.To obtain a general framework, we chose to rely on graph manipulation and transformation rules, proposed by the JERBOA software. This environment provided us with fast prototyping facilities for different mechanical models. It allowed us to precisely define how to store mechanical properties in the topological description of a mesh and simulate its deformation in a topologically-based manner for interaction computation and force distribution. All mechanical properties are stored in the topological model without any external structure.This framework is general. It allows for the simulation of 2D or 3D objects, with different types of meshes, including non homogeneous ones. It also allowed for the simulation of several, continuous or discrete, mechanical models with various properties of homogeneity and isotropy. Furthermore, different methods to simulate topological modifications have been implemented in the framework. They include both the selection of a criterion to trigger topological modifications and a transformation type. Our approach also managed to reduce the number of updates of the mechanical model after tearing / fracture
Glardon, Carmen. "Simulation et optimisation du processus constructif de rénovation d'immeubles /." [S.l.] : [s.n.], 1995. http://library.epfl.ch/theses/?nr=1355.
Full textDroux, Jean-Jacques. "Simulation numérique bidimensionnelle et tridimensionnelle de processus de solidification /." [S.l.] : [s.n.], 1991. http://library.epfl.ch/theses/?nr=901.
Full text