Academic literature on the topic 'Simulation à base de processus'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Simulation à base de processus.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Simulation à base de processus"

1

Curie, Florence, Annabelle Mas, Julien Perret, Anne Puissant, and Anne Ruas. "Simulation d’un processus de densification du tissu urbain à base d’agents." Revue internationale de géomatique 21, no. 4 (December 30, 2011): 489–511. http://dx.doi.org/10.3166/rig.15.489-511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Charness, Neil. "Psychological Models of Aging: How, Who, and What? A Comment." Canadian Journal on Aging / La Revue canadienne du vieillissement 14, no. 1 (1995): 67–73. http://dx.doi.org/10.1017/s0714980800010503.

Full text
Abstract:
RÉSUMÉSchroots soulève d'importantes questions ayant trait au processus du vieillissement, notamment «De quelle façon devrions-nous étudier le processus du vieillissement?», «Quelles personnes devrions-nous étudier?», et «Quel est le meilleur moyen d'établir les modèles de changement?». J'aborde ces questions en utilisant des analogies dans leur sens large, comme «gérodynamique» et entropie, et je défends une stratégie qui détermine des phénomènes de base pour fournir une explication et utilise la simulation par ordinateur pour modeler ces phénomènes.
APA, Harvard, Vancouver, ISO, and other styles
3

Lebl, Aleksandar, Dragan Mitić, and Žarko Markov. "A role of Excel program in telecommunication processes simulation." Scientific Technical Review 73, no. 1 (2023): 13–17. http://dx.doi.org/10.5937/str2301013l.

Full text
Abstract:
This paper presents the simulation of mobile telephony systems realized in Excel program. Although primarily intended for other applications, Excel has several advantages over other specialized programs for simulation purposes. Excel application for simulation is illustrated by several examples from already published papers with the main goal to describe the most important part of realized simulations that allows determination of all important characteristics of telecommunication traffic process. Beside traffic process, the Excel application allows a reliable simulation of base station emission power starting from a random distance between the base station and mobile station.
APA, Harvard, Vancouver, ISO, and other styles
4

Lemogne, C. "L’imagerie cérébrale fonctionnelle : un outil au service de la psychopathologie ?" European Psychiatry 30, S2 (November 2015): S3—S4. http://dx.doi.org/10.1016/j.eurpsy.2015.09.021.

Full text
Abstract:
La psychopathologie est précieuse pour guider l’application de l’imagerie cérébrale fonctionnelle à l’étude des troubles mentaux. En permettant une approche basée sur les processus mentaux plutôt que sur des catégories diagnostiques (par ex. rumination plutôt que dépression), elle offre l’opportunité d’identifier des biomarqueurs susceptibles d’enrichir la nosographie psychiatrique et de renseigner les stratégies diagnostiques et thérapeutiques. En revanche, savoir si l’imagerie cérébrale fonctionnelle peut être un outil au service de la psychopathologie, c’est-à-dire de la compréhension des processus mentaux sous-jacents aux troubles psychiatriques, reste une question controversée. Un intérêt potentiel de l’imagerie cérébrale fonctionnelle pourrait être l’identification de processus mentaux non conscients et inaccessibles à une mesure comportementale. C’est ainsi que la constatation de bases cérébrales communes entre douleur morale et douleur physique a pu donner lieu à des spéculations fascinantes sur l’origine de leur parenté lexicale. Ou encore que certains envisagent de pouvoir distinguer conversion et simulation sur la base de l’activité cérébrale. Mais interpréter cette activité comme témoignant d’un processus mental, raisonnement appelé inférence inverse, pose plusieurs problèmes, que le processus mental soit rapportable ou non. Par exemple, l’activité cérébrale observée peut ne pas être pas spécifique du processus mental en question. Ou alors cette activité cérébrale peut ne pas être définie avec assez de précision. L’ensemble de ces problèmes peut être formalisé dans une perspective bayésienne. En dépit de ces limites, l’inférence inverse est néanmoins un outil heuristique puissant pour susciter des hypothèses secondairement réfutables concernant la nature des processus mentaux et leurs relations (par ex. évocation de l’objet perdu et renforcement lors d’un deuil compliqué). Combinée à des paradigmes expérimentaux de qualité, l’imagerie cérébrale fonctionnelle est donc susceptible d’apporter des connaissances nouvelles à la psychopathologie.
APA, Harvard, Vancouver, ISO, and other styles
5

Bursi, Fabio, Andrea Ferrara, Andrea Grassi, and Chiara Ronzoni. "Simulating Continuous Time Production Flows in Food Industry by Means of Discrete Event Simulation." International Journal of Food Engineering 11, no. 1 (February 1, 2015): 139–50. http://dx.doi.org/10.1515/ijfe-2014-0002.

Full text
Abstract:
Abstract The paper presents a new framework for carrying out simulations of continuous-time stochastic processes by exploiting a discrete event approach. The application scope of this work mainly refers to industrial production processes executed on a continuous flow of material (e.g. food and beverage industry) as well as production processes working on discrete units but characterized by a high speed flow (e.g. automated packaging lines). The proposed model, developed adopting the Discrete EVent system Specification (DEVS) formalism, defines a single generalized base unit able to represent, by means of an event scheme generated by state changes, the base behaviors needed for the modeling of a generic manufacturing unit, that is, (i) breakdowns and repairs, (ii) speed and accumulation, and (iii) throughput time. Moreover, the possibility to keep trace of additional measures of parameters related to the process and the flowing material (i.e. temperature, concentration of pollutant, and so on) is also considered. Since these parameters can change over time in a continuous manner, a specific discretization approach has been introduced to avoid the need to integrate parameter variation functions over time.
APA, Harvard, Vancouver, ISO, and other styles
6

Lin, Tzu-Shun, and Fang-Yi Cheng. "Impact of Soil Moisture Initialization and Soil Texture on Simulated Land–Atmosphere Interaction in Taiwan." Journal of Hydrometeorology 17, no. 5 (April 14, 2016): 1337–55. http://dx.doi.org/10.1175/jhm-d-15-0024.1.

Full text
Abstract:
Abstract This study investigates the effect of soil moisture initializations and soil texture on the land surface hydrologic processes and its feedback on atmospheric fields in Taiwan. The simulations using the Weather Research and Forecasting (WRF) Model with the Noah land surface model were conducted for a 1-month period from 10 August to 12 September 2013 that included two typhoon-induced precipitation episodes and a series of clear-sky days. Soil moisture from the Global Land Data Assimilation System (GLDAS) was utilized to provide the soil moisture initialization process. In addition, updated soil textures based on field surveys in Taiwan were adopted for the WRF Model. Three WRF sensitivity runs were performed. The first simulation is the base case without any update (WRF-base), the second simulation utilizes GLDAS products to initialize the soil moisture (WRF-GLDAS), and the third simulation includes GLDAS products plus the updated soil textures and soil parameters (WRF-GSOIL). In WRF-base, the soil moisture initialization process is provided from National Centers for Environmental Prediction (NCEP) Final (FNL) Operational Global Analysis data, which are higher than the data from GLDAS products. The WRF-GLDAS and WRF-GSOIL with use of GLDAS data show lower soil moisture than WRF-base and agree better with observed data, while WRF-base shows a systematic wet bias of soil moisture throughout the simulation periods. In WRF-GSOIL, the soil textures with large-sized soil particles reveal higher soil conductivity; as a result, water drains through the soil column in a faster manner than the WRF-GLDAS, which leads to reduced soil moisture in western Taiwan. Among the three simulations, the variation of soil moisture is best simulated in WRF-GSOIL.
APA, Harvard, Vancouver, ISO, and other styles
7

Morrow, Jarrett D., and Brandon W. Higgs. "CallSim: Evaluation of Base Calls Using Sequencing Simulation." ISRN Bioinformatics 2012 (December 12, 2012): 1–10. http://dx.doi.org/10.5402/2012/371718.

Full text
Abstract:
Accurate base calls generated from sequencing data are required for downstream biological interpretation, particularly in the case of rare variants. CallSim is a software application that provides evidence for the validity of base calls believed to be sequencing errors and it is applicable to Ion Torrent and 454 data. The algorithm processes a single read using a Monte Carlo approach to sequencing simulation, not dependent upon information from any other read in the data set. Three examples from general read correction, as well as from error-or-variant classification, demonstrate its effectiveness for a robust low-volume read processing base corrector. Specifically, correction of errors in Ion Torrent reads from a study involving mutations in multidrug resistant Staphylococcus aureus illustrates an ability to classify an erroneous homopolymer call. In addition, support for a rare variant in 454 data for a mixed viral population demonstrates “base rescue” capabilities. CallSim provides evidence regarding the validity of base calls in sequences produced by 454 or Ion Torrent systems and is intended for hands-on downstream processing analysis. These downstream efforts, although time consuming, are necessary steps for accurate identification of rare variants.
APA, Harvard, Vancouver, ISO, and other styles
8

Barton, G., X. Li, and Gerhard Hirt. "Finite-Element Modeling of Multi-Pass Forging of Nickel-Base Alloys Using a Multi-Mesh Method." Materials Science Forum 539-543 (March 2007): 2503–8. http://dx.doi.org/10.4028/www.scientific.net/msf.539-543.2503.

Full text
Abstract:
Nickel-base alloys are mostly used for high-temperature applications, many of which are heavily loaded safety components. The material properties highly depend on the microstructure, which, in turn, depends on the metal forming process and the heat treatment. FEM integrated microstructure models can satisfactorily describe the grain size development due to dynamic and static recrystallisation during a metal forming processes and the heat treatment. The simulation results obtained from modeled compression experiments are very promising so that consequently, simulations of more sophisticated processes, like multi-pass open die forging or radial forging, is the next reasonable goal. However, the computation times for the simulation of these processes are still unsatisfactorily long and thus, their application is deterred. To accelerate the simulations, a multi-mesh algorithm was implemented to the Finite-Element simulation package PEP & LARSTRAN/SHAPE. This method uses a Finite-Element mesh that is fine in the deformation zone and coarse in the remaining areas of the workpiece. Due to the movement of the tools during the simulation, the deformation zone moves across the workpiece and thus, necessitates a remeshing with a transition of the finely meshed area. A second mesh, which is fine over the entire volume of the workpiece, is used to store the nodal data and simulation results, which get transferred to the simulation mesh every time a remeshing operation becomes necessary. In combination with an adopted data transfer algorithm, this second mesh is used to minimize the loss of accuracy, if a previously finely meshed area becomes a coarsely meshed area. This simulation model can be used to optimize forging process chains with respect to grain size distribution as well as cost effectiveness and energy consumption.
APA, Harvard, Vancouver, ISO, and other styles
9

Thews, O. "Simulation Analysis of the Influence of Hemodialysis Control Parameters on Exchange Processes during Therapy." International Journal of Artificial Organs 15, no. 4 (April 1992): 213–21. http://dx.doi.org/10.1177/039139889201500405.

Full text
Abstract:
The effect of dialysis control parameters (dialysate composition, ultrafiltration rate, blood flow rate) on the patient's internal milieu were studied using a mathematical model for the description of the dynamic exchange processes during hemodialysis. This model simulates the electrolyte and water distribution, the acid-base and the oxygenation state as well as the ventilation. The dialysate sodium concentration affects mainly the intra-/ extracellular water and the potassium distribution. The dialysate bicarbonate and acetate concentrations control the acid-base state and the electrolyte distribution (sodium and potassium). In addition, the dialysate acetate concentration has a strong effect on arterial oxygenation and on ventilation. The ultrafiltration rate controls the water distribution between plasma and the interstitial space but also the sodium distribution and the arterial acid-base state. The blood flow rate through the dialyser influences the acid-base state and, by this, it affects the potassium and sodium distribution. The acid-base state is affected in opposite directions when acetate or bicarbonate is used as a buffer.
APA, Harvard, Vancouver, ISO, and other styles
10

Zarin-Nejadan, Milad. "Fiscalité, q de Tobin et investissement privé en Suisse." Recherches économiques de Louvain 58, no. 2 (1992): 213–35. http://dx.doi.org/10.1017/s0770451800083688.

Full text
Abstract:
RésuméCette contribution a pour objet d’élaborer un modèle mettant en relation la variable q de Tobin corrigée pour la fiscalité avec le taux d’accumulation du capital dans le but d’évaluer l’effet de l’incitation fiscale sur l’investissement privé en Suisse. L’estimation du modèle sur la base des données portant sur la période d’après-guerre révèle le rô1e mineur joué par la fiscalité dans le processus d’investissement en Suisse. Par ailleurs, d’après les rèsultats de la simulation du modèle sous divers scénarios de réforme fiscale, les instruments fiscaux employés en Suisse, à savoir la manipulation du taux d’imposition des bénéfices et des taux de dépréciation fiscalement admis paraissent relativement inefficaces. Par contre, un instrument direct comme le crédit d’impôt, jamais encore utilisé, témoigne d’une assez grande efficacité.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Simulation à base de processus"

1

Troncoso, Alan. "Conditional simulations of reservoir models using Sequential Monte-Carlo methods." Electronic Thesis or Diss., Université Paris sciences et lettres, 2022. http://www.theses.fr/2022UPSLM055.

Full text
Abstract:
Une méthode séquentielle de Monte Carlo, appelée filtrage particulaire, a été utilisée dans un contexte spatial pour simulerdeux modèles de réservoir en respectant les faciès observés aux puits. Le premier modèle, le schéma Booléen, est un modèle àbase d'objets. Il peut servir à modéliser des réservoirs à deux faciès: un faciès poreux, et un faciès imperméable qui agitcomme barrière à la circulation des fluides. Ce modèle se prête bien à des calculs mathématiques~: il existe des méthodes statistiques pour en inférer les paramètres, ainsi qu'un algorithme itératif de simulation conditionnelle. Cependant, lavitesse de convergence de cet algorithme est difficile à établir. Un algorithme séquentiel basé sur un filtre particulaireest proposé en alternative. Au final, cet algorithme séquentiel surpasse l’algorithme itératif en termes de qualité desrésultats et de temps de calcul.Le second modèle, Flumy, est un modèle de processus sédimentaires, permettant de représenter la formation de systèmes chenalisés méandriformes. Ce modèle permet de reproduire l'hétérogénéité induite par les géométries complexes des dépots sédimentaires.L'algorithme courant implémenté dans Flumy modifie dynamiquement les processus au cours du temps pour s'adapter aux données et permet ainsi d'obtenir des simulations conditionnelles. Le développement de cet algorithme de conditionnement requiert,cependant, une profonde compréhension de ces processus pour les modifier tout en évitant les artefacts ou les biais. Pour cette raison, un autre algorithme, dit séquentiel, a été développé. Il consiste à construire le réservoir par empilement de couches horizontales au moyen d'un filtre particulaire permettant d'assimiler les facies observés en chaque couche. Ces deux algorithmesde conditionnement ont été comparés sur un cas synthétique et sur un cas réel (bassin de Loranca, en Espagne). Ils fournissentdes résultats comparables, mais se distinguent en termes de ressources nécessaires pour leur mise en œuvre: l'agorithmeséquentiel requiert une puissance de calcul informatique conséquente quand l'algorithme dynamique nécessite une finecompréhension des processus à modifier
A sequential Monte Carlo method, called particle filtering, has been used in a spatial context to produce simulations of two reservoir models that respect the observed facies at wells. The first one, the Boolean model, is an object-based model. It canbe used to model two-facies reservoirs: One porous facies, and an impermeable facies that acts as a barrier for the fluidcirculation. The model is mathematically tractable: There exists statistical methods to infer its parameters as well as aniterative conditional simulation algorithm. However, the convergence rate of this algorithm is difficult to establish. Asequential algorithm based on the particle filtering is proposed as an alternative. It finally appears that this sequentialalgorithm outperforms the iterative algorithm in terms of quality of results and computational time.The second model, Flumy, is a model of sedimentary processes. It is used for representing the formation of meanderingchannelized systems. This model can reproduce the heterogeneity induced by the complex geometries of sedimentary deposits.The current algorithm implemented in Flumy modifies dynamically the processes for fitting the data at best to produceconditional simulations. The set-up of this algorithm requires a deep knowledge of the processes to modify them and avoidartifacts and biases. For this reason, another conditioning algorithm, called sequential, has been developed. It consists in building the reservoir by stacking horizontal layers using particle filtering, thus allowing the observed facies to beassimilated in each layer. These two algorithms have been compared on a synthetic case and on a real case (Loranca Basin,Spain). Both give comparable results, but they differ in terms of the resources required for their implementation: whereasthe sequential algorithm needs high computer power, the dynamic algorithm requires a fine understanding of the processes to be modified
APA, Harvard, Vancouver, ISO, and other styles
2

Beaujouan, David. "Simulation des matériaux magnétiques à base Cobalt par Dynamique Moléculaire Magnétique." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00760645.

Full text
Abstract:
Les propriétés magnétiques des matériaux sont fortement connectées à leur structure cristallographique. Nous proposons un modèle atomique de la dynamique d'aimantation capable de rendre compte de cette magnétoélasticité. Bien que ce travail s'inscrive dans une thématique générale de l'étude des matériaux magnétiques en température, nous la particularisons à un seul élément, le Cobalt. Dans ce modèle effectif, les atomes sont décrits par 3 vecteurs classiques qui sont position, impulsion et spin. Ils interagissent entre eux via un potentiel magnéto-mécanique ad hoc. On s'intéresse tout d'abord à la dynamique de spin atomique. Cette méthode permet d'aborder simplement l'écriture des équations d'évolution d'un système atomique de spins dans lequel la position et l'impulsion des atomes sont gelées. Il est toutefois possible de définir une température de spin permettant de développer naturellement une connexion avec un bain thermique. Montrant les limites d'une approche stochastique, nous développons une nouvelle formulation déterministe du contrôle de la température d'un système à spins.Dans un second temps, nous développons et analysons les intégrateurs géométriques nécessaires au couplage temporel de la dynamique moléculaire avec cette dynamique de spin atomique. La liaison des spins avec le réseau est assurée par un potentiel magnétique dépendant des positions des atomes. La nouveauté de ce potentiel réside dans la manière de paramétrer l'anisotropie magnétique qui est la manifestation d'un couplage spin-orbite. L'écriture d'un modèle de paires étendu de l'anisotropie permet de restituer les constantes de magnétostriction expérimentales du hcp-Co. En considérant un système canonique, où pression et température sont contrôlées, nous avons mis en évidence la transition de retournement de spin si particulière au Co vers 695K.Nous finissons par l'étude des retournements d'aimantation super-paramagnétiques de nanoplots de Co permettant de comparer ce couplage spin-réseau aux mesures récentes.
APA, Harvard, Vancouver, ISO, and other styles
3

Blondet, Gaëtan. "Système à base de connaissances pour le processus de plan d'expériences numériques." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2363/document.

Full text
Abstract:
Le besoin de compétitivité des entreprises, dans un contexte économique mondialisé, repose sur l'amélioration de la qualité des produits et la réduction des coûts et du temps de mise sur le marché. Pour atteindre ces objectifs, la simulation numérique est couramment utilisée pour la conception de produits complexes et mobilise des expertises diverses. Les Plans d'Expériences Numériques (PEN) sont de plus en plus utilisés pour simuler les variabilités des propriétés et de l’environnement du produit. Un processus de PEN apporte des méthodes de planification et d'analyse d'un ensemble de simulations, pour mieux maîtriser les performances du produit. La problématique traitée repose sur deux points. D'une part, la définition d'un processus de PEN repose sur de nombreux choix et l'utilisation de méthodes complexes, nécessitant une expertise avancée. Cette définition est d'autant plus complexe que le modèle de simulation est complexe et coûteux à exécuter. D'autre part, l'utilisation de PEN conduit à une production de grands volumes de données en multipliant les simulations. Ces travaux portent sur l'obtention rapide de la configuration optimale du processus de PEN pour raccourcir la préparation et l’exécution d’un PEN. Ces travaux se sont orientés vers la réutilisation des connaissances en entreprise pour un système à base de connaissances, composé d'une ontologie spécifique, pour capitaliser et partager les connaissances, et d'un moteur d'inférences, basé sur les réseaux bayésiens, pour proposer aux concepteurs des configurations efficaces et innovantes. Cette proposition est illustrée par une application sur un produit industriel issue du secteur automobile
In order to improve industrial competitiveness, product design relies more and more on numerical tools, such as numerical simulation, to develop better and cheaper products faster. Numerical Design of Experiments (NDOE) are more and more used to include variabilities during simulation processes, to design more robust, reliable and optimized product earlier in the product development process. Nevertheless, a NDOE process may be too expensive to be applied to a complex product, because of the high computational cost of the model and the high number of required experiments. Several methods exist to decrease this computational cost, but they required expert knowledge to be efficiently applied. In addition to that, NDoE process produces a large amount of data which must be managed. The aim of this research is to propose a solution to define, as fast as possible, an efficient NDoE process, which produce as much useful information as possible with a minimal number of simulations, for complex products. The objective is to shorten both process definition and execution steps. A knowledge-based system is proposed, based on a specific ontology and a bayesian network, to capitalise, share and reuse knowledge and data to predict the best NDoE process definition regarding to a new product. This system is validated on a product from automotive industry
APA, Harvard, Vancouver, ISO, and other styles
4

Hoock, Jean-Baptiste. "Contributions to Simulation-based High-dimensional Sequential Decision Making." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00912338.

Full text
Abstract:
My thesis is entitled "Contributions to Simulation-based High-dimensional Sequential Decision Making". The context of the thesis is about games, planning and Markov Decision Processes. An agent interacts with its environment by successively making decisions. The agent starts from an initial state until a final state in which the agent can not make decision anymore. At each timestep, the agent receives an observation of the state of the environment. From this observation and its knowledge, the agent makes a decision which modifies the state of the environment. Then, the agent receives a reward and a new observation. The goal is to maximize the sum of rewards obtained during a simulation from an initial state to a final state. The policy of the agent is the function which, from the history of observations, returns a decision. We work in a context where (i) the number of states is huge, (ii) reward carries little information, (iii) the probability to reach quickly a good final state is weak and (iv) prior knowledge is either nonexistent or hardly exploitable. Both applications described in this thesis present these constraints : the game of Go and a 3D simulator of the european project MASH (Massive Sets of Heuristics). In order to take a satisfying decision in this context, several solutions are brought : 1. Simulating with the compromise exploration/exploitation (MCTS) 2. Reducing the complexity by local solving (GoldenEye) 3. Building a policy which improves itself (RBGP) 4. Learning prior knowledge (CluVo+GMCTS) Monte-Carlo Tree Search (MCTS) is the state of the art for the game of Go. From a model of the environment, MCTS builds incrementally and asymetrically a tree of possible futures by performing Monte-Carlo simulations. The tree starts from the current observation of the agent. The agent switches between the exploration of the model and the exploitation of decisions which statistically give a good cumulative reward. We discuss 2 ways for improving MCTS : the parallelization and the addition of prior knowledge. The parallelization does not solve some weaknesses of MCTS; in particular some local problems remain challenges. We propose an algorithm (GoldenEye) which is composed of 2 parts : detection of a local problem and then its resolution. The algorithm of resolution reuses some concepts of MCTS and it solves difficult problems of a classical database. The addition of prior knowledge by hand is laborious and boring. We propose a method called Racing-based Genetic Programming (RBGP) in order to add automatically prior knowledge. The strong point is that RBGP rigorously validates the addition of a prior knowledge and RBGP can be used for building a policy (instead of only optimizing an algorithm). In some applications such as MASH, simulations are too expensive in time and there is no prior knowledge and no model of the environment; therefore Monte-Carlo Tree Search can not be used. So that MCTS becomes usable in this context, we propose a method for learning prior knowledge (CluVo). Then we use pieces of prior knowledge for improving the rapidity of learning of the agent and for building a model, too. We use from this model an adapted version of Monte-Carlo Tree Search (GMCTS). This method solves difficult problems of MASH and gives good results in an application to a word game.
APA, Harvard, Vancouver, ISO, and other styles
5

Dinaharison, Jean Bienvenue. "Conception d’une approche spatialisée à base d’agent pour coupler les modèles mathématiques et informatiques : application à la modélisation du processus écosystémique du sol." Electronic Thesis or Diss., Sorbonne université, 2023. http://www.theses.fr/2023SORUS184.

Full text
Abstract:
Le sol est un environnement fortement hétérogène dans lequel de nombreux processus interagissent pour fournir des services écosystémiques. L'approche de couplage de modèle propose de définir ce type de système en utilisant une architecture modulaire dans laquelle les divers processus représentés par des modèles communiquent pour reproduire les différents aspects du mécanisme d'un phénomène tel que le fonctionnement du sol. Dans ce projet de thèse, développons une telle approche dans le but de modéliser le fonctionnement du sol. Les défis d'un tel projet réside sur la résolution des problèmes de représentation des processus du sol. Ces problèmes de représentation prennent leur origine du fait qu'on réutilise des modèles venant de disciplines variés pour décrire les processus. Cet hétérogénéité de représentation peut se résumer par l'écriture des modèles, le cadre d'exécution temporelle et la résolution des données décrites par les modèles. Ces contraintes de couplage, sont traités par bon nombre d'approches dans la littérature. Tous proposent des solutions satisfaisantes à ces contraintes dans leur domaines d'application respectives. Dans nôtre approche, nous utilisons le paradigme agent pour encapsuler les divers processus du sol. Ces processus communiquerons à travers l'espace en utilisant les ressources qui y sont présent. Le comportement des processus dépendent alors de la disponibilité des ressources. Un problème de coordination peut jaillir de ce type de couplage, les processus peuvent consommer de la ressource de manière simultanée alors que cette dernière peut ne pas supporter cette demande. Pour remédier à cela nous utilisons une technique en théorie d'action appelé Influence-Reaction afin de définir des stratégies pour gérer ce type de situation. Nous nous sommes inspiré des approches présente dans la littérature afin de gérer les différentes temporalités des processus. Cette approche de couplage a été appliqué à un modèle de décomposition de la matière organique ou plusieurs processus (vers de terre, microbes et racine) se disputent de la ressource dans un sol. Les résultats suggèrent que l'utilisation de nôtre approche sied à la modélisation du fonctionnement du sol, mais donnent aussi des indications plus précises quand à la disponibilité de la ressource
Soil is a highly heterogeneous environment in which many processes interact to provide ecosystem services. Model coupling approaches propose to define such system by using a modular architecture in which various processes, represented by models, communicate to reproduce different aspects of a phenomenon such as soil functioning. In this thesis project, we develop such an approach for the purpose of modelling soil functioning. The challenges of such a scheme lie in solving representation problems of soil processes. These representation problems originate from the fact that models from various disciplines are reused to describe the processes. By representations problems, we mean model description which can be (individual or equation based), the temporal execution settings and data resolution. These coupling constraints are addressed by a number of approaches in the literature. All of them propose satisfactory solutions to these constraints in their respective application fields. In our approach, we use the agent paradigm to encapsulate the various soil processes. Then processes will communicate through the space by using resources inside it. The behaviour of the processes then depends on the availability of resources. A coordination problem can arise from this type of coupling, as processes may consume the resource simultaneously while the resource may not support this demand. To overcome that matter, we use an action-theoretic technique called Influence-Reaction to define strategies to manage this type of situation. We used algorithms suggested by the abbundant litterature to manage any processes temporality issues. This coupling approach was applied to a model of organic matter decomposition in which several processes (earthworms, microbes and roots) compete for soil ressources. The results suggest that the use of our approach is suitable for modelling soil functioning, but also gives more accurate indications of resource availability
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Haizhou. "Modeling and verification of probabilistic data-aware business processes." Thesis, Clermont-Ferrand 2, 2015. http://www.theses.fr/2015CLF22563/document.

Full text
Abstract:
Un large éventail de nouvelles applications met l’accent sur la nécessité de disposer de modèles de processus métier capables de manipuler des données imprécises ou incertaines. Du fait de la présence de données probabilistes, les comportements externes de tels processus métier sont non markoviens. Peu de travaux dans la littérature se sont intéressés à la vérification de tels systèmes. Ce travail de thèse étudie les questions de modélisation et d’analyse de ce type de processus métier. Il utilise comme modèle formel pour décrire les comportements des processus métier un système de transitions étiquetées dans lequel les transitions sont gardées par des conditions définies sur une base de données probabiliste. Il propose ensuite une approche de décomposition de ces processus qui permet de tester la relation de simulation entre processus dans ce contexte. Une analyse de complexité révèle que le problème de test de simulation est dans 2-EXPTIME, et qu’il est EXPTIME-difficile en termes de complexité d’expression, alors que du point de vue de la complexité en termes des données, il n’engendre pas de surcoût supplémentaire par rapport au coût de l’évaluation de requêtes booléennes sur des bases de données probabilistes. L’approche proposée est ensuite étendue pour permettre la vérification de propriétés exprimées dans les logiques P-LTL et P-CTL. Finalement, un prototype, nommé ‘PRODUS’, a été implémenté et utilisé dans le cadre d’une application liée aux systèmes d’information géographiques pour montrer la faisabilité de l’approche proposée
There is a wide range of new applications that stress the need for business process models that are able to handle imprecise data. This thesis studies the underlying modelling and analysis issues. It uses as formal model to describe process behaviours a labelled transitions system in which transitions are guarded by conditions defined over a probabilistic database. To tackle verification problems, we decompose this model to a set of traditional automata associated with probabilities named as world-partition automata. Next, this thesis presents an approach for testing probabilistic simulation preorder in this context. A complexity analysis reveals that the problem is in 2-exptime, and is exptime-hard, w.r.t. expression complexity while it matches probabilistic query evaluation w.r.t. data-complexity. Then P-LTL and P-CTL model checking methods are studied to verify this model. In this context, the complexity of P-LTL and P-CTL model checking is in exptime. Finally a prototype called ”PRODUS” which is a modeling and verification tool is introduced and we model a realistic scenario in the domain of GIS (graphical information system) by using our approach
APA, Harvard, Vancouver, ISO, and other styles
7

Sirin, Göknur. "Supporting multidisciplinary vehicle modeling : towards an ontology-based knowledge sharing in collaborative model based systems engineering environment." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2015. http://www.theses.fr/2015ECAP0024/document.

Full text
Abstract:
Les systèmes industriels (automobile, aérospatial, etc.) sont de plus en plus complexes à cause des contraintes économiques et écologiques. Cette complexité croissante impose des nouvelles contraintes au niveau du développement. La question de la maitrise de la capacité d’analyse de leurs architectures est alors posée. Pour résoudre cette question, les outils de modélisation et de simulation sont devenus une pratique courante dans les milieux industriels afin de comparer les multiples architectures candidates. Ces outils de simulations sont devenus incontournables pour conforter les décisions. Pourtant, la mise en œuvre des modèles physiques est de plus en plus complexe et nécessite une compréhension spécifique de chaque phénomène simulé ainsi qu’une description approfondie de l’architecture du système, de ses composants et des liaisons entre composants. L’objectif de cette thèse est double. Le premier concerne le développement d’une méthodologie et des outils nécessaires pour construire avec précision les modèles de simulation des architectures de systèmes qu’on désire étudier. Le deuxième s’intéresse à l’introduction d’une approche innovante pour la conception, la production et l’intégration des modèles de simulations en mode « plug and play » afin de garantir la conformité des résultats aux attentes, notamment aux niveaux de la qualité et de la maturité. Pour accomplir ces objectifs, des méthodologies et des processus d’ingénierie des systèmes basés sur les modèles (MBSE) ainsi que les systèmes d’information ont été utilisés. Ce travail de thèse propose pour la première fois un processus détaillé et un outil pour la conception des modèles de simulation. Un référentiel commun nommé « Modèle de carte d'identité (MIC) » a été développé pour standardiser et renforcer les interfaces entre les métiers et les fournisseurs sur les plans organisationnels et techniques. MIC garantit l’évolution et la gestion de la cohérence de l’ensemble des règles et les spécifications des connaissances des domaines métiers dont la sémantique est multiple. MIC renforce également la cohérence du modèle et réduit les anomalies qui peuvent interférer pendant la phase dite IVVQ pour Intégration, Vérification, Validation, Qualification. Finalement, afin de structurer les processus de conception des modèles de simulation, le travail s’est inspiré des cadres de l’Architecture d’Entreprise en reflétant les exigences d’intégration et de standardisation du modèle opératoire de l’entreprise. Pour valider les concepts introduits dans le cadre de cette thèse, des études de cas tirés des domaines automobile et aérospatiale ont été réalisées. L'objectif de cette validation est d'observer l'amélioration significative du processus actuel en termes d'efficacité, de réduction de l'ambiguïté et des malentendus dans la modélisation et la simulation du système à concevoir
Simulation models are widely used by industries as an aid for decision making to explore and optimize a broad range of complex industrial systems’ architectures. The increased complexity of industrial systems (cars, airplanes, etc.), ecological and economic concerns implies a need for exploring and analysing innovative system architectures efficiently and effectively by using simulation models. However, simulations designers currently suffer from limitations which make simulation models difficult to design and develop in a collaborative, multidisciplinary design environment. The multidisciplinary nature of simulation models requires a specific understanding of each phenomenon to simulate and a thorough description of the system architecture, its components and connections between components. To accomplish these objectives, the Model-Based Systems Engineering (MBSE) and Information Systems’ (IS) methodologies were used to support the simulation designer’s analysing capabilities in terms of methods, processes and design tool solutions. The objective of this thesis is twofold. The first concerns the development of a methodology and tools to build accurate simulation models. The second focuses on the introduction of an innovative approach to design, product and integrate the simulation models in a “plug and play" manner by ensuring the expected model fidelity. However, today, one of the major challenges in full-vehicle simulation model creation is to get domain level simulation models from different domain experts while detecting any potential inconsistency problem before the IVVQ (Integration, Verification, Validation, and Qualification) phase. In the current simulation model development process, most of the defects such as interface mismatch and interoperability problems are discovered late, during the IVVQ phase. This may create multiple wastes, including rework and, may-be the most harmful, incorrect simulation models, which are subsequently used as basis for design decisions. In order to address this problem, this work aims to reduce late inconsistency detection by ensuring early stage collaborations between the different suppliers and OEM. Thus, this work integrates first a Detailed Model Design Phase to the current model development process and, second, the roles have been re-organized and delegated between design actors. Finally an alternative architecture design tool is supported by an ontology-based DSL (Domain Specific Language) called Model Identity Card (MIC). The design tools and mentioned activities perspectives (e.g. decisions, views and viewpoints) are structured by inspiration from Enterprise Architecture Frameworks. To demonstrate the applicability of our proposed solution, engine-after treatment, hybrid parallel propulsion and electric transmission models are tested across automotive and aeronautic industries
APA, Harvard, Vancouver, ISO, and other styles
8

Prodel, Martin. "Modélisation automatique et simulation de parcours de soins à partir de bases de données de santé." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEM009/document.

Full text
Abstract:
Les deux dernières décennies ont été marquées par une augmentation significative des données collectées dans les systèmes d'informations. Cette masse de données contient des informations riches et peu exploitées. Cette réalité s’applique au secteur de la santé où l'informatisation est un enjeu pour l’amélioration de la qualité des soins. Les méthodes existantes dans les domaines de l'extraction de processus, de l'exploration de données et de la modélisation mathématique ne parviennent pas à gérer des données aussi hétérogènes et volumineuses que celles de la santé. Notre objectif est de développer une méthodologie complète pour transformer des données de santé brutes en modèles de simulation des parcours de soins cliniques. Nous introduisons d'abord un cadre mathématique dédié à la découverte de modèles décrivant les parcours de soin, en combinant optimisation combinatoire et Process Mining. Ensuite, nous enrichissons ce modèle par l’utilisation conjointe d’un algorithme d’alignement de séquences et de techniques classiques de Data Mining. Notre approche est capable de gérer des données bruitées et de grande taille. Enfin, nous proposons une procédure pour la conversion automatique d'un modèle descriptif des parcours de soins en un modèle de simulation dynamique. Après validation, le modèle obtenu est exécuté pour effectuer des analyses de sensibilité et évaluer de nouveaux scénarios. Un cas d’étude sur les maladies cardiovasculaires est présenté, avec l’utilisation de la base nationale des hospitalisations entre 2006 et 2015. La méthodologie présentée dans cette thèse est réutilisable dans d'autres aires thérapeutiques et sur d'autres sources de données de santé
During the last two decades, the amount of data collected in Information Systems has drastically increased. This large amount of data is highly valuable. This reality applies to health-care where the computerization is still an ongoing process. Existing methods from the fields of process mining, data mining and mathematical modeling cannot handle large-sized and variable event logs. Our goal is to develop an extensive methodology to turn health data from event logs into simulation models of clinical pathways. We first introduce a mathematical framework to discover optimal process models. Our approach shows the benefits of combining combinatorial optimization and process mining techniques. Then, we enrich the discovered model with additional data from the log. An innovative combination of a sequence alignment algorithm and of classical data mining techniques is used to analyse path choices within long-term clinical pathways. The approach is suitable for noisy and large logs. Finally, we propose an automatic procedure to convert static models of clinical pathways into dynamic simulation models. The resulting models perform sensitivity analyses to quantify the impact of determinant factors on several key performance indicators related to care processes. They are also used to evaluate what-if scenarios. The presented methodology was proven to be highly reusable on various medical fields and on any source of event logs. Using the national French database of all the hospital events from 2006 to 2015, an extensive case study on cardiovascular diseases is presented to show the efficiency of the proposed framework
APA, Harvard, Vancouver, ISO, and other styles
9

Michel, Thierry. "Test en ligne des systèmes à base de microprocesseur." Phd thesis, Grenoble INPG, 1993. http://tel.archives-ouvertes.fr/tel-00343488.

Full text
Abstract:
Cette thèse traite de la vérification en ligne, par des moyens matériels, du flot de contrôle d'un système a base de microprocesseur. Une technique de compaction est utilisée pour faciliter cette vérification (analyse de signature). La plupart des méthodes proposées jusqu'ici imposent une modification du programme d'application, afin d'introduire dans celui-ci des propriétés invariantes (la signature en chaque point de l'organigramme est indépendante des chemins préalablement parcourus). Les méthodes proposées ici, au contraire, ont comme caractéristique principale de ne pas modifier le programme vérifie et utilisent un dispositif de type processeur, disposant d'une mémoire locale, pour assurer l'invariance de la signature. Deux méthodes sont ainsi décrites. La première est facilement adaptable a différents microprocesseurs et présente une efficacité qui la place parmi les meilleures méthodes proposées jusqu'ici. La seconde methode a été dérivée de la première dans le but de diminuer la quantité d'informations nécessaire au test. Cette dernière methode a été implantée sur un prototype d'unité centrale d'automate programmable (avec la société télémécanique) et son efficacité a été évaluée par des expériences d'injection de fautes. Le cout d'implantation particulièrement faible dans le cas du prototype réalise peut permettre d'envisager une évolution de celui-ci vers un produit industriel
APA, Harvard, Vancouver, ISO, and other styles
10

EUGENE, REMI. "Etude architecturale, modelisation et realisation d'un processeur a base de gapp pour le traitement d'images temps reel et la simulation par automate cellulaire." Université Louis Pasteur (Strasbourg) (1971-2008), 1990. http://www.theses.fr/1990STR13043.

Full text
Abstract:
Le traitement bas niveau d'images requiert des structures particulieres de processeurs des lors que des contraintes temps reel interviennent. Nous etudions dans cette these l'adequation des processeurs a base d'une unite de calcul compose de plusieurs milliers d'unites de calcul bit serie utilisees en mode de fonctionnement simd et interconnectees par un reseau a maille carree. L'etude porte successivement sur une classification des structures d'integration de telles unites de calcul, sur une modelisation des performances des solutions d'integrations et sur une comparaison des modeles pour diverses classes d'algorithmes. Cette etude est suivie de la description d'un processeur realise a base de gapp et de la description d'une realisation en cours a base d'elsa
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Simulation à base de processus"

1

Chang, Hyeong Soo, Jiaqiao Hu, Michael C. Fu, and Steven I. Marcus. Simulation-based Algorithms for Markov Decision Processes. London: Springer London, 2007. http://dx.doi.org/10.1007/978-1-84628-690-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chang, Hyeong Soo, Jiaqiao Hu, Michael C. Fu, and Steven I. Marcus. Simulation-Based Algorithms for Markov Decision Processes. London: Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5022-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Soo, Chang Hyeong, ed. Simulation-based algorithms for Markov decision processes. London: Springer, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chang, Hyeong Soo. Simulation-Based Algorithms for Markov Decision Processes. 2nd ed. London: Springer London, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yuktadatta, Panurit. Simulation of a parallel processor based small tactical system. Monterey, Calif: Naval Postgraduate School, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Birmingham), Autotech 1991 (1991. Simultaneous engineering, simulation and data base models. [London]: Institution of Mechanical Engineers, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Birmingham), Autotech 1991 (1991. Simultaneous engineering, simulation and data base models. [London]: Institution of MechanicalEngineers, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shedler, G. S. Regenerative stochastic simulation. Boston: Academic Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

United States. General Accounting Office. National Security and International Affairs Division. Base closures. [Washington, D.C.]: The Office, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Markov chain Monte Carlo simulations and their statistical analysis: With web-based Fortran code. Hackensack, NJ: World Scientific, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Simulation à base de processus"

1

Chenevier, P., G. Kamarinos, and G. Pananakakis. "Semianalytical Universal Simulation of the Electrical Properties of the Permeable Base Transistor." In Simulation of Semiconductor Devices and Processes, 309–12. Vienna: Springer Vienna, 1993. http://dx.doi.org/10.1007/978-3-7091-6657-4_76.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Molzer, Wolfgang. "Ge Profile for Minimum Neutral Base Transit Time in Si/Si1-yGey Heterojunction Bipolar Transistors." In Simulation of Semiconductor Devices and Processes, 102–5. Vienna: Springer Vienna, 1995. http://dx.doi.org/10.1007/978-3-7091-6619-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mouis, M., H. J. Gregory, S. Denorme, D. Mathiot, P. Ashburn, D. J. Robbins, and J. L. Glasper. "Physical Modeling of the Enhanced Diffusion of Boron Due to Ion Implantation in Thin Base npn Bipolar Transistors." In Simulation of Semiconductor Devices and Processes, 141–44. Vienna: Springer Vienna, 1993. http://dx.doi.org/10.1007/978-3-7091-6657-4_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Marcon, J., S. Gautier, S. Koumetz, K. Ketata, and M. Ketata. "Simulation of Be diffusion in the base layer of InGaAs/InP Heterojunction Bipolar Transistors." In Simulation of Semiconductor Processes and Devices 1998, 243–46. Vienna: Springer Vienna, 1998. http://dx.doi.org/10.1007/978-3-7091-6827-1_61.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Krallics, György, Stan T. Mandziej, and György Ziaja. "Determination of Thermal-Mechanical Properties of Aluminium Base PM Material for Computer Simulation of Manufacturing Process." In Microstructures, Mechanical Properties and Processes - Computer Simulation and Modelling, 178–83. Weinheim, FRG: Wiley-VCH Verlag GmbH & Co. KGaA, 2005. http://dx.doi.org/10.1002/3527606157.ch28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sulis, Emilio, and Kuldar Taveter. "The Analysis of Business Processes." In Agent-Based Business Process Simulation, 13–35. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98816-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kümpel, Michaela, Christian A. Mueller, and Michael Beetz. "Semantic Digital Twins for Retail Logistics." In Dynamics in Logistics, 129–53. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-88662-2_7.

Full text
Abstract:
AbstractAs digitization advances, stationary retail is increasingly enabled to develop novel retail services aiming at enhancing efficiency of business processes ranging from in-store logistics to customer shopping experiences. In contrast to online stores, stationary retail digitization demands for an integration of various data like location information, product information, or semantic information in order to offer services such as customer shopping assistance, product placement recommendations, or robotic store assistance.We introduce the semantic Digital Twin (semDT) as a semantically enhanced virtual representation of a retail store environment, connecting a symbolic knowledge base with a scene graph. The ontology-based symbolic knowledge base incorporates various interchangeable knowledge sources, allowing for complex reasoning tasks that enhance daily processes in retail business. The scene graph provides a realistic 3D model of the store, which is enhanced with semantic information about the store, its shelf layout, and contained products. Thereby, the semDT knowledge base can be reasoned about and visualized and simulated in applications from web to robot systems. The semDT is demonstrated in three use cases showcasing disparate platforms interacting with the semDT: Optimization of product replenishment; customer support using AR applications; retail store visualization, and simulation in a virtual environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Beygi, Reza, Eduardo Marques, and Lucas F. M. da Silva. "Data Based Simulation." In Computational Concepts in Simulation of Welding Processes, 85–94. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-97910-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sulis, Emilio, and Kuldar Taveter. "Beyond Process Simulation." In Agent-Based Business Process Simulation, 175–82. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98816-6_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sulis, Emilio, and Kuldar Taveter. "Introducing Agent-Based Simulation for the Business Processes." In Agent-Based Business Process Simulation, 3–12. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98816-6_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Simulation à base de processus"

1

Zhou, ChengHan, and WeiDong Wang. "Highway Bridge Construction Process Simulation Base on 4D Visualization." In GeoHunan International Conference 2009. Reston, VA: American Society of Civil Engineers, 2009. http://dx.doi.org/10.1061/41042(349)18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kopp, R., M. Tschirnich, M. Wolske, and J. Klöwer. "Designing Hot Working Processes of Nickel Base Superalloys Using FEM Simulation." In ASME Turbo Expo 2001: Power for Land, Sea, and Air. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/2001-gt-0429.

Full text
Abstract:
Knowledge of correct flow stress curves of Ni-based alloys at high temperatures is of essential importance for reliable plasto-mechanical simulations in materials processing and for an effective planning and designing of industrial hot forming schedules like hot rolling or forging. The experiments are performed on a computer controlled servo-hydraulic testing machine at IBF (Institute of Metal Forming). To avoid an inhomogeneous deformation due to the influence of friction and initial microstructure, a suitable specimen geometry and lubricant is used and a thermal treatment before testing has to provide a microstructure, similar to the structure of the material in the real process. The compression tests are performed within a furnace, which keeps sample, tools and surrounding atmosphere at the defined forming temperature. The uniaxial compressions were carried out in the range of strain rates between 0.001 and 50 s−1 and temperatures between 950 and 1280°C. Furthermore two-stage step tests are carried out to derive the work hardening and softening behaviour as well as the recrystallisation kinetics of the selected Ni-based alloys. At the end of this work a material model is adapted by the previously determined material data. This model is integrated into the Finite Element program LARSTRAN/SHAPE to calculate a forging process of the material Alloy 617.
APA, Harvard, Vancouver, ISO, and other styles
3

Matsui, T., H. Takizawa, and H. Kikuchi. "Numerical Simulation of Ring Rolling Process for Ni-Base Articles." In Superalloys. TMS, 2004. http://dx.doi.org/10.7449/2004/superalloys_2004_907_915.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Park, Hong-Hyun, Chihak Ahn, Woosung Choi, Keun-Ho Lee, and Youngkwan Park. "Multiscale strain simulation for semiconductor devices base on the valence force field and the finite element methods." In 2015 International Conference on Simulation of Semiconductor Processes and Devices (SISPAD). IEEE, 2015. http://dx.doi.org/10.1109/sispad.2015.7292246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kostogryzov, A., P. Stepanov, A. Nistratov, G. Nistratov, O. Atakishchev, and V. Kiselev. "Risks Prediction and Processes Optimization for Complex Systems on the Base of Probabilistic Modeling." In 2016 International Conference on Applied Mathematics, Simulation and Modelling. Paris, France: Atlantis Press, 2016. http://dx.doi.org/10.2991/amsm-16.2016.43.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Piech, Henryk, Grzegorz Grodzki, and Aleksandra Ptak. "Parallel Simulation of Dynamic Communication Processes on the Base of Probability Time Automata." In 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications. SCITEPRESS - Science and Technology Publications, 2014. http://dx.doi.org/10.5220/0005102702370242.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pei, J., G. Wang, B. Rong, and L. Yun. "Dynamics modeling and simulation on single-base propellant deterrent coating process." In 1st International Conference on Mechanical System Dynamics (ICMSD 2022). Institution of Engineering and Technology, 2022. http://dx.doi.org/10.1049/icp.2022.1811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kundas, S., and A. Ilyushenko. "Computer Simulation and Control of Plasma Spraying Processes." In ITSC2001, edited by Christopher C. Berndt, Khiam A. Khor, and Erich F. Lugscheider. ASM International, 2001. http://dx.doi.org/10.31399/asm.cp.itsc2001p0925.

Full text
Abstract:
Abstract Integrated technological complex for computer simulation, experimental study and practical realization of plasma spraying processes was created. Main features of development are possibility of simulation of all plasma spraying stages (moving and heating particles in plasma jet, coating structure formation, heat transfer and strained-stress state of coating-substrate system) separately and in the mode of end-to-end; data base with material and gas properties; possibility of experimental measurement of main plasma spraying parameters and entering them in the computer in the real time mode; graphic visualization of simulation and experimental measurement results; computer control of plasma spaying equipment; fabrication of initial lots of products according to developed technology. Complex is designed on the base of VPS «Plasmatechnik» equipment. Experimental measurement of particle temperature, velocity and coating temperature were conducted with original optoelectronics system. All technological and measurement equipment are connected with computer with special interfaces and working under computer control.
APA, Harvard, Vancouver, ISO, and other styles
9

Siegele, Dieter, and Marcus Brand. "Numerical Simulation of Residual Stresses Due to Cladding Process." In ASME 2007 Pressure Vessels and Piping Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/pvp2007-26586.

Full text
Abstract:
The inner surface of reactor pressure vessels is protected against corrosion by an austenitic cladding. Generally, the cladding is welded on the ferritic base metal with two layers to avoid sub-clad cracks and to improve the microstructure of the cladding material. On the other hand, due to the cladding process and the difference of the thermal expansion coefficient of the austenitic cladding and the ferritic base material residual stresses act in the component. This residual stress field is important for assessing crack postulates in the cladding or subclad flaws in the base metal. For the determination of the residual stress field, plates of RPV steel were cladded and heat treated representative to the RPV relevant conditions. During the cladding process the temperature and distortion were measured as basis for the validation of the finite element simulations. The numerical simulation was performed with the finite element code SYSWELD. The heat source of the model was calibrated on the measured temperature profile. In the analysis, the temperature dependent material properties as well as the transformation behavior of the ferritic base metal were taken into account. The calculated residual stresses show tensile stresses in the cladding followed by compressive stresses in the base metal that are in agreement with measurements with X-ray diffraction technique.
APA, Harvard, Vancouver, ISO, and other styles
10

Marqués Valderrama, Israel, Ricardo Chacartegui Ramirez, Jose Antonio Becerra Villanueva, Carlos Ortiz Dominguez, and Diego Antonio Rodríguez Pastor. "Efficiency Analysis in the Chemical Looping Process of Base Metals." In 36th International Conference on Efficiency, Cost, Optimization, Simulation and Environmental Impact of Energy Systems (ECOS 2023). Las Palmas De Gran Canaria, Spain: ECOS 2023, 2023. http://dx.doi.org/10.52202/069564-0212.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Simulation à base de processus"

1

Boysen-Urban, Kirsten, Hans Grinsted Jensen, and Martina Brockmeier. Extending the GTAP Data Base and Model to Cover Domestic Support Issues using the EU as Example. GTAP Technical Paper, June 2014. http://dx.doi.org/10.21642/gtap.tp35.

Full text
Abstract:
The EU Single Farm Payment (SFP) is currently distributed in proportion to primary factor shares in version 8 of the GTAP database. In this paper, we investigate whether this way of modeling the EU SFP makes a difference in analyzing agricultural policy reforms. To do so, we create alternative versions of the GTAP database to compare the effects with the default setting in GTAP. Employing OECD data, along with the GTAP framework, we vary the assumptions about the allocation of the SFP. In the process, we demonstrate how to alter and update the GTAP database to implement domestic support of OECD PSE tables. We provide a detailed overview supplemented with assumptions of payment allocation, shock calculations and in particular, the Altertax procedure to update value flows and price equations extended in the GTAP model. Subsequently, we illustrate the impact of those assumptions by simulating a 100% removal of the SFP using the deviating versions of GTAP database. This sensitivity analysis reveals strong differences in results, but particularly in production responses of food and agricultural sectors that decrease with an increasing degree of decoupling. Furthermore, our analysis shows that the effect on welfare and the trade balance decreases with an increasing degree of decoupling. This experiment shows that the allocation of the SFP can have strong impacts on simulation results.
APA, Harvard, Vancouver, ISO, and other styles
2

Bäumler, Maximilian, Madlen Ringhand, Christian Siebke, Marcus Mai, Felix Elrod, and Günther Prokop. Report on validation of the stochastic traffic simulation (Part B). Technische Universität Dresden, 2021. http://dx.doi.org/10.26128/2021.243.

Full text
Abstract:
This document is intended to give an overview of the validation of the human subject study, conducted in the driving simulator of the Chair of Traffic and Transportation Psychology (Verkehrspsychologie – VPSY) of the Technische Universität Dresden (TUD), as well of the validation of the stochastic traffic simulation developed in the AutoDrive project by the Chair of Automotive Engineering (Lehrstuhl Kraftfahrzeugtechnik – LKT) of TUD. Furthermore, the evaluation process of a C-AEB (Cooperative-Automatic Emergency Brake) system is demonstrated. The main purpose was to compare the driving behaviour of the study participants and the driving behaviour of the agents in the traffic simulation with real world data. Based on relevant literature, a validation concept was designed and real world data was collected using drones and stationary cameras. By means of qualitative and quantitative analysis it could be shown, that the driving simulator study shows realistic driving behaviour in terms of mean speed. Moreover, the stochastic traffic simulation already reflects reality in terms of mean and maximum speed of the agents. Finally, the performed evaluation proofed the suitability of the developed stochastic simulation for the assessment process. Furthermore, it could be shown, that a C-AEB system improves the traffic safety for the chosen test-scenarios.
APA, Harvard, Vancouver, ISO, and other styles
3

Kompaniets, Alla, Hanna Chemerys, and Iryna Krasheninnik. Using 3D modelling in design training simulator with augmented reality. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3740.

Full text
Abstract:
The article is devoted to the theoretical consideration of the problem and the use of innovative technologies in the educational process in the educational establishment of secondary education in the process of studying the school course of computer science. The main advantages of using educational simulators in the educational process are considered, based on the new state standard of basic and complete general secondary education. Based on the analysis of scientific and methodological literature and network sources, the features of the development of simulators for educational purposes are described. Innovative tools for simulator development have been investigated, as augmented reality with the use of three-dimensional simulation. The peculiarities of using a simulator with augmented reality when studying the topic of algorithmization in the course of studying a school computer science are considered. The article also describes the implementation of augmented reality simulator for the formation of algorithmic thinking skills by students, presents the results of development and describes the functionality of the software product. In the further prospects of the study, it is planned to conduct an experimental study to determine the effectiveness of the use of software development in the learning process.
APA, Harvard, Vancouver, ISO, and other styles
4

Mantock, James M., and Michael T. Gately. Casualty Handling Simulation Using the Scenario-based Engineering Process. Fort Belvoir, VA: Defense Technical Information Center, February 2000. http://dx.doi.org/10.21236/ada375590.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Butler, Brett, Tony Valle, and Jim Watson. Advanced Distributed Simulation Technology. SIMWORLD Data Base. Fort Belvoir, VA: Defense Technical Information Center, April 1994. http://dx.doi.org/10.21236/ada280262.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Voloshynov, Serhii A., Felix M. Zhuravlev, Ivan M. Riabukha, Vitaliy V. Smolets, and Halyna V. Popova. Application of VR technologies in building future maritime specialists' professional competences. [б. в.], July 2021. http://dx.doi.org/10.31812/123456789/4623.

Full text
Abstract:
Progress of modern digital technologies enlarged the quantity of researches about implementation and usage of VR technologies in education process of higher educational establishments. The article provides analysis of best practices of simulation technologies application in maritime education. Absence of national research experience, evidence base for efficiency of new VR simulators operation leaves this issue open to be investigated in terms of researches on their performance effectiveness. The article proposes overview of advantages of VR technologies implementation aimed at building and shaping of future maritime specialists’ professional competences. Authors investigate potential application possibilities of interactive and representative potential of immersion digital technologies during education process at maritime educational establishments. Problem of VR technologies integration into education and training of future seafarers is highlighted, as well as possibility to use virtual courses in the process of future maritime specialists’ training. The article reveals prognostic validity of VR simulators used for building of professional competences.
APA, Harvard, Vancouver, ISO, and other styles
7

May, J., R. Chen, D. Jefferson, J. Leek, I. Kaplan, and J. Tannahill. Petascale Simulation Initiative Tech Base: FY2007 Final Report. Office of Scientific and Technical Information (OSTI), October 2007. http://dx.doi.org/10.2172/923105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kurec, Aleksander M., and Harry J. Zywiol. Motion Base Simulation Test of the M101A2 Trailer. Fort Belvoir, VA: Defense Technical Information Center, March 1991. http://dx.doi.org/10.21236/ada252417.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bauer, Andrew, James Forsythe, Jayanarayanan Sitaraman, Andrew Wissink, Buvana Jayaraman, and Robert Haehnel. In situ analysis and visualization to enable better workflows with CREATE-AV™ Helios. Engineer Research and Development Center (U.S.), June 2021. http://dx.doi.org/10.21079/11681/40846.

Full text
Abstract:
The CREATE-AV™ Helios CFD simulation code has been used to accurately predict rotorcraft performance under a variety of flight conditions. The Helios package contains a suite of tools that contain almost the entire set of functionality needed for a variety of workflows. These workflows include tools customized to properly specify many in situ analysis and visualization capabilities appropriate for rotorcraft analysis. In situ is the process of computing analysis and visualization information during a simulation run before data is saved to disk. In situ has been referred to with a variety of terms including co-processing, covisualization, coviz, etc. In this paper we describe the customization of the pre-processing GUI and corresponding development of the Helios solver code-base to effectively implement in situ analysis and visualization to reduce file IO and speed up workflows for CFD analysts. We showcase how the workflow enables the wide variety of Helios users to effectively work in post-processing tools they are already familiar with as opposed to forcing them to learn new tools in order post-process in situ data extracts being produced by Helios. These data extracts include various sources of information customized to Helios, such as knowledge about the near- and off-body grids, internal surface extracts with patch information, and volumetric extracts meant for fast post-processing of data. Additionally, we demonstrate how in situ can be used by workflow automation tools to help convey information to the user that would be much more difficult when using full data dumps.
APA, Harvard, Vancouver, ISO, and other styles
10

Straatsma, TP, J. A. McCammon, John H. Miller, Paul E. Smith, Erich R. Vorpagel, Chung F. Wong, and Martin W. Zacharias. Biomolecular Simulation of Base Excision Repair and Protein Signaling. Office of Scientific and Technical Information (OSTI), March 2006. http://dx.doi.org/10.2172/877558.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography