Dissertations / Theses on the topic 'Inférence basée sur simulation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Inférence basée sur simulation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Rouillard, Louis. "Bridging Simulation-based Inference and Hierarchical Modeling : Applications in Neuroscience." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG024.
Neuroimaging investigates the brain's architecture and function using magnetic resonance (MRI). To make sense of the complex observed signal, Neuroscientists posit explanatory models, governed by interpretable parameters. This thesis tackles statistical inference : guessing which parameters could have yielded the signal through the model.Inference in Neuroimaging is complexified by at least three hurdles : a large dimensionality, a large uncertainty, and the hierarchcial structure of data. We look into variational inference (VI) as an optimization-based method to tackle this regime.Specifically, we conbine structured stochastic VI and normalizing flows (NFs) to design expressive yet scalable variational families. We apply those techniques in diffusion and functional MRI, on tasks including individual parcellation, microstructure inference and directional coupling estimation. Through these applications, we underline the interplay between the forward and reverse Kullback-Leibler (KL) divergences as complemen-tary tools for inference. We also demonstrate the ability of automatic VI (AVI) as a reliable and scalable inference method to tackle the challenges of model-driven Neuroscience
Amate, Laure. "Apprentissage de modèles de formes parcimonieux basés sur des représentations splines." Phd thesis, Université de Nice Sophia-Antipolis, 2009. http://tel.archives-ouvertes.fr/tel-00456612.
Randrianasolo, Toky. "Inférence basée sur le plan pour l'estimation de petits domaines." Thesis, Paris Est, 2013. http://www.theses.fr/2013PEST0099.
The strong demand for results at a detailed geographic level, particularly from national surveys, has raised the problem of the fragility of estimates for small areas. This thesis addresses this issue with specific methods based on the sample design. These ones consist of building new weights for each statistical unit. The first method consists of optimizing the re-weighting of a subsample survey included in an area. The second one is based on the construction of weights that depend on the statistical units as well as the areas. It consists of splitting the sampling weights of the overall estimator while satisfying two constraints : 1/ the sum of the estimates on every partition into areas is equal to the overall estimate ; 2/ the system of weights for a given area satisfies calibration properties on known auxiliary variables at the level of the area. The split estimator thus obtained behaves almost similarly as the well-known blup (best linear unbiased predictor) estimator. The third method proposes a rewriting of the blup estimator, although model-based, in the form of a homogenous linear estimator from a design-based approach. New modified blup estimators are obtained. Their precision, estimated by simulation with an application to real data, is quite close to that of the standard blup estimator. Then, the methods developed in this thesis are applied to the estimation of local mobility indicators from the 2007-2008 French National Travel Survey. When the size of an area is small in the sample, the estimates obtained with the first method are not precise enough whereas the precision remains satisfactory for the two other methods
Hernandez, Quintero Angelica. "Inférence statistique basée sur les processus empiriques dans des modèles semi-paramétriques de durées de vie." Toulouse 3, 2010. http://thesesups.ups-tlse.fr/1201/.
Survival data arise from disciplines such as medicine, criminology, finance and engineering amongst others. In many circumstances the event of interest can be classified in several causes of death or failure and in some others the event can only be observed for a proportion of "susceptibles". Data for these two cases are known as competing risks and long-term survivors, respectively. Issues relevant to the analysis of these two types of data include basic properties such as the parameters estimation, existence, consistency and asymptotic normality of the estimators, and their efficiency when they follow a semiparametric structure. The present thesis investigates these properties in well established semiparametric formulations for the analysis of both competing risks and long-term survivors. It presents an overview of mathematical tools that allow for the study of these basic properties and describes how the modern theory of empirical processes and the theory of semiparametric efficiency facilitate relevant proofs. Also, consistent variance estimate for both the parametric and semiparametric components for the two models are presented. The findings of this research provide the theoretical basis for obtaining inferences with large samples, the calculation of confidence bands and hypothesis testing. The methods are illustrated with data bases generated through simulations
Chermain, Xavier. "Simulation d'effets aérodynamiques et hydrodynamiques basée sur une méthode lagrangienne." Mémoire, Université de Sherbrooke, 2016. http://hdl.handle.net/11143/8391.
Georgelin, Philippe. "Vérification formelle de systèmes digitaux synchrones, basée sur la simulation symbolique." Université Joseph Fourier (Grenoble ; 1971-2015), 2001. http://www.theses.fr/2001GRE10126.
Bouchard, Jocelyn. "Commande prédictive basée sur la simulation. Application à la flottation en colonne." Thesis, Université Laval, 2007. http://www.theses.ulaval.ca/2007/24892/24892.pdf.
Applications of dynamic simulators for model predictive controllers design are rather scarce in the litterature. The complexity of solving the resulting optimization problems may explain this lack of popularity. In fact, nonlinear programming algorithms are not always well suited to efficiently reach the optimum of a fundamentaly-based cost function. The situation is even worse when the equations used in the model are unknown by the control designers (black box models). The simulation-based model predictive controller is an alternative formulation to perform model predictive control (MPC) without making use of any explicit optimization solver, but rather based on an easy-to-compute closed-loop simulation. The resulting scheme generally provides a sub-optimal solution and benefits from many interesting features of conventional MPC without being restricted by the model complexity. Two algorithms are proposed: decentralized and decoupled. The decentralized simulation structure allows a flexible setting of the prediction horizon (Hp) that is not possible in the decoupled case, easier to tune, but where Hp must generally be in the same order of magnitude that the system settling time. A second contribution of this thesis is the development of a framework for the dynamic simulation of a mineral separation process: column flotation. Until now, most of the proposed models or simulators were restricted to the steady-state behavior. When dynamic mass-balance equations were considered, a constant pulp level during the simulation was always assumed. The presented framework aims to simulate water, solids and gas motion and their effect on the pulp level and output flow rates. As it often happens in mineral processing, the column flotation process has not benefited from advanced control techniques. This is where the two previous subjects merge. The proposed simulation framework is used to design a simulation-based model predictive controller for process variables having a strong influence on metallurgical results (grade and recovery). A case study is presented where the pulp level, bias and air hold-up in the pulp zone are kept within an acceptable operating region.
Soho, Komi Dodzi Badji. "Simulation multi-échelle des procédés de fabrication basée sur la plasticité cristalline." Thesis, Université de Lorraine, 2016. http://www.theses.fr/2016LORR0037.
In this thesis, two coupling methods are proposed for the multiscale simulation of forming processes. In the first part, a simplified procedure (indirect coupling) is adopted to couple the finite element codes (Abaqus and LAM3) with a polycrystalline selfconsistent model based on the large strain elastoplastic behavior of single crystals. This simplified procedure consists in linking the polycrystalline model with the FE analysis by extracting the history of the increment of macroscopic strain and stress, obtained from a preliminary FE simulation with a phenomenological law, and then using it as loading path prescribed to the polycrystalline model. This method is applied to multiscale simulation of skin-pass processes. By following on the loading path extracted at the halfthickness of the sheet, we can predict the evolution of some physical parameters associated with the plasticity model, in particular the crystallographic texture, the morphological texture and hardening. In the second part on this thesis, a small strain version of the elastoplastic polycristalline self-consistent model is coupled to the Abaqus FE code via the user material subroutine UMAT. This coupling (called direct coupling) consists in using crystal plasticity theory as constitutive law at each integration point of the FE mesh. The polycristal is represented by a set of N single crystals. Each time the FE code needs information on the mechanical behavior at the integration points considered, the full polycrystalline constitutive model is called. In order to validate this coupling, simulations of simple mechanical tests have been conducted. The results of this coupling have been validated through comparison with reference models. Unlike phenomenological models, this coupling provides not only information on the overall macroscopic response of the structure, but also important information related to its microstructure
Navarrete, Gutierrez Tomas. "Une architecture de contrôle de systèmes complexes basée sur la simulation multi-agent." Phd thesis, Université de Lorraine, 2012. http://tel.archives-ouvertes.fr/tel-00758118.
Mayet, Clément. "Simulation énergétique flexible d’un carrousel de métros basée sur la représentation énergétique macroscopique." Thesis, Lille 1, 2016. http://www.theses.fr/2016LIL10013/document.
Transportation systems have to be efficient in terms of energy in order to limit their environmental impact. Electric public transportation, such as subways or tramways, is thus strongly requested in urban areas. Various innovative solutions have emerged recently to increase their energy efficiency (energy storage systems, reversible traction power substations, etc.). However, due to the complexity of the development and optimization of such systems, numerical simulation tools are essential. Nevertheless, simulators of railway systems are particularly delicate to develop due to non-linearity (non-reversibility of traction power substations), non-stationary (displacement of trains), and multiple energetic interactions which exists within these kind of systems. This PhD thesis then proposes a new simulation tool of subway system based on EMR formalism (Energetic Macroscopic Representation). This formalism structures the models according to the energetic properties of the system. It leads to a forward simulation approach with exclusive use of the integral causality. In that way, the proposed simulation tool is stemming from an innovative approach and allows a new vision of subway systems. These approaches allow especially the increasing of the simulator flexibility and the obtaining of physical simulation results. Moreover, this PhD thesis has the particularity to experimentally validate all the developed models
Navarrete, Gutiérrez Tomás. "Une architecture de contrôle de systèmes complexes basée sur la simulation multi-agent." Thesis, Université de Lorraine, 2012. http://www.theses.fr/2012LORR0165/document.
Complex systems are present everywhere in our environment: internet, electricity distribution networks, transport networks. This systems have as characteristics: a large number of autonomous entities, dynamic structures, different time and space scales and emergent phenomena. This thesis work is centered on the problem of control of such systems. The problem is defined as the need to determine, based on a partial perception of the system state, which actions to execute in order to avoid or favor certain global states of the system. This problem comprises several difficult questions: how to evaluate the impact at the global level of actions applied at a global level, how to model the dynamics of an heterogeneous system (different behaviors issue of different levels of interactions), how to evaluate the quality of the estimations issue of the modeling of the system dynamics. We propose a control architecture based on an ``equation-free'' approach. We use a multi-agent model to evaluate the global impact of local control actions before applying the most pertinent set of actions. Associated to our architecture, an experimental platform has been developed to confront the basic ideas or the architecture within the context of simulated ``free-riding'' phenomenon in peer to peer file exchange networks. We have demonstrated that our approach allows to drive the system to a state where most peers share files, despite given initial conditions that are supposed to drive the system to a state where no peer shares. We have also executed experiments with different configurations of the architecture to identify the different means to improve the performance of the architecture
Navarrete, Gutiérrez Tomás. "Une architecture de contrôle de systèmes complexes basée sur la simulation multi-agent." Electronic Thesis or Diss., Université de Lorraine, 2012. http://www.theses.fr/2012LORR0165.
Complex systems are present everywhere in our environment: internet, electricity distribution networks, transport networks. This systems have as characteristics: a large number of autonomous entities, dynamic structures, different time and space scales and emergent phenomena. This thesis work is centered on the problem of control of such systems. The problem is defined as the need to determine, based on a partial perception of the system state, which actions to execute in order to avoid or favor certain global states of the system. This problem comprises several difficult questions: how to evaluate the impact at the global level of actions applied at a global level, how to model the dynamics of an heterogeneous system (different behaviors issue of different levels of interactions), how to evaluate the quality of the estimations issue of the modeling of the system dynamics. We propose a control architecture based on an ``equation-free'' approach. We use a multi-agent model to evaluate the global impact of local control actions before applying the most pertinent set of actions. Associated to our architecture, an experimental platform has been developed to confront the basic ideas or the architecture within the context of simulated ``free-riding'' phenomenon in peer to peer file exchange networks. We have demonstrated that our approach allows to drive the system to a state where most peers share files, despite given initial conditions that are supposed to drive the system to a state where no peer shares. We have also executed experiments with different configurations of the architecture to identify the different means to improve the performance of the architecture
Fraj, Amine. "Nouvelles approches en conception préliminaire basée sur les modèles des actionneurs embarqués." Thesis, Toulouse, INSA, 2014. http://www.theses.fr/2014ISAT0015/document.
The objective of this thesis is to propose an innovative approaches for embedded actuators preliminary design. This approach responds to a strong need for the industry, particularly in aeronautics. As a first step, a hybrid method of architectures generation and selection depending on the specifications and the technological state of the art is proposed. In a second step, a study of the effect of uncertainty in preliminary design models was completed. A third part demonstrated the value of combining modeling approaches tools 0D/1D and 3D to enable the design phases acceleration and to have better knowledge related to the geometry. Finally, a method using meta-models based on scaling laws for obtaining simple mathematical forms needed for sizing mechatronic components has been developed
Bui, Thi Thu Cuc. "Simulation des écoulements bifluides, une stratégie de couplage basée sur l'adaptation de maillage anisotrope." Paris 6, 2009. http://www.theses.fr/2009PA066372.
Guay, Martin. "Animation de phénomènes gazeux basée sur la simulation d'un modèle de fluide à phase unique sur le GPU." Mémoire, Université de Sherbrooke, 2011. http://savoirs.usherbrooke.ca/handle/11143/4931.
Vieira, De Mello Aline. "Tlmdt : une stratégie de modélisation basée sur temps distribué pour la simulation prallèle de systèmes multi-processeurs intégrés sur puce." Paris 6, 2013. http://www.theses.fr/2013PA066201.
Innovative hardware architectures in the microelectronics industry are mainly characterized by their incredibly high level of parallelism. Despite their relative novelty, Multi-Processors System on Chip (MPSoCs) containing a few cores tend to be replaced by Massively Parallel MPSoCs (MP2SoCs), which integrate dozens or hundreds of processor cores interconnected through a possibly hierarchical network on chip. Several industrial and academic frameworks appeared to help modeling, simulating and debugging MP2SoC architectures. The SystemC hardware description language is the effective backbone of all these frameworks, which allows to describe the hardware at various levels of abstraction, ranging from synthesizable RTL (more accurate and very slow) to TLM (less accurate and very fast). However, when it comes to simulate an architecture containing hundreds of processors, even the simulation speed brought by TLM is not enough. Simultaneously, multi-core workstations are becoming the mainstream. Unfortunately, the genuine SystemC simulation kernel is fully sequential and cannot exploit the processing power provided by these multi-cores machines. In this context, the strategic goal of this thesis is to propose a general modeling approach for timed TLM virtual prototyping of shared memory MP2SoCs, called Transaction Level Modeling with Distributed Time (TLM-DT). The main idea of the TLM-DT approach is not to use anymore the SystemC global simulation time, becoming possible to use a truly parallel simulation engine and providing a significant reduction in simulation time with a limited loss of precision
Palyart, Marc. "Une approche basée sur les modèles pour le développement d'applications de simulation numérique haute-performance." Phd thesis, Université Paul Sabatier - Toulouse III, 2012. http://tel.archives-ouvertes.fr/tel-00865535.
Palyart-Lamarche, Marc. "Une approche basée sur les modèles pour le développement d'applications de simulation numérique haute-performance." Toulouse 3, 2012. http://thesesups.ups-tlse.fr/1990/.
The development and maintenance of high-performance scientific computing software is a complex task. This complexity results from the fact that software and hardware are tightly coupled. Furthermore current parallel programming approaches lack of accessibility and lead to a mixing of concerns within the source code. In this thesis we define an approach for the development of high-performance scientific computing software which relies on model-driven engineering. In order to reduce both duration and cost of migration phases toward new hardware architectures and also to focus on tasks with higher added value this approach called MDE4HPC defines a domain-specific modeling language. This language enables applied mathematicians to describe their numerical model in a both user-friendly and hardware independent way. The different concerns are separated thanks to the use of several models as well as several modeling viewpoints on these models. Depending on the targeted execution platforms, these abstract models are translated into executable implementations with model transformations that can be shared among several software developments. To evaluate the effectiveness of this approach we developed a tool called ArchiMDE. Using this tool we developed different numerical simulation software to validate the design choices made regarding the modeling language
Tatibouët, Jérémie. "Approche systématique basée sur fUML pour formaliser la sémantique d’exécution des profils UML." Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112247.
UML profiles enable the UML to be tailored to a particular domain. To do so, a profile adds domain specific concepts (i.e., stereotypes) and constraints to UML and disambiguates specific semantic variation points. The design process for abstract syntax of profiles is well documented in the literature. However, specification of the semantics is neglected. According to our researches, there are no proposals in the state-of-the-art that define a systematic approach to solve this problem.Semantic formalization is a key point of language design. It enables to formally define the meaning of syntactic elements. In order to be efficient, the formalization activity must be realized in parallel with a standardization activity. The interest to have a language with a formal standardized semantic is to enable models defined using this latter to be interpreted (analysis, execution, simulation) in equivalent way between tools implementing the semantics. This equivalence enables users to observe similar interpretations of the same model between different tools and therefore to have a shared understanding of this model and the associated semantic.In the context of UML profiles, normalization activities leaded by the OMG are focused on the syntax. The semantic formalization aspect is neglected. We propose to address this problem by relying on fUML and PSCS OMG (Object Management Group) specifications. These standards formalize execution semantics for a subset of UML (classes, composite structures, and activities).Firstly, we define two methodologies relying on these standards to formalize execution semantics of UML profiles execution semantics. The first methodology defines the execution semantics for the domain model underlying the profile. The executable form of an applicative model defined using the profile is obtained by model transformation. In the second methodology, semantics are defined directly for the profile by extending the semantic model of fUML/PSCS using fUML. By being conform to fUML, the semantic model is by construction executable and can be directly used in any tools implementing the UML and fUML standards.Secondly, we compare our approaches against the following criteria: the modelling effort required to build the semantic model, the capability of the methodology to preserve the UML semantics (i.e., the one defined by fUML/PSCS) and the capability to identify clearly the relationships between the stereotypes and their semantics. This comparison enables us to demonstrate the capacity of the second methodology to define key extensions of the UML semantics in the context of a profile. These latter are: the control delegation, the instantiation, and the communications. The contributions have been implemented in our model execution framework Moka which is integrated in the open-source UML/SysML modeling tool Papyrus
Mykhalchuk, Vasyl. "Correspondance de maillages dynamiques basée sur les caractéristiques." Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAD010/document.
3D geometry modelling tools and 3D scanners become more enhanced and to a greater degree affordable today. Thus, development of the new algorithms in geometry processing, shape analysis and shape correspondence gather momentum in computer graphics. Those algorithms steadily extend and increasingly replace prevailing methods based on images and videos. Non-rigid shape correspondence or deformable shape matching has been a long-studied subject in computer graphics and related research fields. Not to forget, shape correspondence is of wide use in many applications such as statistical shape analysis, motion cloning, texture transfer, medical applications and many more. However, robust and efficient non-rigid shape correspondence still remains a challenging task due to fundamental variations between individual subjects, acquisition noise and the number of degrees of freedom involved in correspondence search. Although dynamic 2D/3D intra-subject shape correspondence problem has been addressed in the rich set of previous methods, dynamic inter-subject shape correspondence received much less attention. The primary purpose of our research is to develop a novel, efficient, robust deforming shape analysis and correspondence framework for animated meshes based on their dynamic and motion properties. We elaborate our method by exploiting a profitable set of motion data exhibited by deforming meshes with time-varying embedding. Our approach is based on an observation that a dynamic, deforming shape of a given subject contains much more information rather than a single static posture of it. That is different from the existing methods that rely on static shape information for shape correspondence and analysis.Our framework of deforming shape analysis and correspondence of animated meshes is comprised of several major contributions: a new dynamic feature detection technique based on multi-scale animated mesh’s deformation characteristics, novel dynamic feature descriptor, and an adaptation of a robust graph-based feature correspondence approach followed by the fine matching of the animated meshes. [...]
Mebarki, Nasser. "Une approche d'ordonnancement temps réel basée sur la sélection dynamique de règles de priorité." Lyon 1, 1995. http://www.theses.fr/1995LYO10043.
Shahzad, Atif. "Une Approche Hybride de Simulation-Optimisation Basée sur la fouille de Données pour les problèmes d'ordonnancement." Phd thesis, Université de Nantes, 2011. http://tel.archives-ouvertes.fr/tel-00647353.
Klaudel, Witold. "Contribution aux logiciels de simulation généralisée basée sur une approche séquentielle et l'emploi de flux d'information." Châtenay-Malabry, Ecole centrale de Paris, 1989. http://www.theses.fr/1989ECAP0085.
Le, Moigne Rocco. "Modélisation et simulation basée sur systemC des systèmes monopuces au niveau transactionnel pour l'évaluation de performances." Nantes, 2005. http://www.theses.fr/2005NANT2040.
The fast evolution of microelectronic technologies and their ever-improving integration capacities made possible the appearance of a new generation of components on the market: the “System-on-Chip”. The complexity involved when designing these new components and the permanent need to increase the productivity of the system design process in order to reduce the time-to-market leads designers to raise the level of abstraction of their simulation models. Thus, our goal is to provide a set of high-level models and software tools enabling designers to conduct very early in the design process the HW/SW co-simulation of systems. All models developed in this thesis are integrated to the SystemC simulation library of CoFluent Design's CoFluent Studio™ software environment. This work was done in the context of the MEDEA+ A502 MESA project
Shahzad, Muhammad Atif. "Une approche hybride de simulation-optimisation basée sur la fouille de données pour les problèmes d'ordonnancement." Nantes, 2011. http://archive.bu.univ-nantes.fr/pollux/show.action?id=53c8638a-977a-4b85-8c12-6dc88d92f372.
A data mining based approach to discover previously unknown priority dispatching rules for job shop scheduling problem is presented. This approach is based upon seeking the knowledge that is assumed to be embedded in the efficient solutions provided by the optimization module built using tabu search. The objective is to discover the scheduling concepts using data mining and hence to obtain a set of rules capable of approximating the efficient solutions for a job shop scheduling problem (JSSP). A data mining based scheduling framework is presented and implemented for a job shop problem with maximum lateness and mean tardiness as the scheduling objectives. The results obtained are very promising
Berranen, Mohamed Yacine. "Simulation 3D éléments-finis du muscle squelettique en temps-réel basée sur une approche multi-modèles." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS043/document.
Corrective orthopedic surgeries results are difficult to be predicted and, unfortunately, sometimes unsuccessful. Other diseases resulting from a motor disability as bedsores are still poorly understood, despite a significant prevalence in the population. However, studies on these topics still insufficient especially for the analysis considering the muscle as a soft tissue volumetric organ. Muscle fascicule architectures and their correlation with movement efficiency is poorly documented, it lack of the detailed information regarding its volumetric deformations and stiffness changes along with muscle contractions.Muscle volumetric modeling, would provide a powerful tool for the personalized accurate simulation of body stresses of disabled or SCI patients during prolonged or friction contacts with standard medical devices non-adapted to particular morphologies, but also the planning of surgeries or functional electrical stimulation sequences.There is currently no software for automatic reconstruction of the architecture of fascicles, aponeurosis and tendons from MRI acquisitions of a specific subject. Actual volumetric muscle modeling is expensive in computational time, and not effective for real-time simulations of musculoskeletal system behavior with representation of physiological functions. The objective of this thesis is directed by the many contributions that have yet to make in the area. The current modeling methods based on the conventional finite element method are complex, inflexible or inaccurate in real-time. We propose a multi-model based on barycentric mapping approach that decouples the muscle strain density energy function into a set of independent less complex models, with the following objectives:- Improve complex muscle architecture reconstruction from the MRI acquisitions in term of complexity and flexibility.- Split muscle modeling into simple independent models, to offer more flexibility and reducing complexity of modeling which allows to have independent resolutions between deformable elements and muscle fiber elements..- By reducing the number of finite elements ensuring consistency of results of force and deformations, we reduce the computation time required for each simulation.Our methods are inspired by the previous work on the three-dimensional representation of the geometry and the complex architecture of muscles [Blemker and Delp, 2005]. In addition, the mathematical definition is studied [Chi et al., 2010] to characterize the energy density of deformations of skeletal muscle.Related with the above methods, we demand the following advances:- Improved three-dimensional representation of specific patients with muscle architecture and complex geometry from MRI measurement for personalized modeling. The method is more flexible and faster than previous.- A novel modeling method for muscle deformation via decoupled modeling of solid and muscle fiber mechanics is established. This new modeling allowed independent definitions between deformable elements and fiber force generation elements while keeping its muscle deformation accuracy. The performance is compared to conventional FEM method. - We reach high computational speed on standard machines for muscle complex simulations compared to FEM. Real-time simulation of specific person’s muscle strain and force is performed with an activation input updated in real-time from surface EMG measures.- Muscle modeling requires interdisciplinary knowledge from different research team members. The multi-model approach allows collaborative work, where each specialist focuses only on its area of expertise thanks to the modular designed modeling
Lallier-Daniels, Dominic. "Méthodologie de conception numérique d'un ventilateur hélico-centrifuge basée sur l'emploi du calcul méridien." Mémoire, Université de Sherbrooke, 2012. http://hdl.handle.net/11143/6186.
Gaaloul, Sana. "Interopérabilité basée sur les standards Modelica et composant logiciel pour la simulation énergétique des systèmes de bâtiment." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00782540.
Christine, Heritier-Pingeon. "Une aide à la conception de systèmes de production basée sur la simulation et l'analyse de données." Phd thesis, INSA de Lyon, 1991. http://tel.archives-ouvertes.fr/tel-00840151.
Heritier-Pingeon, Christine. "Une aide à la conception de systèmes de production basée sur la simulation et l'analyse de données." Lyon, INSA, 1991. http://tel.archives-ouvertes.fr/docs/00/84/01/51/PDF/1991_Heritier-Pingeon_Christine.pdf.
New forms of competition are leading manufacturing systems to more and more flexibility. In the case of highly automated systems, decisions taken in the design phase will have a great influence on the possibilities of the future system and also on its ease of adaptation to changes, and thus on its degree of flexibility. This work is a study of methods and tools for decision support in the design of manufacturing systems. The reader is first introduced to the scope and then to the tools and methods employed. The workshop 's model which is used as a support for the approach is then presented and the construction of a simulation plan considered These considerations are then put into a concrete form by defining an automated generation module for simulation plans which are associated to the chosen workshop model. Data analysis which is used as a knowledge acquisition method is considered a method of analysis is proposed and tested. This work was developed to explore data analysis possibilities in this field and to evaluate these possibilities on the base of numerous experiments
Marque, Isabelle. "Segmentation d'images médicales tridimensionnelles basée sur une modélisation continue du volume." Phd thesis, Grenoble 1, 1990. http://tel.archives-ouvertes.fr/tel-00338755.
Pageaud, Simon. "SmartGov : architecture générique pour la co-construction de politiques urbaines basée sur l'apprentissage par renforcement multi-agent." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSE1128.
In this thesis, we propose the SmartGov model, coupling multi-agent simulation and multi-agent deep reinforcement learning, to help co-construct urban policies and integrate all stakeholders in the decision process. Smart Cities provide sensor data from the urban areas to increase realism of the simulation in SmartGov.Our first contribution is a generic architecture for multi-agent simulation of the city to study global behavior emergence with realistic agents reacting to political decisions. With a multi-level modeling and a coupling of different dynamics, our tool learns environment specificities and suggests relevant policies. Our second contribution improves autonomy and adaptation of the decision function with multi-agent, multi-level reinforcement learning. A set of clustered agents is distributed over the studied area to learn local specificities without any prior knowledge on the environment. Trust score assignment and individual rewards help reduce non-stationary impact on experience replay in deep reinforcement learning.These contributions bring forth a complete system to co-construct urban policies in the Smart City. We compare our model with different approaches from the literature on a parking fee policy to display the benefits and limits of our contributions
Bouffaron, Fabien. "Co-spécification système exécutable basée sur des modèles : application à la conduite interactive d’un procédé industriel critique." Thesis, Université de Lorraine, 2016. http://www.theses.fr/2016LORR0001/document.
Insofar as a system is a set of interacting elements, the difficulty for a system engineer is to guide the whole model architecture of a system as a set of interdisciplinary engineering part models interacting. The works presented in this thesis are specifically interested in the heuristic, specifying and executable nature of this whole relationship coupling to design a virtual model of the system-of-interest. The holonic perspectives allows us to consider this coupling relationship as descriptive of a WHOLE (H) and prescriptive of each parts as well in regards to system situation to perceive, as system-elements to architect. In this sense, we revisit this relation as an iterative, recursive and collaborative process of system co-specification to the quest of knowledge with each specialist engineering delivering constitutive models satisfying basic requirements. Our system co-modelling environment is itself composed of a set of system-components modelling environment, with the stated objective to preserve tools, methods and works of each stakeholders in order to facilitate the expression of their skills. The modelling at a system level is based on the system modelling language (SysML) to architecture the set of knowledge. Verification and validation are performed by co-execution of models around a co-simulation bus, including CISPI platform of SAFETECH project of CRAN constituting our case study
Zambrano, Rey Gabriel. "Réduction du Comportement Myope dans le contrôle des FMS : Une Approche Semi-Hétérarchique basée sur la Simulation-Optimisation." Phd thesis, Université de Valenciennes et du Hainaut-Cambresis, 2014. http://tel.archives-ouvertes.fr/tel-01064272.
Leroux-Beaudout, Renan. "Méthodologie de conception de systèmes de simulations en entreprise étendue, basée sur l'ingénierie système dirigée par les modèles." Thesis, Toulouse 3, 2020. http://www.theses.fr/2020TOU30089.
This manuscript presents a methodology for the design of "early" simulations in extended enterprise, based on model-driven system engineering. The goal is to allow the system architect to explore alternative solutions, and to verify and/or validate the system architecture being designed, in relation to the user requirements. This methodology is divided into two complementary axes : the method part (new) and the means of execution, without which there can be no simulation. This new method is based on the following principle : starting from the user requirements to create the system architecture model, then derive the simulation architecture, develop the executable models and run the simulation in relation to objectives of verification and/or validation. By doing this, potential differences in interpretations between the system architecture model and simulation models are removed or at least reduced compared to a traditional approach. This method is of matrix type. The columns represent the actors, while the lines correspond to the different steps of the MBSE method used by the system architect for the product, including the refinement steps. The actors are the system architect for the product (SyA), a first new actor introduced by this method : the system architect for the simulation (SiA), the developers of the simulation executable models (SMD), and the second new actor in charge of the execution of the simulation (SEM). The analysis of its qualities and the production of results exploitable by the system architect for the product. As the method relies on a matrix structure, the SyA can request simulations, either in depth to specify a particular point of its model, or more in extension to check the good agreement of the functions between them. With this new matrix approach, the system architect for the product can reuse functions already defined during the upstream or downstream stages of its previous decompositions. Overall, saving time, costs, and confidence. The second axis of this methodology is the realization of an extended enterprise cosimulation (EE) platform, which is a project in itself. Based on a proposal of requirements specifications, the MBSE has defined a functional and physical architecture. The architecture of this platform can be modified according to the simulation needs expressed by the architect of the simulation. This is one of his prerogatives. The proposal introduces a third new player : the Infrastructure Project Manager (IPM) which is in charge of coordinating for the realization of the cosimulation platform, within his company. For an EE of federated type, that is to say from contractor to subcontractor, introduction of two new actors : - the supervisor of IPM, whose rôle is to link IPMs to solve the administrative and interconnection problems, - the person responsible in charge of the execution simulations. He coordinates, with the SEM of each partner, the implementation of simulations, ensures launches, and returns the results to all partners
Six, Lancelot. "Vers un modèle de comportements de véhicules lourds en utilisant une méthode incrémentale basée sur la vérification et l'hystérésis : le modèle ArchiPL." Thesis, Paris 6, 2014. http://www.theses.fr/2014PA066497/document.
Congestion phenomena are a major issue modern societies have to face. Understanding them, their creation, their evolution and their real impact are major questions addressed by the scientific community since the half of the twentieth century. A large number of simulation models have been developed to reproduce and study the traffic dynamics. Among them, microscopic model are designed to reproduce macroscopic phenomena such as congestion by reproducing individual vehicles' behavior. However, despite the negative influence of large vehicles on the flow, very few models took them into account. Those vehicles are usually dealt with as any other vehicle, except for a few parameters. In this thesis, we reconsider this hypothesis and try to identify how the behavior of large vehicles differs from other vehicles' behavior. We propose the VIM4MAS development methodology to help in this process. This method is used to improve a generic vehicle's behavior model and refine it until it can reproduce the most important aspects of the large vehicles' behaviors. To understand and identify key properties of longitudinal behaviors of vehicles, we have developed an analysis methodology based on the study of hysteresis phenomena. This analysis methodology allows to highlight key properties such as anticipation capabilities of drivers. The outcome of this work is the ArchiPL model for large vehicles' behaviors. This models shows an improvement of the behaviour quality at the microscopic level, while being consistent with the literature with respect to emergent phenomena
Hamawy, Lara. "Développement d'une méthode de reconstruction d'image basée sur la détection de la fluorescence X pour l'analyse d'échantillon." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENI080/document.
A new technique that localizes and identifies fluorescing elements in a sample wasdeveloped. This practical imaging modality employs a polychromatic X-ray source toirradiate the sample and prompts the fluorescence emission. Many factors affecting thewhole system like attenuation, sample self-absorption, probability of fluorescence andCompton scattering were taken into account. Then, an effective detection system wasestablished to acquire the optimum fluorescence data and discriminate betweenelements depending on their characteristic fluorescence. This set-up, coupled with anappropriate image reconstruction technique leads to a detailed two-dimensional image.Compared to the conventional image reconstruction techniques, the developedreconstruction method is a statistical technique and has an appropriate convergencetoward an image with acceptable resolution. Moreover, it is a simplified technique thatallows the imaging of many different applications
Haddad, Ramzi. "Coordination des conflits aériens en présence d’incertitudes : Une étude basée sur l'ordonnancement à contraintes de ressources." Compiègne, 2006. http://www.theses.fr/2006COMP1634.
The volume of the air traffic increased 80% in ten years, a growth that is called to continue. The improvements of the present systems of air-traffic management should permit to face this increase until the middle of the next decade. Ln response to this necessity, this thesis work is located between the junctions of two domains: project resources constraints scheduling and the air-traffic coordination. We constructed a dynamic system that permits to resolve air traffic management problems under uncertainties while integrating various techniques adapted to the hazards existing in the context. Our privileged application domain was the coordination (En-Route) of the aerial traffic. This type of project presents features raising form scheduling and organization under uncertainties domains
Tekobon, Jerry. "Système multi physique de simulation pour l'étude de la production de l'énergie basée sur le couplage éolien offshore-hydrolien." Thesis, Le Havre, 2016. http://www.theses.fr/2016LEHA0031/document.
The thesis work concerns the development of a real-time emulation platform for theoretical and experimental studies of offshore wind and tidal power hybrid systems. Various energy coupling architectures are processed on the basis of the functional similarities of two systems and by both numerical and experimental emulation concepts. The notion of accelerated time used for real time simulation has been developed. The concept was validated on the experimental platform using the evolution of the mean power delivered by a small wind turbine. This approach can reduce the observation times of the measurement campaigns and could accelerate the studies for the wind potential of developing sites. We have also developed two types of coupling of the wind-tidal hybrid system. An electrical coupling based on the connection in parallel on a continuous bus of two turbines. We have developed an innovative concept of an electromechanical coupling based on the use of a single asynchronous generator on which the wind turbine and tidal turbine are simultaneously coupled. For this purpose, a vector-controlled servomotor was used to emulate the wind turbine while a synchronous motor was used as a tidal turbine emulator. The generator shaft is used as a mechanical coupling between the two systems. We have demonstrated in the experiments that we have developed the complementarity of the electrical productions of the two systems; we highlighted the need to add a storage system to compensate the simultaneous decrease of the two energy productions. The real time simulations results allow us to validate the feasibility of such a coupling
Saffar, Imen. "Vers une agentification de comportements observés : une approche originale basée sur l’apprentissage automatique pour la simulation d’un environnement réel." Thesis, Lille 1, 2013. http://www.theses.fr/2013LIL10190/document.
The design of simulation tools, which are able to reproduce the dynamics and evolution of complex real phenomena, is hard. Modeling these phenomena by analytical approaches is often unsuitable, forcing the designer to turn towards behavioral approaches. In this context, multi-agent simulations are now a credible alternative to the classical simulations. However, they remain difficult to implement. In fact, the designer of the simulation must be able to transcribe the dynamic of the phenomenon being observed in agents behavior. This step usually requires the skills of a specialist with some expertise in the phenomenon to be simulated. In this thesis, we propose a novel way to treat observing real behaviors to simulate, without resorting to the help of an expert.It is relying on unsupervised learning techniques to identify and extract behavior and facilitate the agentification. Our approach is, therefore, a step towards the automatic design of multi-agent simulations reproducing observable phenomena. This approach is motivated by an application context aiming the simulation of customers’ behavior within a retail space
Izza, Saïd. "Intégration des systèmes d'information industriels : une approche flexible basée sur les services sémantiques." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2006. http://tel.archives-ouvertes.fr/tel-00780240.
Bénard, Vincent. "Evaluation de la sûreté de fonctionnement des systèmes complexes, basée sur un mode fontionnel dynamique : la méthode SAFE-SADT." Valenciennes, 2004. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/3fb536a1-1028-4880-b5f0-7ff718ea5b94.
This thesis deals with design of dependable automated complex systems. A qualitative approach aims at modelling the functional subsets with the integration of various constraints as early as the design phase. The problem inherent in design of automated complex systems can be summarized by the following question: how to express the aggregation of these different functions in terms RAMS parameters of the global system? These works result from the absence of languages and tools for the modelling of abstracted architectures obtained by composition of software and hardware entities. The proposed SAFE-SADT method is a first response element. It allows the modelling, the characterization, the identification and the representation of dependences within the operational architecture. It allows the quantification of the dependability parameters with intent to validate the operational architecture by taking into account the dynamic aspects by means of a Monte Carlo simulation
Darbani, Mohsen. "Approche sans maillage basée sur la Méthode des Eléments Naturels (NEM), pour les écoulements bidimensionnels à surface libre." Compiègne, 2009. http://www.theses.fr/2009COMP1851.
Solving equation with free surface often encounters numerical difficulties related to excessive mesh distortion as is the case of dambreak or breaking waves. In this work we explore the meshfree technique based on the Natural Element Method(NEM) to simulate the 2D fluid flow in presence of strong gradients. The equations considered here are those of Saint-Venant shallow water where we consider the full non-linear equations, with a transient flow under the Coriolis effect. The nonlinear terms are computed by using a Lagrangian technique based on the method of the characteristics. This will allow us to avoid setting up a numerical algorithm, like Newton-Raphson’s, which tend to extend the computing time. However, the management of boundary conditions remains a major difficulty in meshless methods. We have therefore defined a thin geometrical domain close to the boundaries and a domain for computing that will be submitted to nodal enrichment, when the particles leave the computational domain
Cid, Yáñez Fabian. "Évaluation des stratégies à flux tiré et flux poussé dans la production de bois d'œuvre : une approche basée sur des agents." Master's thesis, Université Laval, 2008. http://hdl.handle.net/20.500.11794/21286.
The objective of this study is the evaluation of pull and push strategies in lumber production planning using a Quebec sawmill as case study. An Advanced Planning and Scheduling System (APS), based on a distributed software architecture, simulates the main operations planning and production processes of the sawmill (sourcing, sawing, drying, finishing, warehousing and delivery) representing them as autonomous software agents. Push and pull strategies are simulated using different penetration positions of the demand information decoupling point over the value chain. To set experiments, configurations are defined by two controllable factors, namely: the decoupling point position and the level of contracts for a product family. Following, a set of scenarios are generated by two uncontrollable factors: the quality of supply and market prices differential for products under contracts. These configurations and scenarios leads to a mixed levels experimental design with fifty four runs. Three performance indicators: orders fill rate, work in process, and potential monetary throughput; are calculated for every one of the 54 production plans generated by the APS. Results show a direct relation between the orders fill rate and the position of the decoupling point, pull strategy, for the three levels of demand on products under contract. Accordingly, at every demand level, production plans under pull strategies generate improvements of 100% compared with equivalent plans under push strategy. This service level performance improvement has a financial cost of about 7% of the Potential Monetary Throughput which should be compensated externally with better contract conditions and internally by lower costs of inventory management. This trade-off seems to be a direct consequence of the divergent nature of lumber production. Consequently, in a business context that privileges service quality and where customers are willing to pay for it, the use of this kind of demand driven strategies in production planning represents a source of competitive advantage.
Quirion, Sébastien. "Animation basée sur la physique : extrapolation de mouvements humains plausibles et réalistes par optimisation incrémentale." Thesis, Université Laval, 2010. http://www.theses.ulaval.ca/2010/27675/27675.pdf.
Gangat, Yasine. "Architecture Agent pour la modélisation et simulation de systèmes complexes multidynamiques : une approche multi-comportementale basée sur le pattern "Agent MVC"." Phd thesis, Université de la Réunion, 2013. http://tel.archives-ouvertes.fr/tel-01022620.
Ramamonjisoa, David. "Architecture de copilotage et de contrôle d'exécution basée sur un système expert temps réel." Compiègne, 1993. http://www.theses.fr/1993COMP574S.
Kaakai, Fateh. "Modélisation et évalution des pôles d'échanges multimodaux : une approche hybride multiéchelle basée sur les réseaux Pétri Lots." Besançon, 2007. http://www.theses.fr/2007BESA2038.
A Multimodal Hub is a complex transportation system which has the role to interconnect several public and private transportation modes in order to promote intermodality practice. Because of many observed problems (such as recurrent congestion phenomena inside stations, high transfer times, long queues in front of services, etc. ) which contribute to deteriorate the image of public transport in general, it becomes more and more important for transit authorities to be able to perform many performance measures for identifying the causes of these problems and trying to find solutions. The main goal of the PhD thesis is to propose a simulation model for evaluating the main performance factors of multimodal transportation hubs. Among the most important quantitative factors, we can mention occupancy rates, queue lengths, mean service times, evacuation times, and measures related to intermodality practice such as connection times and waiting times. The suggested simulation model is based on Batches Petri nets which are an extension of Hybrid Petri nets. This paradigm is suitable for our study because it offers a multiscale modular modeling approach which allows mastering the complexity of the studied system. Besides, it offers formal analysis techniques for checking and design (control) purposes. This simulation model can be successfully used for (i) evaluating existing multimodal hubs, (ii) validating design projects of new multimodal hubs, and (iii) assisting designers during sizing and planning procedures
Fiandino, Maxime. "Exploration d'architectures basée sur la génération automatique de plates-formes matérielles et le portage rapide du logiciel." Grenoble INPG, 2007. http://www.theses.fr/2007INPG0053.
The proposed approach is an iterative flow in three steps. The first one is the fast development and modification of the architecture executable model. The second one is the adaptation of the embedded software. The third one is the hardware and software architecture exploration. A tool has been developed in order to create and modify quickly a hardware architecture model. It uses flexible sub-systems. One method in order to adapt the embedded software is exposed, it includes: to manually add some parameterization in the software, an automatic extraction of the architecture characteristics, the generation of the low level code sources. To finish a method allow to simulate processors at different level of simulation with their embedded software, high level for fast simulation, low level for performance measurements. Following results, hardware and software are modified and the flow can restart. This flow was tested on a real application, a parallelized H264 encoder
Haïat, Guillaume. "Étude d'une méthode d'inversion basée sur la simulation pour la caractérisation de fissures détectées par ultrasons dans un composant revêtu." Phd thesis, Université Paris-Diderot - Paris VII, 2004. http://tel.archives-ouvertes.fr/tel-00007345.