Tesi sul tema "Modélisation basée sur les agents"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Modélisation basée sur les agents".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.
Bouzouba, Karim. "Modélisation des interactions basée sur le point de vue des agents". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/NQ36240.pdf.
Testo completoSix, Lancelot. "Vers un modèle de comportements de véhicules lourds en utilisant une méthode incrémentale basée sur la vérification et l'hystérésis : le modèle ArchiPL". Thesis, Paris 6, 2014. http://www.theses.fr/2014PA066497/document.
Testo completoCongestion phenomena are a major issue modern societies have to face. Understanding them, their creation, their evolution and their real impact are major questions addressed by the scientific community since the half of the twentieth century. A large number of simulation models have been developed to reproduce and study the traffic dynamics. Among them, microscopic model are designed to reproduce macroscopic phenomena such as congestion by reproducing individual vehicles' behavior. However, despite the negative influence of large vehicles on the flow, very few models took them into account. Those vehicles are usually dealt with as any other vehicle, except for a few parameters. In this thesis, we reconsider this hypothesis and try to identify how the behavior of large vehicles differs from other vehicles' behavior. We propose the VIM4MAS development methodology to help in this process. This method is used to improve a generic vehicle's behavior model and refine it until it can reproduce the most important aspects of the large vehicles' behaviors. To understand and identify key properties of longitudinal behaviors of vehicles, we have developed an analysis methodology based on the study of hysteresis phenomena. This analysis methodology allows to highlight key properties such as anticipation capabilities of drivers. The outcome of this work is the ArchiPL model for large vehicles' behaviors. This models shows an improvement of the behaviour quality at the microscopic level, while being consistent with the literature with respect to emergent phenomena
Gangat, Yasine. "Architecture Agent pour la modélisation et simulation de systèmes complexes multidynamiques : une approche multi-comportementale basée sur le pattern "Agent MVC"". Phd thesis, Université de la Réunion, 2013. http://tel.archives-ouvertes.fr/tel-01022620.
Testo completoMeng, Anbo. "Contribution à la modélisation et l'implémentation d'un système d'e-Education basé sur les multi-agents". Metz, 2006. http://docnum.univ-lorraine.fr/public/UPV-M/Theses/2006/Meng.Anbo.SMZ0636.pdf.
Testo completoThe goal of this PhD thesis is develop an intelligent, flexible, personalized and open e-Education environment so as to provide an efficient mechanism to personalize the learner's learning process and the teacher's pedagogic process, diversify the learning paradigms and facilitate the development of the teaching and learning materials. To achieve such goal, this thesis explored and adopted a series of innovative methodologies, théories, algorithmes, and technologies derived from multidiscipline such as Multi-Agent, system, learning Object, cognitive theorue, genetic algorithm, eXentensible Markup Langaguage, J2EE and so on. In particular, this dissertation concentrates on the approch of MAS as a container and supporting environment to integrating and encapsulating the above mentioned technologies and methodologies, as well as to modeling and implementating several typical e-Education applications different levels and different contexts in terms of content authoring, individual and collective learning, expertise peer help finding, and test generation, delivery, assessment in distributed learning environment after deliberately taking into consideration the obvious advantage of MAS in terms of autonomy, procativeness, social ability and reactivity, To verify and validate the feasibility and efficiency of the models proposed in thisthesis, part of the models have been implementted and simulated with the JADE framework. The final simulation results demonstrate the rationality and feasibility of applying multi-agent system technology to modeling and implementing large-scale and complex e-Education system in distributed environment
Six, Lancelot. "Vers un modèle de comportements de véhicules lourds en utilisant une méthode incrémentale basée sur la vérification et l'hystérésis : le modèle ArchiPL". Electronic Thesis or Diss., Paris 6, 2014. http://www.theses.fr/2014PA066497.
Testo completoCongestion phenomena are a major issue modern societies have to face. Understanding them, their creation, their evolution and their real impact are major questions addressed by the scientific community since the half of the twentieth century. A large number of simulation models have been developed to reproduce and study the traffic dynamics. Among them, microscopic model are designed to reproduce macroscopic phenomena such as congestion by reproducing individual vehicles' behavior. However, despite the negative influence of large vehicles on the flow, very few models took them into account. Those vehicles are usually dealt with as any other vehicle, except for a few parameters. In this thesis, we reconsider this hypothesis and try to identify how the behavior of large vehicles differs from other vehicles' behavior. We propose the VIM4MAS development methodology to help in this process. This method is used to improve a generic vehicle's behavior model and refine it until it can reproduce the most important aspects of the large vehicles' behaviors. To understand and identify key properties of longitudinal behaviors of vehicles, we have developed an analysis methodology based on the study of hysteresis phenomena. This analysis methodology allows to highlight key properties such as anticipation capabilities of drivers. The outcome of this work is the ArchiPL model for large vehicles' behaviors. This models shows an improvement of the behaviour quality at the microscopic level, while being consistent with the literature with respect to emergent phenomena
Gallab, Maryam. "Développement d’une approche d’aide à la maitrise des risques dans les activités de maintenance d'une chaine logistique : Approche par modélisation et simulation basée sur les systèmes multi-agents". Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEM028/document.
Testo completoThe main objective of this thesis is to develop a multi-agent approach to designing a model to overcome risks of maintenance activities. The aim is to explore the maintenance complexity and to indicate the interactions between the maintenance function and the risk assessment.Firstly, we are interested in designing a systemic model to identify and model the industrial system, to show the different interactions between its elements, to analyze and to evaluate risks of maintenance activities. We propose the MOSAR method and the UML language to design a cognitive reference model. This model served as a starting point for designing a database using the SQL language, which is operated by Multi-Agent model to acquire the necessary information for its operation.On the other hand, we develop a framework of a multi-agent system that aims to anticipate failures scenarios and the decision-making by simulating the studied system behaviour. A comparison between the existing platforms dedicated to Multi-Agent Systems is performed to choose the appropriate platform for the simulation.Finally, the developed models are applied in the LPG supply chain (Liquefied Petroleum Gas). A simulator was developed using the AnyLogic platform in order to study the system behaviour and to simulate the failure scenarios chosen by the industry, for the calculation of the criticality from three parameters (Frequency, severity, detectability), and for obtaining a Dashboard containing a set of maintenance performance indicators. The proposed simulation models help to guide the industries toward good decisions to avoid risky situations that may trigger disruptive events damaging
Benhlima, Ouidad. "Assessing Transport Modes' Impact on Spatial Accessibility". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPAST091.
Testo completoThis thesis evaluates transport modes' effects on accessibility, integrating location-based and utility-based accessibility approaches alongside the Floating Catchment Area (FCA) family of metrics and agent-based modeling. Through a case study centered on Casablanca and its surrounding areas, we analyze the relationships between urban infrastructure, transportation options, and accessibility metrics. The thesis assesses the current state of accessibility and the changes in access following potential improvements by simulating modifications to the existing public transportation network, including introducing new transport lines and modes. Key objectives include evaluating the differences in spatial accessibility within different transport modes and applying agent-based modeling to simulate and analyze the impact of proposed transport infrastructure enhancements. This approach combines theoretical insight and practical application to inform policymaking and urban planning processes. Through accessibility analysis, this thesis contributes to the broader discussion on creating more accessible and equitable cities
Sellami, Bachir. "Afgar : un système à base de connaissance basé sur les objets pour l'Aide à la Fabrication des Grilles de service des Agents Roulants de la SNCF". Compiègne, 1992. http://www.theses.fr/1992COMPD556.
Testo completoDeboscker, Stéphanie. "Les entérocoques résistants aux glycopeptides : épidémiologie et modélisation de leur transmission hospitalière". Thesis, Strasbourg, 2019. http://www.theses.fr/2019STRAJ106.
Testo completoThe objective of our work was to study the factors associated with acquisition of glycopeptide-resistant enterococci (GRE) during a single-strain outbreak, to describe their natural history and to model their transmission between 3 specialized wards. The Bayesian multivariable analysis of our first study showed that a history of hospitalization and the use of antibiotics and antacids during hospitalization were associated with an increased risk of GRE acquisition. The description of GRE-carriers followed since 2007 then showed that half of the patients had negative screenings after 3months. Finally, the literature review revealed that the most relevant model for simulating GRE hospital diffusion was an agent-based model. The simulations confirmed the importance of hand hygiene for patient care in comparison to other barrier measures. With 80% compliance, there were no secondary cases in 50% of the simulations
Chamekh, Fatma. "L’évolution du web de données basée sur un système multi-agents". Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE3083/document.
Testo completoIn this thesis, we investigate the evolution of RDF datasets from documents and LOD. We identify the following issues : the integration of new triples, the proposition of changes by taking into account the data quality and the management of differents versions.To handle with the complexity of the web of data evolution, we propose an agent based argumentation framework. We assume that the agent specifications could facilitate the process of RDF dataset evolution. The agent technology is one of the most useful solution to cope with a complex problem. The agents work as a team and are autonomous in the sense that they have the ability to decide themselves which goals they should adopt and how these goals should be acheived. The Agents use argumentation theory to reach a consensus about the best change alternative. Relatively to this goal, we propose an argumentation model based on the metric related to the intrinsic dimensions.To keep a record of all the occured modifications, we are focused on the ressource version. In the case of a collaborative environment, several conflicts could be generated. To manage those conflicts, we define rules.The exploited domain is general medecine
Olmos, Marchant Luis Felipe. "Modélisation de performance des caches basée sur l'analyse de données". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLX008/document.
Testo completoThe need to distribute massive quantities of multimedia content to multiple users has increased tremendously in the last decade. The current solution to this ever-growing demand are Content Delivery Networks, an application layer architecture that handle nowadays the majority of multimedia traffic. This distribution problem has also motivated the study of new solutions such as the Information Centric Networking paradigm, whose aim is to add content delivery capabilities to the network layer by decoupling data from its location. In both architectures, cache servers play a key role, allowing efficient use of network resources for content delivery. As a consequence, the study of cache performance evaluation techniques has found a new momentum in recent years.In this dissertation, we propose a framework for the performance modeling of a cache ruled by the Least Recently Used (LRU) discipline. Our framework is data-driven since, in addition to the usual mathematical analysis, we address two additional data-related problems: The first is to propose a model that a priori is both simple and representative of the essential features of the measured traffic; the second, is the estimation of the model parameters starting from traffic traces. The contributions of this thesis concerns each of the above tasks.In particular, for our first contribution, we propose a parsimonious traffic model featuring a document catalog evolving in time. We achieve this by allowing each document to be available for a limited (random) period of time. To make a sensible proposal, we apply the "semi-experimental" method to real data. These "semi-experiments" consist in two phases: first, we randomize the traffic trace to break specific dependence structures in the request sequence; secondly, we perform a simulation of an LRU cache with the randomized request sequence as input. For candidate model, we refute an independence hypothesis if the resulting hit probability curve differs significantly from the one obtained from original trace. With the insights obtained, we propose a traffic model based on the so-called Poisson cluster point processes.Our second contribution is a theoretical estimation of the cache hit probability for a generalization of the latter model. For this objective, we use the Palm distribution of the model to set up a probability space whereby a document can be singled out for the analysis. In this setting, we then obtain an integral formula for the average number of misses. Finally, by means of a scaling of system parameters, we obtain for the latter expression an asymptotic expansion for large cache size. This expansion quantifies the error of a widely used heuristic in literature known as the "Che approximation", thus justifying and extending it in the process.Our last contribution concerns the estimation of the model parameters. We tackle this problem for the simpler and widely used Independent Reference Model. By considering its parameter (a popularity distribution) to be a random sample, we implement a Maximum Likelihood method to estimate it. This method allows us to seamlessly handle the censor phenomena occurring in traces. By measuring the cache performance obtained with the resulting model, we show that this method provides a more representative model of data than typical ad-hoc methodologies
Hmaidi, Riadh. "Nouvelle stratégie basée sur les polypeptides comme agents modulateurs des processus tumoraux". Electronic Thesis or Diss., Amiens, 2022. http://www.theses.fr/2022AMIE0073.
Testo completoNatural molecules are increasingly being researched and tested for their therapeutic potential. Scorpion venoms derived peptides modulating ion channels activity are robust biological essential tools for a better understanding of the action mechanisms of these transmembrane channels, particularly sodium channels. They represent sometimes the major molecules of the venom responsible for its highly toxic effect which manifests with cardiogenic syndromes and pulmonary edema often fatal for humans. By modulating ion channels activity, these molecules are potentially able to act on the cellular processes associated with them. In the first part of this work, we have studied the mechanism of interaction of the AahII toxin from the scorpion venom of Androctonus australis hector (Aah), with the voltage-gated sodium channel NaV1.5 and its effect on the electrophysiological activity of this channel. In particular, we studied the ability of the anti-AahII nanoantibody (NbAahII 10 or Nb10) to neutralize the AahII/NaV1.5 interactions. In the second part, we investigated the effect of different peptide fractions from Aah venom on the viability of human breast cancer cells. The toxin AahII is the most active α-toxin from the North African scorpion Androctonus australis hector that slows the fast inactivation of NaV channels. To fight scorpion envenomation, an anti-AahII nanobody named NbAahII10 (Nb10) was developed. The efficiency of this nanobody has been evaluated in vivo on mice, but its mechanism of action at the cellular level remains unknown. Our work confirmed that the AahII toxin slows the fast inactivation of the adult cardiac NaV1.5 channels, expressed in HEK293 cells, in a dose-dependent manner, while current amplitude was not affected, and showed that Nb10 can fully reverse the effect of the AahII toxin on the channel inactivation kinetics. Bioinformatic analysis of the NaV1.5/AahII interaction complex shows that AahII shares the same interaction surface with Nb10, which strongly suggests that Nb10 dynamically replaces the AahII toxin from its binding site on the NaV1.5 channel. At the pathophysiological level, we showed that treatment of human breast cancer cells MDA-MB-231 with Nb10 prevents the increase in cell invasion induced by AahII. In the second part of this work, the 3 major fractions from Aah scorpion venom, previously isolated by low-pressure chromatography (M1, M2, and AahG50), were tested on the viability and migration of human breast cancer cell lines, MCF-7 and MDA-MB-231. These fractions did not affect cell migration but induced very biphasic effects on cell viability, interestingly demonstrating synergistic or antagonistic effects of their component peptides. To purify the peptides of interest, additional purification of the fractions was performed by high-pressure chromatography (HPLC). In total, three polypeptides were identified to induce a decrease in breast cancer cell viability and will be further investigated. In conclusion, our results elucidated the physiological and molecular mechanisms of α-toxin AahII neutralization by the Nb10 nanoantibody. In addition, we were able to identify three peptide molecules from the scorpion venom Aah with anticancer properties. This work will continue with a structural study to elucidate the underlying molecular topologies at a three-dimensional scale
Marque, Isabelle. "Segmentation d'images médicales tridimensionnelles basée sur une modélisation continue du volume". Phd thesis, Grenoble 1, 1990. http://tel.archives-ouvertes.fr/tel-00338755.
Testo completoLabrecque, Patrick. "Modélisation vibratoire par composantes basée sur une technique expérimentale de mobilité". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/MQ26581.pdf.
Testo completoQuinnez, Bruno. "Modélisation des phénomènes aéroélastiques basée sur une linéarisation des équations d'Euler". Châtenay-Malabry, Ecole centrale de Paris, 1994. http://www.theses.fr/1994ECAP0364.
Testo completoLamontagne, Philippe. "Modélisation spatio-temporelle orientée par patrons avec une approche basée sur individus". Mémoire, École de technologie supérieure, 2009. http://espace.etsmtl.ca/64/1/LAMONTAGNE_Philippe.pdf.
Testo completoBedou, Isabelle. "Modélisation des Systèmes d'Information Coopératifs : une approche cognitive basée sur la négociation". Montpellier 2, 1998. http://www.theses.fr/1998MON20233.
Testo completoLibessart, Gwendal. "Modélisation prédictive des propriétés des sols urbains basée sur leur historique d'usages". Electronic Thesis or Diss., Université de Lorraine, 2022. http://www.theses.fr/2022LORR0120.
Testo completoLand take and the associated environmental issues (heat islands, flooding, biodiversity degradation) have become a major issue in urban planning and development. The objective of this thesis is to propose an approach to characterise urban soils based on the analysis of their historical trajectory of land use, considering that the land use successions and the associated human practices determine the physico-chemical properties of soils. To do this, a scientific approach based on three complementary approaches was carried out to achieve an operational methodological proposal for predictive mapping of urban soil properties. The first approach is cartographic and documentary, and aims to better characterise the historical trajectories of urban soils and thus to better understand all of the human practices that the soils have undergone over time. It turns out that twenty historical trajectories are sufficient to describe the majority of the surface area of an urban unit. Moreover, the majority of historical trajectories represent land use successions composed of a maximum of one or two land uses, showing a low dynamic of land use changes and practices over time. The second approach is a field approach aimed at characterising the impact of three historical trajectories representative of urban soils on the physico-chemical properties of soils. It highlights the link between the historical trajectory of a soil and its physical and chemical properties. The impact of human practices on soil properties also appears to be dependent on the intensity of the practices carried out on the soil. These links, which are still underdeveloped, could be highlighted by the development of a new concept: the "anthroposequence". The last approach is an experimental approach that helps to characterise the pedogenetic processes involved during these historical trajectories and to describe the evolution of physico-chemical properties over time. It shows that the individual effects of practices can be contrary to the effect resulting from the whole of these practices. Moreover, the pedogenetic processes put forward during these experiments reflect the reality of the field, thus reinforcing the proposed experimental method. Thus, these three methodological approaches were finally discussed in order to transcribe the results obtained into "simple" rules allowing the cartographic prediction of soil types and certain agronomic parameters in urban environments through the knowledge of historical trajectories
Belkaloul, Amine. "Modélisation des systèmes multi-corps rigides basée sur la méthode des réseaux virtuels". Thesis, Université Laval, 2010. http://www.theses.ulaval.ca/2010/27391/27391.pdf.
Testo completoWebber, Carine. "Modélisation informatique de l'apprenant : une approche basée sur le modèle cK[c barré]". Université Joseph Fourier (Grenoble), 2003. http://www.theses.fr/2003GRE10048.
Testo completoHerbegue, Hajer. "Approche ADL pour la modélisation d'architecture basée sur les contraintes (calcul de WCET)". Toulouse 3, 2014. http://thesesups.ups-tlse.fr/2448/.
Testo completoThe analysis of the worst-case execution time (WCET) is necessary in the design of critical real-time systems. To get sound and precise times, the WCET analysis of these systems must be performed on binary code and based on static analysis. Each execution path of the analyzed program is split into code snippets, called basic blocs. A pipeline analysis consists in modeling the execution of basic blocks on the pipeline and evaluating the impact of the hardware features on the corresponding execution costs. This thesis focuses on the pipeline analysis for WCET computation. The pipeline analysis considers the instruction set architecture and the hardware features of the processor. Then, a high level specification of the software and hardware architecture is needed. We consider Architecture Description Languages (ADL) for processors description. The ADLs, like Sim-nML, HARMLESS, LISA, are used for the generation of retargetable tools, such as simulators, assemblers, in verification, etc. OTAWA is a framework dedicated to the WCET computation that is able to integrate different kind of methods of computation. The instruction set architecture (the ISA level) is expressed in the Sim-nML language. Our work consists in extending the OTAWA framework with an ADL-based approach for pipeline analysis. The aim of our work has been to enhance the expressivity of OTAWA with regard to the processor description language. In order to do so, we first extend the Sim-nML language, to support both the instruction set description and the hardware description. The extended Sim-nML supports the description of hardware components features and superpose the resource usage model of the instructions, that we call execution model, to the initial description. This document also presents a new method to compute a basic bloc execution time. The proposed method is based on constraint programming (Constraint Satisfaction Problem-CSP). We integrated this method in an automated approach, based on the Sim-nML specification of the target processor and based on the instruction sequence to analyse (the basic bloc). We use constraints to express the structural and the temporal properties of the architecture and the instructions, which resolution provides the time cost of basic blocs of a program. Our method uses well known constraint specification languages and resolution tools. The experimentations provide more accurate times. During this thesis, we have been concerned with the formalization of the architecture specification and the results validation. We propose a logic based description of the static and dynamic properties of the architecture and of the basic bloc instructions, presented as a set of high-level constraints. The goal is to provide a reusable library in which the architectuser can find a set of reusable quantitative properties, that assist him in the formalization of the architecture specification. A validation and animation tool was developed based on timed automata. We validate time results provided by the constraints solvers. We generate animated views that assist the architect to validate general dynamic properties and replay the instructions execution
Cheaib, Nader. "Contribution à la malléabilité des collecticiels : une approche basée sur les services Web et les agents logiciels". Phd thesis, Université d'Evry-Val d'Essonne, 2010. http://tel.archives-ouvertes.fr/tel-00541084.
Testo completoLacouture, Jérôme. "Ingénierie logicielle orientée service : une contribution à l'adaptation dynamique basée sur une approche mixte composant/agent". Pau, 2008. http://www.theses.fr/2008PAUU3011.
Testo completoThe evolution of the distributed systems is taking a new dimension with the development of new technologies (Services Oriented Architectures, Grid computing, nomad and ubiquitous computing). Within such environments, the software architecture of the system evolves at runtime, that is during the exploitation phase of the development cycle. Consequently, persistence of the services and dynamic aspects establish new challenges and bring to reconsider inherent problems that are the reuse of the existing services and their adaptation. To adapt, to integrate and to coordinate “on the fly" the available services, to react dynamically to the evolution of the systems appear as central research concerns today. The objectives of the works which we present around the CompAA approach join this context and propose a way in a contextual adaptation, relative to the environmental conditions (quality of service, availability on the network), most dynamic and most autonomous that possible, by the discovery of the available services. For that purpose, our contributions get organized around two main propositions : 1) a model of adaptable components, leaning on the principles of abstraction and variability, and also leaning on a semantic definition in terms of functional and non-functional properties allowing an automatic interpretation by software agents. 2) a process of dynamic adaptation implementing the proposed model. The specified process covers stages going of the analysis of needs until the adaptation of components, by way of stages of discovery and selection of components. Various policies allow a level of adaptability increased within the process. A dominating aspect emphasized in this thesis lies in the originality of the approach which aims at joining the advantages known for two paradigms : components and agents. For us, there is a real interest to specify entities possessing the structuring and the qualities of reuse of software components and evolving in an autonomous and flexible way following the example of the software agents. The field of experiment through which are tried our propositions is the one of the e-learning, more particularly through our participation to the European project ELeGI (European Learning Grid Infrastructure). Through various situations of learning, the participants evolve by sharing their knowledge to progress individually and collectively. In this context, the knowledge and the needs of each are in perpetual spheres of influence. The CompAA model finds naturally its place in this kind of activity and allows to guarantee a certain transparency to the user while guaranteeing him an optimal quality of service by endowing the system of more autonomous and more self-adaptable entities
Kitio, Teussop Rosine. "Gestion de l'ouverture au sein d'organisations multi-agents : une approche basée sur des artefacts organisationnels". Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2011. http://tel.archives-ouvertes.fr/tel-00721828.
Testo completoMaître, Brigitte, e Hassan Laasri. "Coopération dans un univers multi-agents basée sur le modèle du blackboard : études et réalisations". Nancy 1, 1989. http://www.theses.fr/1989NAN10022.
Testo completoKitio, Teussop Rosine. "Gestion de l’ouverture au sein d’organisations multi-agents : une approche basée sur des artefacts organisationnels". Thesis, Saint-Etienne, EMSE, 2011. http://www.theses.fr/2011EMSE0630/document.
Testo completoMulti-Agent Technology concerns the development of decentralized and open systemscomposed of different agents interacting in a shared environment. In recent years, organizationhas become an important in this research field. Many models have been, and are still,proposed. While no concensual model emerges of these different works, it appears that theyall lack the ability to build open and normative organizations in the sense of management ofentry / exit of agents into organization but also decentralized control / regulation of the autonomyof the agents. In this thesis, our objective consists in the definition of a new modeladdressing these requirements. Ours reseaches allow us to extend theMOISE+ organizationalmodeling language (OML) in a new version namming MOISE. In this one we define an Entry/ Exit specification allowing to explicitly specify the ways in which the agents can enter orexit in or from an organisation by providing some requirements according to the missions, thegoals and the roles of the organisation. The organizational management infrastructure (OMI)ORA4MAS proposed take advantage of the Agents and Artifacts (A&A) approach. We definedthe Organizational Artifacts concept as the basic building block of our OMI for themanagement of organized and open MAS. To focus our study, the organizational artifacts willbe defined considering the OML specification of the MOISE model. We experimented ourproposal with the specification of an application aiming to manage the build of a house. Wethen experimented the management of the candidate agents to enter in the organisation and cooperatewith the other to build the house according to a specified social scheme, the specifiednorms and their contract clauses negociated when they will be admitted in the organisation
Li, Kai. "Modélisation du comportement hydromécanique des sols gonflants basée sur la théorie de l'état limite". Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAD004/document.
Testo completoClayey materials are often subjected to the complex suction/stress paths, causing many problems in both surface structures and buried structures built on them. In this context, it is important to study the hydromechanical behavior of these materials in order to better control their use in civil engineering. The complex hydromechanical behavior of clay materials is basically connected to their fabric which has been the main subject of several studies on the micro- and macrostructure of soils. These studies have led to the development of elastoplastic models for expansive soils. The existed models are able to simulate the basic behavior of unsaturated expansive soil, but present a large number of model parameters, leading to a time-consuming calculation. Therefore, we propose a simplified method to model the hydromechanical behavior of expansive soils based on shakedown concept. This model is first validated by the experimental results of cyclic suction-controlled oedometer tests. Then, it is implemented in a finite element code (CAST3M) to simulate the in-situ behavior of expansive soils. Finally, the application of shakedown theory to heavily dense expansive soils is carried out by considering a combined hardening plasticity
Krebs, Julian. "Le recalage robuste d’images médicales et la modélisation du mouvement basée sur l’apprentissage profond". Thesis, Université Côte d'Azur, 2020. http://www.theses.fr/2020COAZ4032.
Testo completoThis thesis presents new computational tools for quantifying deformations and motion of anatomical structures from medical images as required by a large variety of clinical applications. Generic deformable registration tools are presented that enable deformation analysis useful for improving diagnosis, prognosis and therapy guidance. These tools were built by combining state-of-the-art medical image analysis methods with cutting-edge machine learning methods.First, we focus on difficult inter-subject registration problems. By learning from given deformation examples, we propose a novel agent-based optimization scheme inspired by deep reinforcement learning where a statistical deformation model is explored in a trial-and-error fashion showing improved registration accuracy. Second, we develop a diffeomorphic deformation model that allows for accurate multiscale registration and deformation analysis by learning a low-dimensional representation of intra-subject deformations. The unsupervised method uses a latent variable model in form of a conditional variational autoencoder (CVAE) for learning a probabilistic deformation encoding that is useful for the simulation, classification and comparison of deformations.Third, we propose a probabilistic motion model derived from image sequences of moving organs. This generative model embeds motion in a structured latent space, the motion matrix, which enables the consistent tracking of structures and various analysis tasks. For instance, it leads to the simulation and interpolation of realistic motion patterns allowing for faster data acquisition and data augmentation.Finally, we demonstrate the importance of the developed tools in a clinical application where the motion model is used for disease prognosis and therapy planning. It is shown that the survival risk for heart failure patients can be predicted from the discriminative motion matrix with a higher accuracy compared to classical image-derived risk factors
Romero, Hernandez Ivan. "Modélisation, spécification formelle et vérification de protocoles d'interaction : une approche basée sur les rôles". Grenoble INPG, 2004. http://www.theses.fr/2004INPG0159.
Testo completoLn this thesis we present an approach for specifying, modelling and validating agent interaction protocols. This approch is based on the notion of role attribute, as it is implied on the agent-oriented methodologies MADKIT and GAIA. Roles could be seen as social attributes of agent groups, that are taken in the context of any given interaction. If we see roles like this, we could also imagine schemes to generate an actual specification with concrete agents from the generic specification using roles. To test this approch we propose a machine-assisted specification process that takes a role definiton expressed on a visual notation (AUML), and generates a formai equivalent on the language Promela
Flipon, Baptiste. "Alliages à grains ultrafins et bimodaux : approche couplée expérience-modélisation basée sur la microstructure". Thesis, Normandie, 2018. http://www.theses.fr/2018NORMIR06/document.
Testo completoThis work is focused on the elaboration and the mechanical behaviour of 304L and 316L austenitic stainless steel alloys with bimodal grain size distribution. The complementary approach between experiments and modelling enables a better understanding of both macroscopic and local mechanical responses and also of the associated deformation mechanisms.The use of two elaboration routes and optimized process parameters results in a wide range of samples with different bimodal grain size distributions. Grain sizes and fractions of each population are modified in order to study the influence of these microstructural characteristics on mechanical behavior. Uniaxial tensile tests are used to realize a database of mechanical properties of bimodal alloys and loading-unloading tests provides valuable informations about deformation mechanisms in these materials. With coarse grains (CG) embedded in an ultrafine grained (UFG) matrix, a relaxation of a part of the internal stresses seems to take place and leads to a delayed embrittlement of bimodal alloys as compared to their unimodal counterparts. Full-field modelling, based on two crystal plasticity laws with an explicit account of an internal length, is proposed. It constitutes a valuable prediction tool of effective properties of bimodal alloys in order, in particular, to study the effect of several microstructural characteristics. An access to local fields is also possible and tend, so far, to show similar results compared to experimental ones : stress relaxation is observed in the UFG matrix as well as stress concentrations at the CG/UFG interfaces
Kallel, Asma. "Modélisation et conception d'une antenne plasma à balayage basée sur des ondes de fuite". Phd thesis, Toulouse 3, 2014. http://thesesups.ups-tlse.fr/2514/.
Testo completoIn this work, a beam scanning leaky-wave antenna working at a fixed operating frequency and constructed from a grounded plasma layer is proposed. The radiation angle can be tuned by the plasma electron density which is controlled by the power. A 2D theoretical model based on a canonical structure is proposed to study the leaky waves. The antenna parameters (plasma thickness, length and permittivity) are dimensioned using this theoretical model, at 10 GHz, and a microwave source is chosen to excite the antenna. The scanning range of about 60° needs a plasma reaching an electron density of. In a second step an inductively coupled plasma source is chosen since it meets the dimensioning requirements. The measurements of the plasma parameters confirm the requirements. Finally, the antenna prototype is designed
Kallel, Asma. "Modélisation et conception d'une antenne plasma à balayage basée sur des ondes de fuite". Phd thesis, Toulouse 3, 2014. http://oatao.univ-toulouse.fr/14142/1/kallel.pdf.
Testo completoBrousmiche, Kei-Léo. "Modélisation et simulation multi-agent de la formation et de la dynamique d’attitudes basées sur les croyances". Thesis, Paris 6, 2015. http://www.theses.fr/2015PA066623/document.
Testo completoWe study in this thesis the problem of social attitude formation and dynamics using multi agent simulation. The concept of attitude could be defined as a global evaluation of a social object, based on cognitive or affective information. Our works belongs to the field of social simulation which aims to reproduce in a virtual environment complex social phenomenon at a macroscopic level based on microscopic representations of individuals and their interactions. While existing approaches in this field rarely consider the results of studies in human sciences on the topic of attitude, we propose to follow a psychomimtic approach by micro-founding the cognitive model of our agents on human and social sciences' theories on individual's perception, inter-personal and media communication, belief revision, affective responses and the sentiment of unexpectedness. This model aims to reproduce at a microscopic level attitude dynamics toward actors who perpetuate actions witnessed by the individuals. We have proceeded to a functional analysis of the model's various components based on abstracts scenarios in order to study the capabilities of our model, and more precisely the describable phenomenon such as information diffusion, resistance to disinformation or the conformity process. The model has been applied in the context of French military operations of stabilisation in Afghanistan. The goal of this experience consists in reproducing opinion polls results of the locals toward the present Forces, collected during the intervention, based on a military scenario which has been recreated in partnership with officers who were in charge of operations between 2011 and 2012. Simulation results that follow a model calibration process show an error below 3 points of disparity compared to the real data. Finally, we propose a microscopic analysis of the results by applying automatic classification techniques on the simulated individuals in order to explain the multiple attitudes tendencies in the population
Abdelfattah, Nahed. "Modélisation 2D 1/2 hiérarchique basée sur les cartes planaires : réalisation et évaluation d'une interface graphique utilisant cette modélisation". Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 1994. http://tel.archives-ouvertes.fr/tel-00820994.
Testo completoNahed, Abdelfattah. "Modélisation 2D 1/2 hiérarchique basée sur les cartes planaires : réalisation et évaluation d'une interface graphique utilisant cette modélisation". Saint-Etienne, 1994. http://tel.archives-ouvertes.fr/docs/00/82/09/94/PDF/1994_Nahed_Abdelfattah.pdf.
Testo completoThis thesis proposes a model based on the planar map notion. Data of the same plane level are modeled by severa! planar maps each of which corresponding to a class of semantics. These maps are independent and superposed. Modeling of data belonging to parallel planes is also allowed by planar maps superposing. Display and manipulation of data belonging to parallel planes are ensured thanks to a three-dimensional representation that is inspired by perspective techniques in draughtmanship domain. A hierarchical data structure offers the possibility to detail one zone of the plane (a face in planar map notion) by many planar maps corresponding to the different semantics. Furthermore, this model handles perfectly the semantic links that could lie abjects modeled in different planar maps. Hence, it preserves the whole meaning of the modeled scene. The proposed medel has been used to design a graphical interface within the scope of an european Esprit project MMI2. An ergonomie evaluation of that interface and refinements made in it as consequence of the evaluation are also presented in this thesis
Hermellin, Emmanuel. "Modélisation et implémentation de simulations multi-agents sur architectures massivement parallèles". Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT334/document.
Testo completoMulti-Agent Based Simulations (MABS) represents a relevant solution for the engineering and the study of complex systems in numerous domains (artificial life, biology, economy, etc.). However, MABS sometimes require a lot of computational resources, which is a major constraint that restricts the possibilities of study for the considered models (scalability, real-time interaction, etc.).Among the available technologies for HPC (High Performance Computing), the GPGPU (General-Purpose computing on Graphics Processing Units) proposes to use the massively parallel architectures of graphics cards as computing accelerator. However, while many areas benefit from GPGPU performances (meteorology, molecular dynamics, finance, etc.). Multi-Agent Systems (MAS) and especially MABS hardly enjoy the benefits of this technology: GPGPU is very little used and only few works are interested in it. In fact, the GPGPU comes along with a very specific development context which requires a deep and not trivial transformation process for multi-agents models. So, despite the existence of works that demonstrate the interest of GPGPU, this difficulty explains the low popularity of GPGPU in the MAS community.In this thesis, we show that among the works which aim to ease the use of GPGPU in an agent context, most of them do it through a transparent use of this technology. However, this approach requires to abstract some parts of the models, what greatly limits the scope of the proposed solutions. To handle this issue, and in contrast to existing solutions, we propose to use a nhybrid approach (the execution of the simulation is shared between both the processor and graphics card) that focuses on accessibility and reusability through a modeling process that allows to use directly GPU programming while simplifying its use. More specifically, this approach is based on a design principle, called GPU delegation of agent perceptions, consists in making a clear separation between the agent behaviors, managed by the processor, and environmental dynamics, handled by the graphics card. So, one major idea underlying this principle is to identify agent computations which can be transformed in new structures (e.g. in the environment) in order to distribute the complexity of the code and modulate its implementation. The study of this principle and the different experiments conducted show the advantages of this approach from both a conceptual and performances point of view. Therefore, we propose to generalize this approach and define a comprehensive methodology relying on GPU delegation specifically adapted to the use of massively parallel architectures for MABS
Mbarki, Mohamed. "Raisonnement stratégique et tactique : une approche pour la communication entre agents logiciels basée sur la pertinence". Thesis, Université Laval, 2010. http://www.theses.ulaval.ca/2010/27321/27321.pdf.
Testo completoBrousmiche, Kei-Léo. "Modélisation et simulation multi-agent de la formation et de la dynamique d’attitudes basées sur les croyances". Electronic Thesis or Diss., Paris 6, 2015. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2015PA066623.pdf.
Testo completoWe study in this thesis the problem of social attitude formation and dynamics using multi agent simulation. The concept of attitude could be defined as a global evaluation of a social object, based on cognitive or affective information. Our works belongs to the field of social simulation which aims to reproduce in a virtual environment complex social phenomenon at a macroscopic level based on microscopic representations of individuals and their interactions. While existing approaches in this field rarely consider the results of studies in human sciences on the topic of attitude, we propose to follow a psychomimtic approach by micro-founding the cognitive model of our agents on human and social sciences' theories on individual's perception, inter-personal and media communication, belief revision, affective responses and the sentiment of unexpectedness. This model aims to reproduce at a microscopic level attitude dynamics toward actors who perpetuate actions witnessed by the individuals. We have proceeded to a functional analysis of the model's various components based on abstracts scenarios in order to study the capabilities of our model, and more precisely the describable phenomenon such as information diffusion, resistance to disinformation or the conformity process. The model has been applied in the context of French military operations of stabilisation in Afghanistan. The goal of this experience consists in reproducing opinion polls results of the locals toward the present Forces, collected during the intervention, based on a military scenario which has been recreated in partnership with officers who were in charge of operations between 2011 and 2012. Simulation results that follow a model calibration process show an error below 3 points of disparity compared to the real data. Finally, we propose a microscopic analysis of the results by applying automatic classification techniques on the simulated individuals in order to explain the multiple attitudes tendencies in the population
Sukiman, Muhamad Shafiq. "Etude des propriétés mécanique et thermique des biocomposites basée sur l'homogénéisation numérique". Thesis, Lille 1, 2017. http://www.theses.fr/2017LIL10218/document.
Testo completoThis thesis work concerns the determination of mechanical and thermal properties of HDPE-wood particles and PET-hemp fibers biocomposites using experimental, numerical and analytical approaches. The search for the effective properties of these biocomposites involves different parameters such as the fiber morphology and orientation on the one hand, and the porosity and the interphase on the other. A study based on the numerical homogenization technique has been carried out in order to verify the influence of short and long fibers on the apparent properties of the different materials. Numerical calculations have also allowed the estimation of the elastic properties as well as the thermal conductivity of the biocomposites in relation to the fiber and particle volume fraction. The ultimate objective of this work consists in the modelling of the thermoforming procedure on the biocomposites. A study on the thermoformability of the HDPE-wood particles biocomposite into a formwork in relation to the wood particle content has been carried out by using experimental data of the mechanical and thermal behavior for the numerical simulation of the thermoforming procedure
Girard, André. "Réduction de bruit de signaux de parole mono-capteur basée sur la modélisation par EMD". Mémoire, Université de Sherbrooke, 2010. http://savoirs.usherbrooke.ca/handle/11143/1553.
Testo completoRobinet, Vivien. "Modélisation cognitive computationnelle de l'apprentissage inductif de chunks basée sur la théorie algorithmique de l'information". Grenoble INPG, 2009. http://www.theses.fr/2009INPG0108.
Testo completoThis thesis presents a computational cognitive model of inductive learning based both on the MDL and on the chunking mechanism. The chunking process is used as a basis for many computational cognitive models. The MDL principle is a formalisation of the simplicity principle. It implements the notion of simplicity through the well-defined concept of codelength. The theoretical results justifying the simplicity principle are established through the algorithmic information theory. The MDL principle could be considered as a computable approximation of measures defined in this theory. Using these two mechanisms the model automatically generates the shortest representation of discrete stimuli. Such a representation could be compared to those produced by human participants facing the same set of stimuli. Through the proposed model and experiments, the purpose of this thesis is to assess both the theoretical and the practical effectiveness of the simplicity principle for cognitive modeling
El, Moumen Ahmed. "Prévision du comportement des matériaux hétérogènes basée sur l’homogénéisation numérique : modélisation, visualisation et étude morphologique". Thesis, Lille 1, 2014. http://www.theses.fr/2014LIL10077/document.
Testo completoThe homogenization is a technique of Micro-Macro passage taking into account the influence of morphological, mechanical and statistical parameters of the representative microstructure of an heterogeneous material. Numerical modeling has contributed significantly to the development of this technique to determine the physical and mechanical properties of bi-and multi-phase heterogenous materials. The main objective of this work is the prediction of the macroscopic elastic and thermal behaviors of heterogeneous materials. The mechanical and thermal behaviors was determined numerically and compared with experimental and analytical results. The variation of the representative volume element (RVE) versus volume fraction and the contrast was analyzed. This study showed the importance of a rigorous determination of the optimal RVE size. Indeed, it must take into account several parameters such as : volume fraction, contrast, type of property and the morphology of the heterogeneity. A new concept of the equivalent morphology was proposed. This concept introduces the equivalence of the elastic and thermal characteristics of a microstructure of heterogeneous materials with complex morphology and those of a microstructure containing spherical particles. This work led us to developement of a comprehensive approach to microstructural design by integrating the real morphology of heterogeneous microstructure phases incorporating at the same time the image visualization, the morphological study and the geometric and numerical modeling
Cozette, Maryne. "Développement d'une méthode d'exploration de la balance musculaire basée sur la modélisation du signal isocinétique". Electronic Thesis or Diss., Amiens, 2019. http://www.theses.fr/2019AMIE0018.
Testo completoObjective : The general objective was to propose a reliable alternative to assess agonists-antagonists muscle balance in dynamic perspective by using a sector approach: the angular range method. Material and method : Three studies were conducted, each study involving joint particularly involved in human motion, and also in injury occurence related to the muscle balance between the main joint stabilizers. These are respectively scapulohumeral joint (internal ans external rotator muscles), femoro-tibial joint (flexor and extensor muscles), and lumbar spine (flexor and extensor muscles). Subjects were healthy volunteers, not specialists in a sport. The ratios by angular range of 10° were compared between the angular ranges and also compared to the classical ratios associated with the peak torques. Reliability was evaluated for all ratios. Results : All studies have shown that angular range ratios were signicantly different between angular ranges and also significantly different to the classical peak torque ratios. In addition, the results have highlighted a good absolute reliability of the ratios associated with the angular ranges of 10°. Conclusion : The sector method associated with angular ranges of 10° allows a reliable and relevant assessment of the muscle balance by a finer analysis of a major parameter of muscle function involved in physiopathology and motor performance
Carabelea, Cosmin. "Raisonner sur l'autonomie d'un agent au sein de systèmes multi-agents ouverts : une approche basée sur les relations de pouvoir". Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2007. http://tel.archives-ouvertes.fr/tel-00786141.
Testo completoCouture, Adrien. "Automatisation de la modélisation numérique des microstructures de matériaux hétérogènes basée sur une intégration CAO-calcul". Thesis, Lorient, 2019. http://www.theses.fr/2019LORIS543.
Testo completoNumerical simulation of heterogeneous materials is of great interest to the scientific community since it is an attractive and economical solution to the problem of characterizing the thermomechanical behavior of heterogeneous materials. Heterogeneous materials behavior is difficult to predict even when the constituents properties are known and their shape and position well defined. This difficulty is related to the physical phenomena that occur at the constituents scale. They originate from the constituents interaction and the way they are assembled together. Numerical modeling of heterogeneous materials can simulate samples of the material to be studied. These samples are the Statistical Volume Elements (SVE) and represent a portion of the material microstructure. To obtain a statistical representation of the thermomechanical behavior, it is usually necessary to generate many SVE. This thesis objective is to propose a new automated approach to numerical modeling of microstructures based on the integration of Computer Aided Design (CAD) methods, automatic mesh generation methods and Finite Element Analysis (FEA) method. This approach is proposed in the context of heterogeneous particulate materials but can be applied to all types of microstructures. The potential of the CAD-FEA integrated approach is illustrated with a comparative study of the influence of the constituents shape and the mesh degree on the apparent thermomechanical properties of a glass / epoxy composite. An innovative method for generating microstructures with high volume fractions of slender particles is introduced. This new method is used to simulate the behavior of a hemp and cement composite and the numerical results are confronted with the experimental results
Le, Moigne Rocco. "Modélisation et simulation basée sur systemC des systèmes monopuces au niveau transactionnel pour l'évaluation de performances". Nantes, 2005. http://www.theses.fr/2005NANT2040.
Testo completoThe fast evolution of microelectronic technologies and their ever-improving integration capacities made possible the appearance of a new generation of components on the market: the “System-on-Chip”. The complexity involved when designing these new components and the permanent need to increase the productivity of the system design process in order to reduce the time-to-market leads designers to raise the level of abstraction of their simulation models. Thus, our goal is to provide a set of high-level models and software tools enabling designers to conduct very early in the design process the HW/SW co-simulation of systems. All models developed in this thesis are integrated to the SystemC simulation library of CoFluent Design's CoFluent Studio™ software environment. This work was done in the context of the MEDEA+ A502 MESA project
Alby, Emmanuel. "Elaboration d'une méthodologie de relevé d'objets architecturaux : contribution basée sur la combinaison de techniques d'acquisition". Nancy 1, 2006. https://tel.archives-ouvertes.fr/tel-00132784.
Testo completoThe external survey of an architectural work is a way to create a representation of the building in its conservation condition. Two techniques of remote acquisition differ by their effectiveness and the quality of the produced data: photogrammetry and laser scanning. These two techniques depend on optical principles: what cannot be seen cannot be measured. The combination of these techniques can improve the data quality, but unmeasured zones always remain, therefore cannot be represented. In order to solve this problem, we put forward the hypothesis that using architectural knowledge may allow to rebuild these zones during the modeling process. This study suggests a modeling process based on the combination of these two techniques and on the integration of the available architectural knowledge, from paper documentation or from the built works construction rules. An architectural work being complex and the data numerous, a division of the modeling process in several distinct stages appears necessary. We suggest dividing modeling process according to different figuration of level of details frequently used to represent architecture, and define a process using information in a progressive way. Thus our approach consists in integrating dimensional data into architectural documentation, in order to develop a modeling process providing a model as complete as possible
Fernandez, Davila Jorge Luis. "Planification cognitive basée sur la logique : de la théorie à l'implémentation". Electronic Thesis or Diss., Toulouse 3, 2022. http://thesesups.ups-tlse.fr/5491/.
Testo completoIn this thesis, we introduced a cognitive planning framework that can be used to endow artificial agents with the necessary skills to represent and reason about other agents' mental states. Our cognitive planning framework is based on an NP-fragment of an epistemic logic with a semantics exploiting belief bases and whose satisfiability problem can be reduced to SAT. We detail the set of translations for the reduction of our fragment to SAT. In addition, we provide complexity results for checking satisfiability of formulas in our NP-fragment. We define a general architecture for the cognitive planning problem. Afterward, we define two types of planning problem: informative and interrogative, and we find the complexity of finding a solution for the cognitive planning problem in both cases. Furthermore, we illustrated the potential of our framework for applications in human-machine interaction with the help of two examples in which an artificial agent is expected to interact with a human agent through dialogue and to persuade the human to behave in a certain way. Moreover, we introduced a formalization of simple cognitive planning as a quantified boolean formula (QBF) with an optimal number of quantifiers in the prefix. The model for cognitive planning was implemented. We describe how to represent and generate the belief base. Furthermore, we demonstrate how the machine performs the reasoning process to find a sequence of speech acts intended to induce a potential intention in the human agent. The implemented system has three main components: belief revision, cognitive planning, and the translator module. These modules work integrated to capture the human agent's beliefs during the human-machine interaction process and generate a sequence of speech acts to achieve a persuasive goal. Finally, we present an epistemic language to represent the beliefs and actions of an artificial player in the context of the board game Yokai. The cooperative game Yokai requires a combination of theory of mind (ToM), temporal and spatial reasoning for an artificial agent to play effectively. We show that the language properly accounts for these three dimensions and that its satisfiability problem is NP-complete. We implement the game and perform experiments to compare the cooperation level between agents when they try to achieve a common goal by analyzing two scenarios: when the game is played between a human and the artificial agent versus when two humans play the game
Vieira, De Mello Aline. "Tlmdt : une stratégie de modélisation basée sur temps distribué pour la simulation prallèle de systèmes multi-processeurs intégrés sur puce". Paris 6, 2013. http://www.theses.fr/2013PA066201.
Testo completoInnovative hardware architectures in the microelectronics industry are mainly characterized by their incredibly high level of parallelism. Despite their relative novelty, Multi-Processors System on Chip (MPSoCs) containing a few cores tend to be replaced by Massively Parallel MPSoCs (MP2SoCs), which integrate dozens or hundreds of processor cores interconnected through a possibly hierarchical network on chip. Several industrial and academic frameworks appeared to help modeling, simulating and debugging MP2SoC architectures. The SystemC hardware description language is the effective backbone of all these frameworks, which allows to describe the hardware at various levels of abstraction, ranging from synthesizable RTL (more accurate and very slow) to TLM (less accurate and very fast). However, when it comes to simulate an architecture containing hundreds of processors, even the simulation speed brought by TLM is not enough. Simultaneously, multi-core workstations are becoming the mainstream. Unfortunately, the genuine SystemC simulation kernel is fully sequential and cannot exploit the processing power provided by these multi-cores machines. In this context, the strategic goal of this thesis is to propose a general modeling approach for timed TLM virtual prototyping of shared memory MP2SoCs, called Transaction Level Modeling with Distributed Time (TLM-DT). The main idea of the TLM-DT approach is not to use anymore the SystemC global simulation time, becoming possible to use a truly parallel simulation engine and providing a significant reduction in simulation time with a limited loss of precision
Noh, Mohd Hilmi. "Charge rapide de batteries lithium-ion basée sur la compensation de chute-ohmique". Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAI076/document.
Testo completoThe aim of this thesis is to study fast-charging of lithium-ion, battery using the ohmic-drop compensation method. The latter method theoretically will reduce the total charging of the batteries considered. In this thesis, the ODC method was implemented on three different types of 18650 battery cells. These batteries are C/LFP, C/NMC and LTO/LFP. This method show a good result for C/LFP and LTO/LFP batteries with a reduction of total charging time of about 70% in comparison with the classical method. Nevertheless, there are some issues regarding this method; the temperature elevation of the battery is high during fast-charging. Indeed, almost all fast-charging procedure experiences the same problem concerning that matter. Moreover, with ODC fast-charging method, high current rate and high voltage will worsen the situation. These complications of the ODC fast-charging method are key points for both performance and durability of the batteries. Particularly, we have demonstrated that C/LFP battery underwent internal degradation as a mechanical deformation of the active materials and degradation of electrolyte