Literatura académica sobre el tema "Réutilisation et échange de protocoles"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Réutilisation et échange de protocoles".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Réutilisation et échange de protocoles"
Belley, Jean-Guy. "Contrat et citoyenneté. La politique d'achat régional d'une entreprise multinationale". Les Cahiers de droit 34, n.º 3 (12 de abril de 2005): 1063–124. http://dx.doi.org/10.7202/043241ar.
Texto completoRegazzetti, Loic. "LA GESTION DES DÉCHETS À L’HÔPITAL: DU DISCOURS AUX RÉSULTATS". Revista Baiana de Saúde Pública 40 (14 de septiembre de 2017). http://dx.doi.org/10.22278/2318-2660.2016.v40.n0.a2667.
Texto completoTesis sobre el tema "Réutilisation et échange de protocoles"
Djaffardjy, Marine. "Pipelines d'Analyse Bioinformatiques : solutions offertes par les Systèmes de Workflows, Cadre de représentation et Étude de la Réutilisation". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG059.
Texto completoBioinformatics is a multidisciplinary field that combines biology, computer science, and statistics, aiming to gain a better understanding of living mechanisms. It relies primarily on the analysis of biological data. Major technological improvements, especially sequencing technologies, gave rise to an exponential increase of data, laying out new challenges in data analysis and management.In order to analyze this data, bioinformaticians use pipelines, which chain computational tools and processes. However, the reproducibility crisis in scientific research highlights the necessity of making analyses reproducible and reusable by others.Scientific workflow systems have emerged as a solution to make pipelines more structured, understandable, and reproducible. Workflows describe procedures with multiple coordinated steps involving tasks and their data dependencies. These systems assist bioinformaticians in designing and executing workflows, facilitating their sharing and reuse. In bioinformatics, the most popular workflow systems are Galaxy, Snakemake, and Nextflow.However, the reuse of workflows faces challenges, including the heterogeneity of workflow systems, limited accessibility to workflows, and the need for public workflow databases. Additionally, indexing and developing workflow search engines are necessary to facilitate workflow discovery and reuse.In this study, we developed an analysis method for workflow specifications to extract several representative characteristics from a dataset of workflows. The goal was to propose a standardized representation framework independent of the specification language. Additionally, we selected a set of workflow characteristics and indexed them into a relational database and a structured semantic format. Finally, we established an approach to detect similarity between workflows and between processors, enabling us to observe the reuse practices adopted by workflow developers
Jarraya, Tarek. "Réutilisation des protocoles d'interaction et démarche orientée modèles pour le développement multi-agents". Reims, 2006. http://theses.univ-reims.fr/exl-doc/GED00000553.pdf.
Texto completoTo improve the development of interactive agents, we propose a component representation of interaction protocols. The approach that we adopt consists first in studying the principal interaction protocols, specifically protocols standardized by FIPA. This study consists in analyzing the process of each protocol, in the aim to identify data that the developer specifies to integrate it in the agent. Thanks to this study we have build an ontology which contains interaction concepts, common to all protocols. Based on this ontology we developed the INAF (INteractive Agent Framework) framework. The aim of this framework is to promote the reuse and the adaptation of agents in dynamic environment. INAF provide a library of interaction protocols and a basic architecture for interactive agents. To validate our solution, we used our framework to develop different applications, like an auction system, a timetable management system. A review with experience with INAF, that its use requires the knowledge of some concepts and techniques in multi-agent systems. To reduce the complexity inherent in the diversity of multi-agent concepts, we propose a new development method, named MDAD (Model Driven Agent Development), which is based on the MDA (Model Driven Architecture) approach, proposed by OMG. Our method MDAD describes the multi-agent system through a conceptual level (PIM) and an implementation level (PSM). These levels are described by one or many meta-models. The transition from one level to the other is an automatic transformation process, driven by a set of rules. These rules represent the know-how of experts in application domain and multi-agent systems designers
Marti, Carole. "L' apport des méthodes narratives à la gestion des connaissances : le partage et la réutilisation entre artisans". Montpellier 2, 2005. http://www.theses.fr/2005MON20144.
Texto completoKhirani, Sarah. "Procédés hybrides associant la filtration membranaire et l'adsorption/échange ionique pour le traitement des eaux usées en vue de leur réutilisation". Toulouse, INSA, 2007. http://www.theses.fr/2007ISAT0005.
Texto completoThis study deals with the development of a hybrid process combining membrane filtration and adsorption or ion exchange for the treatment of secondary effluent for their reuse. The application of this process to the treatment of wastewaters needs to look for the lowest cost by taking into account the specificity of the organic matter of the effluents and by developing a durable process by lowering its energetic consummation and using regenerable adsorbents. This led us to privilege the combination in the same reactor, an immersed membrane and a regenerable adsorbent. We were interested mainly in the treatment of the organic compounds of the secondary effluents, either by studying the hydrophobic fraction “humic substances” (mainly fulvic acid in our study), or a synthetic secondary effluent. For these purposes, our study followed these steps:1. Research of an alternate material to the Powdered Activated Carbon PAC; a comparative study of the kinetics and isotherms of adsorption was then investigated. 2. Study of the response of the hybrid process at two scales by using PAC and the Ion Exchange Resin. 3. Improvement of a protocol of the determination of the fouling potential of the treated water that could be treated by either nanofiltration or reverse osmosis
Champalle, Olivier. "Capitalisation et partage de connaissances d’analyse de traces numériques d’activités : assister le suivi de l'activité dans les environnements de formation à base de simulateur pleine échelle". Thesis, Lyon 1, 2014. http://www.theses.fr/2014LYO10131/document.
Texto completoOur research takes place in the field of knowledge engineering. In particularly we focus our study in capitalizing and sharing knowledge of observation and analysis of digital traces. In this context, we base our approach on the concept of modeled trace (M-Trace) developed by the SILEX team. Our approach give the possibility to exploit low levels digital traces in order to extract higher knowledge level through rule-based transformations. These rules modelize the knowldege of observation and analysis of different users. Rules can be capitalized and shared between users. We complete our proposal by providing a synthetic visualization of the knowledge levels with observed elements from the activity. By means of a generic trace model, that we have specified, users can explore the different abstraction level in purposes of investigation in order to better understand and analyze the activity. Our proposals have been implemented in a prototype, called D3KODE (« Define, Discover, and Disseminate Knowledge from Observation to Develop Expertise »), allowing the processing, representation and visualization of traces. D3KODE was applied in the context of professional training on the nuclear power plant full-scope simulator of the EDF group designed to maintain and enhance the knowledge and skills of Nuclear Power Plant control room staff. In such context, the observation, analysis and debriefing of individual and collective interactions of trainees’ operators is a dense activity that require attention and constant alertness of the trainers throughout the simulation, especially for the young trainers who do not have the expertise of confirmed trainers. The amount of data collected during a simulation is big and very low levels. They are difficult to analyse manually in order to extract high level information reflecting the behaviour of trainees. In such a context, understand and follow the activity requires a strong expertise that all trainers don’t have. So as to validate our approach, D3KODE was evaluated in a real context according to a comparative protocol conducted with a team of trainers from EDF Group. The evaluation gave significant results to validate our approach and encourage many research opportunities
Assouroko, Ibrahim. "Gestion de données et dynamiques des connaissances en ingénierie numérique : contribution à l'intégration de l'ingénierie des exigences, de la conception mécanique et de la simulation numérique". Compiègne, 2012. http://www.theses.fr/2012COMP2030.
Texto completoOver the last twenty years, the deep changes noticed in the field of product development, led to methodological change in the field of design. These changes have, in fact, benefited from the significant development of Information and Communication Technologies (ICT) (such as PLM systems dedicated to the product lifecycle management), and from collaborative engineering approaches, playing key role in the improvement of product development process (PDP). In the current PLM market, PLM solutions from different vendors still present strong heterogeneities, and remain on proprietary technologies and formats for competitiveness and profitability reasons, what does not ease communication and sharing between various ICTs contributing to the PDP. Our research work focuses on PDP, and aims to contribute to the improvement of the integrated management of mechanical design and numerical simulation data in a PLM context. The research contribution proposes an engineering knowledge capitalization solution based on a product semantic relationship management approach, organized as follows : (1) a data structuring approach driven by so called semi-structured entities with a structure able to evolve along the PDP, (2) a conceptual model describing the fundamental concepts of the proposed approach, (3) a methodology that facilitates and improves the management and reuse of engineering knowledge within design project, and (4) a knowledge capitalization approach based on the management of semantic relationships that exist or may exist between engineering entities within the product development process
Zekri, Dorsaf. "Agrégation et extraction des connaissances dans les réseaux inter-véhicules". Thesis, Evry, Institut national des télécommunications, 2013. http://www.theses.fr/2013TELE0001/document.
Texto completoThe works in this thesis focus on data management in inter-vehicular networks (VANETs). These networks consist of a set of moving objects that communicate with wireless networks IEEE 802.11, Bluetooth, or Ultra Wide Band (UWB). With such communication mechanisms, a vehicle may receive information from its close neighbors or other more remote, thanks to multi-jump techniques that operate in this case intermediate objects as relays. A lot of information can be exchanged in the context of « VANETs », especially to alert drivers when an event occurs (accident, emergency braking, vehicle leaving a parking place and want to inform others, etc.). In their move vehicles are then « contaminated » by the information provided by others. In this work, we use the data substantially different from the existing work. These are, in fact, use the data exchanged to produce alerts drivers. Once these data are used, they become obsolete and are destroyed. In this work, we seek to generate dynamically from data collected by vehicles in their path, a summary (or aggregate) which provides information to drivers, including when no communicating vehicle is nearby. To do this, we first propose a spatio-temporal aggregation structure enabling a vehicle to summarize all the observed events. Next, we define a protocol for exchanging summaries between vehicles without the mediation of an infrastructure, allowing a vehicle to improve its local knowledge base by exchange with its neighbors. Finally, we define our operating strategies of the summary to assist the driver in making decision. We validated all of our proposals using the «VESPA» simulator by extending it to take into account the concept of summaries. Simulation results show that our approach can effectively help drivers make good decisions without the need to use a centralized infrastructure
Lassus, Saint-Geniès Géraud de. "La prise en compte des aspects économiques du défi climatique dans le régime juridique international du climat". Doctoral thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/26033.
Texto completoCapítulos de libros sobre el tema "Réutilisation et échange de protocoles"
JARECKI, Stanislaw. "Échange de clé authentifié par mot de passe : protocoles et modèles de sécurité". En Cryptographie asymétrique, 241–89. ISTE Group, 2024. http://dx.doi.org/10.51926/iste.9096.ch10.
Texto completo