Dissertations / Theses on the topic 'Gestion des données basée sur des connaissances'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 39 dissertations / theses for your research on the topic 'Gestion des données basée sur des connaissances.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
L'Héritier, Cécile. "Une approche de retour d’expérience basée sur l’analyse multicritère et l’extraction de connaissances : Application au domaine humanitaire." Thesis, Nîmes, 2020. http://www.theses.fr/2020NIME0001.
Full textBecause of its critical impacts on performance and competitivity, organizations’ knowledge is today considered to be an invaluable asset. In this context, the development of methods and frameworks aiming at improving knowledge preservation and exploitation is of major interest. Within Lessons Learned framework – which proposes relevant methods to tackle these challenges –, we propose to work on an approach mixing Knowledge Representation, Multiple-Criteria Decision Analysis and Inductive Reasoning for inferring general learnings by analyzing past experiences. The proposed approach, which is founded on a specific case-based reasoning, intends to study the similarities of past experiences – shared features, patterns – and their potential influence on the overall success of cases through the identification of a set of criteria having a major contribution on this success. For the purpose of highlighting this potential causal link to infer general learnings, we envisage relying on inductive reasoning techniques. The considered work will be developed and validated through the scope of a humanitarian organization, Médecins Sans Frontières, with a focus on the logistical response in emergency situations
You, Wei. "Un système à l'approche basée sur le contenu pour la recommandation personnalisée d'articles de recherche." Compiègne, 2011. http://www.theses.fr/2011COMP1922.
Full textPersonalized research paper recommendation filters the publications according to the specific research interests of users, which could significantly alleviate the information overload problem. Content-based filtering is a promising solution for this task because it can effectively exploit the textual-nature of research papers. A content-based recommender system usually concerns three essential issues: item representation, user profiling, and a model that provides recommendations by comparing candidate item's content representation with the target user's interest representation. In this dissertation, we first propose an automatic keyphrase extraction technique for scientific documents, which improves the existing approaches by using a more precise location for potential keyphrases and a new strategy for eliminating the overlap in the output list. This technique helps to implement the representation of candidate papers and the analysis of users' history papers. Then for modeling the users' information needs, we present a new ontology-based approach. The basic idea is that each user is represented as an instance of a domain ontology in which concepts are assigned interest scores reflecting users' degree of interest. We distinguish senior researchers and junior researchers by deriving their individual research interests from different history paper sets. We also takes advantage of the structure of the ontology and apply spreading activation model to reason about the interests of users. Finally, we explore a novel model to generate recommendations by resorting to the Dempster-Shafer theory. Keyphrases extracted from the candidate paper are considered as sources of evidence. Each of them are linked to different levels of user interest and the strength of each link is quantified. Different from other similarity measures between user profiles and candidate papers, our recommendation result produced by combining all evidence is able to directly indicate the possibility that a user might be interested in the candidate paper. Experimental results show that the system we developed for personalized research paper recommendation outperforms other state-of-the-art approaches. The proposed method can be used as a generic way for addressing different types of recommendation problems
Bruneau, Marina. "Une méthodologie de Reverse Engineering à partir de données hétérogènes pour les pièces et assemblages mécaniques." Thesis, Compiègne, 2016. http://www.theses.fr/2016COMP2267/document.
Full textThis thesis deals with a methodology of Reverse Engineering (RE) of mechanical assemblies from heterogeneous data in a routine context. This activity consists, from the existing data of a part or an assembly, in rebuilding their digital mock-up. The data used in entrance of our process of RE can be drawings, photos, points clouds or another existing version of the digital mock-up. The proposed approach, called Heterogeneous Data Integration for Reverse Engineering (HDI-RE), is divided into three steps : the segmentation, the signature and the comparison of the initial data with a knowledge database. The signatures of the studied object are compared with the signatures of the same type existing in the database in order to extract components ordered by similarity (distance with the object). The parameterized digital mock-up which is the most similar to the object is then extracted and its parameters identified from the initial data. Data set processing, called "heterogeneous" data, requires a solution which is able to manage on one hand, the heterogeneousness of the data and the information which they contain and on the other hand, the incompleteness of some data which are in link with the noise (into points cloud) or with the technology used to digitize the assembly (ex: scanner or photography)
Romero, Julien. "Harvesting commonsense and hidden knowledge from web services." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAT032.
Full textIn this thesis, we harvest knowledge of two different types from online resources . The first one is commonsense knowledge, i.e. intuitive knowledge shared by most people like ``the sky is blue''. We extract salient statements from query logs and question-answering by carefully designing question patterns. Next, we validate our statements by querying other web sources such as Wikipedia, Google Books, or image tags from Flickr. We aggregate these signals to create a final score for each statement. We obtain a knowledge base, QUASIMODO, which, compared to its competitors, has better precision and captures more salient facts.The other kind of knowledge we investigate is hidden knowledge, i.e. knowledge not directly given by a data provider. More concretely, some Web services allow accessing the data only through predefined access functions. To answer a user query, we have to combine different such access functions, i.e., we have to rewrite the query in terms of the functions. We study two different scenarios: In the first scenario, the access functions have the shape of a path, the knowledge base respects constraints called ``Unary Inclusion Dependencies'', and the query is atomic. We show that the problem is decidable in polynomial time, and we provide an algorithm with theoretical evidence. In the second scenario, we remove the constraints and create a new class of relevant plans called "smart plans". We show that it is decidable to find these plans and we provide an algorithm
Pérution-Kihli, Guillaume. "Data Management in the Existential Rule Framework : Translation of Queries and Constraints." Electronic Thesis or Diss., Université de Montpellier (2022-....), 2023. http://www.theses.fr/2023UMONS030.
Full textThe general context of this work is the issue of designing high-quality systems that integrate multiple data sources via a semantic layer encoded in a knowledge representation and reasoning language. We consider knowledge-based data management (KBDM) systems, which are structured in three layers: the data layer, which comprises the data sources, the knowledge (or ontological) layer, and the mappings between the two. Mappings and knowledge are expressed within the existential rule framework. One of the intrinsic difficulties in designing a KBDM is the need to understand the content of data sources. Data sources are often provided with typical queries and constraints, from which valuable information about their semantics can be drawn, as long as this information is made intelligible to KBDM designers. This motivates our core question: is it possible to translate data queries and constraints at the knowledge level while preserving their semantics?The main contributions of this thesis are the following. We extend previous work on data-to-ontology query translation with new techniques for the computation of perfect, minimally complete, or maximally sound query translations. Concerning data-to-ontology constraint translation, we define a general framework and apply it to several classes of constraints. Finally, we provide a sound and complete query rewriting operator for disjunctive existential rules and disjunctive mappings, as well as undecidability results, which are of independent interest
Antoine, Emilien. "Distributed data management with a declarative rule-based language webdamlog." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00933808.
Full textYassir-Montet, Djida. "Une Approche hybride de gestion des connaissances basée sur les ontologies : application aux incidents informatiques." Lyon, INSA, 2006. http://theses.insa-lyon.fr/publication/2006ISAL0110/these.pdf.
Full textThe work developped in this thesis deals with the problem of knowledge management within the organizations. Its aim is to propose methodological assistances and tools fa-cilitating the capitalization and reuse of knowledge producted and/or used by the actors of a company. We have suggested a hybrid approach based on a domain ontology. Set-tling down this approach requires the use of the techniques of the semantic Web such as XML and OWL. In this context, we have designed a corporate memory being able to interface with the information system of the company. Looking focustly on the context of software applications produced and/or being in production by the CIRTIL, organiza-tion attached to the social security in France for the account of its customers which are the URSSAF, we have developed a domain ontology of tha data-processing incidents. This ontology is regarded as a model which allows not only formal and consensual rep-resentations of knowledge of the field, but also a single access to knowledge resources thanks a shareable and nonambiguous terminology while offering mechanisms of rason-ing on modelled knowledge
Chamekh, Fatma. "L’évolution du web de données basée sur un système multi-agents." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE3083/document.
Full textIn this thesis, we investigate the evolution of RDF datasets from documents and LOD. We identify the following issues : the integration of new triples, the proposition of changes by taking into account the data quality and the management of differents versions.To handle with the complexity of the web of data evolution, we propose an agent based argumentation framework. We assume that the agent specifications could facilitate the process of RDF dataset evolution. The agent technology is one of the most useful solution to cope with a complex problem. The agents work as a team and are autonomous in the sense that they have the ability to decide themselves which goals they should adopt and how these goals should be acheived. The Agents use argumentation theory to reach a consensus about the best change alternative. Relatively to this goal, we propose an argumentation model based on the metric related to the intrinsic dimensions.To keep a record of all the occured modifications, we are focused on the ressource version. In the case of a collaborative environment, several conflicts could be generated. To manage those conflicts, we define rules.The exploited domain is general medecine
Jaillet, Simon. "Catégorisation automatique de documents textuels : D'une représentation basée sur les concepts aux motifs séquentiels." Montpellier 2, 2005. http://www.theses.fr/2005MON20030.
Full textOuertani, Mohamed Zied. "DEPNET : une approche support au processus de gestion de conflits basée sur la gestion des dépendances de données de conception." Phd thesis, Université Henri Poincaré - Nancy I, 2007. http://tel.archives-ouvertes.fr/tel-00163113.
Full textC'est à la gestion de ce phénomène, le conflit, que nous nous sommes intéressés dans le travail présenté dans ce mémoire, et plus particulièrement à la gestion de conflits par négociation. Nous proposons l'approche DEPNET (product Data dEPendencies NETwork identification and qualification) pour supporter au processus de gestion de conflits basée sur la gestion des dépendances entre les données. Ces données échangées et partagées entre les différents intervenants sont l'essence même de l'activité de conception et jouent un rôle primordial dans l'avancement du processus de conception.
Cette approche propose des éléments méthodologiques pour : (1) identifier l'équipe de négociation qui sera responsable de la résolution de conflit, et (2) gérer les impacts de la solution retenu suite à la résolution du conflit. Une mise en œuvre des apports de ce travail de recherche est présentée au travers du prototype logiciel DEPNET. Nous validons celui-ci sur un cas d'étude industriel issu de la conception d'un turbocompresseur.
Pierret, Jean-Dominique. "Méthodologie et structuration d'un outil de découverte de connaissances basé sur la littérature biomédicale : une application basée sur l'exploitation du MeSH." Toulon, 2006. http://tel.archives-ouvertes.fr/tel-00011704.
Full textThe information available in bibliographic databases is dated and validated by a long process and becomes not very innovative. Usually bibliographic databases are consultated in a boolean way. The result of a request represente is a set of known which do not bring any additional novelty. In 1985 Don Swanson proposed an original method to draw out innovative information from bibliographic databases. His reasoning is based on systematic use of the biomedical literature to draw the latent connections between different well established knowledges. He demonstrated unsuspected potential of bibliographic databases in knowledge discovery. The value of his work did not lie in the nature of the available information but consisted in the methodology he used. This general methodology was mainly applied on validated and structured information that is bibliographic information. We propose to test the robustness of Swanson's theory by setting out the methods inspired by this theory. These methods led to the same conclusions as Don Swanson's ones. Then we explain how we developed a knowledge discovery system based on the literature available from public biomedical information sources
Chazara, Philippe. "Outils d'élaboration de stratégie de recyclage basée sur la gestion des connaissances : application au domaine du génie des procédés." Thesis, Toulouse, INPT, 2015. http://www.theses.fr/2015INPT0141/document.
Full textIn this work, a study is realised about the creation of a new methodology allowing the generation and the assessment of new waste recovery processes. Three elements are proposed for that. The first one is the creation of a modelling framework permitting a structured and homogeneous representation of each recovery process and the criteria used to asses them. The second one is a system and a tool generating new recovery processes from others known. Finally, the last element is another tool to model, to estimate and to asses the generated processes. The creation of a modelling framework tries to create some categories of elements allowing the structuring of unit operations under different levels of description. Three levels have been identified. In the higher level, the Generic operation which describes global structure of operations. The second one is Generic treatment which is an intermediate level between the two others. It proposes here too categories of operations but more detailed than the higher level. The last one is the Unit operation. A second framework has been created. It is more conceptual and it has two components : blocs and systems. These frameworks are used with a set of selected indicators. In a desire of integrating our work in a sustainable development approach, an indicator has been chosen for each of its components: economical, environmental and social. In our study, the social impact is limited to the number of created jobs. To estimate this indicator, we proposed a new method based on economical values of a company. The tool for the generation of new waste recovery processes used the methodology of case-based reasoning CBR which is based on the knowledge management. Some difficult points are treated here to adapt the CBR to our problem. The structuring of knowledge and generally the source case generation is realised by a system based on connections between data and the use of inference mechanisms. The development of a new method for the similarity measure is designed with the introduction of common definition concept which allows linking states, simply put description of objects, to other states under different levels of conceptualizations and abstractions. This point permits creating many levels of description. Finally, recovery process is decomposed from a main problem to some sub-problems. This decomposition is a part of the adaptation mechanism of the selected source case. The realisation of this system is under logic programming with Prolog. This last one permits the use of rules allowing inferences and the backtracking system allowing the exploration to the different possible solution. The modelling and assessment of recovery processes are done by a tool programmed in Python. It uses the meta-programming to dynamically create model of operations or systems. Constraint rules define the behaviour of these models allowing controlling the flux circulating in each one. In the evaluation step, a parser is used to convert theses rules into a homogeneous system of constraint programming. This system can be solved by the use of solvers with an interface developed for that and added to the tool. Therefore, it is possible for the user to add solvers but also to add plug-ins. This plug-ins can make the assessment of the activity allowing to have different kinds of evaluation for the same criteria. Three plug-ins are developed, one for each selected criterion. These two methods are tested to permit the evaluation of the proposed model and to check the behaviour of them and their limits . For these tests, a case-base on waste has been created Finally, for the modelling and assessment tool, a study case about the recovery process of used tyres in new raw material is done
Kexel, Christoph. "La production et la gestion des connaissances dans le marketing des moteurs de recherche : une approche basée sur l'apprentissage par l'action." Nice, 2012. http://www.theses.fr/2012NICE0034.
Full textSince the late 1980s the development of the Internet has empowered customers to choose from an extensive range of products and suppliers and to make purchases more efficient and flexible (Chaffey et al, 2006). In online marketing the potential customer opens a communication channel to the marketer by seeking information with search engines ((Chaffey et al, 2006), (Bogner, 2006)), so that ‘being found’ represents the crucial success factor for today’s organizations. However, modern firms are surprisingly sluggish in appreciating the critical importance of online marketing (Schubring, 2008). This dissertation aims to design a methodology for improvement of marketers’ knowledge and online marketing skills. The dissertation initially develops an action learning based model for conveying the basic concepts necessary for marketing practitioners. This methodology is then applied and evaluated in a controlled higher education environment by also conveying the expertise to a control group in a traditional form. An analysis of the efficacy of the different knowledge-transfer methods is provided in order to identify the optimal learning approach. The data collected on assumed knowledge improvements illustrate that optimal results have indeed been obtained in all subcategories of Internet marketing by the proposed framework. The dissertation concludes that the methodology is indeed able to instill the compulsory know-how in order that practitioners may cope with and make use of the World Wide Web – despite its vagaries. The findings advance business practice by providing a possibility of effective on-the-job training for the key organizational function of online marketing. This may facilitate corporate learning models in a more general context
Darwich, Akoum Hind. "Approche organisationnelle basée sur le paradigme agent pour la synthèse et la réutilisation des connaissances en ingénierie collaborative." Thesis, Université de Lorraine, 2014. http://www.theses.fr/2014LORR0224/document.
Full textIt is well known in the enterprises that each new projectto be carried out is usually similar to a certain previous projects. Those projects can be structured according to a common reference process depending on their type. In the dissertation, we call this reference process of the enterprise’s good practices the “expertise business process”. The main difficulty lies in the formalization of the expertise business process. The traditional knowledge capitalization approaches based on the experts’ debriefings showed their limits: the experts often leave out details which may be of relevance because the debriefings are habitually realizedexternally to the activities. Our thesis relies on the idea that it is possible to construct the operational process, implemented during the collaborative activities in a product development study, from the traces recorded by the used IT tools. The constructed operational process allows the business actors and experts to step back on their work and formalize the new deducted experience to enhance the expertise business processes of the firm. Our work had taken place in the ERPI (Equipe de Recherche sur les Processus Innovatifs) laboratory of the “Université de Lorraine” under a partnership with TDC Software society and through a CIFRE Convention. This dissertation offers five key contributions: • A double cycle to capitalize over the instrumented activities. • A global approach for the management of expertise business processes. • An ontology “OntoProcess” to conceive the generic organizational aspects, separating distinctly the concepts related to traces from those related to the business process, and providing extensions in function of the used tools. • A multi-agents system based on the ontology “OntoProcess” to support the presented global approach of the expertise business processes management. • A trace based system that allows the construction of the operational process from the traces registered over the study
Dellal, Ibrahim. "Gestion et exploitation de larges bases de connaissances en présence de données incomplètes et incertaines." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2019. http://www.theses.fr/2019ESMA0016/document.
Full textIn the era of digitilization, and with the emergence of several semantic Web applications, many new knowledge bases (KBs) are available on the Web. These KBs contain (named) entities and facts about these entities. They also contain the semantic classes of these entities and their mutual links. In addition, multiple KBs could be interconnected by their entities, forming the core of the linked data web. A distinctive feature of these KBs is that they contain millions to trillions of unreliable RDF triples. This uncertainty has multiple causes. It can result from the integration of data sources with various levels of intrinsic reliability or it can be caused by some considerations to preserve confidentiality. Furthermore, it may be due to factors related to the lack of information, the limits of measuring equipment or the evolution of information. The goal of this thesis is to improve the usability of modern systems aiming at exploiting uncertain KBs. In particular, this work proposes cooperative and intelligent techniques that could help the user in his decision-making when his query returns unsatisfactory results in terms of quantity or reliability. First, we address the problem of failing RDF queries (i.e., queries that result in an empty set of responses).This type of response is frustrating and does not meet the user’s expectations. The approach proposed to handle this problem is query-driven and offers a two fold advantage: (i) it provides the user with a rich explanation of the failure of his query by identifying the MFS (Minimal Failing Sub-queries) and (ii) it allows the computation of alternative queries called XSS (maXimal Succeeding Sub-queries), semantically close to the initial query, with non-empty answers. Moreover, from a user’s point of view, this solution offers a high level of flexibility given that several degrees of uncertainty can be simultaneously considered.In the second contribution, we study the dual problem to the above problem (i.e., queries whose execution results in a very large set of responses). Our solution aims at reducing this set of responses to enable their analysis by the user. Counterparts of MFS and XSS have been defined. They allow the identification, on the one hand, of the causes of the problem and, on the other hand, of alternative queries whose results are of reasonable size and therefore can be directly and easily used in the decision making process.All our propositions have been validated with a set of experiments on different uncertain and large-scale knowledge bases (WatDiv and LUBM). We have also used several Triplestores to conduct our tests
Jemaa, Adel. "Processus d’absorption, Innovation & Productivité : Analyse empirique sur données d’entreprises." Caen, 2014. http://www.theses.fr/2014CAEN0504.
Full textThe thesis deals with the conceptualization and assessment of the ability of firms to absorb external knowledge. It also discusses the impact of this capability on innovation and productivity. The first contribution of this thesis consists in modeling the absorption capacity as an integrated process in the innovation of the company processes. This process of absorption is defined and modeled in an original way through a network of interactions between different activities or capacities: the capacity for internal absorption, the access to external knowledge and the ability to cooperate capacity. The second contribution is to analytically treat the issue by integrating the absorption capacity and the cognitive distance through an innovation function simultaneously. This model allows to distinguish between a theoretical absorption capacity and an effective absorption capacity that takes into account the cognitive distance. The third contribution initially consists, on one hand, in measuring the intensity of these different capabilities and, on the other hand, in estimating the causal relationships between them. That is to say, the ability to determine the internal absorption ability to access external knowledge, which in turn, would determine the ability to cooperate. Secondly, the thesis focuses on the influence of the intensity of cooperation on business performance (output of innovation, labor productivity, TFP). Finally, the thesis discusses the impact of the performance of the company on its internal capacity for absorption
Alili, Hiba. "Intégration de données basée sur la qualité pour l'enrichissement des sources de données locales dans le Service Lake." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLED019.
Full textIn the Big Data era, companies are moving away from traditional data-warehouse solutions whereby expensive and timeconsumingETL (Extract, Transform, Load) processes are used, towards data lakes in order to manage their increasinglygrowing data. Yet the stored knowledge in companies’ databases, even though in the constructed data lakes, can never becomplete and up-to-date, because of the continuous production of data. Local data sources often need to be augmentedand enriched with information coming from external data sources. Unfortunately, the data enrichment process is one of themanual labors undertaken by experts who enrich data by adding information based on their expertise or select relevantdata sources to complete missing information. Such work can be tedious, expensive and time-consuming, making itvery promising for automation. We present in this work an active user-centric data integration approach to automaticallyenrich local data sources, in which the missing information is leveraged on the fly from web sources using data services.Accordingly, our approach enables users to query for information about concepts that are not defined in the data sourceschema. In doing so, we take into consideration a set of user preferences such as the cost threshold and the responsetime necessary to compute the desired answers, while ensuring a good quality of the obtained results
Tawbi, Chawki. "Adactif : extension d'un SGBD à l'activité par une approche procédurale basée sur les rendez-vous." Toulouse 3, 1996. http://www.theses.fr/1996TOU30262.
Full textEude, Thibaut. "Forage des données et formalisation des connaissances sur un accident : Le cas Deepwater Horizon." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEM079/document.
Full textData drilling, the method and means developed in this thesis, redefines the process of data extraction, the formalization of knowledge and its enrichment, particularly in the context of the elucidation of events that have not or only slightly been documented. The Deepwater Horizon disaster, the drilling platform operated for BP in the Gulf of Mexico that suffered a blowout on April 20, 2010, will be our case study for the implementation of our proof of concept for data drilling. This accident is the result of an unprecedented discrepancy between the state of the art of drilling engineers' heuristics and that of pollution response engineers. The loss of control of the MC 252-1 well is therefore an engineering failure and it will take the response party eighty-seven days to regain control of the wild well and halt the pollution. Deepwater Horizon is in this sense a case of engineering facing extreme situation, as defined by Guarnieri and Travadel.First, we propose to return to the overall concept of accident by means of an in-depth linguistic analysis presenting the semantic spaces in which the accident takes place. This makes it possible to enrich its "core meaning" and broaden the shared acceptance of its definition.Then, we bring that the literature review must be systematically supported by algorithmic assistance to process the data taking into account the available volume, the heterogeneity of the sources and the requirements of quality and relevance standards. In fact, more than eight hundred scientific articles mentioning this accident have been published to date and some twenty investigation reports, constituting our research material, have been produced. Our method demonstrates the limitations of accident models when dealing with a case like Deepwater Horizon and the urgent need to look for an appropriate way to formalize knowledge.As a result, the use of upper-level ontologies should be encouraged. The DOLCE ontology has shown its great interest in formalizing knowledge about this accident and especially in elucidating very accurately a decision-making process at a critical moment of the intervention. The population, the creation of instances, is the heart of the exploitation of ontology and its main interest, but the process is still largely manual and not without mistakes. This thesis proposes a partial answer to this problem by an original NER algorithm for the automatic population of an ontology.Finally, the study of accidents involves determining the causes and examining "socially constructed facts". This thesis presents the original plans of a "semantic pipeline" built with a series of algorithms that extract the expressed causality in a document and produce a graph that represents the "causal path" underlying the document. It is significant for scientific or industrial research to highlight the reasoning behind the findings of the investigation team. To do this, this work leverages developments in Machine Learning and Question Answering and especially the Natural Language Processing tools.As a conclusion, this thesis is a work of a fitter, an architect, which offers both a prime insight into the Deepwater Horizon case and proposes the data drilling, an original method and means to address an event, in order to uncover answers from the research material for questions that had previously escaped understanding
Zouikri, Messaoud. "Stratégies de R et D et innovation dans l’industrie pharmaceutique en France : une étude économétrique sur données individuelles." Paris 9, 2008. https://bu.dauphine.psl.eu/fileviewer/index.php?doc=2008PA090071.
Full textThe thesis puts forth an empirical methodology to study the link between modes of knowledge production and innovation, based on a functional (basic research, applied research and development) and organisational (internal research, external research) analysis of RD activities. To do so, it uses survey data on RD and innovation at the firm level in order to estimate through appropriate econometric models both the link between firm size and RD modes, and the relationship between RD strategies and innovation in the pharmaceutical industry in France. The main results show that the RD components increase non-linearly and less than proportionally with the size of the firm. Moreover, small and medium-sized firms are not significantly different from big firms in the production of original innovations. The different RD modes contribute to innovation in complementary ways
Razmerita, Liana. "Modèle utilisateur et modélisation utilisateur dans les systèmes de gestion des connaissances : une approche fondée sur les ontologies." Toulouse 3, 2003. http://www.theses.fr/2003TOU30179.
Full textJlassi, Aymen. "Optimisation de la gestion des ressources sur une plate-forme informatique du type Big Data basée sur le logiciel Hadoop." Thesis, Tours, 2017. http://www.theses.fr/2017TOUR4042.
Full text"Cyres-Group" is working to improve the response time of his clusters Hadoop and optimize how the resources are exploited in its data center. That is, the goals are to finish work as soon as possible and reduce the latency of each user of the system. Firstly, we decide to work on the scheduling problem in the Hadoop system. We consider the problem as the problem of scheduling a set of jobs on a homogeneous platform. Secondly, we decide to propose tools, which are able to provide more flexibility during the resources management in the data center and ensure the integration of Hadoop in Cloud infrastructures without unacceptable loss of performance. Next, the second level focuses on the review of literature. We conclude that, existing works use simple mathematical models that do not reflect the real problem. They ignore the main characteristics of Hadoop software. Hence, we propose a new model ; we take into account the most important aspects like resources management and the relations of precedence among tasks and the data management and transfer. Thus, we model the problem. We begin with a simplistic model and we consider the minimisation of the Cmax as the objective function. We solve the model with mathematical solver CPLEX and we compute a lower bound. We propose the heuristic "LocFirst" that aims to minimize the Cmax. In the third level, we consider a more realistic modelling of the scheduling problem. We aim to minimize the weighted sum of the following objectives : the weighted flow time ( ∑ wjCj) and the makespan (Cmax). We compute a lower bound and we propose two heuristics to resolve the problem
Reeveerakul, Napaporn. "Un système d’aide à la décision pour la réorganisation des chaînes logistiques : une approche basée sur une analyse multicritère et un système de gestion des connaissances." Thesis, Lyon 2, 2011. http://www.theses.fr/2011LYO20007/document.
Full textSince The investment of foreign manufacturing in developing countries result rapidly increasingly on economic growth to the host countries. Such contributions are creating new jobs by foreign companies, increasing the use of multinational distribution networks, or even spending on research and development to support many national projects. These have led to higher productivity through increased capital, which in turn has led to high living standards. Consequently, several developing countries are recognized to attracting foreign investors to invest their manufacturing business that they can gain benefits from them. However, they face many critical challenges linked to the economic turbulence, from the increasing of labor cost, ineffective supply chain and infrastructure. The crisis was raised many problems that give Foreign Direct Investments (FDIs) loss profits and increase operational costs. Then new and existing foreign investors are reluctant to invest or expand the businesses. While several approaches to be considered on making a decision are noticed for foreign investors. The relocation to cheaper laboring countries or shutting down operation is the possible strategy to be considered for them. However, these situations will not only impact internal organization but also the local and global economic situation. Since the crises can affect people’s income, as well raise economic problems, they finally lead to social problems in the area.Thus, to sustain the business operation and attract the new comers of FDIs, this research aims to help manufacturers understand the existing crises in their business situation, and then make the right decision by providing them with a tool to validate their future investment decision. Furthermore, providing useful information on FDIs’ investment also contributes to attracting new investors. Thus we will focus on the problematic as follows. 1 What are the potential factors used for a decision making while the FDIs is faced up with crises? 2 How can the study help manufacturers make the right decision in their manufacturing crises?3 In order to make a decision on relocating, transferring or divesting plants, are the specific factors that should be considered?4 How can the relevant organizations and the government help to prevent the crises generated by offshore or plant divestment?To answer on the problematic, our research proposes an integrated framework which is based on three main requirements: The supply chain and infrastructure, workers skills and performance, as well as the financial situation associated with the relevant stakeholders. Those three stakeholders are foreign investor, local industrial estate stakeholders, and manufacturers. However, the strategies to be analyzed in our research framework can be categorized into static and dynamic analysis. In terms of static analysis, the risk matrix decision represented as the knowledge base system used to evaluate the occurrence of existing risks in businesses. This analysis also helps investors or manufacturers evaluate their related risks on existing businesses. Regarding dynamic analysis, the modeling of the supply chain simulation is constructed according to the Supply Chain Operations Reference (SCOR) model. Supply chain modeling and analysis on future cost of investment are represented in this context. Besides, our research also applies the metrics and attributes based on Supply Chain Operations Reference (SCOR) to measure the supply chain performance
Leroy, Yann. "Développement d'une méthodologie de fiabilisation des prises de décisions environnementales dans le cadre d'analyses de cycle de vie basée sur l'analyse et la gestion des incertitudes sur les données d'inventaires." Phd thesis, Paris, ENSAM, 2009. http://pastel.archives-ouvertes.fr/pastel-00005830.
Full textSantorineos, Manthos. "Une approche sur la recherche et l’art à l’époque du numérique à travers l’étude des systèmes artificiels d’organisation de la mémoire humaine : nécessité et proposition d’un outil d’enregistrement et de synthèse des données susceptible de contribuer à la mise en place d’une nouvelle forme de doctorat ("doctorat_machine")." Paris 8, 2006. http://octaviana.fr/document/119090325#?c=0&m=0&s=0&cv=0.
Full textSociety exists in a state of hybridicity in relation to memory storage, between a “paper” culture and the culture of digital technology, while in many contexts the results of this situation are not yet apparent, as digital technology is used for the enhancement of conditions and not for their development. Important role in the clarification of this situation will play a change in research method putting in good practice the capacities of the new system of memory processing. In the realm of digital art both conditions coexist yet the artist-researcher can bring into fruition the advantages of digital technology characterised by the notion of “platform” and not of “media” valid up to the electronic revolution. In the course of research, a system for the optimisation of digital capacities of knowledge processing was created so as to tackle complexities and problems inherent in the object of study. In the context of perspectives of digital culture, a hypothesis is being examined, this system, defined as doctorate_machine, to comprise the new thesis form with respect to research documentation and presentation
Ben, Arfi Wissal. "Partage des connaissances : articulation entre management de l'innovation et management des connaissances : cas des plateformes d'innovation d'un groupe leader du secteur agroalimentaire en Tunisie." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENG016/document.
Full textThe globalization, the adaptation to the consumer needs, the creation of new products represent permanent challenges to promote the market demand and make the innovation a strategic axis to approach. These requirements have enhanced an increasing necessity to rethink about the process of innovation and more particularly the management of the innovative projects through the implementation of innovation platforms. Team work is dedicated to the innovation where the involved members have transverse knowledge and are capable of sharing them to innovate.This doctoral approach tries to examine and to identify the role of the sharing knowledge between the members of innovation platforms in the elaboration of aninnovation process. It deals withthe following problematic: howthe sharing knowledgewithin the innovation platforms favour the emergence of innovations ? By examining the phenomenon of the knowledge sharing taking into consideration an articulation between innovation management and knowledge management,this research digs deep into the Knowledge-based View of the firm, where the cognitive and social practices play an important role for the innovation. Through a qualitative research, three case studies were led within three innovation platforms of a leader group for food industry sector in Tunisia. We were able to identify the organizational devices which influence the interactions between the members involved within the innovation platforms. The deep examination of the practices of each innovation platform allowed us to observe the following phenomena: although the implementation of the innovation platformsis linked to the managerial strategy based on innovation and makes a reference toa basic model, the practices of every platform of innovation is specific. Beyond the transverse quality of their structure, the innovation platforms appear as a relatively flexible formula that every enterprise of the groupcan appropriate and adapt it to its context and its constraints. In the three case studies, the knowledge sharing appears as an organizational and technological approach aiming at sharing and integrating the knowledge between the members of an innovation platform to innovate. The interest of this approach is not based on the knowledge in itself but on "Who" detains it and "How" it is shared within the innovation platform. Finally, two elements allowed to a better understandingof the knowledge sharing: the strategic management and the corporate culture. Our study on the innovation platforms shows that when there is a deliberate action to establish an initiativeof innovation based on the knowledge sharing, this action becomes crucialto the life of the organization as far as it can question its balance of power, arouses the enthusiasm of certain actors and the distrust of others. This thesis, with its three case studiesof innovation platforms, does not only ponder on the cultural level concerning the practices of knowledge sharing but also tacklesthe identity level. The praxis is the following one: the knowledge sharing between the members of the innovation platforms enhances the emergence of the innovations within the companies under study. In terms of the managerial contribution, we consider that the knowledge sharing within the innovation platforms represents anevolving strategic action for the innovation initiatives
Toure, Carine. "Capitalisation pérenne de connaissances industrielles : Vers des méthodes de conception incrémentales et itératives centrées sur l’activité." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEI095/document.
Full textIn this research, we are interested in the question of sustainability of the use of knowledge management systems (KMS) in companies. KMS are those IT environments that are set up in companies to share and build common expertise through collaborators. Findings show that, despite the rigor employed by companies in the implementation of these KMS, the risk of knowledge management initiatives being unsuccessful, particularly related to the acceptance and continuous use of these environments by users remains prevalent. The persistence of this fact in companies has motivated our interest to contribute to this general research question. As contributions to this problem, we have 1) identified from the state of the art, four facets that are required to promote the perennial use of a platform managing knowledge; 2) proposed a theoretical model of mixed regulation that unifies tools for self-regulation and tools to support change, and allows the continuous implementation of the various factors that stimulate the sustainable use of CMS; 3) proposed a design methodology, adapted to this model and based on the Agile concepts, which incorporates a mixed evaluation methodology of satisfaction and effective use as well as CHI tools for the completion of different iterations of our methodology; 4) implemented the methodology in real context at the Société du Canal de Provence, which allowed us to test its feasibility and propose generic adjustments / recommendations to designers for its application in context. The tool resulting from our implementation was positively received by the users in terms of satisfaction and usages
Vosgien, Thomas. "Ingénierie systèmes basée sur les modèles appliquée à la gestion et l'intégration des données de conception et de simulation : application aux métiers d'intégration et de simulation de systèmes aéronautiques complexes." Thesis, Châtenay-Malabry, Ecole centrale de Paris, 2015. http://www.theses.fr/2015ECAP0011/document.
Full textThe aim of this doctoral thesis is to contribute to the facilitation of design, integration and simulation activities in the aeronautics industry, but more generally in the context of collaborative complex product development. This objective is expected to be achieved through the use and improvement of digital engineering capabilities. During the last decade, the Digital Mock-Up (DMU) – supported by Product Data Management (PDM) systems – became a key federating environment to exchange/share a common 3D CAD model-based product definition between co-designers. It enables designers and downstream users(analysts) to access the geometry of the product assembly. While enhancing 3D and 2D simulations in a collaborative and distributed design process, the DMU offers new perspectives for analysts to retrieve the appropriate CAD data inputs used for Finite Element Analysis (FEA), permitting hence to speed-up the simulation preparation process. However, current industrial DMUs suffer from several limitations, such as the lack of flexibility in terms of content and structure, the lack of digital interface objects describing the relationships between its components and a lack of integration with simulation activities and data.This PhD underlines the DMU transformations required to provide adapted DMUs that can be used as direct input for large assembly FEA. These transformations must be consistent with the simulation context and objectives and lead to the concept of “Product View” applied to DMUs andto the concept of “Behavioural Mock-Up” (BMU). A product view defines the link between a product representation and the activity or process (performed by at least one stakeholder) that use or generate this representation as input or output respectively. The BMU is the equivalent of the DMU for simulation data and processes. Beyond the geometry, which is represented in the DMU,the so-called BMU should logically link all data and models that are required to simulate the physical behaviour and properties of a single component or an assembly of components. The key enabler for achieving the target of extending the concept of the established CAD-based DMU to the behavioural CAE-based BMU is to find a bi-directional interfacing concept between the BMU and its associated DMU. This the aim of the Design-Analysis System Integration Framework (DASIF) proposed in this PhD. This framework might be implemented within PLM/SLM environments and interoperate with both CAD-DMU and CAE-BMU environments. DASIF combines configuration data management capabilities of PDM systems with MBSE system modelling concepts and Simulation Data Management capabilities.This PhD has been carried out within a European research project: the CRESCENDO project, which aims at delivering the Behavioural Digital Aircraft (BDA). The BDA concept might consist in a collaborative data exchange/sharing platform for design-simulation processes and models throughout the development life cycle of aeronautics products. Within this project, the Product Integration Scenario and related methodology have been defined to handle digital integration chains and to provide a test case scenario for testing DASIF concepts. These latter have been used to specify and develop a prototype of an “Integrator Dedicated Environment” implemented in commercial PLM/SLM applications. Finally the DASIF conceptual data model has also served as input for contributing to the definition of the Behavioural Digital Aircraft Business Object Model: the standardized data model of the BDA platform enabling interoperability between heterogeneous PLM/SLM applications and to which existing local design environments and new services to be developed could plug
Piton, Thomas. "Une Méthodologie de Recommandations Produits Fondée sur l'Actionnabilité et l'Intérêt Économique des Clients - Application à la Gestion de la Relation Client du groupe VM Matériaux." Phd thesis, Université de Nantes, 2011. http://tel.archives-ouvertes.fr/tel-00643243.
Full textGuillot, Bernard. "Réalisation d'un outil autonome pour l'écriture et l'interrogation de systèmes de gestion de bases de données et de connaissance sur une machine multiprocesseur évolution du concept de base de données vers la manipulation d'objets image et graphique." Grenoble 2 : ANRT, 1986. http://catalogue.bnf.fr/ark:/12148/cb375993889.
Full textGuillot, Bernard. "Réalisation d'un outil autonome pour l'écriture et l'interrogation de systèmes de gestion de bases de données et de connaissances sur une machine multiprocesseur : évolution du concept de bases de données vers la manipulation d'objets image et graphique." Compiègne, 1986. http://www.theses.fr/1986COMPI219.
Full textLame, Guiraude. "Construction d'ontologie à partir de textes : une ontologie du droit dédiée à la recherche d'informations sur le Web." Paris, ENMP, 2002. http://www.theses.fr/2002ENMP1157.
Full textPugeault, Florence. "Extraction dans les textes de connaissances structurées : une méthode fondée sur la sémantique lexicale linguistique." Toulouse 3, 1995. http://www.theses.fr/1995TOU30164.
Full textSokhn, Maria. "Plateforme de recherche basée d'information multimédia guidée par une ontologie dans une architecture paire à paire." Phd thesis, Télécom ParisTech, 2011. http://pastel.archives-ouvertes.fr/pastel-00678039.
Full textSantorineos, Manthos. "Une approche sur la recherche et l'art à l'époque du numérique à travers l'étude des systèmes artificiels d'organisation de la mémoire humaine : nécessité et proposition d'un outil d'enregistrement et de synthèse des données susceptible de contribuer à la mise en place d'une nouvelle forme de doctorat ("doctorat_machine")." Phd thesis, Université Paris VIII Vincennes-Saint Denis, 2006. http://tel.archives-ouvertes.fr/tel-00180014.
Full textBuitrago, Hurtado Alex Fernando. "Aide à la prise de décision stratégique : détection de données pertinentes de sources numériques sur Internet." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENG002/document.
Full textOur research area is around the strategic decision within organizations. More precisely, it is applicable as an aid for strategic decision-making and detecting useful information for such decisions. On the one hand, the ‘information from the field' from the contacts between individuals, business meetings, etc. is always essential for managers. On the other hand, national and international newspapers can provide a considerable volume of data that can be defined as the raw data. However, besides these classical sources, gathering information has changed dramatically with the advent of information technology and particularly internet that is related to our research. We chose the area for the acquisition of ‘information from the field' provided by the national daily newspapers: the Colombian newspaper which concerns to our empirical study. In order to detect weak signals of potential internet base issues which help managers to discover and understand their environment, we proposed a research based on “Action Design Research” type and then applied for designing, building and testing an artifact to gain the required information. The artifact has been designed and built in two phases that is included of using theoretical concepts about the data overload, environmental scanning particularly the “anticipatory and collective environmental scanning model” (VAS-IC®) and the desirable characteristics of strategic decision making support systems. After its construction, the artifact applied to real experimentation that has allowed us to evaluate its effectiveness. Afterwards, we improved our knowledge about the relevance of digital data in the decision making process. The results of all the involved decision makers have been able to integrate these new practices into their information needs
Vieira, Jordão Jorge Manuel. "Le knowledge-based view à l’épreuve des faits : l’interaction entre le knowledge et le knowing et son impact sur la gestion de la connaissance : cas d’expérimentation dans le secteur de services informatiques." Thesis, Paris, CNAM, 2010. http://www.theses.fr/2010CNAM0724/document.
Full textBased on his already long professional career the author asked himself what knowledge management would mean for the knowledge-intensive business services firms where there was a traditional tendency to view knowledge as an object instead of a process of knowing.Through three successive experiments of intervention-research – managing a software house, leading a supplier of packaged software or running a shared services centre – the author has proved that privileging the knowing over the knowledge and ensuring a correct articulation with the strategic and organizational processes it is possible that the KBV (Knowledge-Based View) will make sense in the knowledge-intensive business services sector.In fact, during the first experiment it was shown that articulating appropriately the strategy and the operational processes of Eurociber the KBV made sense through knowledge sharing while during the second experiment at I2S that was achieved through the coproduction in close interaction with the customers. Finally, at CA Serviços it was recognized the importance of knowledge creation as a tool for strategic management assuming the fundamental need to generate knowledge about the interfaces required by the development of a new shared vision
David, Isabelle. "Perception des professionnels de la santé par rapport à l'introduction d'une plateforme Web 2.0 dans leur pratique." Thèse, 2012. http://hdl.handle.net/1866/7117.
Full textIntroduction: Health professionals are increasingly encouraged to adopt an evidence-based practice. However, gaps continue to be observed between scientific evidence and practice. Through its interactive capabilities, Web 2.0 can contribute to an evidence-based practice by improving exchange of relevant clinical and scientific information’s. Objective: This study is a part of a project that wants to design a Web 2.0 platform for health professionals working with stroke patients. The aim is to gain a better understanding of professionals’ perceptions of Web 2.0 before platform development. Methods: A qualitative study following the phenomenological approach was chosen. We conducted individual semi-structured interviews with 24 clinicians and managers. Results: Interviewed people were all women with a mean age of 45 years (± 18). Knowledge transfer was identified to be the most useful outcome of a Web 2.0 platform. Respondents also expressed their need for a user-friendly platform. Results also highlight a time paradox. Clinicians feel that the Web 2.0 will help them save time while they argue that they will not have time to use it. Conclusion: While Web 2.0 remains a knowledge transfer tool not yet integrated in clinical practice, professionals working with stroke patients generally receive its implementation positively.
Lessard, Chantale. "Le rôle de l’évaluation économique dans la pratique des médecins de famille = The role of economic evaluation in the practice of family physicians." Thèse, 2011. http://hdl.handle.net/1866/7044.
Full textHealth economic evaluations are analytic techniques to assess the relative costs and consequences of health services. Their role is to inform the decision-making process. A vast amount of resource allocation decisions are undertaken at the clinical-encounter level; especially in primary care. Since every decision has an opportunity cost, ignoring economic information in family physicians’ practices may have a broad impact on health care efficiency. There is little evidence on the influence of economic evaluation on clinical practice. The objective of the thesis is to understand the role of economic evaluation in family physicians’ practices. Its contributions are presented in four original articles (philosophical, theoretical, methodological, and empirical). The philosophical article suggests that complexity and reflexivity are two important issues for economic evaluation. Complexity thinking is the philosophical perspective (overarching epistemological approach) underpinning the thesis. This way of thinking focuses attention on explanation and understanding and gives particular emphasis to relations and interactions (interactive causality). This increased emphasis on the context and process of data production highlights the importance of reflexivity in the research process. The theoretical article develops a new and different conceptualization of the research problem. The originality of the thesis also lay in the research problem being approached from the perspective of Pierre Bourdieu's sociological theory. Bourdieu’s approach embraces complexity. Moving away from individualist, rational models of action, it can contribute to a more complete and complex understanding of social phenomena by revealing the structuring effects of social fields on the individual’s dispositions and practices. The methodological article presents the protocol of a qualitative embedded multiple-case study research. There were two embedded units of analysis: the family physicians (micro-individual level) and the field of family medicine (macro-structural level). Eight case studies were performed with the family physician as the unit of analysis. The sources of data collection for the micro-level were eight life history interviews with family physicians, documents and observational evidence. The sources of data collection for the macro-level were documents, and eight open-ended focused interviews with key informants, from nine medical organizations. The analytic induction approach to data analysis was used. The empirical article presents all the empirical findings of the thesis. The findings show an increasing integration of economics concepts into the official discourse of family medicine organizations. However, at the level of practice, the economization of this discourse does not seem to be true depictions of reality as the very great majority of the study participants do not embody this discourse. The contributions include a deep understanding of the social processes that influence family physicians’ schemes of perception, thought, appreciation and action with respect to the role of economic evaluation in their practices, and the family physicians’ willingness to contribute to efficient, fair and legitimate resource allocation.