Teses / dissertações sobre o tema "Modélisation des données (informatique) – Informatique"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 50 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Modélisation des données (informatique) – Informatique".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.
Bisgambiglia, Paul-Antoine. "Traitement numérique et informatique de la modélisation spectrale". Corte, 1989. http://www.theses.fr/1989CORT3002.
Texto completo da fonteBelghiti, Moulay Tayeb. "Modélisation et techniques d'optimisation en bio-informatique et fouille de données". Thesis, Rouen, INSA, 2008. http://www.theses.fr/2008ISAM0002.
Texto completo da fonteThis Ph.D. thesis is particularly intended to treat two types of problems : clustering and the multiple alignment of sequence. Our objective is to solve efficiently these global problems and to test DC Programming approach and DCA on real datasets. The thesis is divided into three parts : the first part is devoted to the new approaches of nonconvex optimization-global optimization. We present it a study in depth of the algorithm which is used in this thesis, namely the programming DC and the algorithm DC ( DCA). In the second part, we will model the problem clustering in three nonconvex subproblems. The first two subproblems are distinguished compared to the choice from the norm used, (clustering via norm 1 and 2). The third subproblem uses the method of the kernel, (clustering via the method of the kernel). The third part will be devoted to bioinformatics, one goes this focused on the modeling and the resolution of two subproblems : the multiple alignment of sequence and the alignment of sequence of RNA. All the chapters except the first end in numerical tests
Rhin, Christophe. "Modélisation et gestion de données géographiques multi-sources". Versailles-St Quentin en Yvelines, 1997. http://www.theses.fr/1997VERS0010.
Texto completo da fontePrimet, Pascale. "Une approche de modélisation des logiciels de robots et une méthodologie de configuration". Lyon, INSA, 1988. http://www.theses.fr/1988ISAL0006.
Texto completo da fonteBerthelon, Franck. "Modélisation et détection des émotions à partir de données expressives et contextuelles". Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-00917416.
Texto completo da fonteVillame, Thérèse. "Modélisation des activités de recherche d'information dans les bases de données et conception d'une aide informatique". Paris 13, 1994. http://www.theses.fr/1994PA131009.
Texto completo da fonteEven though information retrieval computer systems are getting more and more numerous and sophisticated, their access remains difficult for non specialists. The aim of this thesis, developed in an industrial context, is to contribute to the design of more user-friendly computer systems providing actual support. We consider that such a support must not be limited to the improvement of existing tools, but must be integrated into the whole process of information retrieval. From this point of view, the focus of analysis is the user's work practice that must be described as well as explained. We therefore believe that work practice shall be captured within the real work environment, to reflect the context in which it is performed. To understand the actual meaning given by the user to his actions, the analysis has to be centered on user's point of view. We addressed information retrieval activity with the methodological and theoretical "course of action" framework, that aims at taking into account and reflecting these work practice dimensions. Then, we proposed a two level model of construction of this activity. The global construction level shows that information retrieval process is organized by users in a coherent way around practical concerns, and outlines the essential characteristics of activity. The local construction level outlines the explanatory elements, especially the "know-low" and knowledge exercised during activity. Both analysis levels are complementary and provide adequate means to offer a set of organized guidelines for the design of a computer system devoted to database information retrieval
Rudloff, David. "Modélisation conceptuelle et optimisation des requêtes dans une interface en langue naturelle pour des bases de données". Université Louis Pasteur (Strasbourg) (1971-2008), 2000. http://www.theses.fr/2000STR13244.
Texto completo da fonteAouadhi, Mohamed Amine. "Introduction de raisonnement probabiliste dans la méthode B événementiel". Thesis, Nantes, 2017. http://www.theses.fr/2017NANT4118.
Texto completo da fonteFrom the best of our knowledge, proof based modeling and verification methods, for example Event-B, are not capable of handling the complete spectrum of quantitative aspects from real-life systems. In particular, modeling probabilistic aspects of systems within Event-B is a problem which has not been well studied in the state of the art. The main difficulties lie in the expression of probabilities and the verification of probabilistic aspects within Event-B. This thesis presents a probabilistic extension of Event-B to support modeling and verification of probabilistic aspects of systems. We denote this extension by probabilistic Event-B. We propose to replace all the non-deterministic choices by probabilistic choices, which allows the description of fully probabilistic systems. The development process in Event-B is based on refinement, we therefore propose some development approaches based on refinement which permit the progres- sive integration of probabilities in Event-B models. In particular, we study the almost-certain convergence of a set of events within this extension. The Event-B method is equipped with the Rodin platform which we extend to take into consideration the new added elements of our extension.The different aspects of this work are illustrated by several case studies: a peer-to-peer communication protocol, a landing gear sys- tem and an emergency brake system
Piolle, Guillaume. "Agents utilisateurs pour la protection des données personnelles : modélisation logique et outils informatiques". Phd thesis, Grenoble 1, 2009. https://theses.hal.science/tel-00401295.
Texto completo da fonteUsage in the domain of multi-agent systems has evolved so as to integrate human users more closely in the applications. Manipulation of private information by autonomous agents has then called for an adapted protection of personal data. This work first examines the legal context of privacy protection and the various computing methods aiming at personal data protection. Surveys show a significant need for AI-based solutions, allowing both reasoning on the regulations themselves and automatically adapting an agent's behaviour to these regulations. The Privacy-Aware (PAw) agent model and the Deontic Logic for Privacy, designed to deal with regulations coming from multiple authorities, are proposed here in this perspective. The agent's normative reasoning component analyses its heterogeneous context and provides a consistent policy for dealing with personal information. PAw agent then automatically controls its own usage of the data with regard to the resulting policy. In order to enforce policies in a remote manner, we study the different distributed application architectures oriented towards privacy protection, several of them based on the principles of Trusted Computing. We propose a complementary one, illustrating a different possible usage of this technology. Implementation of the PAw agent allows demonstration of its principles over three scenarios, thus showing the adaptability of the agent to its normative context and the influence of the regulations over the behaviour of the application
Piolle, Guillaume. "Agents utilisateurs pour la protection des données personnelles : modélisation logique et outils informatiques". Phd thesis, Université Joseph Fourier (Grenoble), 2009. http://tel.archives-ouvertes.fr/tel-00401295.
Texto completo da fonteles divers moyens informatiques destinés à la protection des données personnelles. Il en ressort un besoin de solutions fondées sur les méthodes d'IA, autorisant à la fois un raisonnement sur les réglementations et l'adaptation du comportement d'un agent à ces réglementations. Dans cette perspective, nous proposons le modèle d'agent PAw (Privacy-Aware) et la logique DLP (Deontic Logic for Privacy), conçue pour traiter des réglementations provenant d'autorités multiples. Le composant de raisonnement normatif de l'agent analyse son contexte hétérogène et fournit une politique cohérente pour le traitement des données personnelles. L'agent PAw contrôle alors automatiquement sa propre utilisation des données en regard de cette politique. Afin d'appliquer cette politique de manière distante, nous étudions les différentes architectures d'applications distribuées orientées vers la protection de la vie privée, notamment celles fondées sur les principes du Trusted Computing. Nous en proposons une complémentaire, illustrant la possibilité d'utiliser différemment cette technologie. L'implémentation de l'agent PAw permet la démonstration de ses principes sur trois scénarios, montrant ainsi l'adaptabilité de l'agent à son contexte normatif et l'influence des réglementations sur le comportement de l'application.
Girard, Philippe. "Etude de la conduite de la conception des produits manufacturés : contribution à l'ingénierie des systèmes de conception". Bordeaux 1, 1999. http://www.theses.fr/1999BOR10525.
Texto completo da fonteBiennier, Frédérique. "Modélisation d'une base d'hyperdocuments et méthode connexionniste d'aide a la navigation". Lyon, INSA, 1990. http://www.theses.fr/1990ISAL0104.
Texto completo da fonteHyperdocument bases must at least be as easy to use as paper documents. One of the readers' major problem is to select a path, from a myriad of browsing possibilities along the defined links, adapted to their own goals in order to reach the information they need. First, we propose a ,storage model for the hyperdocument base. By splitting the structure in three levels and an heavy use of persistent trees in each level, redundancy is avoided and several kinds of version can be stored. Then, the documentary base is coupled to an associative epigenetic neural network. By running this network, according to particular activation rules, a path adapted to the users' defined needs is dynamically built. By this way, the system proposes the answers and their organizations which seem to best fit the users' needs. By using several simple parameters, the users control totally the system and they can adjust the answers to their particular needs by several refinements
Delprat, J. Christophe. "Application des concepts et des outils d'HBDS à l'étude de Merise". Paris 6, 1991. http://www.theses.fr/1991PA066461.
Texto completo da fonteGoullioud, Renaud. "Modélisation des structures d'échanges de données dans un environnement parallèle : approche par simulation à événements discrets". Lyon, INSA, 1996. http://www.theses.fr/1996ISAL0093.
Texto completo da fonteThe capabilities of the thermal probe for determining the thermal conductivity have been studied in phase change materials. A numerical modeling in ID cylindrical coordinates has been performed in order to predict thermal behaviour of the probe-material system. Lt takes into account the important variation of specific heat. An enthalpy formulation suppres the need of calculating the solid-liquid interface position. Faisability and limits of the method have been obtained from simulations. Lt needs evaluation of sensitivity coefficients for all parameters. Calculated and noised thermograms have allowed to develop a modified Gauss minimisation method wich removes instability and divergence caused by linear dependence of sensivity coefficients. An experimental set up has been carried out and water-agar gel conductivity has been determined
Bonnail, Nicolas. "Analyse des données, modélisation et commande pour la microscopie en champ proche utilisant des actionneurs piézoélectriques". Aix-Marseille 2, 2001. http://www.theses.fr/2001AIX22059.
Texto completo da fonteLongueville, Véronique. "Modélisation, calcul et évaluation de liens pour la navigation dans les grands ensembles d'images fixes". Toulouse 3, 1993. http://www.theses.fr/1993TOU30149.
Texto completo da fonteBuard, Pierre-Yves. "Modélisation des sources anciennes et édition numérique". Caen, 2015. https://hal.archives-ouvertes.fr/tel-01279385.
Texto completo da fonteAncient sources present a very complex text organization that guides to define dedicated patterns. Beginning with the study of practices in the fields of preservation, analyze and edition of ancient texts, this thesis gives a general work pattern to organize circulation of information on objects and on texts carried by these objects. Paying attention to the cultural dimension of skills envolved in ancient sources facing both digital convergence and global networking guides us to rethink the document in the larger fields of flows and fragments. We propose to organize all informations used in text flows marked up by scholars. Based on many experimentations lead in institutional publishing, we propose a general organisational pattern to manage highly annotated textual resources in order to easily build editorial or research exploitations
Wakim, Bernadette. "La Conception des bases de données orientées objet : Propositions pour la construction d'un AGL". Lyon, INSA, 1991. http://www.theses.fr/1991ISAL0028.
Texto completo da fonteThe recent apparition of the Object Oriented DBMSs requires an enhancement of classical information system design. The complexity of the Information System is accompanied by the development of more sophisticated aide tools and by having recourse to design methodologies. Using the traditional design methods are insufficient to abject approach. For example, the methods formed upon the Entity - Association model are not convenient for the design of applications developed on abject oriented DBMSs. New means must be explored to benefit as much as possible from such DBMSs. We propose some concepts for an Object Oriented methodology. The proposed method, folloing an object oriented approach provides a static and dynamic representation of the applications. Our approach considers different aspects of the same object, depending on the viewpoint of each users. We proceed then to integrate all these views in a global conceptual scheme. The views integration, tockled in some classical conceptual methods arises new problems features and highlights the complexity of phenomena. E can mention, for example. Inheritance conflicts, data semantic, synonymy and polysemy. The target DBMS which guides us is 02. We have developed a tool. (CASE)
Carrère, Cécile. "Prise en compte de l'hétérogénéité tumorale dans l'optimisation d'une chimiothérapie : contrôle optimal, analyse théorique et numérique". Thesis, Aix-Marseille, 2017. http://www.theses.fr/2017AIXM0305.
Texto completo da fonteTo prevent the emergence of drug resistance during cancer chemotherapy, most medical protocols use the maximal tolerated dose (MTD) of drug possible. In a series of in vitro experiments, M.Carré showed that such protocols fail if resistant cells are present in the initial tumour. However, smaller doses of treatment maintain a small, stable tumour sensitive to the drug. To model and optimize such results, G.Chapuisat designed an ODE mathematical model of this experiment. We first study it in the framework of optimal control theory, to design protocols minimizing the tumour size and resistant charge. Then, we study a PDE model of competition-diffusion of two species, to understand the influence of motility on resistance emergence. Finally, we developped with H.Zidani other technics of treatment optimization, using dynamic programming
Fernandez, Conception. "Modélisation des systèmes d'exploitation par HBDS". Paris 6, 1988. http://www.theses.fr/1988PA066235.
Texto completo da fonteAbouchi, Nacer. "Analyse et mesure de performance des reseaux de communication par simulation". Lyon, INSA, 1990. http://www.theses.fr/1990ISAL0057.
Texto completo da fonteThe of communication networks either local area or wide area networks can be partially modelized using the existing mathematical methods. The analysis or the wide area network's performances and of the behaviour as a function of adaptive routing techniques is still badly controlled. In the same way for local area networks, it may be usefully to study quantitatively their access random discrete event simulation is a solution which can take all the specifications of a network into consideration without any simplification. In the first part, after the evocation of the system modelling and simulation roles, more particularly of the communication systems, we introduce ·the principal criteria that should be studied in order to choose correctly the tools of modelling and simulation. A comparative study of the mostly used tools is also included. The goal of the second part is to present the simulation models that we designed to represent the communication networks (local or wide). The 3 rd part is dedicated for "OSIRES" the network simulation tool we developed. Our study will be guided by the analysis of different deterministic or adaptive routing algorithms either found in the existing networks or proposed in the literature. In the last part, the local area network access techniques proposed by ISO will be analysed and compared. Finally, we conclude this thesis by the perspective and what could be do more
Bouchikhi-Ouzzane, Samira. "La modélisation du comportement de l'interprète humain face à des données de télédétection des régions urbaines". Paris, EHESS, 1999. http://www.theses.fr/1999EHES0054.
Texto completo da fonteSall, Ousmane. "Contribution à la modélisation de données multi-sources de type DATAWEB basé sur XML". Littoral, 2010. http://www.theses.fr/2010DUNK0284.
Texto completo da fonteEnvironmental data in the Senegal River Valley have been collected for many years from the activities of the various experts and organisms involved therein. These spatio-temporal data display certain specific semantic and structural features depending on the owners. Various systems have been used for the collection and storage of the data, thus, conferring them a structural dimension of heterogeneity, to which a semantic dimension related to them description has been attached, with a proper vocabulary controlled within every organism or expert. In this context, we perform an integration in three phases. First, a structural integration phase, based on the use of XML documents warehouses (called dataweb), allows us to create a warehouse for each agency involved in the project. A second step is to make the integration of these XML documents warehouses by associating a knowledge-base to each warehouse thus constituting semantic dataweb. That is done by an automatic building of OWL ontology starting from XML dataweb and by re-use of the Agricultural Ontology Service. A third mediation phase permits to query in a uniform manner the different semantic dataweb via a web application
Tchienehom, Pascaline Laure. "Modélisation et exploitation de profils : accès sémantique à des ressources". Toulouse 1, 2006. http://www.theses.fr/2006TOU10026.
Texto completo da fonteResources access is a broader view of information access where resources can be extended to any kind of persons, things and actions. Heterogeneity of resources has led to development of several access methods. These methods rely on the description of resources that we call profile, and also on the definition of using ru les of those profiles in order to achieve a specific task (retrieval, filtering, etc. ). Profiles and their using rules differ from one application to another. For applications cooperation, there is a real need of a flexible and homogenous framework for the modelling and use of profiles. Our research work aims at prodiving solutions in those two aspects, thanks to a profile generic model and methods for semantic analysis and matching of instances of this model. In order to validate our proposals, an assistant tool for profiles construction, visualization and semantic analysis has been implemented. Furthermore, an evaluation of methods for profiles semantic analysis and matching has been carried out
Shariat, Ghodous Parisa. "Modélisation intégrée de données de produit et de processus de conception". Lyon 1, 1996. http://www.theses.fr/1996LYO10208.
Texto completo da fontePolitaki, Dimitra. "Vers la modélisation de clusters de centres de données vertes". Thesis, Université Côte d'Azur (ComUE), 2019. http://www.theses.fr/2019AZUR4116.
Texto completo da fonteData center clusters energy consumption is rapidly increasing making them the fastest-growing consumers of electricity worldwide. Renewable electricity sources and especially solar energy as a clean and abundant energy can be used, in many locations, to cover their electricity needs and make them "green" namely fed by photovoltaics. This potential can be explored by predicting solar irradiance and assessing the capacity provision for data center clusters. In this thesis we develop stochastic models for solar energy; one at the surface of the Earth and a second one which models the photovoltaic output current. We then compare them to the state of the art on-off model and validate them against real data. We conclude that the solar irradiance model can better capture the multiscales correlations and is suitable for small scale cases. We then propose a new job life-cycle of a complex and real cluster system and a model for data center clusters that supports batch job submissions and cons iders both impatient and persistent customer behavior. To understand the essential computer cluster characteristics, we analyze in detail two different workload type traces; the first one is the published complex Google trace and the second, simpler one, which serves scientific purposes, is from the Nef cluster located at the research center Inria Sophia Antipolis. We then implement the marmoteCore-Q, a tool for the simulation of a family of queueing models based on our multi-server model for data center clusters with abandonments and resubmissions
Chauveau, Estelle. "Optimisation des routes maritimes : un système de résolution multicritère et dépendant du temps". Electronic Thesis or Diss., Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0139.
Texto completo da fonteMaritime charter companies try to use weather forecast in order to optimize the journeys of their fleet. Let consider a boat transporting merchandise (or goods) from a port to another. Given the date and time of departure and trying to minimize fuel consumption, determining the best route to take is a difficult problem in the meaning of complexity theory. Moreover, the best route likely changes during the journey leading to an even more difficult problem. To tackle this type of issues, many routing software are available. However, to our knowledge, the state of the art still lacks of algorithms capable of efficiently solving the problem while considering multiple and sometime contradictory criteria.The aim of the this PhD thesis is to build a relevant modelling framework to solve this problem as well as to develop algorithms to be used and validated in industrial conditions.The first task undertaken was the development of a methodology to format raw data, mainly spatial and weather data, into usable input data for mathematical model. This first step was essential as it conditioned which algorithms could be used, and consequently their efficiency. We chose to model the problem as a graph that takes time into account.The second task was the development of a multi-objective and time dependent algorithm. This algorithm identifies pareto-optimum paths within the graph.A third work focused on processing the paths in order to optimize speed during the whole journey, and as a consequence, fuel consumption
Chauveau, Estelle. "Optimisation des routes maritimes : un système de résolution multicritère et dépendant du temps". Thesis, Aix-Marseille, 2018. http://www.theses.fr/2018AIXM0139/document.
Texto completo da fonteMaritime charter companies try to use weather forecast in order to optimize the journeys of their fleet. Let consider a boat transporting merchandise (or goods) from a port to another. Given the date and time of departure and trying to minimize fuel consumption, determining the best route to take is a difficult problem in the meaning of complexity theory. Moreover, the best route likely changes during the journey leading to an even more difficult problem. To tackle this type of issues, many routing software are available. However, to our knowledge, the state of the art still lacks of algorithms capable of efficiently solving the problem while considering multiple and sometime contradictory criteria.The aim of the this PhD thesis is to build a relevant modelling framework to solve this problem as well as to develop algorithms to be used and validated in industrial conditions.The first task undertaken was the development of a methodology to format raw data, mainly spatial and weather data, into usable input data for mathematical model. This first step was essential as it conditioned which algorithms could be used, and consequently their efficiency. We chose to model the problem as a graph that takes time into account.The second task was the development of a multi-objective and time dependent algorithm. This algorithm identifies pareto-optimum paths within the graph.A third work focused on processing the paths in order to optimize speed during the whole journey, and as a consequence, fuel consumption
Allard, Pierre. "Modélisation logique de l'analyse multidimensionnelle des relations multivaluées : application à l'exploration de données géographiques". Rennes 1, 2011. http://www.theses.fr/2011REN1S117.
Texto completo da fonteSince the beginning of data processing, the companies have realized the importance of information management solutions. The gathered data are a powerful asset to study the trends and make choices for the future. Business Intelligence appeared in the mid-90s (information synthesis to assist decision-making) with OLAP (On-Line Analytical Processing, a tools set for the exploration, analysis and display of multidimensional data) and S-OLAP (Spatial OLAP, OLAP with spatial support). A OLAP user, unspecializedin computer sciences,does not needto knowa language to handle multidimensional data, create graphics, etc. However, we consider that the OLAP data modelistoo rigid,because ofitsa priorimultidimensionnal structureandbecause each content must have a single aggregate value. This observation is the starting point of this thesis. We propose a new paradigm of information system, able to analyze and explore multidimensional and multivalued data. To model this paradigm, we use Logical Information Systems (LIS), which share features with OLAP, especially on the data mining aspects. Our paradigm is defined by a flexible data model, an easy navigation and modular representation. We conclude this thesisby the application of this paradigm on several topics, including the exploration of geographic data
Aldanondo, Michel. "Modélisation des données pour la planification et l'ordonnancement de la production : mécanismes d'agrégation et de désagrégation". Toulouse, INSA, 1992. http://www.theses.fr/1992ISAT0012.
Texto completo da fonteLecoq, Jean-Christophe. "Modélisation logique de liens entre attributs hétérogènes, fondée sur une technique de fermeture vectorielle généralisée dans un environnement multimédia à couplage". Rouen, INSA, 2004. http://www.theses.fr/2004ISAM0008.
Texto completo da fonteZaki, Chamseddine. "Modélisation spatio-temporelle multi-échelle des données dans un SIG urbain". Ecole Centrale de Nantes, 2011. http://www.theses.fr/2011ECDN0024.
Texto completo da fonteBame, Ndiouma. "Gestion de donnée complexes pour la modélisation de niche écologique". Electronic Thesis or Diss., Paris 6, 2015. http://www.theses.fr/2015PA066125.
Texto completo da fonteThis thesis concerns large scale biodiversity data management. Its objectives are to optimize queries for researchers who have free access to biodiversity worldwide data. These data which are shared by worldwide research laboratories are federated in GBIF data warehouse. GBIF makes accessible its data to researchers, policy makers and general public. With a significant amount of data and a rapid growth of data and users that express new needs, the GBIF portal is facing a double problem of expressiveness of queries and of efficiency. Thus, we propose a decentralized solution for biodiversity data interrogation. Our solution combines the resources of several of remote and limited machines to provide the needed computing and storage power to ensure system responsiveness for users. It also provides high-level query interface which is more expressive for users. Then, we propose a dynamic data distribution on demand approach. This approach which is based on data properties and characteristics of users analysis queries adapts dynamically machines capacities to users demands. Then, we propose a queries optimization approach that adapts dynamically data placement and machines loads according to performances in order to process users queries within deadlines. We experimentally validated our solution with real GBIF data concerning 100 million observation data
Thibault, Serge. "Modélisation morpho-fonctionnelle des réseaux d'assainissement urbain a l'aide du concept de dimension fractale". Lyon, INSA, 1987. https://theses.hal.science/tel-00277119.
Texto completo da fonteRoyan, Jérôme. "Visualisation interactive de scènes urbaines vastes et complexes à travers un réseau". Rennes 1, 2005. http://www.theses.fr/2005REN1S013.
Texto completo da fonteGuehis, Sonia. "Modélisation, production et optimisation des programmes SQL". Paris 9, 2009. https://bu.dauphine.psl.eu/fileviewer/index.php?doc=2009PA090076.
Texto completo da fonteCartier, Emmanuel. "Repérage automatique des expressions définitoires : modélisation de l'information définitoire, méthode d'exploration contextuelle, méthodologie de développement des ressources linguistiques, description des expressions du français contemporain, implémentation informatique". Paris 4, 2005. http://www.theses.fr/2004PA040228.
Texto completo da fonteThis work deals with automatic extraction of definitory statements. It has three main goals : formal description of French definitory statements and implementation into e-doc Labs software e-doc Finder; contribution to a software specification for text mining; contribution to the methodology and autoomatization of linguistic resources development. We describe a conceptuel model of definitory statements, composed of a term, a definitory semantic relation, a definition, a domain field and a temporal assignment. Definitory semantic relations are identification, categorisation, specification, attribution. We describe the linguistic patterns for each of these elements as well as textual integration phenomena (syntactiv transformations, negation, coordination, anaphora). Second, we describe a model for text mining, inspired from the Contextual Exploration Method, that has three main properties : externality, adaptability of linguistic resources; high expressive power of grammar rules. Last, we give methodological elements to set up linguistic ressources in such a system and evoke the steps toward an automatic learning system of semantic classes and patterns
Cao, Van Toan. "La mise en registre automatique des surfaces acquises à partir d'objets déformables". Doctoral thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/26764.
Texto completo da fonteThree-dimensional registration (sometimes referred to as alignment or matching) is the process of transforming many 3D data sets into the same coordinate system so as to align overlapping components of these data sets. Two data sets aligned together can be two partial scans from two different views of the same object. They can also be two complete models of an object generated at different times or even from two distinct objects. Depending on the generated data sets, the registration methods are classified into rigid registration or non-rigid registration. In the case of rigid registration, the data is usually acquired from rigid objects. The registration process can be accomplished by finding a single global rigid transformation (rotation, translation) to align the source data set with the target data set. However, in the non-rigid case, in which data is acquired from deformable objects, the registration process is more challenging since it is important to solve for both the global transformation and local deformations. In this thesis, three methods are proposed to solve the non-rigid registration problem between two data sets (presented in triangle meshes) acquired from deformable objects. The first method registers two partially overlapping surfaces. This method overcomes some limitations of previous methods to solve large global deformations between two surfaces. However, the method is restricted to small local deformations on the surface in order to validate the descriptor used. The second method is developed from the framework of the first method and is applied to data for which the deformation between the two surfaces consists of both large global deformation and small local deformations. The third method, which exploits both the first and second method, is proposed to solve more challenging data sets. Although the quality of alignment that is achieved is not as good as the second method, its computation time is accelerated approximately four times since the number of optimized parameters is reduced by half. The efficiency of the three methods is the result of the strategies in which correspondences are correctly determined and the deformation model is adequately exploited. These proposed methods are implemented and compared with other methods on various types of data to evaluate their robustness in handling the non-rigid registration problem. The proposed methods are also promising solutions that can be applied in applications such as non-rigid registration of multiple views, 3D dynamic reconstruction, 3D animation or 3D model retrieval.
Kellal, Abderrazak. "Contribution à l’étude des asservissements électropneumatiques". Lyon, INSA, 1987. http://www.theses.fr/1987ISAL0025.
Texto completo da fonteYatim, Houssein. "Etude théorique et expérimentale du procédé de distillation extractive discontinue". Lyon, INSA, 1993. http://www.theses.fr/1993ISAL0102.
Texto completo da fonteThe process of batch extractive distillation may provide the advantages of both the batch and the extractive distillation. So far this process has not been applied at all probably due to its complexity. An algorithm and a computer program were developed for simulating the experiments of a batch extractive distillation process (separation on acetone and methanol on a pilot plant column containing 32 bubble cap trays applying water as solvent). For the integration of the set of non-linear differential equations (component material balances) the Runge-Kutta method was used, the initial column profile was computed at total reflux. For saving computation time a two dimensional linear interpolation method was applied for the ternary mixture when calculating VLE. The experimental and calculated results are compared. The influence of the variation of the main operating parameters have been examined to evaluate the recovery of the separation of the process
Richard, Claude. "Etude expérimentale et théorique de composites piézoélectriques de connectivité 1. 3. 1. Pour hydrophone". Lyon, INSA, 1992. http://www.theses.fr/1992ISAL0039.
Texto completo da fonte1. 3. 1 piezoelectric-polymer composites have been made on the basis of non-mechamical contact between the polymer matrix and the piezoelectric rads within the transverse directions. The piezoelectric rads (PZT) are put in hales drilled in the matrix and are held between two metallic plates. The plates, which are also used as electrodes, provide the mechanical stress transfer from the resilient matrix toward the stiff rods and a reinforcement of the structure against the transverse loads. The hydrostatic figures of merit lie between 4 times and 12 times the Lead Titanate one. The influence of the PZT volume fraction and the plate thickness on the sensitivity and on the pressure stability are shown. Experimental and modelling results are given and discussed. The great influence of the uniaxial stress dependence of the PZT longitudinal figure of merit on the composite stability is analysed. Measurements of the longitudinal of merit of various PZT under high uniaxial stress are made and are the basis of an optimization of the piezo-composite. This optimization shows the best suitability of the hard PZT for this dee underwater hydrophone application
Fénié, Patrick. "Graico : méthode de modélisation et de conception de systèmes d'exploitation de systèmes de production". Bordeaux 1, 1994. http://www.theses.fr/1994BOR10622.
Texto completo da fonteAbbar, Sofiane. "Modèle d'accès personnalisé pour les plateformes de livraison de contenu : une approche basée sur les services". Versailles-St Quentin en Yvelines, 2010. http://www.theses.fr/2010VERS0053.
Texto completo da fonteAccess to relevant information adapted to user’s preferences and contexts is a challenge in many applications. In this thesis, we address the personalization in the context of content delivery platforms. Hence we propose a personalized access model based on multidimensional models of user’s profile and context. The PAM provides a set of services that enable applications to take into account both user’s profile and context within personalization processes, hence delivering more accurate contents. PAM services include, but are not limited to, an automatic approach for contexts and contextual preferences discovery, the projection of user’s profile within his current context, and matching of profiles and contents to provide user recommendations. We also show that PAM services allow a smooth integration of context within personalized applications without changing their inner processes. Thus, we instantiated the PAM to define context-aware recommender systems used to evaluate our approach
Behlouli, Hassan. "Apprentissages auto-améliorants et modélisation de la dynamique temporelle de données évolutives par réseaux de neurones : application au diagnostic et la prédiction en électrocardiologie quantitative". Lyon, INSA, 1998. http://www.theses.fr/1998ISAL0034.
Texto completo da fonteWe present various methodologies to improve decision making on follow-up patient data and their validation in the field of quantitative electrocardiology. First, we propose an extension to the classical Pattern Recognition supervised learning model by introducing a self-improving concept based on information min. Ing from undocumented datasets. Then we apply this concept to the particular case of neural networks based supervised learning and we propose a self-improving learning methodology integrating iteratively, in the initial learning set, undocumented data that are extracted from databases not validated by experts. This method involves different concepts such as neural network combination, rejection of ambiguous cases and control of the learning process by cross-validation. Using this approach for the categorisation of cardiac diseases we could significantly improve the performance of the original classifiers. Secondly, we developed a methodology based on neural networks to model the dynamic behavior of the heart particularly for predicting one of the main descriptors of the ventricular repolarisation, i. E. : the QT interval as a function of the RR interval that represents the inverse of heart rate. An initial evaluation on a series of sequences of 30 electrocardiograms (3D ECG) continuously '1 recorded over 24 hours allowed to demonstrate the pertinence of the models and to study the ray of some parameters (e. G. Memory effect and noise level) on the prediction quality of this model We conclude by presenting another outcome of our work, a series of generic analysis processing tool s that were integrated into the MATIS environment (Mathematical Tools Integration Software), which is a fundamental building black for the future workstation of the research cardiologist
Letellier, Guillaume. "Modélisation du complexe récepteur muscarinique-toxine MT7 à partir de données thermodynamiques". Paris 7, 2008. https://tel.archives-ouvertes.fr/tel-00447060.
Texto completo da fonteMuscarinic acetylcholine receptors are transmembrane proteins involved into various biological process. Muscarinic toxin MT7 is a powerful modulator of theses receptors. Furthermore, this toxin is the only known ligand specific of the subtype I of muscarinic receptors. We have study by the molecular basis of the interaction between MT7 toxin and hMl receptor with molecular modeling tools. To begin, a sampling of both partners' structures by molecular dynamics has been performed. Large scale motions of the e2 loop of the receptor have been predicted by activated molecular dynamics. Then the toxin structure has been docked on the receptor by molecular dynamics under ambiguous restraints, derived from mutagenesis experiments. This model was then optimized by a free molecular dynamics into explicit membrane environment. Finally, binding free energies have been back calculated to validate the model. We predict that the toxin binds a dimer of hMl receptor. The core of the interaction is localized on a first monomer in contact with loops II and III of the toxin. The toxin also establishes hydrophobic interactions with the second monomer and toxin loop I. The analysis of this model brings structural basis for understanding the high affinity of this toxin and its selectivity for the subtype 1 of muscarinic receptors. The selectivity appears to be mainly determined by extracellular loop e2 of the receptor
Zoghlami, Asma. "Modélisation et conception de systèmes d’information géographique gérant l’imprécision". Paris 8, 2013. http://octaviana.fr/document/170325245#?c=0&m=0&s=0&cv=0.
Texto completo da fonteOur work focuses on the management of imprecise spatiotemporal knowledge in the construction of geographic information systems. We more particularly deal with their conceptualization, representation and structure using the fuzzy set theory. As the information system design is usually done using the Unified Modeling Language (UML), we favored approaches extending it. Therefore, since PERCEPTORY, with its PictograF language, extends the UML for modeling GIS, and Fuzzy UML enriches UML for the management of imprecision, we proposed an approach called F-PERCEPTORY exploiting their respective advantages. The second part of our work is focused on the implementation (structure, constraints, rules) of GIS modeled using our approach. For this, we chose a data representation by connex, normalized fuzzy sets stored via α-cuts. Finally, the last part of our work aimed to propose a methodology for the study of urban trajectories from the past to the future based on the stored information, descriptive and logical modeling of spatial dynamics taking account of imprecise, and on rule mining processes. Previous contributions have been introduced with the aim of managing Reims archaeological data and the study of the spatial dynamics of the city of Saint-Denis
Blin-Lacroix, Jean-Luc. "Analyse et modélisation des fractures et des systèmes fracturaux en milieu rocheux : contribution à l'élaboration d'une chaîne de logiciels intégrant l'acquisition des données, le traitement analytique et statistique, la simulation". Vandoeuvre-les-Nancy, INPL, 1988. http://www.theses.fr/1988NAN10342.
Texto completo da fonteVerdie, Yannick. "Modélisation de scènes urbaines à partir de données aeriennes". Phd thesis, Université Nice Sophia Antipolis, 2013. http://tel.archives-ouvertes.fr/tel-00881242.
Texto completo da fonteBernardi, Fabrice. "Conception de bibliothèques hiérarchisées de modèles réutilisables selon une approche orientée objet". Corte, 2002. http://www.theses.fr/2002CORT3068.
Texto completo da fonteBaietto, Marie-Christine. "Le contact unilatéral avec frottement le long de fissures de fatigue dans les liaisons mécaniques". Lyon, INSA, 1989. http://www.theses.fr/1989ISAL0088.
Texto completo da fonte