Дисертації з теми "Méthodes guidées par des données"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-50 дисертацій для дослідження на тему "Méthodes guidées par des données".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Espinoza, Molina Daniela. "Méthodes avancées pour l'extraction d'informations a partir des images à haute résolution SAR: Méthodes d'évaluation guidées par les modèles utilisateur et par la structure des données." Phd thesis, Paris, Télécom ParisTech, 2011. https://pastel.archives-ouvertes.fr/pastel-00676833.
Повний текст джерелаWe are concerned in this thesis by the problem of Image Information Mining for exploitation and understanding of high resolution (HR) Synthetic Aperture Radar (SAR) data. Advances in this field of research contribute to the elaboration of tools for interactive exploration and extraction of the image content. In this context, analyzing and evaluating adequate image models and image information extraction methods according to user conjectures, constitute challenging issues. Our work contributes with solutions to HR SAR modeling and content estimation with a data-driven evaluation approach, and with the design of image mining scenarios by involving the user and his conjectures, achieved through an user-driven evaluation approach. We perform a data-driven evaluation and validation of automatic information extraction methods for high resolution SAR scenes based on Gibbs Random Field (GRF) models. Specifically, Gauss Markov Random Field (GMRF) and Auto-binomial (ABM) models are implemented in information extraction methods following the two levels of Bayesian inference: model fitting and model selection. We perform detection tests on classes such as cities, vegetation, and water bodies; using specific qualitative metrics to quantify the quality of speckle removal. The accuracy of modelling and characterization of the image content are determined using both supervised and unsupervised classifications, and confusion matrices. We conclude that both methods enhance the image during the despeckling process. The GMRF model is more suitable for natural scenes and the ABM model for man-made structures. We design and generate two study cases: oil spill and flood detection
Jose, Sagar. "Stratégies d'apprentissage multimodal pour le diagnostic et le pronostic de la santé des machines industrielles dans un contexte de manque de données." Electronic Thesis or Diss., Université de Toulouse (2023-....), 2024. http://www.theses.fr/2024TLSEP093.
Повний текст джерелаPrognostics and Health Management (PHM) with data-driven techniques is heavily dependent upon the availability of extensive and high-quality datasets, a requirement often challenging to fulfill in industrial condition monitoring environments. This discrepancy creates a significant gap between state-of-the-art PHM methodologies and their practical application in real-world scenarios. The prevailing focus in data-driven PHM research on unimodal datasets highlights the potential of multimodal data to bridge this gap.This thesis explores the integration of multimodal data to advance PHM models for industrial machines. It systematically addresses pivotal challenges such as data missingness and noise, sparse and irregular datasets, class imbalance, and the scarcity of run-to-failure data. The research develops innovative methodologies that incorporate multiple data modalities and harness domain-specific expertise to create robust predictive models.The primary contributions of this research include:1. Cross-modal attention-based learning: A new multimodal learning method is designed to mitigate the limitations posed by missing and noisy data. It allows integrating information across multiple modalities, thereby enhancing the accuracy and robustness of predictive models.2. Expert-knowledge-assisted multimodal diagnostics methodology: This methodology combines domain expertise with multimodal learning to enable comprehensive diagnostics, thereby improving fault detection and classification in industrial machinery.3. Graph-based prognostics approach: This innovative approach constructs run-to-failure trajectories from incomplete data using graph-based techniques, offering a significant advancement in failure prognostics.These methodologies were rigorously validated using both simulation and industrial dataset of hydrogenerators, demonstrating significant improvements in PHM and predictive maintenance capabilities. The results underscore the potential of multimodal data to significantly enhance the reliability and efficiency of PHM methods and algorithms.This thesis proposes a comprehensive framework for leveraging diverse data sources and domain expertise, promising to transform maintenance strategies and reducing operational costs across various industries. The findings pave the way for future research and practical implementations, positioning multimodal data integration as a pivotal advancement in the field of PHM
Hey, Silke. "Thermothérapies guidées par IRM : développements méthodologiques en thermométrie par IRM et méthodes d’asservissement automatique." Thesis, Bordeaux 1, 2010. http://www.theses.fr/2010BOR14191/document.
Повний текст джерелаMR-guided high-intensity focused ultrasound (HIFU) using proton resonance frequency (PRF) based thermometry is a promising technique for non-invasive ablations in tumor therapy as well as for targeted drug delivery and the activation of transgenes. This work presents further developments in the field of PRF thermometry in the presence of periodical physiological motion and the associated magnetic field variations. Using the examples of thermometry in the human breast and the human heart, new correction strategies are presented which extend the established multi-baseline phase correction to include a model of the phase variation and external sensor readings from a pencil-beam navigator. In addition further factors, namely the presence of fat in the breast and blood flow in the heart influencing the performance of MR thermometry in these organs are examined.In the second part of this work, the issue of precise temperature control has been approached in two ways. First, an improved proportional, integral and derivative (PID) controller using adaptive control parameters is developed. By expanding the concept of temperature control to 3D, an implementation of volumetric heating is presented. A novel slice sweep technique provides volumetric anatomic and temperature information in near-real time. The combination with 2D motion compensation and adaptation of the ultrasound beam position allows to achieve volumetric heating according to a pre-defined target temperature or thermal dose value even in the presence of motion
Ben, Fredj Feten. "Méthode et outil d’anonymisation des données sensibles." Thesis, Paris, CNAM, 2017. http://www.theses.fr/2017CNAM1128/document.
Повний текст джерелаPersonal data anonymization requires complex algorithms aiming at avoiding disclosure risk without losing data utility. In this thesis, we describe a model-driven approach guiding the data owner during the anonymization process. The guidance may be informative or suggestive. It helps the data owner in choosing the most relevant algorithm given the data characteristics and the future usage of anonymized data. The guidance process also helps in defining the best input values for the algorithms. In this thesis, we focus on generalization algorithms for micro-data. The knowledge about anonymization is composed of both theoretical aspects and experimental results. It is managed thanks to an ontology
Ben, Fredj Feten. "Méthode et outil d’anonymisation des données sensibles." Electronic Thesis or Diss., Paris, CNAM, 2017. http://www.theses.fr/2017CNAM1128.
Повний текст джерелаPersonal data anonymization requires complex algorithms aiming at avoiding disclosure risk without losing data utility. In this thesis, we describe a model-driven approach guiding the data owner during the anonymization process. The guidance may be informative or suggestive. It helps the data owner in choosing the most relevant algorithm given the data characteristics and the future usage of anonymized data. The guidance process also helps in defining the best input values for the algorithms. In this thesis, we focus on generalization algorithms for micro-data. The knowledge about anonymization is composed of both theoretical aspects and experimental results. It is managed thanks to an ontology
Moreau, Frédérique. "Méthodes de traitement de données géophysiques par transformée en ondelettes." Phd thesis, Université Rennes 1, 1995. http://tel.archives-ouvertes.fr/tel-00656040.
Повний текст джерелаTrellet, Mikael. "Exploration et analyse immersives de données moléculaires guidées par la tâche et la modélisation sémantique des contenus." Thesis, Université Paris-Saclay (ComUE), 2015. http://www.theses.fr/2015SACLS262/document.
Повний текст джерелаIn structural biology, the theoretical study of molecular structures has four main activities organized in the following scenario: collection of experimental and theoretical data, visualization of 3D structures, molecular simulation, analysis and interpretation of results. This pipeline allows the expert to develop new hypotheses, to verify them experimentally and to produce new data as a starting point for a new scenario.The explosion in the amount of data to handle in this loop has two problems. Firstly, the resources and time dedicated to the tasks of transfer and conversion of data between each of these four activities increases significantly. Secondly, the complexity of molecular data generated by new experimental methodologies greatly increases the difficulty to properly collect, visualize and analyze the data.Immersive environments are often proposed to address the quantity and the increasing complexity of the modeled phenomena, especially during the viewing activity. Indeed, virtual reality offers a high quality stereoscopic perception, useful for a better understanding of inherently three-dimensional molecular data. It also displays a large amount of information thanks to the large display surfaces, but also to complete the immersive feeling with other sensorimotor channels (3D audio, haptic feedbacks,...).However, two major factors hindering the use of virtual reality in the field of structural biology. On one hand, although there are literature on navigation and environmental realistic virtual scenes, navigating abstract science is still very little studied. The understanding of complex 3D phenomena is however particularly conditioned by the subject’s ability to identify themselves in a complex 3D phenomenon. The first objective of this thesis work is then to propose 3D navigation paradigms adapted to the molecular structures of increasing complexity. On the other hand, the interactive context of immersive environments encourages direct interaction with the objects of interest. But the activities of: results collection, simulation and analysis, assume a working environment based on command-line inputs or through specific scripts associated to the tools. Usually, the use of virtual reality is therefore restricted to molecular structures exploration and visualization. The second thesis objective is then to bring all these activities, previously carried out in independent and interactive application contexts, within a homogeneous and unique interactive context. In addition to minimizing the time spent in data management between different work contexts, the aim is also to present, in a joint and simultaneous way, molecular structures and analyses, and allow their manipulation through direct interaction.Our contribution meets these objectives by building on an approach guided by both the content and the task. More precisely, navigation paradigms have been designed taking into account the molecular content, especially geometric properties, and tasks of the expert, to facilitate spatial referencing in molecular complexes and make the exploration of these structures more efficient. In addition, formalizing the nature of molecular data, their analysis and their visual representations, allows to interactively propose analyzes adapted to the nature of the data and create links between the molecular components and associated analyzes. These features go through the construction of a unified and powerful semantic representation making possible the integration of these activities in a unique interactive context
Park, Soo-Uk. "Les Méthodes d'analyse par enveloppement des données : généralisations et applications nouvelles." Paris, EHESS, 1994. http://www.theses.fr/1994EHES0064.
Повний текст джерелаOur work, in its theoretical part, presents the concept of productive efficiency, which is linked to the problem of the performance of a production unit on a market. This concept has, for some time, been the object of important advances. In particular, dea methods, which rests on linear programming methods. All these approches have been developped under a renewed form, with some original results. This dissertation tries to apply some variants of the dea method to the corean industry of thermal electricity production. It proposes an attempt at obtaining the signification of these dea efficiency, as measured in our sample of power plants, by introducing these efficiencies as exogeneous variables in econometric production function models
Afonso, Filipe. "Méthodes prédictives par extraction de règles en présence de données symboliques." Paris 9, 2005. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2005PA090067.
Повний текст джерелаWe first extend linear regression methods to the case of interval, diagram, histogram, taxonomical and hierarchical variables. Second, association rules are extended to the case of interval, diagram, taxonomical and hierarchical variables. We will be able to discover rules at the level of the concepts thanks to these methods. For example, instead of mining rules between different items of some transactions recorded in a retail organization like in the classical case, we discover rules at the level of the customers in order to study their purchase behavior. The Apriori algorithm for the extraction of association rules is extended to these symbolic data. The symbolic linear regression is, then, used in order to study and select the symbolic association rules. Afterwards, we give a mathematical support to these association rules using Galois lattices theory
Giraudel, Jean-Luc. "Exploration des données et prédiction en écologie par des méthodes d'intelligence artificielle." Toulouse 3, 2001. http://www.theses.fr/2001TOU30138.
Повний текст джерелаDruet, Tom. "Tomographie passive par ondes guidées pour des applications de contrôle santé intégré." Thesis, Valenciennes, 2017. http://www.theses.fr/2017VALE0032/document.
Повний текст джерелаThis manuscript presents a baseline-free quantitative method for the imaging of corrosion flaws present in thin plates. This method only requires an embedded guided waves sensors network in a fully passive way. The field ofapplications are Structural Health Monitoring (SHM) of critical structures with heavy constrains on both sensors intrusiveness and diagnostic reliability. A promising solution allowing to increase the number of measurement points without increasing the intrusiveness of the system is provided by the Fiber Bragg Gratings (FBGs). However, unlike piezoelectric transducers generally used in SHM, the FBGs cannot emit elastic waves. The idea consists in using passive methods in order to retrieve the Green function from elastic diffuse fields - naturally present in structures - measured simultaneously between two sensors. In this manuscript, two passive methods are studied: the ambient noise correlation and the passive inverse filter. It is shown that the latter gives better results when coupled with tomography. Several tomography algorithms are assessed with numerical simulations and then applied to active and passive datasets measured by a PZT network. In order to make passive tomography robust, a time of flight identification method is proposed, based on a time-frequency representation. Finally, a novel experimental demonstration of passive measurements with FBGs only is presented, suggesting high potential for FBGs passive tomography
Nauleau, Pierre. "Vers la mesure d'ondes circonférentielles guidées par la coque corticale du col du fémur." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2013. http://tel.archives-ouvertes.fr/tel-00931778.
Повний текст джерелаLoslever, Pierre. "Étude ergonomique du poste bureautique : Approche par les méthodes multidimensionnelles d'analyse des données." Valenciennes, 1988. https://ged.uphf.fr/nuxeo/site/esupversions/0fb2137f-578e-43ec-9bea-1ae708b27a3a.
Повний текст джерелаBarras, Jordan. "Prédiction modale du rayonnement d’ondes élastiques guidées par une source quelconque dans une structure fine - application au contrôle non destructif." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASG020.
Повний текст джерелаThis thesis focuses on the modelling of the propagation of elastic guided waves (GW) in thin plate like structures for their non-destructive testing (NDT). These waves are generated by any ultrasonic transducer positioned on the surface of the piece - for example, a PZT ceramic or an EMAT. A semianalytical model, the so-called GW modal pencil model, has been developed. It is based on the geometrical acoustic approximation, which makes it particularly efficient from a numerical point-of-view and able to simulate GW propagation over long distances faster than with conventional numerical models. The displacement field is then predicted only at the points of interest. The pencil model can be used to generically deal with multiple reflections of GW on plate edges. It also takes into account plates made of either metallic (isotropic) or composite (anisotropic) materials. The plate can be curved and have continuously variable mechanical properties. The waveforms are obtained in the form of their modal decomposition, which greatly eases their interpretation. Finally, comparisons with a finite element model allow the pencil approach to be validated
Soulez, Ferréol. "Une approche problèmes inverses pour la reconstruction de données multi-dimensionnelles par méthodes d'optimisation." Phd thesis, Université Jean Monnet - Saint-Etienne, 2008. http://tel.archives-ouvertes.fr/tel-00379735.
Повний текст джерелаL'approche « problèmes inverses » consiste à rechercher les causes à partir des effets ; c'est-à-dire estimer les paramètres décrivant un système d'après son observation. Pour cela, on utilise un modèle physique décrivant les liens de causes à effets entre les paramètres et les observations. Le terme inverse désigne ainsi l'inversion de ce modèle direct. Seulement si, en règle générale, les mêmes causes donnent les mêmes effets, un même effet peut avoir différentes causes et il est souvent nécessaire d'introduire des a priori pour restreindre les ambiguïtés de l'inversion. Dans ce travail, ce problème est résolu en estimant par des méthodes d'optimisations, les paramètres minimisant une fonction de coût regroupant un terme issu du modèle de formation des données et un terme d'a priori.
Nous utilisons cette approche pour traiter le problème de la déconvolution aveugle de données multidimensionnelles hétérogène ; c'est-à-dire de données dont les différentes dimensions ont des significations et des unités différentes. Pour cela nous avons établi un cadre général avec un terme d'a priori séparable, que nous avons adapté avec succès à différentes applications : la déconvolution de données multi-spectrales en astronomie, d'images couleurs en imagerie de Bayer et la déconvolution aveugle de séquences vidéo bio-médicales (coronarographie, microscopie classique et confocale).
Cette même approche a été utilisée en holographie numérique pour la vélocimétrie par image de particules (DH-PIV). Un hologramme de micro-particules sphériques est composé de figures de diffraction contenant l'information sur la la position 3D et le rayon de ces particules. En utilisant un modèle physique de formation de l'hologramme, l'approche « problèmes inverses » nous a permis de nous affranchir des problèmes liées à la restitution de l'hologramme (effet de bords, images jumelles...) et d'estimer les positions 3D et le rayon des particules avec une précision améliorée d'au moins un facteur 5 par rapport aux méthodes classiques utilisant la restitution. De plus, nous avons pu avec cette méthode détecter des particules hors du champs du capteur élargissant ainsi le volume d'intérêt d'un facteur 16.
Ollier, Edouard. "Sélection de modèles statistiques par méthodes de vraisemblance pénalisée pour l'étude de données complexes." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEN097.
Повний текст джерелаThis thesis is mainly devoted to the development of penalized maximum likelihood methods for the study of complex data.A first work deals with the selection of generalized linear models in the framework of stratified data, characterized by the measurement of observations as well as covariates within different groups (or strata). The purpose of the analysis is then to determine which covariates influence in a global way (whatever the stratum) the observations but also to evaluate the heterogeneity of this effect across the strata.Secondly, we are interested in the selection of nonlinear mixed effects models used in the analysis of longitudinal data. In a first work, we describe a SAEM-type algorithm in which the penalty is taken into account during step M by solving a penalized regression problem at each iteration. In a second work, inspired by proximal gradient algorithms, we simplify the M step of the penalized SAEM algorithm previously described by performing only one proximal gradient iteration at each iteration. This algorithm, called Stochastic Approximation Proximal Gradient Algorithm (SAPG), corresponds to a proximal gradient algorithm in which the gradient of the likelihood is approximated by a stochastic approximation technique.Finally, we present two statistical modeling works realized during this thesis
Hoang, Huu Tinh. "Contrôle santé intégré passif par ondes élastiques guidées de tuyauteries pour applications nucléaires et pétrolières." Thesis, Valenciennes, Université Polytechnique Hauts-de-France, 2020. http://www.theses.fr/2020UPHF0023.
Повний текст джерелаStructural Health Monitoring (SHM) consists in embedding sensors into a structure in order to monitor its health inreal time throughout its lifetime. The research works carried out in this thesis aimed at developing a new approachof SHM for the detection of corrosion/erosion in pipes. This manuscript presents a new quantitative imaging method,called passive elastic guided wave tomography, based on the use of an embedded network of piezoelectric sensors(PZT) listening and analyzing only the ambient elastic noise which is naturally generated by the fluid circulation inpipes. This passive method offers many advantages for a SHM system, such as reduction of energy consumption,simplified electronics and ability to perform an inspection while the structure is in operation. In addition, thispassive method makes SHM systems possible to use Fiber Bragg Grating sensors (FBG) which have several advantagesover traditional PZT sensors (low intrusivity, resistance to harsh environments, etc.) but which are not able to emitwaves. A first demonstration of the feasibility of corrosion/erosion imaging by FBG is illustrated experimentallythanks to a result obtained by hybrid tomography in which wave emission is performed by PZT and reception byFBG. All these works offer promising perspectives for the application of passive tomography on industrial structuresusing a pure FBG system. Among the various results presented in this thesis, we also show that corrosion/erosiondefects can be characterized by tomography on a straight pipe without the need for a baseline measurement in apristine state. It is feasible by using a new method of auto-calibration of the data used for tomography. The absenceof baseline measurement makes the method very reliable and avoid false alarms of the system. Finally, preliminarystudies on tomography for more complex structures such as a bended pipe have been realized and validated throughsimulations
Barsoum, Baher Albert. "Classification automatique par amincicement de l'histogramme multidimensionnel." Lille 1, 1994. http://www.theses.fr/1994LIL10195.
Повний текст джерелаHamdan, Hani. "Développement de méthodes de classification pour le contrôle par émission acoustique d'appareils à pression." Compiègne, 2005. http://www.theses.fr/2005COMP1583.
Повний текст джерелаThis PhD thesis deals with real-time computer-aided decision for acoustic emission-based control of pressure equipments. The addressed problem is the taking into account of the location uncertainty of acoustic emission signals, in the mixture model-based clustering. Two new algorithms (EM and CEM for uncertain data) are developed. These algorithms are only based on uncertainty zone data and their development is carried out by optimizing new likelihood criteria adapted to this kind of data. In order to speed up the data processing when the data size becomes very big, we have also developed a new method for the discretization of uncertainty zone data. This method is compared with the traditional one applied to imprecise data. An experimental study using simulated and real data shows the efficiency of the various developed approaches
Bayat, Makoei Sahar. "Étude de l’accès à la transplantation rénale en Lorraine par méthodes biostatistiques conventionnelles et par fouille de données." Thesis, Nancy 1, 2008. http://www.theses.fr/2008NAN10049/document.
Повний текст джерелаAmong renal replacement therapy techniques, transplantation is associated with longer survival and lower long-term cost. We studied access to renal transplantation in NEPHROLOR network of care in the French administrative region Lorraine. All incident patients beginning renal replacement therapy in NEPHROLOR between July 1, 1997 and June 30, 2003 (1725 patients) were included in the study. In Lorraine, access to renal transplant waiting list and to transplantation after registration on the list was primarily associated with medical determinants. Nevertheless, patients followed up in the nephrology department performing transplantation were more likely to be placed on the waiting list. After taking into account comorbidities, transplantation was associated with longer survival even among elderly patients (> 60 years). Moreover, patients’ survival was associated with medical determinants of access to renal transplant waiting list. These facts validate the current care process of registration on the waiting list in Lorraine. However, access to the waiting list in the NEPHROLOR network can be improved by encouraging nephrology facilities without transplantation to extend the selection criteria of transplant candidates. Data mining algorithms can represent the probability of being registered on the list based on the patient’s characteristics. Using such an algorithm at the first registration of patient in NEPHROLOR information system could optimize the renal transplant registration process and improve access to renal transplantation in NEPHROLOR
Da, Rugna Jérôme. "De l'usage des méthodes bas niveau pour la recherche d'images par le contenu." Saint-Etienne, 2004. http://www.theses.fr/2004STET4015.
Повний текст джерелаThe matter of this work is content based image retrievaland more precisely the contribution of the low level methods. After having discussed the various existing approaches, we recall the semantic gap between the user expectations and what really the systems of research propose. Most of these approaches rely on a preliminary step of segmentation whose validity and robustness must be studied. Then we propose a protocol of evaluation and a practical example of benchmarks. The originality consists in not comparing a segmentation with a theoretical reference but judging its stability objectively. The third part of this document introduces three specific contributions likely to improve the chain of research. Initially, a detector of blur allows to extract a meta-data carried by the image: the unblur regions, a priori of focusing. Secondly, we expose a descriptor based on the extraction of emergent areas using only the color criteria. This process, combined with adapted distances, may allow for example a color pre-filtering before the step of similarity research. Finally, we briefly introduce an algebra of histograms able as well as possible to exploit the information contained in this type of descriptors, via a specific query language
Bevilacqua, Elsa. "Etude chimique et minéralogique des peintures : analyse sur poudre par méthodes X, traitement de données." Lille 1, 1989. http://www.theses.fr/1989LIL10155.
Повний текст джерелаCarme, Sylvain. "Méthodes d'assimilation de données par Filtrage de Kalman dans un modèle réaliste de l'Atlantique Nord." Université Joseph Fourier (Grenoble), 1999. http://www.theses.fr/1999GRE10161.
Повний текст джерелаMilion, Chloé. "Méthodes et modèles pour l’étude de la mobilité des personnes par l’exploitation de données de radiotéléphonie." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1199/document.
Повний текст джерелаThis work stands on the close relationship that exists between two areas one use every day, namely transportation and telecommunications. Due to a daily and intensive usage of both networks, actors of each area raise up some needs to ensure delivered services and their quality to their end-users. Then, we propose to the actor of transportation measurement of performed trips that can be treated with the knowledge of the transportation actors trade in order to ease the decision making process on matters that go from land uses up to network operation. The methodologies presented here from trip measurement are based on the exploitation of the digital footprints that are fund within a telecommunication network. Those footprints are reflecting how the network is used and already exist for operating purposes. The methods proposed in this work result from our knowledge of the telecommunication mechanisms and the huge amount of data that are generated at every time and every place where Orange is operating. We introduce that mobile equipment embedded onto individuals of whom we capture their activity can lead to estimate trips attributes, origin-destination trip tables, quality of service indicators and quantification of explain factor of trip choices. We also show how the mining of usage relationship through signaling data can lead to the characterization of land use
Hammami, Imen. "Fusion d'images de télédétection hétérogènes par méthodes crédibilistes." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0034/document.
Повний текст джерелаWith the advent of new image acquisition techniques and the emergence of high-resolution satellite systems, remote sensing data to be exploited have become increasingly rich and varied. Their combination has thus become essential to improve the process of extracting useful information related to the physical nature of the observed surfaces. However, these data are generally heterogeneous and imperfect, which poses several problems in their joint treatment and requires the development of specific methods. It is in this context that falls this thesis that aimed at developing a new evidential fusion method dedicated to heterogeneous remote sensing images processing at high resolution. In order to achieve this objective, we first focus our research, firstly, on the development of a new approach for the belief functions estimation based on Kohonen’s map in order to simplify the masses assignment operation of the large volumes of data occupied by these images. The proposed method allows to model not only the ignorance and the imprecision of our sources of information, but also their paradox. After that, we exploit this estimation approach to propose an original fusion technique that will solve problems due to the wide variety of knowledge provided by these heterogeneous sensors. Finally, we study the way in which the dependence between these sources can be considered in the fusion process using the copula theory. For this reason, a new technique for choosing the most appropriate copula is introduced. The experimental part of this work isdevoted to land use mapping in case of agricultural areas using SPOT-5 and RADARSAT-2 images. The experimental study carried out demonstrates the robustness and effectiveness of the approaches developed in the framework of this thesis
Pennerath, Frédéric. "Méthodes d'extraction de connaissances à partir de données modélisables par des graphes : Application à des problèmes de synthèse organique." Phd thesis, Université Henri Poincaré - Nancy I, 2009. http://tel.archives-ouvertes.fr/tel-00436568.
Повний текст джерелаSaley, Issa. "Modélisation des données d'attractivité hospitalière par les modèles d'utilité." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS044/document.
Повний текст джерелаUnderstanding how patients choose hospitals is of utmost importance for both hospitals administrators and healthcare decision makers; the formers for patients incoming tide and the laters for regulations.In this thesis, we present different methods of modelling patients admission data in order to forecast patients incoming tide and compare hospitals attractiveness.The two first method use counting data models with possible spatial dependancy. Illustration is done on patients admission data in Languedoc-Roussillon.The third method uses discrete choice models (RUMs). Due to some limitations of these models according to our goal, we introduce a new approach where we released the assumption of utility maximization for an utility-threshold ; that is to say that an agent (patient) can choose an alternative (hospital) since he thinks that he can obtain a certain level of satisfaction of doing so, according to some aspects. Illustration of the approach is done on 2009 asthma admission data in Hérault
Le, Duff Franck. "Enrichissement quantitatif et qualitatif de relations conceptuelles des bases de connaissances médicales par extraction et héritage automatique par des méthodes informatiques et probabilistes." Rennes 1, 2006. http://www.theses.fr/2006REN1B094.
Повний текст джерелаMougel, Pierre-Nicolas. "Découverte de collections homogènes de sous-graphes denses par des méthodes de fouille de données sous contraintes." Phd thesis, INSA de Lyon, 2012. http://tel.archives-ouvertes.fr/tel-00858751.
Повний текст джерелаSelingue, Maxime. "amélioration de la précision de structures sérielles poly-articulées par des méthodes d'apprentissage automatique économes en données." Electronic Thesis or Diss., Paris, HESAM, 2023. http://www.theses.fr/2023HESAE085.
Повний текст джерелаThe evolution of production methods in the context of Industry 4.0 has led to the use of collaborative and industrial robots for tasks such as drilling, machining, and assembly. These tasks require an accuracy of around a tenth of a millimeter, whereas the precision of these robots is in the range of one to two millimeters. Robotic integrators had to propose calibration methods aimed at establishing a more reliable and representative model of the robot's behavior in the real world.As a result, analytical calibration methods model the defects affecting the accuracy of industrial robots, including geometric defects, joint compliance, transmission errors, and thermal drift. Given the complexity of experimentally identifying the parameters of some of these analytical models, hybrid calibration methods have been developed. These methods combine an analytical model with a machine learning approach whose role is to accurately predict residual positioning errors (caused by the inaccuracies of the analytical model). These defects can then be compensated for in advance through a compensation algorithm.However, these methods require a significant amount of time and data and are no longer valid when the robot's payload changes. The objective of this thesis is to improve hybrid calibration methods to make them applicable in industrial contexts. In this regard, several contributions have been made.First, two methods based on neural networks that allow the adaptation of the hybrid model to a new payload within a robot's workspace with very little data. These two methods respectively rely on transfer learning and prediction interpolation.Then, a hybrid calibration method using active learning with Gaussian process regression is presented. Through this approach, in an iterative process, the system autonomously decides on relevant data to acquire, enabling optimized calibration in terms of data and time
Allain, Guillaume. "Prévision et analyse du trafic routier par des méthodes statistiques." Toulouse 3, 2008. http://thesesups.ups-tlse.fr/351/.
Повний текст джерелаThe industrial partner of this work is Mediamobile/V-trafic, a company which processes and broadcasts live road-traffic information. The goal of our work is to enhance traffic information with forecasting and spatial extending. Our approach is sometimes inspired by physical modelling of traffic dynamic, but it mainly uses statistical methods in order to propose self-organising and modular models suitable for industrial constraints. In the first part of this work, we describe a method to forecast trafic speed within a time frame of a few minutes up to several hours. Our method is based on the assumption that traffic on the a road network can be summarized by a few typical profiles. Those profiles are linked to the users' periodical behaviors. We therefore make the assumption that observed speed curves on each point of the network are stemming from a probabilistic mixture model. The following parts of our work will present how we can refine the general method. Medium term forecasting uses variables built from the calendar. The mixture model still stands. Additionnaly we use a fonctionnal regression model to forecast speed curves. We then introduces a local regression model in order to stimulate short-term trafic dynamics. The kernel function is built from real speed observations and we integrate some knowledge about traffic dynamics. The last part of our work focuses on the analysis of speed data from in traffic vehicles. These observations are gathered sporadically in time and on the road segment. The resulting data is completed and smoothed by local polynomial regression
Jonas, Sarah Flora. "Méthodes de comparaisons de deux ou plusieurs groupes de données censurées par intervalle. Avec application en immunologie clinique." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS317/document.
Повний текст джерелаIn the context of analysis of survival data, the comparison of several groups of individuals, where the event of interest is interval censored, represents a methodological challenge. When the monitoring of patients during the study is not continuous, the event of interest may occur between two observation dates; it is said "interval censored". Tests of comparisons of survival time distributions for several groups, adapted for interval censoring, have been developed (score tests, weighted pseudo log-rank tests, rank tests). In this context, we have developped two new group comparison tests adapted to the particular situations of interval censoring. The first test apply to a situation where the alternative hypothesis considers that the hazard functions cross. The second test concerns a situation where the study population has a fraction not at risk for the event of interest. Both of these tests have been applied to real clinical immunology dataset
Debèse, Nathalie. "Recalage de la navigation par apprentissage sur les données bathymètriques." Compiègne, 1992. http://www.theses.fr/1992COMPD538.
Повний текст джерелаAbel, Nicolas. "Outils et méthodes pour les architectures reconfigurables dynamiquement à grain fin : Synthèse et gestion automatique des flux de données." Cergy-Pontoise, 2006. http://biblioweb.u-cergy.fr/theses/06CERG0301.pdf.
Повний текст джерелаThis thesis presents tools and methodologies dedicated to fine grain dynamically reconfigurable architectures. In the first part, after studying this reconfiguration mode, we describe a tool set improving dynamical reconfiguration implementation. Firstly, we optimize configuration storage and reconfiguration duration proposing software compressing tools and hardware reconfiguration module. We finish studying the system management of the reconfigurable area. The system, developed with a high level language, makes configurations scheduling flexible. In the second part, we focus on the data flow automatic management. We base this on the separation of treatment modules and data flow managing modules. The second is totally directed by the developing tools and the managing system. In this way, the system has a treatment library and all the tools necessary to interconnect and schedule treatments in real time. The whole studied concepts have been implemented on ARDOISE architecture
Le, floch Edith. "Méthodes multivariées pour l'analyse jointe de données de neuroimagerie et de génétique." Thesis, Paris 11, 2012. http://www.theses.fr/2012PA112214/document.
Повний текст джерелаBrain imaging is increasingly recognised as an interesting intermediate phenotype to understand the complex path between genetics and behavioural or clinical phenotypes. In this context, a first goal is to propose methods to identify the part of genetic variability that explains some neuroimaging variability. Classical univariate approaches often ignore the potential joint effects that may exist between genes or the potential covariations between brain regions. Our first contribution is to improve the sensitivity of the univariate approach by taking advantage of the multivariate nature of the genetic data in a local way. Indeed, we adapt cluster-inference techniques from neuroimaging to Single Nucleotide Polymorphism (SNP) data, by looking for 1D clusters of adjacent SNPs associated with the same imaging phenotype. Then, we push further the concept of clusters and we combined voxel clusters and SNP clusters, by using a simple 4D cluster test that detects conjointly brain and genome regions with high associations. We obtain promising preliminary results on both simulated and real datasets .Our second contribution is to investigate exploratory multivariate methods to increase the detection power of imaging genetics studies, by accounting for the potential multivariate nature of the associations, at a longer range, on both the imaging and the genetics sides. Recently, Partial Least Squares (PLS) regression or Canonical Correlation Analysis (CCA) have been proposed to analyse genetic and transcriptomic data. Here, we propose to transpose this idea to the genetics vs. imaging context. Moreover, we investigate the use of different strategies of regularisation and dimension reduction techniques combined with PLS or CCA, to face the overfitting issues due to the very high dimensionality of the data. We propose a comparison study of the different strategies on both a simulated dataset and a real fMRI and SNP dataset. Univariate selection appears to be necessary to reduce the dimensionality. However, the generalisable and significant association uncovered on the real dataset by the two-step approach combining univariate filtering and L1-regularised PLS suggests that discovering meaningful imaging genetics associations calls for a multivariate approach
Marot, Julien. "Méthodes par sous-espaces et d'optimisation : application au traitement d'antenne, à l'analyse d'images, et au traitement de données tensorielles." Aix-Marseille 3, 2007. http://www.theses.fr/2007AIX30051.
Повний текст джерелаThis thesis is devoted to subspace-based and optimization methods, developed in array processing, image processing, tensor signal processing. Definitions concerning an array processing problem and high resolution methods are presented. We propose an optimization method applied to source detection in presence of phase distortions for a high number of sensors. We propose fast subspace-based methods for the estimation of straight line offset and orientation, several optimization methods to estimate distorted contours, nearly straight or circular. We provide a state-of-the art about multiway signal processing: truncation of the HOSVD, lower rank tensor approximation, multiway Wiener filtering. We propose a procedure for nonorthogonal tensor flattening, using the method presented in the first part
Chakib, Soundouss. "Les essences de géranium : influence pédoclimatique et saisonnière sur les essences marocaines. Classification par les méthodes d'analyse de données." Aix-Marseille 3, 1998. http://www.theses.fr/1998AIX30076.
Повний текст джерелаSandt, Christophe. "Identification de micro-organisme pathogènes impliqués dans les infections nosocomiales par spectroscopie infrarouge à transformée de Fourier et méthodes statistiques." Reims, 2003. http://www.theses.fr/2003REIMP204.
Повний текст джерелаWe have used Fourier transform-infrared spectroscopy (FTIR) in order to identify pathogenic microorganisms isolated in a clinical set-up. We demonstrated the usefulness of the technique in the typing of Candida albicans. By using this method, four clinical applications have been achieved: the epidemiological follow-up of HIV patients and ICU patients, the demonstration of a nosocomial transmission of a C. Albicans strain among neonates in a maternity ward and the follow-up of strains from a patient with recurrent systemic candidiasis. We have built a database containing more than 245 strains of GRAM- and 270 strains of GRAM+ pathogenic bacteria belonging to 11 genera and 18 species. Validation of the database yielded 84. 2% correct identification for GRAM- and 94. 7% correct identification for GRAM+. We used FTIR microspectroscopy to evaluate the early identification of pathogenic bacteria and yeasts. A total of 100 GRAM- and 60 GRAM+ strains, belonging to 9 genera and 15 species were included
Carles, Olivier. "Système de génération automatique de bases de données pour la simulation de situations de conduite fondée sur l'interaction de ses différents acteurs." Toulouse 3, 2001. http://www.theses.fr/2001TOU30160.
Повний текст джерелаAgnaou, Youssef Joseph. "Analyse statistique de données de croissance humaine : estimation et ajustement paramétriques, non paramétriques, et par réseaux de neurones." Bordeaux 1, 2001. http://www.theses.fr/2001BOR12404.
Повний текст джерелаGodoy, Campbell Matias. "Sur le problème inverse de détection d'obstacles par des méthodes d'optimisation." Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30220/document.
Повний текст джерелаThis PhD thesis is dedicated to the study of the inverse problem of obstacle/object detection using optimization methods. This problem consists in localizing an unknown object omega inside a known bounded domain omega by means of boundary measurements and more precisely by a given Cauchy pair on a part Gammaobs of thetaOmega. We cover the scalar and vector scenarios for this problem considering both the Laplace and the Stokes equations. For both cases, we rely on identifiability result which ensures that there is a unique obstacle/object which corresponds to the considered boundary measurements. The strategy used in this work is to reduce the inverse problem into the minimization of a cost-type functional: the Kohn-Vogelius functional. This kind of approach is widely used and permits to use optimization tools for numerical implementations. However, in order to well-define the functional, this approach needs to assume the knowledge of a measurement on the whole exterior boundary thetaOmega. This last point leads us to first study the data completion problem which consists in recovering the boundary conditions on an inaccessible region, i.e. on thetaOmega\Gammaobs, from the Cauchy data on the accessible region Gammaobs. This inverse problem is also studied through the minimization of a Kohn-Vogelius type functional. The ill-posedness of this problem enforces us to regularize the functional via a Tikhonov regularization. We obtain several theoretical properties as convergence properties, in particular when data is corrupted by noise. Based on these theoretical results, we reconstruct numerically the boundary data by implementing a gradient algorithm in order to minimize the regularized functional. Then we study the obstacle detection problem when only partial boundary measurements are available. We consider the inaccessible boundary conditions and the unknown object as the variables of the functional and then, using geometrical shape optimization tools, in particular the shape gradient of the Kohn-Vogelius functional, we perform the numerical reconstruction of the unknown inclusion. Finally, we consider, into the two dimensional vector case, a new degree of freedom by studying the case when the number of objects is unknown. Hence, we use the topological shape optimization in order to minimize the Kohn-Vogelius functional. We obtain the topological asymptotic expansion of the solution of the 2D Stokes equations and characterize the topological gradient for this functional. Then we determine numerically the number and location of the obstacles. Additionally, we propose a blending algorithm which combines the topological and geometrical shape optimization methods in order to determine numerically the number, location and shape of the objects
Platzer, Auriane. "Mécanique numérique en grandes transformations pilotée par les données : De la génération de données sur mesure à une stratégie adaptative de calcul multiéchelle." Thesis, Ecole centrale de Nantes, 2020. http://www.theses.fr/2020ECDN0041.
Повний текст джерелаComputational mechanics is a field in which a large amount of data is both consumed and produced. On the one hand, the recent developments of experimental measurement techniques have provided rich data for the identification process of constitutive models used in finite element simulations. On the other hand, multiscale analysis produces a huge amount of discrete values of displacements, strains and stresses from which knowledge is extracted on the overall material behavior. The constitutive model then acts as a bottleneck between upstream and downstream material data. In contrast, Kirchdoerfer and Ortiz (Computer Methods in Applied Mechanics and Engineering, 304, 81-101) proposed a model-free computing paradigm, called data-driven computational mechanics. The material response is then only represented by a database of raw material data (strain-stress pairs). The boundary value problem is thus reformulated as a constrained distance minimization between (i) the mechanical strain-stress state of the body, and (ii) the material database. In this thesis, we investigate the question of material data coverage, especially in the finite strain framework. The data-driven approach is first extended to a geometrically nonlinear setting: two alternative formulations are considered and a finite element solver is proposed for both. Second, we explore the generation of tailored databases using a mechanically meaningful sampling method. The approach is assessed by means of finite element analyses of complex structures exhibiting large deformations. Finally, we propose a prototype multiscale data-driven solver, in which the material database is adaptively enriched
Neuvial, Pierre. "Contributions à l'analyse statistique des données de puces à ADN." Paris 7, 2008. http://www.theses.fr/2008PA077073.
Повний текст джерелаThis thesis deals with statistical questions raised by the analysis of high-dimensional genomic data for cancer research. In the first part, we study asymptotic properties of multiple testing procedures that aim at controlling the False Discovery Rate (FDR), that is, the expected False Discovery Proportion (FDP) among rejected hypotheses. We develop a versatile formalism to calculate the asymptotic distribution of the FDP an the associated regularity conditions, for a wide range of multiple testing procedures, and compare their asymptotic power. We then study in terms of FDR control connections between intrinsic bounds between three multiple testing problems: detection, estimation and selection. In particular, we connect convergence rates in the estimation problem to the regularity of the p-value distribution near 1. In the second part, we develop statistical methods to study DNA microarrays for cancer research. We propose a microarray normalization method that removes spatial biases while preserving the true biological signal; it combines robust regression with a mixture model with spatial constraints. Then we develop a method to infer gene regulations from gene expression data, which is based on learning and multiple testing theories. Finally, we build a genomic score to predict, for a patient treated for a breast tumor, whether or not a second cancer is a true recurrence of the first cancer
Priam, Rodolphe. "Méthodes de carte auto-organisatrice par mélange de lois contraintes. Ap^plication à l'exploration dans les tableaux de contingence textuels." Rennes 1, 2003. http://www.theses.fr/2003REN10166.
Повний текст джерелаShahzad, Muhammad Kashif. "Exploitation dynamique des données de production pour améliorer les méthodes DFM dans l'industrie Microélectronique." Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00771672.
Повний текст джерелаItier, Vincent. "Nouvelles méthodes de synchronisation de nuages de points 3D pour l'insertion de données cachées." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS017/document.
Повний текст джерелаThis thesis addresses issues relating to the protection of 3D object meshes. For instance, these objects can be created using CAD tool developed by the company STRATEGIES. In an industrial context, 3D meshes creators need to have tools in order to verify meshes integrity, or check permission for 3D printing for example.In this context we study data hiding on 3D meshes. This approach allows us to insert information in a secure and imperceptible way in a mesh. This may be an identifier, a meta-information or a third-party content, for instance, in order to transmit secretly a texture. Data hiding can address these problems by adjusting the trade-off between capacity, imperceptibility and robustness. Generally, data hiding methods consist of two stages, the synchronization and the embedding. The synchronization stage consists of finding and ordering available components for insertion. One of the main challenges is to propose an effective synchronization method that defines an order on mesh components. In our work, we propose to use mesh vertices, specifically their geometric representation in space, as basic components for synchronization and embedding. We present three new synchronisation methods based on the construction of a Hamiltonian path in a vertex cloud. Two of these methods jointly perform the synchronization stage and the embedding stage. This is possible thanks to two new high-capacity embedding methods (from 3 to 24 bits per vertex) that rely on coordinates quantization. In this work we also highlight the constraints of this kind of synchronization. We analyze the different approaches proposed with several experimental studies. Our work is assessed on various criteria including the capacity and imperceptibility of the embedding method. We also pay attention to security aspects of the proposed methods
Van, Den Berghe François. "Assimilation de mesures satellitaires dans des modèles numériques par méthodes de contrôle optimal." Phd thesis, École Nationale Supérieure des Mines de Paris, 1992. http://pastel.archives-ouvertes.fr/pastel-00954478.
Повний текст джерелаFerraro, Pascal. "Méthodes algorithmiques de comparaison d'arborescences : applications à la comparaison de l'architecture des plantes." Toulouse, INPT, 2000. http://www.theses.fr/2000INPT039H.
Повний текст джерелаHihi, Jalil. "Évaluation de méthodes d'identification de systèmes non-linéaires en régime permanent : méthode de traitement des données par groupes, identification floue." Nancy 1, 1993. http://www.theses.fr/1993NAN10013.
Повний текст джерелаLe, Brun Alexia. "Etude d'un ensemble de paramètres liés à la sécheresse de la peau : traitement des données par des méthodes d'analyses multidimensionnelles." Bordeaux 1, 1986. http://www.theses.fr/1986BOR10880.
Повний текст джерела