Dissertations / Theses on the topic 'Réduction et traitement de données'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Réduction et traitement de données.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Flitti, Farid. "Techniques de réduction de données et analyse d'images multispéctrales astronomiques par arbres de Markov." Université Louis Pasteur (Strasbourg) (1971-2008), 2005. https://publication-theses.unistra.fr/public/theses_doctorat/2005/FLITTI_Farid_2005.pdf.
Full textThe development of astronomical multispectral sensors allows data of a great richness. Nevertheless, the classification of multidimensional images is often limited by Hughes phenomenon: when dimensionality increases the number of parameters of the model grows and the precision of their estimates falls inevitably, therefore the quality of the segmentation dramatically decreases. It is thus imperative to discard redundant information in order to carry out robust segmentation or classification. In this thesis, we have proposed two methods for multispectral image dimensionnality reduction: 1) bands regrouping followed by local projections; 2) radio cubes reduction by a mixture of Gaussians model. We have also proposed joint reduction/segmentation scheme based on the regularization of the mixture of probabilistic principal components analyzers (MPPCA). For the segmentation task, we have used a Bayesian approach based on hierarchical Markov models namely the hidden Markov tree and the pairwise Markov tree. These models allow fast and exact computation of the a posteriori probabilities. For the data driven term, we have used three formulations: 1) the classical multidimensional Gaussian distribution 2) the multidimensional generalized Gaussian distribution formulated using copulas theory 3) the likelihood of the probabilistic PCA model (within the framework of the regularized MPPCA). The major contribution of this work consists in introducing various hierarchical Markov models for multidimensional and multiresolution data segmentation. Their exploitation for data issued from wavelets analysis, adapted to the astronomical context, enabled us to develop new denoising and fusion techniques of multispectral astronomical images. All our algorithms are unsupervised and were validated on synthetic and real images
Jouvard, Jean-Marie. "Relations isotopiques et réduction des données spectroscopiques : application au traitement simultané des premières polyades des molécules 12CH4, 13CH4, 12CD4 et 13CD4." Dijon, 1991. http://www.theses.fr/1991DIJOS007.
Full textGirard, Robin. "Réduction de dimension en statistique et application en imagerie hyper-spectrale." Phd thesis, Université Joseph Fourier (Grenoble), 2008. http://tel.archives-ouvertes.fr/tel-00379179.
Full textOger, Myriam. "Indexation automatique d'images numériques : application aux images histopathologiques du cancer du sein et hématologiques de leucémies lympoïdes chroniques." Caen, 2008. http://www.theses.fr/2008CAEN2066.
Full textIn a context where health economies are increasingly curbed, where specialists are fewer and fewer, while, thanks to screening campaigns, the number of cases to analyze is constantly growing, the task of pathologists is more and more difficult. In addition, early lesions discovered during the screening are often poorly known and / or of very small size which makes the histopathological diagnosis difficult. A similar problem is encountered in hematology with the increasingly widespread practice of systematic blood tests and the difficulty of identifying suspicious cells and rare events in blood smears. It is therefore very important to assess how digital microscopy and automatic processing techniques will be able to help the specialists in their daily practice in the future. This present work is based on the use of virtual slides of histological and cytological preparations, acquired at low or high resolution. It aims at developing and testing several tools for computer assisted diagnosis based on automatic indexing of images. However, the use of virtual slides involves the manipulation of very large data and it is difficult to process, analyze or visualize these images in a classical way. The first objective of the study was to assess the relevance of a global analysis of images, then the contribution of their local analysis, with a dimensional reduction of data by various methods including spectral analysis. These methods have been applied to virtual slides of breast tumors and chronic lymphoid leukemia, two tumor locations whose incidence is a public health problem
Dyrek, Achrène. "L'atmosphère des exoplanètes avec le James Webb Space Telescope." Electronic Thesis or Diss., Université Paris Cité, 2023. http://www.theses.fr/2023UNIP7096.
Full textMy thesis is devoted to the characterisation of exoplanet atmospheres with the newly-operating James Webb Space Telescope (JWST). Our understanding of exoplanet atmospheres is being revolutionised by the observational capabilities of such an observatory. The scientific outcomes will reach our scientific community and the general public, putting into perspective our knowledge of our own Solar System, the only system that is known to host life. The first part of this manuscript is devoted to an introduction that includes a state-of-the-art review of exoplanet atmospheres characterisation in terms of atomic and molecular composition, structure and dynamics. In this introduction, we focus on transiting exoplanets (when the planet passes in front of or behind its host star in the telescope's line of sight). We provide a description of this observational method and key results that have been obtained over the past two decades. The second part of this manuscript focuses on the molecular composition predictions with the JWST Mid-InfraRed Instrument (MIRI) and its Low-Resolution Spectrometer (LRS) that is meant to carry out atmospheric spectroscopy in an uncharted wavelength range. Here, we present realistic simulations of transiting exoplanets I developed during my thesis, with the MIRI LRS instrument that include various instrumental systematics likely to alter the atmospheric features we are meant to detect in our data [Dyrek+, sub., 2023, Morello, Dyrek+, 2022]. Our main objective is to design a comprehensive simulation tool that enables the community to build robust data reduction methods and to predict molecular detections. The third part of this manuscript is dedicated to the characterisation of the in-flight post-commissioning performances of the MIRI LRS. This work is based on the first exoplanetary transit observed with MIRI of the Super-Earth L168-9b, chosen to be a calibration target. My work focuses on identifying in-flight instrumental systematics that undermine observations' stability and more generally, the study of transiting exoplanets [Dyrek+, sub., 2023]. The final part of this manuscript is devoted to the scientific analysis of photometric and spectroscopic observations of both gas giants and temperate rocky exoplanet atmospheres. Here, I present my contribution on data reduction and analysis to the collaborative work we conducted as part of the Guaranteed Time Observation (GTO) and the Early Release Science (ERS) consortia. In particular, our work on the super-Neptune WASP-107b led to the first mid-infrared detection of sulphur dioxide (SO2) and silicate clouds [Dyrek+, sub., 2023b]. In addition, we conducted the first detection of the thermal emission of the rocky temperate exoplanet TRAPPIST-1b. In this work, we have constrained its brightness temperature, revealing key insights in the presence or not of an atmosphere [Greene+, 2023]. The final chapter of my thesis is dedicated to the prospects offered by JWST and the future Ariel mission, as these two telescopes will provide game-changing observations over the next decades
Laforest, Valérie. "Technologies propres : méthodes de minimisation des rejets et de choix des procédés de valorisation des effluents : application aux ateliers de traitements de surface." Lyon, INSA, 1999. http://theses.insa-lyon.fr/publication/1999ISAL0118/these.pdf.
Full textCurrently, the essential part of the money invested by the industrialist is for the water treatment. In France, most of the 20 billions francs per year devoted to the water treatment is used for the industrial activity. The global management of effluents favour the integration of clean technologies (optimisation, change and modification of the production process) in order to reduce the pollution problem at its source. Our study aims at the introduction of clean technologies in the metal workshops (consumer and generator of water and chemicals) by the development of two data management methods, which lead to two decision support systems. The aim of the first one is to minimise both the water consumption and the wastewater disposal by optimising the production process (optimum yield and efficiency of the rinsing baths. The second one concerns the choice of valorisation techniques considering the valorisation objectives, the effluents characteristics and the parameters limiting the use of the techniques. Our approach fits into a global management method for the metal finishing industry wastewater. Its aim is to limit the quantity of wastewater generated, to valorise effluents and by this way to develop the clean technologies
Teixeira, Ramachrisna. "Traitement global des observations méridiennes de l'Observatoire de Bordeaux." Bordeaux 1, 1990. http://www.theses.fr/1990BOR10618.
Full textRavazzola, Patrice. "Approximation et réduction de modèles en traitement d'antenne." Grenoble INPG, 1990. http://www.theses.fr/1990INPG0104.
Full textWright, Sophie. "Données obstétricales et néonatales précoces des grossesses gémellaires après réduction embryonnaire." Montpellier 1, 1994. http://www.theses.fr/1994MON11106.
Full textAtigui, Faten. "Approche dirigée par les modèles pour l’implantation et la réduction d’entrepôts de données." Thesis, Toulouse 1, 2013. http://www.theses.fr/2013TOU10044/document.
Full textOur work handles decision support systems based on multidimensional Data Warehouse (DW). A Data Warehouse (DW) is a huge amount of data, often historical, used for complex and sophisticated analysis. It supports the business process within an organization. The relevant data for the decision-making process are collected from data sources by means of software processes commonly known as ETL (Extraction-Transformation-Loading) processes. The study of existing systems and methods shows two major limits. Actually, when building a DW, the designer deals with two major issues. The first issue treats the DW's design, whereas the second addresses the ETL processes design. Current frameworks provide partial solutions that focus either on the multidimensional structure or on the ETL processes, yet both could benefit from each other. However, few studies have considered these issues in a unified framework and have provided solutions to automate all of these tasks. Since its creation, the DW has a large amount of data, mainly due to the historical data. Looking into the decision maker's analysis over time, we can see that they are usually less interested in old data.To overcome these shortcomings, this thesis aims to formalize the development of a time-varying (with a temporal dimension) DW from its design to its physical implementation. We use the Model Driven Engineering (MDE) that automates the process and thus significantly reduce development costs and improve the software quality. The contributions of this thesis are summarized as follows: 1. To formalize and to automate the development of a time-varying DW within a model-driven approach that provides: - A set of unified (conceptual, logical and physical) metamodels that describe data and transformation operations. - An OCL (Object Constraint Language) extension that aims to conceptually formalize the transformation operations. - A set of transformation rules that maps the conceptual model to logical and physical models. - A set of transformation rules that generates the code. 2. To formalize and to automate historical data reduction within a model-driven approach that provides : - A set of (conceptual, logical and physical) metamodels that describe the reduced data. - A set of reduction operations. - A set of transformation rules that implement these operations at the physical level.In order to validate our proposals, we have developed a prototype composed of three parts. The first part performs the transformation of models to lower level models. The second part transforms the physical model into code. The last part allows the DW reduction
Fliti, Tamim. "Le problème SAT : traitement dynamique et données minimales." Aix-Marseille 2, 1997. http://www.theses.fr/1997AIX22015.
Full textTissot, Gilles. "Réduction de modèle et contrôle d'écoulements." Thesis, Poitiers, 2014. http://www.theses.fr/2014POIT2284/document.
Full textControl of turbulent flows is still today a challenge in aerodynamics. Indeed, the presence of a high number of active degrees of freedom and of a complex dynamics leads to the need of strong modelling efforts for an efficient control design. During this PhD, various directions have been followed in order to develop reduced-order models of flows in realistic situations and to use it for control. First, dynamic mode decomposition (DMD), and some of its variants, have been exploited as reduced basis for extracting at best the dynamical behaviour of the flow. Thereafter, we were interested in 4D-variational data assimilation which combines inhomogeneous informations coming from a dynamical model, observations and an a priori knowledge of the system. POD and DMD reduced-order models of a turbulent cylinder wake flow have been successfully derived using data assimilation of PIV measurements. Finally, we considered flow control in a fluid-structure interaction context. After showing that the immersed body motion can be represented as an additional constraint in the reduced-order model, we stabilized a cylinder wake flow by vertical oscillations
Buchholz, Bert. "Abstraction et traitement de masses de données 3D animées." Phd thesis, Télécom ParisTech, 2012. http://pastel.archives-ouvertes.fr/pastel-00958339.
Full textFlitti, Farid. "Techniques de réduction de données et analyse d'images multispectrales astronomiques par arbres de Markov." Phd thesis, Université Louis Pasteur - Strasbourg I, 2005. http://tel.archives-ouvertes.fr/tel-00156963.
Full textChen, Fati. "Réduction de l'encombrement visuel : Application à la visualisation et à l'exploration de données prosopographiques." Thesis, Université de Montpellier (2022-….), 2022. http://www.theses.fr/2022UMONS023.
Full textProsopography is used by historians to designate biographical records in order to study common characteristics of a group of historical actors through a collective analysis of their lives. Information visualization presents interesting perspectives for analyzing prosopographic data. It is in this context that the work presented in this thesis is situated. First, we present the ProsoVis platform to analyze and navigate through prosopographic data. We describe the different needs expressed and detail the design choices as well as the different views. We illustrate its use with the Siprojuris database which contains data on the careers of law teachers from 1800 to 1950. Visualizing so much data induces visual cluttering problems. In this context, we address the problem of overlapping nodes in a graph. Even if approaches exist, it is difficult to compare them because their respective evaluations are not based on the same quality criteria. We therefore propose a study of the state-of-the-art algorithms by comparing their results on the same criteria. Finally, we address a similar problem of visual cluttering within a map and propose an agglomeration spatial clustering approach, F-SAC, which is much faster than the state-of-the-art proposals while guaranteeing the same quality of results
Noyel, Guillaume. "Filtrage, réduction de dimension, classification et segmentation morphologique hyperspectrale." Phd thesis, École Nationale Supérieure des Mines de Paris, 2008. http://pastel.archives-ouvertes.fr/pastel-00004473.
Full textMouginot, Jérémie. "Traitement et Analyse des Données du Radar MARSIS/Mars Express." Phd thesis, Université Joseph Fourier (Grenoble), 2008. http://tel.archives-ouvertes.fr/tel-00364323.
Full textLe traitement de ces données a principalement consisté à compenser la distorsion ionosphérique. Cette correction a permis de réaliser une mesure indirecte du contenu électronique de l'ionosphère. Grâce à ces mesures, nous avons pu étudier en détail l'ionosphère martienne. Nous avons ainsi montré que le champ magnétique rémanent modifiait la distribution des électrons de l'ionosphère en précipitant les particules du vent solaire le long des lignes de champ radiales.
Les radargrammes corrigés nous ont permis d'étudier en détail les calottes martiennes. Nous faisons le bilan du volume des calottes polaires de Mars en utilisant des outils numériques développés pour le pointage des interfaces. Nous montrons ainsi que le volume des calottes correspondrait à une couche d'eau d'environ 20 m d'épaisseur répartie sur toute la planète.
Nous étudions enfin la réflectivité de la surface martienne. Pour cela, nous avons extrait l'amplitude de l'écho de surface de chaque pulse MARSIS, puis, après avoir calibré ces mesures, nous avons créé une carte globale de l'albédo radar. Nous nous sommes attachés à décrire cette carte de réflectivité, d'abord de manière globale, puis plus localement autour de Medusae Fossae et de la calotte résiduelle sud. Nous montrons que la réflectivité décroît avec la latitude, cette constatation est surement liée à la présence d'un pergélisol lorsqu'on remonte vers les hautes latitudes. Près de l'équateur, nous observons que les formations de Medusae Fossae possèdent une constante diélectrique de 2,4+/-0.5 ce qui est caractéristique d'un terrain poreux et/ou riche en glace. Dans cette même région, nous montrons que, dans les plaines d'Elysium et Amazonis, la constante diélectrique est égale à 7\pm1 et nous observons une interface dans la plaine d'Amazonis à environ 140+/-20 m, notre conclusion est que ces résultats sont caractéristiques d'écoulements de lave. L'étude de la calotte résiduelle sud de Mars, à l'aide d'un modèle de réflectivité multi-couches, nous permet d'estimer l'épaisseur de CO2 qui couvre cette région à 11+/-1.5 m.
Gu, Co Weila Vila. "Méthodes statistiques et informatiques pour le traitement des données manquantes." Phd thesis, Conservatoire national des arts et metiers - CNAM, 1997. http://tel.archives-ouvertes.fr/tel-00808585.
Full textRoyer, Jean-Jacques. "Analyse multivariable et filtrage des données régionalisées." Vandoeuvre-les-Nancy, INPL, 1988. http://www.theses.fr/1988NAN10312.
Full textFraisse, Bernard. "Automatisation, traitement du signal et recueil de données en diffraction x et analyse thermique : Exploitation, analyse et représentation des données." Montpellier 2, 1995. http://www.theses.fr/1995MON20152.
Full textPasquier, Nicolas. "Data Mining : algorithmes d'extraction et de réduction des règles d'association dans les bases de données." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2000. http://tel.archives-ouvertes.fr/tel-00467764.
Full textSoler, Maxime. "Réduction et comparaison de structures d'intérêt dans des jeux de données massifs par analyse topologique." Electronic Thesis or Diss., Sorbonne université, 2019. http://www.theses.fr/2019SORUS364.
Full textIn this thesis, we propose different methods, based on topological data analysis, in order to address modern problematics concerning the increasing difficulty in the analysis of scientific data. In the case of scalar data defined on geometrical domains, extracting meaningful knowledge from static data, then time-varying data, then ensembles of time-varying data proves increasingly challenging. Our approaches for the reduction and analysis of such data are based on the idea of defining structures of interest in scalar fields as topological features. In a first effort to address data volume growth, we propose a new lossy compression scheme which offers strong topological guarantees, allowing topological features to be preserved throughout compression. The approach is shown to yield high compression factors in practice. Extensions are proposed to offer additional control over the geometrical error. We then target time-varying data by designing a new method for tracking topological features over time, based on topological metrics. We extend the metrics in order to overcome robustness and performance limitations. We propose a new efficient way to compute them, gaining orders of magnitude speedups over state-of-the-art approaches. Finally, we apply and adapt our methods to ensemble data related to reservoir simulation, for modeling viscous fingering in porous media. We show how to capture viscous fingers with topological features, adapt topological metrics for capturing discrepancies between simulation runs and a ground truth, evaluate the proposed metrics with feedback from experts, then implement an in-situ ranking framework for rating the fidelity of simulation runs
Alkadee, Dareen. "Techniques de réduction et de traitement des émissions polluantes dans une machine thermique." Phd thesis, Conservatoire national des arts et metiers - CNAM, 2011. http://tel.archives-ouvertes.fr/tel-01005123.
Full textWoiselle, Arnaud. "Parcimonie et ses applications à des données de flot optique." Paris 7, 2010. http://www.theses.fr/2010PA077201.
Full textThis manuscript addresses inverse problems using 3D sparsifying transforms in the context of infrared video. We first survey the infrared imaging System and the problems associated such as noise and missing data, and make a short state of the art on industrial imaging processing technics Then we introduce the concept of sparsity and present common algorithms and representations used in this domain. The core of this research is the development of new 3D representations, which are well adapted to representing surfaces or filaments in a 3D volume. Two innovative transforms are proposed, and an existing one ~ the fast curvelet transform — is modified to better suit the applications by reducing its redundancy. These tools are used in many inverse problems such as denoising, inpainting and video deinterlacing, and prove to be very useful and competitive. Finally, we propose a few concrete applications of sparse transforms to infrared video processing, such as small targets detection or motion blur deconvolution
Deleris, Stéphane. "Réduction de la production de boue lors du traitement des eaux résiduaires urbaines : analyse du traitement combiné : ozonation et traitement biologique." Toulouse, INSA, 2001. http://www.theses.fr/2001ISAT0038.
Full textElimination of excess sludge produced by biological wastewater treatment is an important issue of the next decade. Due to the constraints inherent to the current practices for sludge disposal, research efforts are focused on processes that reduce sludge production simultaneously to treating wastewater. The association of a chemical oxidation step (with ozone) with an activated sludge process can lead to a significant reduction of sludge production. Experimental and simulation approaches, were employed to analyse the main mechanisms of sludge production. Results showed the major influence of the accumulation of particular inert organic compounds coming from the influent and from the decay of biomass. Il ozonation treatment improves the biodegradability of those compounds, simulations confirms that total reduction of excess sludge production could be reached. Batch experiments of sludge ozonation show the solubilising effect if ozone and also demonstrate the biodegradability improvements of the solubilised matter and significant inactivation of biomass. The obtained results allow generation of a simulation model on the effect of ozone on sludge. Using this model combined with the existing biological model confirms the potential of the combined process. Continuous experiments at pilot scale confirm that the association of ozonation treatment with an activated sludge process allows total reduction of excess organic sludge production and also leads to the reduction of mineral sludge production. The optimization of the way of applying ozone (dosage, treatment frequency) appears to be a key point to guarantee the efficiency of the ozonation treatment. The obtained data show that ozone treatment lead to the production of non settleable refractory COD in the effluent. Significant improvements in sludge settling characteristics and conservation of nitrifying activity are also observed. The results of this work confirm the applicability of the combined process
Revenu, Benoît. "Anisotropies et polarisation du rayonnement fossile: méthode de détection et traitement de données." Phd thesis, Université Paris-Diderot - Paris VII, 2000. http://tel.archives-ouvertes.fr/tel-00001686.
Full textLa première partie de cette thèse est consacrée au modèle standard de la cosmologie. Je présente en particulier la polarisation du rayonnement fossile.
Le signal polarisé, dont l'intensité n'excède pas dans la plupart des scénarios 10 % des fluctuations de température, est attendu à quelques micro kelvins. Pour le mesurer, on utilise souvent des bolomètres refroidis, couplés à des polariseurs. Je montre qu'il existe des dispositions optimales des détecteurs dans le plan focal de l'instrument minimisant le volume de la boîte d'erreurs et permettant d'avoir des erreurs décorrélées sur les paramètres de Stokes, caractérisant la polarisation.
La source majeure de bruit dans ces mesures provient des fluctuations du bain thermique dans lequel plongent les bolomètres, de l'électronique de lecture, des instabilités de gain et de l'optique. Ces processus engendrent des dérivez basses fréquences qui se traduisent par des niveaux de bruit relatifs entre détecteurs trop importants par rapport au signal recherché. J'applique aux donné polarisées une méthode simple permettant de soustraire aux dérives basses fréquences; elle utilise les redondances inhérentes à la stratégie de balayage du ciel par l'instrument. Les résultats montrent que ces dérives peuvent être soustraites jusqu'au niveau du bruit blanc.
Enfin, je décris l'expérience COSMOSOMAS et présente une analyse préliminaire. Elle fournira des cartes de l'émission polarisée de notre galaxie à des fréquences de l'ordre de 10 GHz.
Marcel, Patrick. "Manipulations de données multidimensionnelles et langages de règles." Lyon, INSA, 1998. http://www.theses.fr/1998ISAL0093.
Full textThis works is a contribution to the study of the manipulations in data warehouses. In the first part, we present a state of the art about multidimensional data manipulation languages in systems dedicated to On-Line analytical Processing (OLAP systems). We point out interesting combinations that haven't been studied. These conclusions are used in the second part to propose a simple rule-based language allowing specifying typical treatments arising in OLAP systems. In a third part, we illustrate the use of the language to describe OLAP treatments in spreadsheets, and to generate semi automatic spreadsheet programs
Sablayrolles, Jean-François. "Les néologismes du français contemporain : traitement théorique et analyses de données." Paris 8, 1996. http://www.theses.fr/1996PA081066.
Full textThe aim of the present work is to examine the place of neologisms in interlocution and the treatments of neology in linguistic theories. This examination induces to compare the definitions given by some dictionaries and encylopedias of the end of the last century and of nowadays. Then the comparison between about a hundred typologies made in these same periods leads to consider the concepts set in several contemporary linguistic models, whether structuralist or generativist. Diversity of the proposed solutions and the dificulties they encounter egg to inquire about the nature of the relevant linguistic unity. The notion of word, debated and insufficient, is leaved on behalf of lexie, functional unity memorized in competence,then,it is the notion of newness which is examined from two points of view:new for whom and new when compared with what? the formation and examination of six corpus from different origines (miscellaneous, weekly papers, a novel by r. Jorif, le burelain, the chronicles of ph. Meyer,le monde, and neologisms collected in a lycee) allow to test the taken up definitions and to bear out, for the maint points, some expressed hypothesis about the inequalities between membres of the linguistic community facing the neologic phenomenon. Everybody does not create as many neologisms nor the same ones. Complementary analysis, caused by the examination of the facts, consider the existence of circumstances propitious to neology, then of causes which egg the locuteur to create a new lexie and about which the interpretant makes hypothesis. At last, considerations connected with the circulation du dire specify the concept of newness and show the incertitudes which bear on the future of lexies created by a definite locuteur in definite circumstances
Lerat, Nadine. "Représentation et traitement des valeurs nulles dans les bases de données." Paris 11, 1986. http://www.theses.fr/1986PA112383.
Full textThis thesis deals with the representation and treatment of two cases of information incompleteness in the field of databases: non applicable null values and null values representing unknown objects. In the first part, queries on a unique table containing non applicable nulls are translated into a set of queries on conventional multitables. In the second part, unknown null values are represented by Skolem constants and a method adapting to this context a "chase" algorithm allows evaluating queries when functional or inclusion dependencies are satisfied. Eventually, it is shown that these two types of null values can be taken into account simultaneously
Gassot, Oriane. "Imagerie SAR du régolithe d'un astéroïde : simulation et traitement des données." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALY016.
Full textIn recent years, surface-penetrating radars were part of several space missions dedicated to study the bodies of the solar system. These radars, associated with repeat-pass observations and processing, can lead to 3D radar tomography, which results in 3D imagery of the first tens of meters of the sub-surface of planetary bodies. This technique can be used to better understand and model the formation processes and post-accretion evolution of asteroids. However, even though spaceborne SAR is a classical technique used for the detection and reconstruction of planetary structures, the small body geometry of observation reconsiders the hypotheses usually formulated for Earth observation. Furthermore, in order to achieve the metric-resolution necessary to study kilometric-sized asteroids with sufficient precision, the radar has to be ultra-wideband in range and in Doppler, which also question the SAR synthesis models established for narrow band signals. As the radar geometry of study and configuration drives the instrument performance, and thus the mission science return, simulation of the radar signal and the SAR synthesis needs to be developed while taking into account the specificity of the small body geometry. Thus, my thesis aims at assessing the performances of the UWB SAR HFR, which is dedicated to the study of small bodies, with frequencies ranging from 300 to 800 MHz, by simulating the radar's return.By creating firsly realistic asteroid digital terrain models (DTM), several surface scattering models were studied in order to select the model most suited to simulate the field scattered by the surface of an asteroid. The Kirchhoff Approximation (KA) was selected and applied on the generated DTM, and was used to build SAR images which correctly locate the DTM studied, and which differenciate the terrain’s rough areas from the smooth ones. Then, the Born Approximation (BA) was selected to model the field reflected by the asteroid subsurface and was found out to correctly locate an inclusion below the surface of an asteroid. With a multipass geometry, tomography algortihms were applied to the BA results in order to improve the resolution of the results in the third dimension of space, as well as the precision of the localisation of the inclusion. Finally, the performances of UWB scattering were studied, and, unlike what was foreseen, UWB scattering generates only a small degradation of the resolution in range and in azimuth
Pucheu, Chantal. "Etude de la goutte : traitement par l'allopurinol : données pharmacologiques et toxicologiques." Bordeaux 2, 1992. http://www.theses.fr/1992BOR2P003.
Full textBisgambiglia, Paul-Antoine. "Traitement numérique et informatique de la modélisation spectrale." Corte, 1989. http://www.theses.fr/1989CORT3002.
Full textDucournau, Aurélien. "Hypergraphes : clustering, réduction et marches aléatoires orientées pour la segmentation d'images et de vidéo." Ecole nationale d'ingénieurs (Saint-Etienne), 2012. http://www.theses.fr/2012ENISE011.
Full textGraphs have been successfully used in Pattern Recognition and Computer Vision domains for a number of years, due to their ability to represent relational patterns. However, their main limitation comes from the fact that they are only able to model pairwise affinities between elements of interest. In this work, we focus on exploiting the advantages of hypergraphs, which generalize the graph concept by providing the possibility to model relationships of higher dimension, in the domain of image analysis and more particularly image processing such as image or video segmentation. This thesis mainly focused on how we can represent the image data into a hypergraph-based model, and how we can handle such a representation in a particular application. In many of these applications the problem can be formulated as a partitioning of the underlying hypergraph structure, and consequently we developed a novel hypergraph partitioning and clustering approach (RHC), based on the reduction of the hypergraph structure. Two novel hypergraph reduction algorithms (HR-IH and HR-MST) have been designed for this purpose. The RHC framework has been exploited in different computer vision problems (image segmentation, object-based video segmentation and superpixels generation), for which appropriate hypergraph-based models have been designed. Finally, the notion of directed hypergraph has been introduced in the computer vision domain, by proposing both a directed hypergraph-based representation of the image content (DINH), and an interactive image segmentation algorithm by semi-supervised learning on such a DINH. The latter is formulated in terms of Markov random walks on directed hypergraphs, for which an appropriate transition matrix and important mathematical properties have been developed. Extensive experiments have shown that hypergraphs can in general provide a richer image representation model thangraphs, and that the proposed algorithms are efficient both in terms of segmentation quality and complexity. It has also been shown that using directed hypergraphs can provide an improvement to undirected hypergraphbased structures
Spill, Yannick. "Développement de méthodes d'échantillonnage et traitement bayésien de données continues : nouvelle méthode d'échange de répliques et modélisation de données SAXS." Paris 7, 2013. http://www.theses.fr/2013PA077237.
Full textThe determination of protein structures and other macromolecular complexes is becoming more and more difficult. The simplest cases have already been determined, and today's research in structural bioinformatics focuses on ever more challenging targets. To successfully determine the structure of these complexes, it has become necessary to combine several kinds of experiments and to relax the quality standards during acquisition. In other words, structure determination makes an increasing use of sparse, noisy and inconsistent data. It is therefore becoming essential to quantify the accuracy of a determined structure. This quantification is superbly achieved by statistical inference. In this thesis, I develop a new sampling algorithm, Convective Replica-Exchange, sought to find probable structures more robustly. I also propose e proper statistical treatment for continuous data, such as Small-Angle X-Ray Scattering data
Héraud, Bousquet Vanina. "Traitement des données manquantes en épidémiologie : Application de l'imputation multiple à des données de surveillance et d'enquêtes." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00713926.
Full textHéraud, Bousquet Vanina. "Traitement des données manquantes en épidémiologie : application de l’imputation multiple à des données de surveillance et d’enquêtes." Thesis, Paris 11, 2012. http://www.theses.fr/2012PA11T017/document.
Full textThe management of missing values is a common and widespread problem in epidemiology. The most common technique used restricts the data analysis to subjects with complete information on variables of interest, which can reducesubstantially statistical power and precision and may also result in biased estimates.This thesis investigates the application of multiple imputation methods to manage missing values in epidemiological studies and surveillance systems for infectious diseases. Study designs to which multiple imputation was applied were diverse: a risk analysis of HIV transmission through blood transfusion, a case-control study on risk factors for ampylobacter infection, and a capture-recapture study to estimate the number of new HIV diagnoses among children. We then performed multiple imputation analysis on data of a surveillance system for chronic hepatitis C (HCV) to assess risk factors of severe liver disease among HCV infected patients who reported drug use. Within this study on HCV, we proposedguidelines to apply a sensitivity analysis in order to test the multiple imputation underlying hypotheses. Finally, we describe how we elaborated and applied an ongoing multiple imputation process of the French national HIV surveillance database, evaluated and attempted to validate multiple imputation procedures.Based on these practical applications, we worked out a strategy to handle missing data in surveillance data base, including the thorough examination of the incomplete database, the building of the imputation model, and the procedure to validate imputation models and examine underlying multiple imputation hypotheses
Benzine, Mehdi. "Combinaison sécurisée des données publiques et sensibles dans les bases de données." Versailles-St Quentin en Yvelines, 2010. http://www.theses.fr/2010VERS0024.
Full textProtection of sensitive data is a major issue in the databases field. Many software and hardware solutions have been designed to protect data when stored and during query processing. Moreover, it is also necessary to provide a secure manner to combine sensitive data with public data. To achieve this goal, we designed a new storage and processing architecture. Our solution combines a main server that stores public data and a secure server dedicated to the storage and processing of sensitive data. The secure server is a hardware token which is basically a combination of (i) a secured microcontroller and (ii) a large external NAND Flash memory. The queries which combine public and sensitive data are split in two sub queries, the first one deals with the public data, the second one deals with the sensitive data. Each sub query is processed on the server storing the corresponding data. Finally, the data obtained by the computation of the sub query on public data is sent to the secure server to be mixed with the result of the computation on sensitive data. For security reasons, the final result is built on the secure server. This architecture resolves the security problems, because all the computations dealing with sensitive data are done by the secure server, but brings performance problems (few RAM, asymmetric cost of read/write operations. . . ). These problems will be solved by different strategies of query optimization
Martin, Thomas. "ORBS : élaboration d'un logiciel de réduction de données pour SpIOMM et SITELLE et application à l'étude de M1-67." Doctoral thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/25894.
Full textSpIOMM (spectromètre-imageur de l’observatoire du Mont-Mégantic), attached to the telescope of the Observatoire du Mont-Mégantic, is an imaging Fourier transform spectrometer which is still the only instrument in the world capable of sampling a 12 arc-minute field of view into 1.4 million spectra in the visible band. Since the installation in 2010 of a second camera, which has given the possibility of using the data on the second port of the interferometer, on the one hand, and the development of SITELLE (spectromètre-imageur pour l’étude en long et en large des raies d’émission), an upgraded version of SpIOMM, for the Canada-France-Hawaii Telescope, on the other hand, the design of a data reduction software capable of combining the data of both ports, has become a necessity. The main part of this thesis concerns ORBS, a data reduction software for SpIOMM and SITELLE fully automated and based on an open and upgradable architecture. An application to the study of the Wolf-Rayet nebula M1-67, which, for the first time, clearly demonstrates the existence of two regions made of a material strongly enriched in nitrogen, is also presented.
Marot, Julien. "Méthodes par sous-espaces et d'optimisation : application au traitement d'antenne, à l'analyse d'images, et au traitement de données tensorielles." Aix-Marseille 3, 2007. http://www.theses.fr/2007AIX30051.
Full textThis thesis is devoted to subspace-based and optimization methods, developed in array processing, image processing, tensor signal processing. Definitions concerning an array processing problem and high resolution methods are presented. We propose an optimization method applied to source detection in presence of phase distortions for a high number of sensors. We propose fast subspace-based methods for the estimation of straight line offset and orientation, several optimization methods to estimate distorted contours, nearly straight or circular. We provide a state-of-the art about multiway signal processing: truncation of the HOSVD, lower rank tensor approximation, multiway Wiener filtering. We propose a procedure for nonorthogonal tensor flattening, using the method presented in the first part
Michel, François. "Validation de systèmes répartis : symétries d'architecture et de données." Toulouse, INPT, 1996. http://www.theses.fr/1996INPT099H.
Full textBilodeau, Antoine, and Antoine Bilodeau. "Traitement de données observationnelles et développement d'un code de simulation pour SpIOMM et SITELLE." Master's thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/26907.
Full textL’arrivée du spectromètre imageur à transformée de Fourier SITELLE au télescope Canada-France-Hawaï souligne la nécessité d’un calculateur de temps d’exposition permettant aux utilisateurs de l’instrument de planifier leurs observations et leurs demandes de temps de télescope. Une grande partie de mon projet est ainsi le développement d’un code de simulation capable de reproduire les résultats de SITELLE et de son prédecesseur SpIOMM, installé à l’Observatoire du Mont-Mégantic. La précision des simulations est confirmée par une comparaison avec des données SpIOMM et les premières observations de SITELLE. La seconde partie de mon projet consiste en une analyse spectrale de données observationelles. Prenant avantage du grand champ de vue de SpIOMM, les caractéristiques du gaz ionisé (vitesse radiale et intensité) sont étudiées pour l’ensemble de la paire de galaxies en interaction Arp 72. La courbe de rotation dans le visible ainsi que le gradient de métallicité de NGC 5996, la galaxie principale d’Arp 72, sont obtenues ici pour la première fois. La galaxie spirale NGC 7320 est également étudiée à partir d’observations faites à la fois avec SpIOMM et SITELLE.
The arrival of the Fourier-transform imaging spectrometer SITELLE at the Canada-France-Hawaii telescope highlights the importance of having an exposure time calculator allowing the instrument’s users to plan ahead their observing proposals. A large part of my project is the development of a simulator able to reproduce the results obtained with SITELLE and its predecessor SpIOMM. The accuracy of the simulations is confirmed through comparison with SpIOMM data and SITELLE’s first observations. The second part of my project is the spectral analysis of observational data. SpIOMM’s large field of view allows us to study the properties (radial velocities and line intensities) of the ionized gas through the entirety of Arp 72, a pair of interacting galaxies. For the first time, the rotation curve in the optical and the metallicity gradient are obtained for NGC 5996, the main galaxy in Arp 72. Both SpIOMM and SITELLE results are also shown for the dwarf spiral galaxy NGC 7320.
The arrival of the Fourier-transform imaging spectrometer SITELLE at the Canada-France-Hawaii telescope highlights the importance of having an exposure time calculator allowing the instrument’s users to plan ahead their observing proposals. A large part of my project is the development of a simulator able to reproduce the results obtained with SITELLE and its predecessor SpIOMM. The accuracy of the simulations is confirmed through comparison with SpIOMM data and SITELLE’s first observations. The second part of my project is the spectral analysis of observational data. SpIOMM’s large field of view allows us to study the properties (radial velocities and line intensities) of the ionized gas through the entirety of Arp 72, a pair of interacting galaxies. For the first time, the rotation curve in the optical and the metallicity gradient are obtained for NGC 5996, the main galaxy in Arp 72. Both SpIOMM and SITELLE results are also shown for the dwarf spiral galaxy NGC 7320.
Tencé, Marcel. "Un système informatique pour l'acquisition et le traitement des données en microscopie électronique : réalisation, mise au point et application." Paris 11, 1987. http://www.theses.fr/1987PA112449.
Full textAs a consequence of its general layout, the new generation of scanning transmission electron microscopes (STEM) is particularly well suited to be interfaced with a computer which has several functions the control of the electron microscope parameters used for image acquisition, the storage of the recorded data and their a posteriori processing. STEM instruments offer several specific characters which determine the technical choices for the elaboration of the digital system hardware and for building the required interfaces. These are the capability of simultaneously recording the data delivered by the different detection channels for one probe position on the specimen. It has also to handle the sequences of energy filtered images necessary for achieving elemental mapping. Finally, the replication of images in a given set of working conditions is the key for applying cross correlation techniques and estimating image parameters such as point resolution or signal to noise ratio. This work describes the hardware which has been built and the software which has been elaborated to fulfill these goals. As a conclusion, we present three different applications made with this system : i) on-line measurement of the fractal dimension of aggregates, ii) estimate of the spatial resolution in EELS chemical mapping, iii) quantitative development of these methods with a reasonable extrapolation of the detection limits to the identification of a single atom
Malla, Noor. "Méthode de Partitionnement pour le traitement distribué et parallèle de données XML." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00759173.
Full textToutain, Matthieu. "EdP géometriques pour le traitement et la classification de données sur graphes." Caen, 2015. https://hal.archives-ouvertes.fr/tel-01258738.
Full textPartial differential equations (PDEs) play a key role in the mathematical modelization of phenomena in applied sciences. In particular, in image processing and computer vision, geometric PDEs have been successfuly used to solve many problems, such as image restoration, segmentation, inpainting, etc. Nowadays, more and more data are collected as graphs or networks, or functions defined on these networks. Knowing this, there is an interest to extend PDEs to process irregular data or graphs of arbitrary topology. The presented work follows this idea. More precisely, this work is about geometric partial difference equations (PdEs) for data processing and classification on graphs. In the first part, we propose a transcription of the normalized SpSLaplacian on weighted graphs of arbitrary topology by using the framework of PdEs. This adaptation allows us to introduce a new class of SpSaplacian on graphs as a non-divergence form. In this part, we also introduce a formulation of the SpSaplacian on graphs defined as a convex combination of gradient terms. We show that this formulation unifies and generalize many existing difference operators defined on graphs. Then, we use this operator with the Poisson equation to compute generalized distances on graphs. In the second part, we propose to apply the operators on graphs we defined, for the tasks of semi-supervised classification and clustering. We compare them to existing graphs operators and to some of the state of the art methods, such as Multiclass Total Variation clustering (MTV), clustering by non-negative matrix factorization (NMFR) or the INCRES method
Hardy, Emilie. "Procédures expérimentales et traitement des données associées pour la mission spatiale MICROSCOPE." Observatoire de Paris (1667-....), 2013. https://hal.science/tel-02095140.
Full textThe MICROSCOPE space mission aims at testing the Equivalence Principle with an accuracy of 1E-15. The instrument will be embarked on board a drag-free microsatellite, and consists in a differential electrostatic accelerometer composed of two test masses submitted to the same gravitional field but made of different material. The accelerations applied to the masses to maintain them relatively motionless are measured and will demonstrate a violation of the universality of free fall if found unequal. The accuracy of the measurement exploited for the test is limited by our a priori knowledge of the instrument physical parameters. An in-orbit calibration is needed to finely characterize them in order to correct the measurement. Appropriate calibration procedures have been determined. In order to validate their performances, they have been implemented in a software tool dedicated to the calibration sessions which simulates the instrument, the satellite and its environment. Other pertubations must be considered during the data analysis : numerical effects arise from the finite time span of the measurement. These effects have been evaluated and a procedure has been determined to process the data with minimal numerical perturbations, in a nominal situation as well as in the case of missing data, which may amplify these effects. Finally, a simulator has been developed for the entire mission scenario in order to study the association of instrumental and numerical simulations. This tool enables to simulate both the calibration sessions and the sessions for the test of the Equivalence Principle, and is therefore used to validate the protocol of measurement correction and analysis
Thirion, Bertrand. "Analyse de données d' IRM fonctionnelle : statistiques, information et dynamique." Phd thesis, Télécom ParisTech, 2003. http://tel.archives-ouvertes.fr/tel-00457460.
Full textBouveyron, Charles. "Modélisation et classification des données de grande dimension : application à l'analyse d'images." Phd thesis, Université Joseph Fourier (Grenoble), 2006. http://tel.archives-ouvertes.fr/tel-00109047.
Full textdimension. Partant du postulat que les données de grande dimension vivent dans des sous-espaces de
dimensions intrinsèques inférieures à la dimension de l'espace original et que les données de classes
différentes vivent dans des sous-espaces différents dont les dimensions intrinsèques peuvent être aussi
différentes, nous proposons une re-paramétrisation du modèle de mélange gaussien. En forçant certains
paramètres à être communs dans une même classe ou entre les classes, nous exhibons une famille de 28 modèles gaussiens adaptés aux données de grande dimension, allant du modèle le plus général au modèle le plus parcimonieux. Ces modèles gaussiens sont ensuite utilisés pour la discrimination et la classification
automatique de données de grande dimension. Les classifieurs associés à ces modèles sont baptisés respectivement High Dimensional Discriminant Analysis (HDDA) et High Dimensional Data Clustering (HDDC) et
leur construction se base sur l'estimation par la méthode du maximum de vraisemblance des paramètres du
modèle. La nature de notre re-paramétrisation permet aux méthodes HDDA et HDDC de ne pas être perturbées par le mauvais conditionnement ou la singularité des matrices de covariance empiriques des classes et d'être
efficaces en terme de temps de calcul. Les méthodes HDDA et HDDC sont ensuite mises en dans le cadre d'une
approche probabiliste de la reconnaissance d'objets dans des images. Cette approche, qui peut être
supervisée ou faiblement supervisée, permet de localiser de manière probabiliste un objet dans une
nouvelle image. Notre approche est validée sur des bases d'images récentes et comparée aux meilleures
méthodes actuelles de reconnaissance d'objets.
Brunet, Camille. "Classification parcimonieuse et discriminante de données complexes. Une application à la cytologie." Phd thesis, Université d'Evry-Val d'Essonne, 2011. http://tel.archives-ouvertes.fr/tel-00671333.
Full textLoughraïeb, Mounira. "Valence et rôles thématiques comme outils de réduction d’ambiguïtés en traitement automatique de textes écrits." Nancy 2, 1990. http://www.theses.fr/1990NAN21005.
Full textDupuy, Marie-Pierre. "Essais de prospection électromagnétique radar en mines et au sol : interprétation et traitement des données." Bordeaux 1, 1987. http://www.theses.fr/1987BOR10504.
Full text