Tesi sul tema "Traitement des données et Chimiométrie"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Traitement des données et Chimiométrie".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.
Abbas, Ouissam. "Vieillissement simulé ou naturel de la matière organique. Apport du traitement chimiométrique des données spectroscopiques. Conséquences environnementales et aide à l'exploration pétrolière". Aix-Marseille 3, 2007. http://www.theses.fr/2007AIX30006.
Testo completoThe characterization of the organic matter before and during a process of ageing is a very significant parameter for the professionals of various industrial sectors. The knowledge of its evolution processes is important to a better understand of its origins, mechanisms of its genesis and to determine its various environmental roles in natural or artificial ecosystems, in particular when the aim is to help the oil exploration or to improve the depollution processes of the aquatic environments. Thus, an effective analytical methodology is necessary. For this, a new approach of use of the spectroscopic techniques (infrared and fluorescence) is to treat spectral data by chemometric tools (ACP, SIMPLISMA and PLS) in order to establish a correlation structure ~ reactivity of the organic matter. The first objective was to authenticate the geographical origins of oils, their geological data, to evaluate their degree of maturity and biological biodegradation. In addition, this methodolgy made it possible to follow the photodegradation of phenol in aqueous solutions, to determine the concentration profiles of intermediary compounds. Thus, the effectiveness of the process has been estimated in terms of the elimination of the polluant and its photoproducts. The conclusions of this analytical procedure, which is innovating in the environmental field and petroleum industry, are in agreement with the the classical techniques. Reliable and relevant informations have been rapidly obtained with low cost
Abou, Fadel Maya. "Apports de la chimiométrie à la spectroscopie de Résonance Paramagnétique Electronique : nouvelles perspectives de traitement de données spectrales à dimensions spatiales (imagerie) et/ou temporelles". Thesis, Lille 1, 2015. http://www.theses.fr/2015LIL10130/document.
Testo completoElectron Paramagnetic Resonance (EPR) Spectroscopy has undoubtedly become the first-choice technique for the characterization of complex materials containing some unpaired electrons (transition metal ions, rare earth ions, defects, organic radicals ...). Similarly to nuclear magnetic resonance spectroscopy, EPR generates multidimensional (2D, 3D…) spectral and recently also spatial (imaging) data as well as spectral/spatial ones. It is thus, surprising that despite the large amount of spectral data to be explored and complexity of the EPR signals, there hardly exist at the international level of exploitation the multivariate data processing methods that are widely available in chemometrics. The objective of this thesis is thus, to develop new tools for the treatment of these EPR spectral data, to establish new analytical methodologies and to evaluate their performance. The two main areas that will be studied are spectroscopic imaging and time-resolved spectroscopy. In this work, we will show that the implementation of the methods known as "multivariate curve resolutions" can extract, simultaneously, and without a priori all chemical maps and their corresponding spectra of pure compounds present in the studied sample. This methodology will also be exploited to extract the EPR spectra of intermediate species during a kinetic monitoring
Elmi, Rayaleh Waïss. "Extraction de connaissances en imagerie microspectrométrique par analyse chimiométrique : application à la caractérisation des constituants d'un calcul urinaire". Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2006. http://tel.archives-ouvertes.fr/tel-00270116.
Testo completoJacq, Kévin. "Traitement d'images multispectrales et spatialisation des données pour la caractérisation de la matière organique des phases solides naturelles. High-resolution prediction of organic matter concentration with hyperspectral imaging on a sediment core High-resolution grain size distribution of sediment core with 2 hyperspectral imaging Study of pansharpening methods applied to hyperspectral images of sediment cores". Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAA024.
Testo completoThe evolution of the environment and climate are, currently, the focus of all attention. The impacts of the activities of present and past societies on the environment are in particular questioned in order to better anticipate the implications of our current activities on the future. Better describing past environments and their evolutions are possible thanks to the study of many natural recorders (sediments, speleothems, tree rings, corals). Thanks to them, it is possible to characterize biological-physical-chemical evolutions at di erent temporal resolutions and for di erent periods. The high resolution understood here as the su cient resolution for the study of the environment in connection with the evolution of societies constitutes the main lock of the study of these natural archives in particular because of the analytical capacity devices that can only rarely see ne inframillimetre structures. This work is built on the assumption that the use of hyperspectral sensors (VNIR, SWIR, LIF) coupled with relevant statistical methods should allow access to the spectral and therefore biological-physical-chemical contained in these natural archives at a spatial resolution of a few tens of micrometers and, therefore, to propose methods to reach the high temporal resolution (season). Besides, to obtain reliable estimates, several imaging sensors and linear spectroscopy (XRF, TRES) are used with their own characteristics (resolutions, spectral ranges, atomic/molecular interactions). These analytical methods are used for surface characterization of sediment cores. These micrometric spectral analyses are mapped to usual millimeter geochemical analyses. Optimizing the complementarity of all these data involves developing methods to overcome the di culty inherent in coupling data considered essentially dissimilar (resolutions, spatial shifts, spectral non-recovery). Thus, four methods were developed. The rst consists in combining hyperspectral and usual methods for the creation of quantitative predictive models. The second allows the spatial registration of di erent hyperspectral images at the lowest resolution. The third focuses on their merging with the highest of the resolutions. Finally, the last one focuses on deposits in sediments (laminae, oods, tephras) to add a temporal dimension to our studies. Through all this information and methods, multivariate predictive models were estimated for the study of organic matter, textural parameters and particle size distribution. The laminated and instantaneous deposits within the samples were characterized. These made it possible to estimate oods chronicles, as well as biological-physical-chemical variations at the season scale. Hyperspectral imaging coupled with data analysis methods are therefore powerful tools for the study of natural archives at ne temporal resolutions. The further development of the approaches proposed in this work will make it possible to study multiple archives to characterize evolutions at the scale of one or more watershed(s)
Cooney, Gary Sean. "Spectroscopie Raman exaltée de pointe pour la caractérisation de systèmes biologiques : de l'imagerie chimique et structurale nanométrique à l’air à son développement en milieu liquide". Electronic Thesis or Diss., Bordeaux, 2024. http://www.theses.fr/2024BORD0267.
Testo completoThe aims of this thesis are the development of tip-enhanced Raman spectroscopy (TERS) for applications in liquid media, specifically for the study of lipid membranes and amyloid proteins which are implicated in neurodegenerative diseases like Alzheimer’s. TERS overcomes the diffraction limit of conventional Raman spectroscopy by combining the high spatial resolution of scanning probe microscopy with the chemical specificity of surface-enhanced Raman spectroscopy (SERS). By employing a metal-coated nano-tapered scanning probe microscopy probe tip, TERS generates a localised enhancement of the Raman signal at the tip apex. This enables the study of optically non-resonant biomolecules at the nanoscale in a label-free and non-destructive manner. The key challenges that are addressed in this work include the fabrication of TERS-active tips, the optimisation of our novel total-internal reflection (TIR)-TERS system for use in liquid environments, and the handling of the complex data obtained from hyperspectral TERS imaging. Amyloid proteins in the form of Tau fibrils were studied using this TIR-TERS setup with heparin-induced Tau fibrils being a benchmark for evaluating the performance of the system. TERS studies of RNA-induced Tau fibrils provided insight into the underlying formation mechanisms of amyloid fibrils. In addition, these data were used to explore the use of chemometric methods, such as Principal Component Analysis (PCA) and Hierarchical Cluster Analysis (HCA), for their fine analysis. These methods were evaluated in the context of more traditional peak-picking methods. This thesis also details the development of a liquid-compatible TIR-TERS system and its application to the study of supported lipid bilayers in aqueous media. This advancement enables the nanoscale investigation of lipid membranes in biologically relevant media, which is more representative compared to TERS in air. With the outlook of future works investigating protein-lipid interactions, these innovations are crucial for understanding amyloid fibril formation and their deleterious effects on neuronal cells. To conclude, this thesis enhances TERS as a tool for studying biomolecular structures in the context of neurodegenerative diseases at the nanoscale, and the optimised TIR-TERS system provides a platform for future research in biological and biomedical applications
Gendrin, Christelle. "Imagerie chimique et chimiométrie pour l'analyse des formes pharmaceutiques solides". Phd thesis, Université Louis Pasteur - Strasbourg I, 2008. http://tel.archives-ouvertes.fr/tel-00341106.
Testo completoLa première problématique est l'extraction des cartes de distributions des composés chimiques. Dans le premier cas, la composition du médicament est connue et les longueurs d'onde spécifiques des composés ou les spectres de référence sont utilisés pour extraire les cartes de distributions. L'étude des histogrammes et un schéma de segmentation utilisant l'algorithme "Watershed" sont ensuite proposés pour les caractériser. Dans un deuxième temps le cas ou aucun a priori n'est connu est étudié. L'extraction simultanée de la concentration et des spectres purs à partir de la matrice de mélange doit alors être effectuée. Plusieurs algorithmes sont comparés: NMF, BPSS, MCR-ALS and PMF, les outils de rotation ont aussi été introduits pour explorer le domaine de solutions et PMF se montre alors le plus précis pour l'extraction.
La deuxième problématique est la quantification du principe actif dans des comprimés. Une première étude menée sur des mélanges binaires révèle les difficultés de la quantification sans a priori. Il est démontré ensuite que la quantification est la plus précise lorsqu'une gamme d'échantillons de concentration connue est utilisée pour établir le modèle mathématique à l'aide de l'algorithme PLS, cependant si une librairie spectrale constituée des spectres de référence est construite, alors l'algorithme PLS-DA donne une information semi-quantitative.
Fliti, Tamim. "Le problème SAT : traitement dynamique et données minimales". Aix-Marseille 2, 1997. http://www.theses.fr/1997AIX22015.
Testo completoBakiri, Ali. "Développements informatiques de déréplication et de classification de données spectroscopiques pour le profilage métabolique d’extraits d'algues". Thesis, Reims, 2018. http://www.theses.fr/2018REIMS013.
Testo completoThe emergence of dereplication strategies as a new tool for the rapid identification of the natural products from complex natural extracts has unveiled a great need for cheminformatic tools for the treatment and analysis of the spectral data. The present thesis deals with the development of in silico dereplication methods based on Nuclear Magnetic Resonance (NMR). The first method, DerepCrud, is based on 13C NMR spectroscopy. It identifies the major compounds contained in a crude natural extract without any need for fractionation. The principle of the method is to compare the 13C NMR spectrum of the analyzed mixture to a series of 13C NMR chemical shifts of natural compounds stored in a local database. The second method, BCNet, is designed to exploit the richness of 2D NMR data (HMBC and HSQC) for the dereplication of the natural products. BCNet traces back the network formed by the HMBC correlations of the molecules present in a naturel extract, then isolates the groups of correlations belonging to the individual molecules using a community detection algorithm. The molecules are identified by searching these correlations within a locally constructed database that associates natural product structures and 2D NMR peak positions. Finally, the HSQC correlations of the molecules identified during the previous step are compared to the experimental HSQC correlations of the studied extract in order to increase the quality of identification accuracy
Wojciechowski, Christine. "Apports de la chimiométrie à l'interprétation des données de la spectroscopie infrarouge : caractérisation des matières premières et matériaux d'emballage dans l'agro-alimentaire". Lille 1, 1998. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/1998/50376-1998-63.pdf.
Testo completoBuchholz, Bert. "Abstraction et traitement de masses de données 3D animées". Phd thesis, Télécom ParisTech, 2012. http://pastel.archives-ouvertes.fr/pastel-00958339.
Testo completoBuchholz, Bert. "Abstraction et traitement de masses de données 3D animées". Electronic Thesis or Diss., Paris, ENST, 2012. http://www.theses.fr/2012ENST0080.
Testo completoIn this thesis, we explore intermediary structures and their relationship to the employed algorithms in the context of photorealistic (PR) and non-photorealistic (NPR) rendering. We present new structures for rendering as well as new uses for existing structures. We present three original contributions in the NPR and PR domain: First, we present binary shading, a method to generate stylized black and white images, inspired by comic artists, using appearance and geometry in a graph-based energy formulation. The user can control the algorithm to generate images of different styles and representations. The second work allows the temporally coherent parameterization of line animations for texturing purposes. We introduce a spatio-temporal structure over the input data and an energy formulation for a globally optimal parameterization. Similar to the work on binary shading, the energy formulation provides a an important and simple control over the output. Finally, we present an extension to Point-based Global Illumination, a method used extensively in movie production during the last years. Our work allows compressing the data generated by the original algorithm using quantification. It is memory-efficient and has only a neglegible time overhead while enabling the rendering of larger scenes. The user can easily control the strength and quality of the compression. We also propose a number of possible extensions and improvements to the methods presented in the thesis
Laloum, Eric. "Classification de calculs biliaires et urinaires à partir de spectres vibrationnels et de critères morphologiques : relation avec des données cliniques". Châtenay-Malabry, Ecole centrale de Paris, 1998. http://www.theses.fr/1998ECAP0591.
Testo completoGu, Co Weila Vila. "Méthodes statistiques et informatiques pour le traitement des données manquantes". Phd thesis, Conservatoire national des arts et metiers - CNAM, 1997. http://tel.archives-ouvertes.fr/tel-00808585.
Testo completoMouginot, Jérémie. "Traitement et analyse des données du radar MARSIS/Mars Express". Phd thesis, Grenoble 1, 2008. http://www.theses.fr/2008GRE10291.
Testo completoThis manuscript describes the data processing and analysis of the Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS). The data processing consists mainly to compensate the ionospheric distorsion. This correction provides as beneficial by-product to estimate the total electron content of the martian ionosphere. Using this results, we study in detail the martian ionosphere. We show that the remnant magnetic fied changes the spatial distribution of electrons in the ionosphere by accelerated the solar wind particles along magnetic field lines. The corrected radargrams provide the opportunity to study in detail the martian polar deposits. After the development of numerical tools to select the subsurface interfaces, we make a volume balance of the martian polar deposits. We show that the volume of the martian ice sheets is equivalent to a global water layer thickness of about 20 m. In a last part, we study the martian surface reflectivity. To do that, we extract the surface echo amplitude from each MARSIS pulse, and then, after calibrate it, we construct a global map of radar reflectivity (radar albedo). We describe the reflectivity map, firstly in a global point of view and secondly, more regionally around Medusae Fossae and the south residual cap of Mars. We show that the reflectivity decreases with the latitude, this constatation is probably linked to the presence of permafrost in the shallow subsurface. Near equator, we observe that the Medusae Fossae Formation corresponds to a dielectric constant of 2,4+/-0. 5, which is carateristic of a porous and/or ice rich terrain. In the same region, we show that the dielectric constant of Elysium and Amazonis Planitia is equal to 7+/-1 and we observe a interface at about 140+/-20 m under the surface, our main conclusion is that these results are characterics of lava floods. Finally, the study of the reflectivity of south residual of Mars by a multi-layers model of reflectivity, shows that the CO2 layer covering this region has a thickness of about 11+/-1. 5 m
Mouginot, Jérémie. "Traitement et Analyse des Données du Radar MARSIS/Mars Express". Phd thesis, Université Joseph Fourier (Grenoble), 2008. http://tel.archives-ouvertes.fr/tel-00364323.
Testo completoLe traitement de ces données a principalement consisté à compenser la distorsion ionosphérique. Cette correction a permis de réaliser une mesure indirecte du contenu électronique de l'ionosphère. Grâce à ces mesures, nous avons pu étudier en détail l'ionosphère martienne. Nous avons ainsi montré que le champ magnétique rémanent modifiait la distribution des électrons de l'ionosphère en précipitant les particules du vent solaire le long des lignes de champ radiales.
Les radargrammes corrigés nous ont permis d'étudier en détail les calottes martiennes. Nous faisons le bilan du volume des calottes polaires de Mars en utilisant des outils numériques développés pour le pointage des interfaces. Nous montrons ainsi que le volume des calottes correspondrait à une couche d'eau d'environ 20 m d'épaisseur répartie sur toute la planète.
Nous étudions enfin la réflectivité de la surface martienne. Pour cela, nous avons extrait l'amplitude de l'écho de surface de chaque pulse MARSIS, puis, après avoir calibré ces mesures, nous avons créé une carte globale de l'albédo radar. Nous nous sommes attachés à décrire cette carte de réflectivité, d'abord de manière globale, puis plus localement autour de Medusae Fossae et de la calotte résiduelle sud. Nous montrons que la réflectivité décroît avec la latitude, cette constatation est surement liée à la présence d'un pergélisol lorsqu'on remonte vers les hautes latitudes. Près de l'équateur, nous observons que les formations de Medusae Fossae possèdent une constante diélectrique de 2,4+/-0.5 ce qui est caractéristique d'un terrain poreux et/ou riche en glace. Dans cette même région, nous montrons que, dans les plaines d'Elysium et Amazonis, la constante diélectrique est égale à 7\pm1 et nous observons une interface dans la plaine d'Amazonis à environ 140+/-20 m, notre conclusion est que ces résultats sont caractéristiques d'écoulements de lave. L'étude de la calotte résiduelle sud de Mars, à l'aide d'un modèle de réflectivité multi-couches, nous permet d'estimer l'épaisseur de CO2 qui couvre cette région à 11+/-1.5 m.
Royer, Jean-Jacques. "Analyse multivariable et filtrage des données régionalisées". Vandoeuvre-les-Nancy, INPL, 1988. http://www.theses.fr/1988NAN10312.
Testo completoFraisse, Bernard. "Automatisation, traitement du signal et recueil de données en diffraction x et analyse thermique : Exploitation, analyse et représentation des données". Montpellier 2, 1995. http://www.theses.fr/1995MON20152.
Testo completoWoiselle, Arnaud. "Parcimonie et ses applications à des données de flot optique". Paris 7, 2010. http://www.theses.fr/2010PA077201.
Testo completoThis manuscript addresses inverse problems using 3D sparsifying transforms in the context of infrared video. We first survey the infrared imaging System and the problems associated such as noise and missing data, and make a short state of the art on industrial imaging processing technics Then we introduce the concept of sparsity and present common algorithms and representations used in this domain. The core of this research is the development of new 3D representations, which are well adapted to representing surfaces or filaments in a 3D volume. Two innovative transforms are proposed, and an existing one ~ the fast curvelet transform — is modified to better suit the applications by reducing its redundancy. These tools are used in many inverse problems such as denoising, inpainting and video deinterlacing, and prove to be very useful and competitive. Finally, we propose a few concrete applications of sparse transforms to infrared video processing, such as small targets detection or motion blur deconvolution
Revenu, Benoît. "Anisotropies et polarisation du rayonnement fossile: méthode de détection et traitement de données". Phd thesis, Université Paris-Diderot - Paris VII, 2000. http://tel.archives-ouvertes.fr/tel-00001686.
Testo completoLa première partie de cette thèse est consacrée au modèle standard de la cosmologie. Je présente en particulier la polarisation du rayonnement fossile.
Le signal polarisé, dont l'intensité n'excède pas dans la plupart des scénarios 10 % des fluctuations de température, est attendu à quelques micro kelvins. Pour le mesurer, on utilise souvent des bolomètres refroidis, couplés à des polariseurs. Je montre qu'il existe des dispositions optimales des détecteurs dans le plan focal de l'instrument minimisant le volume de la boîte d'erreurs et permettant d'avoir des erreurs décorrélées sur les paramètres de Stokes, caractérisant la polarisation.
La source majeure de bruit dans ces mesures provient des fluctuations du bain thermique dans lequel plongent les bolomètres, de l'électronique de lecture, des instabilités de gain et de l'optique. Ces processus engendrent des dérivez basses fréquences qui se traduisent par des niveaux de bruit relatifs entre détecteurs trop importants par rapport au signal recherché. J'applique aux donné polarisées une méthode simple permettant de soustraire aux dérives basses fréquences; elle utilise les redondances inhérentes à la stratégie de balayage du ciel par l'instrument. Les résultats montrent que ces dérives peuvent être soustraites jusqu'au niveau du bruit blanc.
Enfin, je décris l'expérience COSMOSOMAS et présente une analyse préliminaire. Elle fournira des cartes de l'émission polarisée de notre galaxie à des fréquences de l'ordre de 10 GHz.
Marcel, Patrick. "Manipulations de données multidimensionnelles et langages de règles". Lyon, INSA, 1998. http://www.theses.fr/1998ISAL0093.
Testo completoThis works is a contribution to the study of the manipulations in data warehouses. In the first part, we present a state of the art about multidimensional data manipulation languages in systems dedicated to On-Line analytical Processing (OLAP systems). We point out interesting combinations that haven't been studied. These conclusions are used in the second part to propose a simple rule-based language allowing specifying typical treatments arising in OLAP systems. In a third part, we illustrate the use of the language to describe OLAP treatments in spreadsheets, and to generate semi automatic spreadsheet programs
Bisgambiglia, Paul-Antoine. "Traitement numérique et informatique de la modélisation spectrale". Corte, 1989. http://www.theses.fr/1989CORT3002.
Testo completoSablayrolles, Jean-François. "Les néologismes du français contemporain : traitement théorique et analyses de données". Paris 8, 1996. http://www.theses.fr/1996PA081066.
Testo completoThe aim of the present work is to examine the place of neologisms in interlocution and the treatments of neology in linguistic theories. This examination induces to compare the definitions given by some dictionaries and encylopedias of the end of the last century and of nowadays. Then the comparison between about a hundred typologies made in these same periods leads to consider the concepts set in several contemporary linguistic models, whether structuralist or generativist. Diversity of the proposed solutions and the dificulties they encounter egg to inquire about the nature of the relevant linguistic unity. The notion of word, debated and insufficient, is leaved on behalf of lexie, functional unity memorized in competence,then,it is the notion of newness which is examined from two points of view:new for whom and new when compared with what? the formation and examination of six corpus from different origines (miscellaneous, weekly papers, a novel by r. Jorif, le burelain, the chronicles of ph. Meyer,le monde, and neologisms collected in a lycee) allow to test the taken up definitions and to bear out, for the maint points, some expressed hypothesis about the inequalities between membres of the linguistic community facing the neologic phenomenon. Everybody does not create as many neologisms nor the same ones. Complementary analysis, caused by the examination of the facts, consider the existence of circumstances propitious to neology, then of causes which egg the locuteur to create a new lexie and about which the interpretant makes hypothesis. At last, considerations connected with the circulation du dire specify the concept of newness and show the incertitudes which bear on the future of lexies created by a definite locuteur in definite circumstances
Lerat, Nadine. "Représentation et traitement des valeurs nulles dans les bases de données". Paris 11, 1986. http://www.theses.fr/1986PA112383.
Testo completoThis thesis deals with the representation and treatment of two cases of information incompleteness in the field of databases: non applicable null values and null values representing unknown objects. In the first part, queries on a unique table containing non applicable nulls are translated into a set of queries on conventional multitables. In the second part, unknown null values are represented by Skolem constants and a method adapting to this context a "chase" algorithm allows evaluating queries when functional or inclusion dependencies are satisfied. Eventually, it is shown that these two types of null values can be taken into account simultaneously
Gassot, Oriane. "Imagerie SAR du régolithe d'un astéroïde : simulation et traitement des données". Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALY016.
Testo completoIn recent years, surface-penetrating radars were part of several space missions dedicated to study the bodies of the solar system. These radars, associated with repeat-pass observations and processing, can lead to 3D radar tomography, which results in 3D imagery of the first tens of meters of the sub-surface of planetary bodies. This technique can be used to better understand and model the formation processes and post-accretion evolution of asteroids. However, even though spaceborne SAR is a classical technique used for the detection and reconstruction of planetary structures, the small body geometry of observation reconsiders the hypotheses usually formulated for Earth observation. Furthermore, in order to achieve the metric-resolution necessary to study kilometric-sized asteroids with sufficient precision, the radar has to be ultra-wideband in range and in Doppler, which also question the SAR synthesis models established for narrow band signals. As the radar geometry of study and configuration drives the instrument performance, and thus the mission science return, simulation of the radar signal and the SAR synthesis needs to be developed while taking into account the specificity of the small body geometry. Thus, my thesis aims at assessing the performances of the UWB SAR HFR, which is dedicated to the study of small bodies, with frequencies ranging from 300 to 800 MHz, by simulating the radar's return.By creating firsly realistic asteroid digital terrain models (DTM), several surface scattering models were studied in order to select the model most suited to simulate the field scattered by the surface of an asteroid. The Kirchhoff Approximation (KA) was selected and applied on the generated DTM, and was used to build SAR images which correctly locate the DTM studied, and which differenciate the terrain’s rough areas from the smooth ones. Then, the Born Approximation (BA) was selected to model the field reflected by the asteroid subsurface and was found out to correctly locate an inclusion below the surface of an asteroid. With a multipass geometry, tomography algortihms were applied to the BA results in order to improve the resolution of the results in the third dimension of space, as well as the precision of the localisation of the inclusion. Finally, the performances of UWB scattering were studied, and, unlike what was foreseen, UWB scattering generates only a small degradation of the resolution in range and in azimuth
Pucheu, Chantal. "Etude de la goutte : traitement par l'allopurinol : données pharmacologiques et toxicologiques". Bordeaux 2, 1992. http://www.theses.fr/1992BOR2P003.
Testo completoSpill, Yannick. "Développement de méthodes d'échantillonnage et traitement bayésien de données continues : nouvelle méthode d'échange de répliques et modélisation de données SAXS". Paris 7, 2013. http://www.theses.fr/2013PA077237.
Testo completoThe determination of protein structures and other macromolecular complexes is becoming more and more difficult. The simplest cases have already been determined, and today's research in structural bioinformatics focuses on ever more challenging targets. To successfully determine the structure of these complexes, it has become necessary to combine several kinds of experiments and to relax the quality standards during acquisition. In other words, structure determination makes an increasing use of sparse, noisy and inconsistent data. It is therefore becoming essential to quantify the accuracy of a determined structure. This quantification is superbly achieved by statistical inference. In this thesis, I develop a new sampling algorithm, Convective Replica-Exchange, sought to find probable structures more robustly. I also propose e proper statistical treatment for continuous data, such as Small-Angle X-Ray Scattering data
Héraud, Bousquet Vanina. "Traitement des données manquantes en épidémiologie : Application de l'imputation multiple à des données de surveillance et d'enquêtes". Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00713926.
Testo completoHéraud, Bousquet Vanina. "Traitement des données manquantes en épidémiologie : application de l’imputation multiple à des données de surveillance et d’enquêtes". Thesis, Paris 11, 2012. http://www.theses.fr/2012PA11T017/document.
Testo completoThe management of missing values is a common and widespread problem in epidemiology. The most common technique used restricts the data analysis to subjects with complete information on variables of interest, which can reducesubstantially statistical power and precision and may also result in biased estimates.This thesis investigates the application of multiple imputation methods to manage missing values in epidemiological studies and surveillance systems for infectious diseases. Study designs to which multiple imputation was applied were diverse: a risk analysis of HIV transmission through blood transfusion, a case-control study on risk factors for ampylobacter infection, and a capture-recapture study to estimate the number of new HIV diagnoses among children. We then performed multiple imputation analysis on data of a surveillance system for chronic hepatitis C (HCV) to assess risk factors of severe liver disease among HCV infected patients who reported drug use. Within this study on HCV, we proposedguidelines to apply a sensitivity analysis in order to test the multiple imputation underlying hypotheses. Finally, we describe how we elaborated and applied an ongoing multiple imputation process of the French national HIV surveillance database, evaluated and attempted to validate multiple imputation procedures.Based on these practical applications, we worked out a strategy to handle missing data in surveillance data base, including the thorough examination of the incomplete database, the building of the imputation model, and the procedure to validate imputation models and examine underlying multiple imputation hypotheses
Marot, Julien. "Méthodes par sous-espaces et d'optimisation : application au traitement d'antenne, à l'analyse d'images, et au traitement de données tensorielles". Aix-Marseille 3, 2007. http://www.theses.fr/2007AIX30051.
Testo completoThis thesis is devoted to subspace-based and optimization methods, developed in array processing, image processing, tensor signal processing. Definitions concerning an array processing problem and high resolution methods are presented. We propose an optimization method applied to source detection in presence of phase distortions for a high number of sensors. We propose fast subspace-based methods for the estimation of straight line offset and orientation, several optimization methods to estimate distorted contours, nearly straight or circular. We provide a state-of-the art about multiway signal processing: truncation of the HOSVD, lower rank tensor approximation, multiway Wiener filtering. We propose a procedure for nonorthogonal tensor flattening, using the method presented in the first part
Benzine, Mehdi. "Combinaison sécurisée des données publiques et sensibles dans les bases de données". Versailles-St Quentin en Yvelines, 2010. http://www.theses.fr/2010VERS0024.
Testo completoProtection of sensitive data is a major issue in the databases field. Many software and hardware solutions have been designed to protect data when stored and during query processing. Moreover, it is also necessary to provide a secure manner to combine sensitive data with public data. To achieve this goal, we designed a new storage and processing architecture. Our solution combines a main server that stores public data and a secure server dedicated to the storage and processing of sensitive data. The secure server is a hardware token which is basically a combination of (i) a secured microcontroller and (ii) a large external NAND Flash memory. The queries which combine public and sensitive data are split in two sub queries, the first one deals with the public data, the second one deals with the sensitive data. Each sub query is processed on the server storing the corresponding data. Finally, the data obtained by the computation of the sub query on public data is sent to the secure server to be mixed with the result of the computation on sensitive data. For security reasons, the final result is built on the secure server. This architecture resolves the security problems, because all the computations dealing with sensitive data are done by the secure server, but brings performance problems (few RAM, asymmetric cost of read/write operations. . . ). These problems will be solved by different strategies of query optimization
Michel, François. "Validation de systèmes répartis : symétries d'architecture et de données". Toulouse, INPT, 1996. http://www.theses.fr/1996INPT099H.
Testo completoBilodeau, Antoine, e Antoine Bilodeau. "Traitement de données observationnelles et développement d'un code de simulation pour SpIOMM et SITELLE". Master's thesis, Université Laval, 2016. http://hdl.handle.net/20.500.11794/26907.
Testo completoL’arrivée du spectromètre imageur à transformée de Fourier SITELLE au télescope Canada-France-Hawaï souligne la nécessité d’un calculateur de temps d’exposition permettant aux utilisateurs de l’instrument de planifier leurs observations et leurs demandes de temps de télescope. Une grande partie de mon projet est ainsi le développement d’un code de simulation capable de reproduire les résultats de SITELLE et de son prédecesseur SpIOMM, installé à l’Observatoire du Mont-Mégantic. La précision des simulations est confirmée par une comparaison avec des données SpIOMM et les premières observations de SITELLE. La seconde partie de mon projet consiste en une analyse spectrale de données observationelles. Prenant avantage du grand champ de vue de SpIOMM, les caractéristiques du gaz ionisé (vitesse radiale et intensité) sont étudiées pour l’ensemble de la paire de galaxies en interaction Arp 72. La courbe de rotation dans le visible ainsi que le gradient de métallicité de NGC 5996, la galaxie principale d’Arp 72, sont obtenues ici pour la première fois. La galaxie spirale NGC 7320 est également étudiée à partir d’observations faites à la fois avec SpIOMM et SITELLE.
The arrival of the Fourier-transform imaging spectrometer SITELLE at the Canada-France-Hawaii telescope highlights the importance of having an exposure time calculator allowing the instrument’s users to plan ahead their observing proposals. A large part of my project is the development of a simulator able to reproduce the results obtained with SITELLE and its predecessor SpIOMM. The accuracy of the simulations is confirmed through comparison with SpIOMM data and SITELLE’s first observations. The second part of my project is the spectral analysis of observational data. SpIOMM’s large field of view allows us to study the properties (radial velocities and line intensities) of the ionized gas through the entirety of Arp 72, a pair of interacting galaxies. For the first time, the rotation curve in the optical and the metallicity gradient are obtained for NGC 5996, the main galaxy in Arp 72. Both SpIOMM and SITELLE results are also shown for the dwarf spiral galaxy NGC 7320.
The arrival of the Fourier-transform imaging spectrometer SITELLE at the Canada-France-Hawaii telescope highlights the importance of having an exposure time calculator allowing the instrument’s users to plan ahead their observing proposals. A large part of my project is the development of a simulator able to reproduce the results obtained with SITELLE and its predecessor SpIOMM. The accuracy of the simulations is confirmed through comparison with SpIOMM data and SITELLE’s first observations. The second part of my project is the spectral analysis of observational data. SpIOMM’s large field of view allows us to study the properties (radial velocities and line intensities) of the ionized gas through the entirety of Arp 72, a pair of interacting galaxies. For the first time, the rotation curve in the optical and the metallicity gradient are obtained for NGC 5996, the main galaxy in Arp 72. Both SpIOMM and SITELLE results are also shown for the dwarf spiral galaxy NGC 7320.
Tencé, Marcel. "Un système informatique pour l'acquisition et le traitement des données en microscopie électronique : réalisation, mise au point et application". Paris 11, 1987. http://www.theses.fr/1987PA112449.
Testo completoAs a consequence of its general layout, the new generation of scanning transmission electron microscopes (STEM) is particularly well suited to be interfaced with a computer which has several functions the control of the electron microscope parameters used for image acquisition, the storage of the recorded data and their a posteriori processing. STEM instruments offer several specific characters which determine the technical choices for the elaboration of the digital system hardware and for building the required interfaces. These are the capability of simultaneously recording the data delivered by the different detection channels for one probe position on the specimen. It has also to handle the sequences of energy filtered images necessary for achieving elemental mapping. Finally, the replication of images in a given set of working conditions is the key for applying cross correlation techniques and estimating image parameters such as point resolution or signal to noise ratio. This work describes the hardware which has been built and the software which has been elaborated to fulfill these goals. As a conclusion, we present three different applications made with this system : i) on-line measurement of the fractal dimension of aggregates, ii) estimate of the spatial resolution in EELS chemical mapping, iii) quantitative development of these methods with a reasonable extrapolation of the detection limits to the identification of a single atom
Malla, Noor. "Méthode de Partitionnement pour le traitement distribué et parallèle de données XML". Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00759173.
Testo completoToutain, Matthieu. "EdP géometriques pour le traitement et la classification de données sur graphes". Caen, 2015. https://hal.archives-ouvertes.fr/tel-01258738.
Testo completoPartial differential equations (PDEs) play a key role in the mathematical modelization of phenomena in applied sciences. In particular, in image processing and computer vision, geometric PDEs have been successfuly used to solve many problems, such as image restoration, segmentation, inpainting, etc. Nowadays, more and more data are collected as graphs or networks, or functions defined on these networks. Knowing this, there is an interest to extend PDEs to process irregular data or graphs of arbitrary topology. The presented work follows this idea. More precisely, this work is about geometric partial difference equations (PdEs) for data processing and classification on graphs. In the first part, we propose a transcription of the normalized SpSLaplacian on weighted graphs of arbitrary topology by using the framework of PdEs. This adaptation allows us to introduce a new class of SpSaplacian on graphs as a non-divergence form. In this part, we also introduce a formulation of the SpSaplacian on graphs defined as a convex combination of gradient terms. We show that this formulation unifies and generalize many existing difference operators defined on graphs. Then, we use this operator with the Poisson equation to compute generalized distances on graphs. In the second part, we propose to apply the operators on graphs we defined, for the tasks of semi-supervised classification and clustering. We compare them to existing graphs operators and to some of the state of the art methods, such as Multiclass Total Variation clustering (MTV), clustering by non-negative matrix factorization (NMFR) or the INCRES method
Hardy, Emilie. "Procédures expérimentales et traitement des données associées pour la mission spatiale MICROSCOPE". Observatoire de Paris (1667-....), 2013. https://hal.science/tel-02095140.
Testo completoThe MICROSCOPE space mission aims at testing the Equivalence Principle with an accuracy of 1E-15. The instrument will be embarked on board a drag-free microsatellite, and consists in a differential electrostatic accelerometer composed of two test masses submitted to the same gravitional field but made of different material. The accelerations applied to the masses to maintain them relatively motionless are measured and will demonstrate a violation of the universality of free fall if found unequal. The accuracy of the measurement exploited for the test is limited by our a priori knowledge of the instrument physical parameters. An in-orbit calibration is needed to finely characterize them in order to correct the measurement. Appropriate calibration procedures have been determined. In order to validate their performances, they have been implemented in a software tool dedicated to the calibration sessions which simulates the instrument, the satellite and its environment. Other pertubations must be considered during the data analysis : numerical effects arise from the finite time span of the measurement. These effects have been evaluated and a procedure has been determined to process the data with minimal numerical perturbations, in a nominal situation as well as in the case of missing data, which may amplify these effects. Finally, a simulator has been developed for the entire mission scenario in order to study the association of instrumental and numerical simulations. This tool enables to simulate both the calibration sessions and the sessions for the test of the Equivalence Principle, and is therefore used to validate the protocol of measurement correction and analysis
Dupuy, Marie-Pierre. "Essais de prospection électromagnétique radar en mines et au sol : interprétation et traitement des données". Bordeaux 1, 1987. http://www.theses.fr/1987BOR10504.
Testo completoRoumy, Aline. "Egalisation et décodage conjoints : méthodes turbo". Cergy-Pontoise, 2000. http://biblioweb.u-cergy.fr/theses/00CERG0100.pdf.
Testo completoDeveau, Matthieu. "Utilisation conjointe de données image et laser pour la segmentation et la modélisation 3D". Paris 5, 2006. http://www.theses.fr/2006PA05S001.
Testo completoThis thesis deals with combining a digital image with laser data for complex scenes 3D modeling automation. Image and laser data are acquired from the same point of view, with a greater image resolution than the laser one. This work is structured around three main topics pose estimation, segmentation and modeling. Pose estimation is based both on feature points matching, between the digital image and the laser intensity image, and on linear feature extraction. Data segmentation into geometric features is done through a hierarchical segmentation scheme, where image and laser data are combined. 3D modeling automation is studied through this hierarchical scheme. A tool for semi-automated modeling is also derived from the hierarchical segmentation architecture. In the modeling step, we have focused on automatic modeling of cylinders with free-form profiles. The description is then very general, with planes, freeform profile cylinders, revolution objects, and meshes on complex parts
Khelil, Amar. "Elaboration d'un système de stockage et exploitation de données pluviométriques". Lyon, INSA, 1985. http://www.theses.fr/1985ISAL0034.
Testo completoThe Lyon District Urban Area (CO. UR. LY. ) may be explained from an hydrological point of view as a 600 km2 area equipped with a sewerage system estimated by 2 000 km of pipes. Due to the complexity of the sewerage network of the area, it must therefore be controlled by an accurate and reliable system of calculation to avoid any negative consequences of its function. The capacity of the present computerising system SERAIL, allows an overall simulation of the functioning of drainage / sewerage system. This model requires an accurate information of the rainfall rate which was not previously available. Therefore a 30 rain gages network (with cassette in sit recording) was set up within the Urban District Area in 1983. This research however introduces the experiment of three steps: 1) to install the network; 2) to build up a data checking and storage system; 3) to analyse the data. The characteristic nature of this work deals with the data analysis system. It allows to extract easily and analyse any rainfall event important to the hydrologist. Two aims were defined: 1) to get a better understanding of the phenomena (punctual representations ); 2) to build up models. In order to achieve the second aim, it was necessary to think about the fitting of the propounded models and their limits which led to the setting up of several other programmes for checking and comparison. For example a complete analysis of a rainfall event is given with comments and conclusion
Chmiel, Malgorzata. "Traitement de données géophysiques en réseaux denses en configuration sismique passive et active". Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAU009/document.
Testo completoIn geophysics, spatially dense arrays enhance the spatial and frequential characterization of the various waves propagating in the medium. Of course, surface array is subject to strong surface waves. Surface waves highly impact the processing of geophysical data acquired at ground level. They can be considered as noise and subject to suppression as they mask sub-surface information.However, they can be useful for near-surface imaging if they are well retrieved. In any case, their characterization is crucial in active and passive exploration geophysics. In passive microseismic monitoring, ambient surface noise consists of surface waves. The main goal of passive monitoring is to minimize the impact of surface waves on the actual microseismic data. The strong ambient surface noise lowers the sensitivity and the efficiency ofdetection and location methods. Moreover, current location and detection methods usually require strong a priori information (e.g., a velocity model or a template).Active sources generate strong surface waves. In active seismic, current processing strategies often consist in manually picking surface wave arrivals in order to use or remove them. This is often a complex, time consuming, and an ambiguous task. However, it is needed for near- and sub-surface imaging. Surface waves can be particularly difficult to retrieve in sparse arrays. We propose to apply the techniques of interferometry and beamforming (Matched Field Processing in particular) in the context of dense arrays. High trace density opens new possibilities in geophysical processing in both passive and active surveys. We show that the ambient noise can be explored in the case of microseismic monitoring to extract important information about the medium properties. Moreover, we develop a denoising approach to remove the noise sources at the surface and detect the microseismic event. Furthermore, we propose an automatic detection and location method with a minimum a priori information to retrieve the distribution of heterogeneities in the reservoir, in the well vicinity.In active survey, we propose an interferometric, automatic approach to characterize the surface waves. We retrieve phase-sensitivity kernels of surface waves between any two points of the acquisition. These kernels are consequently used to obtain multi-mode dispersion curves. These dispersion curves make it possible to separate different modes of surface waves and provide near-surface information if inverted.The above presented methodologies benefit from spatially dense arrays.Dense arrays of sources or receivers enable alternative, innovative applications in geophysical processing
Servière, Thierry. "Acquisition et traitement de données : application à la caractérisation des structures implantées F.E.T". Montpellier 2, 1986. http://www.theses.fr/1986MON20047.
Testo completoWurbel, Nathalie. "Dictionnaires et bases de connaissances : traitement automatique de données dictionnairiques de langue française". Aix-Marseille 3, 1995. http://www.theses.fr/1995AIX30035.
Testo completoDelanchy, Patrice. "Mesure et traitement de données thermiques : contribution à la connaissance des sols viticoles". Angers, 1990. http://www.theses.fr/1990ANGE0003.
Testo completoFernandez, Pernas Jesus. "Optimisation et automatisation du traitement informatique des données spectroscopiques cérébrales (proton simple volume)". Caen, 2002. http://www.theses.fr/2002CAEN3079.
Testo completoKaytoue, Mehdi. "Traitement de données numériques par analyse formelle de concepts et structures de patrons". Phd thesis, Université Henri Poincaré - Nancy I, 2011. http://tel.archives-ouvertes.fr/tel-00599168.
Testo completoKaytoue, Mehdi. "Traitement de données numériques par analyse formelle de concepts et structures de patrons". Electronic Thesis or Diss., Nancy 1, 2011. http://www.theses.fr/2011NAN10015.
Testo completoThe main topic of this thesis addresses the important problem of mining numerical data, and especially gene expression data. These data characterize the behaviour of thousand of genes in various biological situations (time, cell, etc.).A difficult task consists in clustering genes to obtain classes of genes with similar behaviour, supposed to be involved together within a biological process.Accordingly, we are interested in designing and comparing methods in the field of knowledge discovery from biological data. We propose to study how the conceptual classification method called Formal Concept Analysis (FCA) can handle the problem of extracting interesting classes of genes. For this purpose, we have designed and experimented several original methods based on an extension of FCA called pattern structures. Furthermore, we show that these methods can enhance decision making in agronomy and crop sanity in the vast formal domain of information fusion
Grenier, Jean-Christophe. "Le glaucome primitif à angle ouvert : données actuelles et nouvelles perspectives de traitement". Bordeaux 2, 1995. http://www.theses.fr/1995BOR2P039.
Testo completoFleureau, Julien. "Intégration de données anatomiques issues d'images MSCT et de modèles électrophysiologique et mécanique du coeur". Rennes 1, 2008. http://www.theses.fr/2008REN1S049.
Testo completoThis work contributes to the systemic interpretation of clinical data for the analysis of the regional cardiac function. A patient-specific approach, combining a realistic geometry and an electromechanical model of the ventricle is proposed. The work is divided into two parts: 1) Two original methods for 3D segmentation are proposed to extract cardiac structures from MSCT imaging: the first one, generic and multi-object, is based on a multi-agent framework; the second one, leads to a less detailed approximation of the heart geometry, combining region and boundary-based approaches in a hybrid method. Both methods are evaluated on real data; 2) A mesoscopic model of the left ventricle, coupling a discrete electrical model, a mechanical model integrating a visco-elastic law, solved by a finite element method and a hydraulic model, is presented. Verification is carried out by a set of simulations and a first parameter identification approach, based on real patient data is presented
Damak, Mohamed. "Un logiciel de stockage, de traitement et de visualisation graphique et cartographique des données géologiques et géotechniques". Phd thesis, Grenoble 1, 1990. http://tel.archives-ouvertes.fr/tel-00785637.
Testo completo