Tesis sobre el tema "Séparation en composantes"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Séparation en composantes".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Betoule, Marc. "Analyse des données du fond diffus cosmologique : simulation et séparation de composantes". Phd thesis, Observatoire de Paris, 2009. http://tel.archives-ouvertes.fr/tel-00462157.
Texto completoBetoule, Marc. "Analyse des données du fond diffus cosmologique : simulation et séparation de composantes". Phd thesis, Observatoire de Paris (1667-....), 2009. https://theses.hal.science/tel-00462157v2.
Texto completoThe next generation of experiments dedicated to measuring temperature and polarization anisotropies of the microwave background radiation (CMB), inaugurated with the launch of the Planck satellite, will enable the detection and study of increasingly subtle effects. However, the superposition of astrophysical foreground emissions hinder the analysis of the cosmological signal and will contribute as the main source of uncertainty in the forthcoming measurements. An improved modeling of foreground emissions and the development of statistical methods to extract the cosmological information from this contamination are thus crucial steps in the scientific analysis of incoming datasets. In this work we describe the development of the Planck Sky Model, a tool for modeling and simulating the sky emission. We then make use of these simulations to develop and evaluate statistical treatments of foreground emission. We explore the efficiency of wavelet analysis on the sphere (needlets) in the domain of spectral estimation on incomplete data with inhomogeneous contamination, and design a method for treating small scales contamination induced by point sources in the Planck and WMAP data. We also study the impact of foregrounds on our ability to detect primordial gravitational waves (predicted by inflation) and offer forecasts of the performance of future dedicated experiments
Anthoine, Sandrine. "Plusieurs approches en ondelettes pour la séparation et déconvolection de composantes. Application à des données astrophysiques". Phd thesis, Ecole Polytechnique X, 2005. http://pastel.archives-ouvertes.fr/pastel-00001556.
Texto completoJarboui, Lina. "Méthodes avancées de séparation de sources applicables aux mélanges linéaires-quadratiques". Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30295/document.
Texto completoIn this thesis, we were interested to propose new Blind Source Separation (BSS) methods adapted to the nonlinear mixing models. BSS consists in estimating the unknown source signals from their observed mixtures when there is little information available on the mixing model. The methodological contribution of this thesis consists in considering the non-linear interactions that can occur between sources by using the linear-quadratic (LQ) model. To this end, we developed three new BSS methods. The first method aims at solving the hyperspectral unmixing problem by using a linear-quadratic model. It is based on the Sparse Component Analysis (SCA) method and requires the existence of pure pixels in the observed scene. For the same purpose, we propose a second hyperspectral unmixing method adapted to the linear-quadratic model. It corresponds to a Non-negative Matrix Factorization (NMF) method based on the Maximum A Posteriori (MAP) estimate allowing to take into account the available prior information about the unknown parameters for a better estimation of them. Finally, we propose a third BSS method based on the Independent Component Analysis (ICA) method by using the Second Order Statistics (SOS) to process a particular case of the linear-quadratic mixture that corresponds to the bilinear one
Anthoine, Sandrine. "Approches en ondelettes pour la séparation et la déconvolution simultanées de composantes : application à des données astrophysiques". Palaiseau, Ecole polytechnique, 2005. http://www.theses.fr/2005EPXX0019.
Texto completoKhan, Amir Ali. "Séparation de sources thermométriques". Phd thesis, Grenoble INPG, 2009. http://www.theses.fr/2009INPG0160.
Texto completoThe aim of this research is to address the problem of leakage detection in embankment dikes. Significant flow of water through the dike, an indicator of internal erosion, results in a thermal anomaly. The measurement of temperature is therefore capable of revealing information linked to leakage. Fiber optic based distributed temperature sensors present en economically viable solution for recording spatio-temporel temperature data. The data are influenced by several factors amongst them the leakages, the response of the near surface, the seasonal variations, geomechanical environment, etc. In order to remove phenomena like precipitations, we propose a criterion based on skewness and kurtosis. Formulating leakage detection as a source separation problem, firstly, we present a system based on SVD and ICA for the separation of useful information relevant to the leakages from other “non relevant” factors. Secondly, for the case of limited acquisitions, we propose a singularity detector exploiting the dissimilarity of daily temperature variations at different distances
Khan, Amir Ali. "Séparation de sources thermométriques". Phd thesis, Grenoble INPG, 2009. http://tel.archives-ouvertes.fr/tel-00477455.
Texto completoKaroui, Moussa Sofiane. "Méthodes de séparation aveugle de sources et application à la télédétection spatiale". Phd thesis, Université Paul Sabatier - Toulouse III, 2012. http://tel.archives-ouvertes.fr/tel-00790655.
Texto completoMassouti, Bayan. "Contribution à la réalisation en temps réel de la séparation des composantes acoustique et turbulente d'un signal microphonique". Aix-Marseille 2, 1989. http://www.theses.fr/1989AIX22061.
Texto completoChaouchi, Chahinez. "Méthodes de séparation aveugle de sources non linéaires, étude du modèle quadratique 2X2". Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1375/.
Texto completoThis thesis presents blind source separation (BSS) methods for a particular model of mixture, the quadratic one. The first part presents the separating structure (basic and extended versions). The equilibrium points of the structure and their local stability are then studied. We propose two methods of BSS. The first method uses the cumulants and the second is based on a maximum likelihood approach. We validate our results by numerical tests
Benachir, Djaouad. "Méthodes de séparation aveugle de sources pour le démélange d'images de télédétection". Toulouse 3, 2014. http://thesesups.ups-tlse.fr/2809/.
Texto completoWithin this thesis, we propose new blind source separation (BSS) methods intended for instantaneous linear mixtures, aimed at remote sensing applications. The first contribution is based on the combination of two broad classes of BSS methods : Independent Component Analysis (ICA), and Non-negative Matrix Factorization (NMF). We show how the physical constraints of our problem can be used to eliminate some of the indeterminacies related to ICA and provide a first approximation of endmembers spectra and associated sources. These approximations are then used to initialize an NMF algorithm with the goal of improving them. The results we reached are satisfactory as compared with the classical methods used in our undertaken tests. The second proposed method is based on sparsity as well as on geometrical properties. We begin by highlighting some properties facilitating the presentation of the hypotheses considered 153 in the method. We then provide the broad lines of this approach which is based on the determination of the two-source zones that are contained in a remote sensing image, with the help of a correlation criterion. From the intersections of the lines generated by these two-source zones, we detail how to obtain the columns of the mixing matrix and the sought sources. The obtained results are quite attractive as compared with those reached by several methods from literature
Patanchon, Guillaume. "Analyse multi-composantes d'observations du fond diffus cosmologique". Phd thesis, Université Pierre et Marie Curie - Paris VI, 2003. http://tel.archives-ouvertes.fr/tel-00004512.
Texto completoAkam, Bita Isidore Paul. "Sur une approche de l'analyse en composantes indépendantes à la compression des images multi composantes". Phd thesis, Université Joseph Fourier (Grenoble), 2007. http://tel.archives-ouvertes.fr/tel-00151987.
Texto completoBatany, Yves-Marie. "Séparation de signaux en mélanges convolutifs : contributions à la séparation aveugle de sources parcimonieuses et à la soustraction adaptative des réflexions multiples en sismique". Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEM093/document.
Texto completoThe recovery of correlated signals from their linear combinations is a challenging task and has many applications in signal processing. We focus on two problems that are the blind separation of sparse sources and the adaptive subtraction of multiple events in seismic processing. A special focus is put on convolutive mixtures: for both problems, finite impulse response filters can indeed be estimated for the recovery of the desired signals.For instantaneous and convolutive mixing models, we address the necessary and sufficient conditions for the exact extraction and separation of sparse sources by using the L0 pseudo-norm as a contrast function. Equivalences between sparse component analysis and disjoint component analysis are investigated.For adaptive multiple subtraction, we discuss the limits of methods based on independent component analysis and we highlight equivalence with Lp-norm-based methods. We investigate how other regularization parameters may have more influence on the estimation of the desired primaries. Finally, we propose to improve the robustness of adaptive subtraction by estimating the extracting convolutive filters directly in the curvelet domain. Computation and memory costs are limited by using the uniform discrete curvelet transform
Ehsandoust, Bahram. "Séparation de Sources Dans des Mélanges non-Lineaires". Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAT033/document.
Texto completoBlind Source Separation (BSS) is a technique for estimating individual source components from their mixtures at multiple sensors, where the mixing model is unknown. Although it has been mathematically shown that for linear mixtures, under mild conditions, mutually independent sources can be reconstructed up to accepted ambiguities, there is not such theoretical basis for general nonlinear models. This is why there are relatively few results in the literature in this regard in the recent decades, which are focused on specific structured nonlinearities.In the present study, the problem is tackled using a novel approach utilizing temporal information of the signals. The original idea followed in this purpose is to study a linear time-varying source separation problem deduced from the initial nonlinear problem by derivations. It is shown that already-proposed counter-examples showing inefficiency of Independent Component Analysis (ICA) for nonlinear mixtures, loose their validity, considering independence in the sense of stochastic processes instead of simple random variables. Based on this approach, both nice theoretical results and algorithmic developments are provided. Even though these achievements are not claimed to be a mathematical proof for the separability of nonlinear mixtures, it is shown that given a few assumptions, which are satisfied in most practical applications, they are separable.Moreover, nonlinear BSS for two useful sets of source signals is also addressed: (1) spatially sparse sources and (2) Gaussian processes. Distinct BSS methods are proposed for these two cases, each of which has been widely studied in the literature and has been shown to be quite beneficial in modeling many practical applications.Concerning Gaussian processes, it is demonstrated that not all nonlinear mappings can preserve Gaussianity of the input. For example being restricted to polynomial functions, the only Gaussianity-preserving function is linear. This idea is utilized for proposing a linearizing algorithm which, cascaded by a conventional linear BSS method, separates polynomial mixturesof Gaussian processes.Concerning spatially sparse sources, it is shown that spatially sparsesources make manifolds in the observations space, and can be separated once the manifolds are clustered and learned. For this purpose, multiple manifold learning problem has been generally studied, whose results are not limited to the proposed BSS framework and can be employed in other topics requiring a similar issue
Assoul, Mohamed. "Analyse de la topographie des surfaces quelconques : étude et réalisation d'un appareillage de mesure 3D ; séparation des différentes composantes et quantification". Besançon, 1991. http://www.theses.fr/1991BESA2049.
Texto completoRafi, Selwa. "Chaînes de Markov cachées et séparation non supervisée de sources". Thesis, Evry, Institut national des télécommunications, 2012. http://www.theses.fr/2012TELE0020/document.
Texto completoThe restoration problem is usually encountered in various domains and in particular in signal and image processing. It consists in retrieving original data from a set of observed ones. For multidimensional data, the problem can be solved using different approaches depending on the data structure, the transformation system and the noise. In this work, we have first tackled the problem in the case of discrete data and noisy model. In this context, the problem is similar to a segmentation problem. We have exploited Pairwise and Triplet Markov chain models, which generalize Hidden Markov chain models. The interest of these models consist in the possibility to generalize the computation procedure of the posterior probability, allowing one to perform bayesian segmentation. We have considered these methods for two-dimensional signals and we have applied the algorithms to retrieve of old hand-written document which have been scanned and are subject to show through effect. In the second part of this work, we have considered the restoration problem as a blind source separation problem. The well-known "Independent Component Analysis" (ICA) method requires the assumption that the sources be statistically independent. In practice, this condition is not always verified. Consequently, we have studied an extension of the ICA model in the case where the sources are not necessarily independent. We have introduced a latent process which controls the dependence and/or independence of the sources. The model that we propose combines a linear instantaneous mixing model similar to the one of ICA model and a probabilistic model on the sources with hidden variables. In this context, we show how the usual independence assumption can be weakened using the technique of Iterative Conditional Estimation to a conditional independence assumption
Fourt, Olivier. "Traitement des signaux à phase polynomiale dans des environnements fortement bruités : séparation et estimation des paramètres". Paris 11, 2008. http://www.theses.fr/2008PA112064.
Texto completoThe research works of this thesis deal with the processings of polynomial phase signals in heavily corrupted environnements, whatsoever noise with high levels or impulse noise, noise modelled by the use of alpha-stable laws. Noise robustness is a common task in signal processing and if several algorithms are able to work with high gaussian noise level, the presence of impulse noise often leads to a great loss in performances or makes algorithms unable to work. Recently, some algorithms have been built in order to support impulse noise environnements but with one limit: the achievable results decrease with gaussian noise situations and thus needs as a first step to select the good method versus the kind of the noise. So one of the key points of this thesis was building algorithms who were robust to the kind of the noise which means that they have similar performances with gaussian noise or alpha-stable noise. The second key point was building fast algorithms, something difficult to add to robustness
Batany, Yves-Marie. "Séparation de signaux en mélanges convolutifs : contributions à la séparation aveugle de sources parcimonieuses et à la soustraction adaptative des réflexions multiples en sismique". Electronic Thesis or Diss., Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLEM093.
Texto completoThe recovery of correlated signals from their linear combinations is a challenging task and has many applications in signal processing. We focus on two problems that are the blind separation of sparse sources and the adaptive subtraction of multiple events in seismic processing. A special focus is put on convolutive mixtures: for both problems, finite impulse response filters can indeed be estimated for the recovery of the desired signals.For instantaneous and convolutive mixing models, we address the necessary and sufficient conditions for the exact extraction and separation of sparse sources by using the L0 pseudo-norm as a contrast function. Equivalences between sparse component analysis and disjoint component analysis are investigated.For adaptive multiple subtraction, we discuss the limits of methods based on independent component analysis and we highlight equivalence with Lp-norm-based methods. We investigate how other regularization parameters may have more influence on the estimation of the desired primaries. Finally, we propose to improve the robustness of adaptive subtraction by estimating the extracting convolutive filters directly in the curvelet domain. Computation and memory costs are limited by using the uniform discrete curvelet transform
Thomas, Johan. "Algorithmes temporels rapides à point fixe pour la séparation aveugle de mélanges convolutifs et/ou sous-déterminés". Phd thesis, Université Paul Sabatier - Toulouse III, 2007. http://tel.archives-ouvertes.fr/tel-00268239.
Texto completoDans la deuxième partie, consacrée aux mélanges sous-déterminés instantanés et convolutifs, nous cherchons aussi à étendre l'algorithme FastICA en nous basant sur le concept de séparation différentielle. Nous proposons des procédures de blanchiment différentiel des observations qui permettent l'emploi d'itérations à point fixe pour séparer entre elles des sources dites utiles. En convolutif, les contributions sont estimées au moyen d'un filtre de Wiener différentiel non-causal.
La troisième partie, consacrée aux mélanges instantanés de sources contenant un grand nombre d'échantillons, met d'abord en évidence une limitation de l'algorithme FastICA, qui, à chaque itération d'optimisation, calcule des statistiques sur l'ensemble des échantillons disponibles. Le critère du kurtosis étant polynômial relativement aux coefficients d'extraction, nous proposons une procédure d'identification de ce polynôme qui permet d'effectuer l'optimisation dans un espace de calcul moins gourmand. Ce principe est ensuite appliqué aux mélanges instantanés sous-déterminés.
Carlos, Mickaël. "Dissociation induite par collisions d'hydrocarbures aromatiques polycycliques dans un piège à ions quadripolaire : Séparation des structures isomériques des composantes de la poussière cosmique". Thesis, Toulouse 3, 2020. http://www.theses.fr/2020TOU30232.
Texto completoPolycyclic aromatic hydrocarbons (PAH) are large carbonaceous molecules that are the subject of investigation in various fields from astrochemistry to environmental science. Being a major constituent of cosmic dust, they play a key role in regions of star formation, where their infrared emission is excited by ultraviolet photons. However, the mechanisms involved in their formation remain poorly understood. In simulation experiments in the laboratory as in meteorites, the mass m/z = 202.08 corresponding to the species C_16H_10, has been identified as a key species in the growth of these PAHs. This chemical formula includes several isomeric structures that need to be differentiated. We used the AROMA molecular analyzer to study the differentiation of C_16H_10 isomers by collision-induced dissociation (CID), in the case of fluoranthene and pyrene (two compact structures) and that of9-ethynylphenanthrene (structure with an alkyl group). The various experimental parameters control-ling the CID were explored in order to determine optimized conditions for our experimental conditions. At low collision energy, we have shown that the criterion of 50 % dissociation of parent ions can be used to differentiate condensed from non-condensed structures but is more difficult to differentiate between condensed structures. The procedure has been applied to individual species, their mixtures, and more complex samples of meteorite (Allende) and laboratory analogues of cosmic dust (dusty plasma). We have modeled the dynamics of the ions in the trap and extracted frequency and energy distributions of collisions. The competition with the dissociation rate was treated by Monte Carlo method. The model fit of the pyrene dissociation curve quantified the parameter êta of transfer of kinetic energy into internal energy. The fluoranthene dissociation curve could be modeled using the same values for êta. Finally, we were able to determine the dissociation rate of 9-ethynylphenanthrene, which is not yet known
RAFI, Selwa. "Chaînes de Markov cachées et séparation non supervisée de sources". Phd thesis, Institut National des Télécommunications, 2012. http://tel.archives-ouvertes.fr/tel-00995414.
Texto completoFuentes, Benoît. "L'analyse probabiliste en composantes latentes et ses adaptations aux signaux musicaux : application à la transcription automatique de musique et à la séparation de sources". Thesis, Paris, ENST, 2013. http://www.theses.fr/2013ENST0011/document.
Texto completoAutomatic music transcription consists in automatically estimating the notes in a recording, through three attributes: onset time, duration and pitch. To address this problem, there is a class of methods which is based on the modeling of a signal as a sum of basic elements, carrying symbolic information. Among these analysis techniques, one can find the probabilistic latent component analysis (PLCA). The purpose of this thesis is to propose variants and improvements of the PLCA, so that it can better adapt to musical signals and th us better address the problem of transcription. To this aim, a first approach is to put forward new models of signals, instead of the inherent model 0 PLCA, expressive enough so they can adapt to musical notes having variations of both pitch and spectral envelope over time. A second aspect of this work is to provide tools to help the parameters estimation algorithm to converge towards meaningful solutions through the incorporation of prior knowledge about the signals to be analyzed, as weil as a new dynamic model. Ali the devised algorithms are applie to the task of automatic transcription. They can also be directly used for source separation, which consists in separating several sources from a mixture, and Iwo applications are put forward in this direction
Fuentes, Benoit. "L'analyse probabiliste en composantes latentes et ses adaptations aux signaux musicaux : application à la transcription automatique de musique et à la séparation de sources". Electronic Thesis or Diss., Paris, ENST, 2013. http://www.theses.fr/2013ENST0011.
Texto completoAutomatic music transcription consists in automatically estimating the notes in a recording, through three attributes: onset time, duration and pitch. To address this problem, there is a class of methods which is based on the modeling of a signal as a sum of basic elements, carrying symbolic information. Among these analysis techniques, one can find the probabilistic latent component analysis (PLCA). The purpose of this thesis is to propose variants and improvements of the PLCA, so that it can better adapt to musical signals and th us better address the problem of transcription. To this aim, a first approach is to put forward new models of signals, instead of the inherent model 0 PLCA, expressive enough so they can adapt to musical notes having variations of both pitch and spectral envelope over time. A second aspect of this work is to provide tools to help the parameters estimation algorithm to converge towards meaningful solutions through the incorporation of prior knowledge about the signals to be analyzed, as weil as a new dynamic model. Ali the devised algorithms are applie to the task of automatic transcription. They can also be directly used for source separation, which consists in separating several sources from a mixture, and Iwo applications are put forward in this direction
Boulais, Axel. "Méthodes de séparation aveugle de sources et application à l'imagerie hyperspectrale en astrophysique". Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30318/document.
Texto completoThis thesis deals with the development of new blind separation methods for linear instantaneous mixtures applicable to astrophysical hyperspectral data sets. We propose three approaches to perform data separation. A first contribution is based on hybridization of two existing blind source separation (BSS) methods: the SpaceCORR method, requiring a sparsity assumption, and a non-negative matrix factorization (NMF) method. We show that using SpaceCORR results to initialize the NMF improves the performance of the methods used alone. We then proposed a first original method to relax the sparsity constraint of SpaceCORR. The method called MASS (Maximum Angle Source Separation) is a geometric method based on the extraction of single-source pixels to achieve the separation of data. We also studied the hybridization of MASS with the NMF. Finally, we proposed an approach to relax the sparsity constraint of SpaceCORR. The original method called SIBIS (Subspace-Intersection Blind Identification and Separation) is a geometric method based on the identification of intersections of subspaces generated by regions of the hyperspectral image. Under a sparsity assumption, these intersections allow one to achieve the separation of the data. The approaches proposed in this manuscript have been validated by experimentations on simulated data and then applied to real data. The results obtained on our data are very encouraging and are compared with those obtained by methods from the literature
Guidara, Rima. "Méthodes markoviennes pour la séparation aveugle de signaux et images". Toulouse 3, 2009. http://thesesups.ups-tlse.fr/705/.
Texto completoThis thesis presents new Markovian methods for blind separation of instantaneous linear mixtures of one-dimensional signals and images. In the first part, we propose several improvements to an existent method for separating temporal signals. The new method exploits simultaneously non-Gaussianity, autocorrelation and non-stationarity of the sources. Excellent performance is obtained for the separation of artificial mixtures of speech signals, and we succeed to separate real mixtures of astrophysical spectra. An extension to image separation is then proposed. The dependence within the image pixels is modelled by non-symetrical half-plane Markov random fields. Very good performance is obtained for the separation of artificial mixtures of natural images and noiseless observations of the Planck satellite. The results obtained with a low level noise are acceptable
Becker, Hanna. "Débruitage, séparation et localisation de sources EEG dans le contexte de l'épilepsie". Thesis, Nice, 2014. http://www.theses.fr/2014NICE4075/document.
Texto completoElectroencephalography (EEG) is a routinely used technique for the diagnosis and management of epilepsy. In this context, the objective of this thesis consists in providing algorithms for the extraction, separation, and localization of epileptic sources from the EEG recordings. In the first part of the thesis, we consider two preprocessing steps applied to raw EEG data. The first step aims at removing muscle artifacts by means of Independent Component Analysis (ICA). In this context, we propose a new semi-algebraic deflation algorithm that extracts the epileptic sources more efficiently than conventional methods as we demonstrate on simulated and real EEG data. The second step consists in separating correlated sources that can be involved in the propagation of epileptic phenomena. To this end, we explore deterministic tensor decomposition methods exploiting space-time-frequency or space-time-wave-vector data. We compare the two preprocessing methods using computer simulations to determine in which cases ICA, tensor decomposition, or a combination of both should be used. The second part of the thesis is devoted to distributed source localization techniques. After providing a survey and a classification of current state-of-the-art methods, we present an algorithm for distributed source localization that builds on the results of the tensor-based preprocessing methods. The algorithm is evaluated on simulated and real EEG data. Furthermore, we propose several improvements of a source imaging method based on structured sparsity. Finally, a comprehensive performance study of various brain source imaging methods is conducted on physiologically plausible, simulated EEG data
Chaouchi, Chahinez. "Méthodes de séparation aveugle de sources non linéaires, étude du modèle quadratique 2*2". Phd thesis, Université Paul Sabatier - Toulouse III, 2011. http://tel.archives-ouvertes.fr/tel-00683204.
Texto completoMoussaoui, Saïd. "Séparation de sources non-négatives : Application au traitement des signaux de spectroscopie". Phd thesis, Université Henri Poincaré - Nancy I, 2005. http://tel.archives-ouvertes.fr/tel-00012096.
Texto completoLa séparation de sources est un problème fondamental en traitement du signal dont une hypothèse forte est celle de l'indépendance statistique des signaux sources. Compte tenu du recouvrement entre les spectres purs, leur corrélation mutuelle devient parfois importante. Dans une telle situation, l'application d'une méthode fondée sur l'hypothèse d'orthogonalité s'avère infructueuse. Par ailleurs, une analyse des solutions admissibles sous la contrainte de non-négativité montre que cette contrainte toute seule ne permet d'obtenir une solution unique que dans certains cas particuliers. Ces deux constats motivent le développement de deux méthodes qui considèrent conjointement l'hypothèse d'indépendance et l'information de non-négativité. La première méthode est issue de l'approche de séparation par maximum de vraisemblance et la deuxième se fonde sur une inférence bayésienne. Une évaluation numérique des performances des méthodes développées à l'aide de données synthétiques et le traitement de signaux expérimentaux permettent, d'une part, de mettre en évidence les avantages de ces méthodes par rapport aux approches usuelles et, d'autre part, de déceler leurs limitations. Des applications au traitement de signaux réels issus de trois types de spectroscopies (Infrarouge, Raman et Ultraviolet-Visible) illustrent l'apport de la séparation de sources non-négatives à l'analyse physico-chimique.
Jurczynski, Pauline. "Étude de relations entre activités aux échelles cellulaires et macroscopiques. Application au système visuel de reconnaissance des visages". Electronic Thesis or Diss., Université de Lorraine, 2023. http://www.theses.fr/2023LORR0084.
Texto completoCurrent techniques allow the exploration of brain electrical activity at different spatial scales, such as surface EEG, SEEG, or microelectrode recordings. EEG/SEEG offer a view of activity at the macroscopic scale, largely dominated by the synaptic activity of synchronised neuron populations, more or less distant from the electrodes. At the micro level, the activity is a mixture of action potentials (APs) from neurons very close to the electrode and LFPs (mainly of synaptic origin) from surrounding populations as well as from distant populations by volume propagation. The identification of the different contributions that add up to the signal of each microelectrode is a prerequisite to study the relationships between these components. The main objective of this work is to separate the different contributions in the signals recorded by the microelectrodes. The first step is to extract and classify the visible APs (despiking and spike sorting), and then to differentiate the LFP activities coming from distant regions from those coming from the surrounding population. This second part, which can be approached at the microelectrode scale only, can also take into account the spatial extent of the sources impacting these micro measurements, relying in particular on simultaneous SEEG recordings. The second objective is to study the components (spikes, LFP) previously identified using the methods introduced in this thesis, and their intra- and inter-scale relationships. Based on the recording protocols used in the team, the multi-scale activities of the structures involved in face recognition are described and their relationships are analysed, especially in the context of the periodic fast visual protocols used to study the mechanisms of face recognition in humans
Wang, Wei-Min. "Estimation of component temperatures of vegetative canopy with Vis/NIR and TIR multiple-angular data through inversion of vegetative canopy radiative transfer model". Strasbourg, 2009. http://www.theses.fr/2009STRA6027.
Texto completoThe separation of component temperature is the basic step for the application of two-source algorithm. Multi-angular thermal infrared measurements provide a chance for the estimation of component temperatures (namely, soil and vegetation temperatures) with remotely-sensed data. The objective of this study is to explore the factors that affect the estimation of component temperatures and propose new algorithm for inverting the canopy radiative transfer models to compute component temperatures. The objectives of this dissertation include: (1) finding an appropriate candidate leaf angle distribution functions for modeling and inversion, (2) evaluating the scaling behavior of Beer's law and its effect on the estimation of component temperatures, (3) proposing an analytical model for directional brightness temperature at top of canopy, (4) retrieving component temperatures with neural network and simplex algorithms. The effects of leaf angle distribution function on extinction coefficient, which is a key parameter for simulating the radiative transfer through vegetative canopy, is explored to improve the radiative transfer modeling. These contributions will enhance our understanding of the basic problems existing in thermal IR remote sensing and improve the simulation of land surface energy balance. Further work can be conducted to continue the enhancement and application of proposed algorithm to remote sensing images
Vergès, Clara. "Searching for cosmological B-modes in the presence of astrophysical contaminants and instrumental effects". Thesis, Université de Paris (2019-....), 2020. http://www.theses.fr/2020UNIP7092.
Texto completoCosmology has known a considerable surge in the past twenty years and is now well established as a precision science. Accurate measurements of temperature anisotropies of the Cosmic Microwave Background (CMB) opened an unprecedented window on the primordial Universe. Furthermore, the existing measurements of CMB polarisation anisotropies have confirmed our understanding of early universe physics, and provided important consistency tests of the currently popular models. The next frontier in CMB science is the precise mapping of polarisation anisotropies, and the detection of a specific signature in the polarisation signal, the primordial B-modes. These observations will allow us to push further our knowledge of the Universe and its best-kept secrets, such as dark energy, dark matter and inflation.This detection is particularly challenging, as the signal we look for is very low, and shadowed by various sources of galactic and extra-galactic contamination: polarised dust, synchrotron radiation and weak gravitational lensing - to name a few. The next generation of CMB polarisation experiments therefore needs not only to reach an unprecedented raw instrumental sensitivity, but also to be able to distinguish the signal from these contaminations. This calls for enhanced detection capabilities, new technologies, as well as novel data analysis methods. This work focuses on understanding whether this increased complexity of CMB experiments will effectively lead to better performances, and at which cost regarding the control of systematic effects. I develop new, involved data models taking into account various instrumental parameters to ensure a better modelling of instrumental systematic effects, in the context of new generation CMB polarisation experiments - POLARBEAR/Simons Array and the Simons Observatory. To prepare for future experiments, I propose an extension of component separation forecasting framework to take into account instrumental systematic effects. I demonstrate its capabilities as a key tool to forecast the performance of future experiments, and inform calibration strategies to achieve required performance
Vansyngel, Flavien. "The blind Bayesian approach to Cosmic Microwave Background data analysis". Thesis, Paris 6, 2014. http://www.theses.fr/2014PA066609/document.
Texto completoThe main topic of this thesis is the analysis of Cosmic Microwave Background (CMB) data. In particular, I present a method, Bayesian Independent component analysis (BICA), that performs both CMB component separation and CMB power spectrum inference.I begin by presenting the basics of our understanding of the CMB emission and highlight the need for a robust error modelling at the map level. Then I present the main source of errors in the CMB products, namely the foregrounds.Component separation is a crucial and delicate step in CMB data analysis. I review several methods aiming at cleaning the CMB from foregroundsThen I present BICA. The method is formulated in a blind Bayesian framework. The posterior distribution provides an inference of the CMB map and power spectrum from the observation maps. Thus, the errors on the reconstruction include the uncertainties due the presence of foregrounds in the data. By considering particular choices of prior and sampling scheme, I show how the Bayesian formulation of component separation provide a unifying framework of which previous methods are special cases.I present the results of BICA when applied on both simulated data and 2013 Planck data. This method is able to reconstruct the CMB map and power spectrum on a large fraction of the sky. The main contributions of this thesis is to provide: 1) a CMB power spectrum on a large multipole range whose errors take the presence of foregrounds into account but without assuming physical models, 2) a CMB map inference together with an error model including both noise and foregrounds residuals
Puigt, Matthieu. "Méthodes de séparation aveugle de sources fondées sur des transformées temps-fréquence : application à des signaux de parole". Toulouse 3, 2007. http://thesesups.ups-tlse.fr/217/.
Texto completoSeveral time-frequency (TF) blind source separation (BSS) methods have been proposed in this thesis. In the systems output that have been used, a contribution of each source is estimated, using only mixed signals. All the methods proposed in this manuscript find tiny TF zones where only one source is active and estimate the mixing parameters in these zones. These approaches are particularly well suited for non-stationary sources (speech, music). We first studied and improved linear instantaneous methods based on variance or correlation criteria, that have been previously proposed by our team. They yield excellent performance for signals of speech and can also separate spectra from astrophysical data. However, the nature of the mixtures that they can process limits their application fields. We have extended these approaches to more realistic mixtures. The first extensions consider attenuated and delayed mixtures of sources, which corresponds to mixtures in anechoic chamber. They require less restrictive sparsity assumptions than some approaches previously proposed in the literature, while addressing the same type of mixtures. We have studied the contribution of clustering techniques to our approaches and have achieved good performance for mixtures of speech signals. Lastly, a theoretical extension of these methods to general convolutive mixtures is described. It needs strong sparsity hypotheses and we have to solve classical indeterminacies of frequency-domain BSS methods
Umiltà, Caterina. "Development and assessment of a blind component separation method for cosmological parameter estimation". Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066453/document.
Texto completoThe Planck satellite observed the whole sky at various frequencies in the microwave range. These data are of high value to cosmology, since they help understanding the primordial universe through the observation of the cosmic microwave background (CMB) signal. To extract the CMB information, astrophysical foreground emissions need to be removed via component separation techniques. In this work I use the blind component separation method SMICA to estimate the CMB angular power spectrum with the aim of using it for the estimation of cosmological parameters. In order to do so, small scales limitations as the residual contamination of unresolved point sources and the noise need to be addressed. In particular, the point sources are modelled as two independent populations with a flat angular power spectrum: by adding this information, the SMICA method is able to recover the joint emission law of point sources. Auto-spectra deriving from one sky map have a noise bias at small scales, while cross-spectra show no such bias. This is particularly true in the case of cross-spectra between data-splits, corresponding to sky maps with the same astrophysical content but different noise properties. I thus adapt SMICA to use data-split cross-spectra only. The obtained CMB spectra from simulations and Planck 2015 data are used to estimate cosmological parameters. Results show that this estimation can be biased if the shape of the (weak) foreground residuals in the angular power spectrum is not well known. In the end, I also present results of the study of a Modified Gravity model called Induced Gravity
Maurandi, Victor. "Algorithmes pour la diagonalisation conjointe de tenseurs sans contrainte unitaire. Application à la séparation MIMO de sources de télécommunications numériques". Thesis, Toulon, 2015. http://www.theses.fr/2015TOUL0009/document.
Texto completoThis thesis develops joint diagonalization of matrices and third-order tensors methods for MIMO source separation in the field of digital telecommunications. After a state of the art, the motivations and the objectives are presented. Then the joint diagonalisation and the blind source separation issues are defined and a link between both fields is established. Thereafter, five Jacobi-like iterative algorithms based on an LU parameterization are developed. For each of them, we propose to derive the diagonalization matrix by optimizing an inverse criterion. Two ways are investigated : minimizing the criterion in a direct way or assuming that the elements from the considered set are almost diagonal. Regarding the parameters derivation, two strategies are implemented : one consists in estimating each parameter independently, the other consists in the independent derivation of couple of well-chosen parameters. Hence, we propose three algorithms for the joint diagonalization of symmetric complex matrices or hermitian ones. The first one relies on searching for the roots of the criterion derivative, the second one relies on a minor eigenvector research and the last one relies on a gradient descent method enhanced by computation of the optimal adaptation step. In the framework of joint diagonalization of symmetric, INDSCAL or non symmetric third-order tensors, we have developed two algorithms. For each of them, the parameters derivation is done by computing the roots of the considered criterion derivative. We also show the link between the joint diagonalization of a third-order tensor set and the canonical polyadic decomposition of a fourth-order tensor. We confront both methods through numerical simulations. The good behavior of the proposed algorithms is illustrated by means of computing simulations. Finally, they are applied to the source separation of digital telecommunication signals
Boudet, Samuel. "Filtrage d'artefacts par analyse multicomposante de l'électroencéphalogramme de patients épileptiques". Thesis, Lille 1, 2008. http://www.theses.fr/2008LIL10156/document.
Texto completoThe electroencephalography (EEG) consists in measuring brain electrical activity thanks to electrodes located on the scalp surface. This technique is mainly used for the diagnostic of epilepsy. Sorne grapho-elements like slow waves and spike waves can appear on the EEG, enabling the neurologist to detect an epilepsy pain. Unfortunately, this activity can be highly contaminated by parasitical signals called artifacts. These artifacts have for main origins, the ocular activity, the muscular activity, the cardiac rhythm and tight electrode displacements. The frequencies of pathological grapho-elements recover those of artifacts, and it is then required to use spatial filter which rests on source separation. The principle is to determine a set of cerebral sources and a set of artifacts sources. Artifact sources are then cancelled and the cerebral ones are used to rebuild the signal. This thesis presents several methods using both spatial and frequential filters, making the EEG filtering automated. A quantitative approach of filtering validation is defined, which enables the author to choose the most efficient called Adaptive Filtering by Optimal Projection (AFOP). According to the neurologist, tests on clinical recordings of epileptic patients prove AFOP efficiency on cancelling most of artifact types as well as on respecting cerebn rhythms
Dupont, M. "Tomographie spectrale à comptage de photons~: développement du prototype PIXSCAN et preuve de concept". Phd thesis, Aix-Marseille Université, 2014. http://tel.archives-ouvertes.fr/tel-01019735.
Texto completoSorel, Maud. "Les fluctuations du fond diffus extragalactique et ses avant-plans, de l'infrarouge au domaine millimétrique". Phd thesis, Université Paris Sud - Paris XI, 2005. http://tel.archives-ouvertes.fr/tel-00009750.
Texto completoHattay, Jamel. "Wavelet-based lifting structures and blind source separation : applications to digital in-line holography". Rouen, 2016. http://www.theses.fr/2016ROUES016.
Texto completoThe present thesis is meant to develop specific processes, in the realm of wavelets domain, for certain digital holography applications. We mainly use the so-called blind source separation (BSS) techniques to solve numerous digital holography problems, namely, the twin image suppression, real time coding and transmission of holograms. Firstly, we give a brief introduction to in-line configuration of digital holography in flow measurements: the recording step explanation and the study of two reconstruction approaches that have been used during this thesis. Then, we emphasize the two well known obstacles of digital holograms reconstruction, namely, the determination of the best focus plane and the twin image removal. Secondly, we propose a meticulous scrutiny of the tool, based on the Blind Source Separation (BSS), enhanced by a multiscale decomposition algorithm, which enables the blind separation of convolutively mixed images. The suggested algorithm uses a wavelet-based transformer, called Adaptive Quincunx Lifting Scheme (AQLS), coupled with an appropriate unmixing algorithm. The resulting deconvolution process is made up of three steps. In the first step, the convolutively mixed images are decomposed by AQLS. Then, separation algorithm is applied to the most relevant component to unmix the transformed images. The unmixed images are, thereafter, reconstructed using the inverse of the AQLS transform. In a subsequent part, we adopt the blind source separation technique in the wavelet field domain to solve several problems related to digital holography. In this context, we present two main contributions for digital in-line hologram processing. The first contribution consists in an entropy-based method to retrieve the best focus plane, a crucial issue in digital hologram reconstruction. The second contribution consists in a new approach to remove a common unwanted artifact in holography called the twin image. The latter contribution is based on the blind source separation technique, and the resulting algorithm is made up of two steps: an Adaptive Quincunx Lifting Scheme (AQLS) based on the wavelet packet transform and a statistical unmixing algorithm based on Independent Component Analysis (ICA) tool. The role of the AQLS is to maximize the sparseness of the input holograms. Since the convolutive formalism is retained in digital in-line holography, BSS-based tool is extended and coupled with wavelet-based AQLS to fulfill the deconvolution task. Experimental results confirm that convolutive blind source separation is able to discard the unwanted twin image from digital in-line holograms. The last of this part consists in measuring the thickness of a ring. This ring is obtained from an improved reconstructed image of an hologram containing a vapor bubble created by thermal coupling between a laser pulse and nanoparticles in a droplet of a liquid. The last part introduces the Tele-Holography concept. Once the image of the object is perfectly reconstructed, the next objective is to code and transmit the reconstructed image for an interactive flow of exchange between a given laboratory, where the holograms are recorded, and a distant partner research. We propose the tele-holography process that involves the wavelet transform tool for lossless compression and transmission of digital holograms. The concept of tele-holography is motivated by the fact that the digital holograms are considered as a 2D image yielding the depth information of 3D objects. Besides, we propose a quincunx embedded zero-tree wavelet coder (QEZW) for scalable transmission. Owing to the transmission channel capacity, it reduces drastically the bit rate of the holography transmission flow. A flurry of experimental results carried out on real digital holograms show that the proposed lossless compression process yields a significant improvement in compression ratio and total compressed size. These experimentations reveal the capacities of the proposed coder in terms of real bitrate for progressive transmission
Baque, Mathieu. "Analyse de scène sonore multi-capteurs : un front-end temps-réel pour la manipulation de scène". Thesis, Le Mans, 2017. http://www.theses.fr/2017LEMA1013/document.
Texto completoThe context of this thesis is the development of spatialized audio (5.1 contents, Dolby Atmos...) and particularly of 3D audio. Among the existing 3D audio formats, Ambisonics and Higher Order Ambisonics (HOA) allow a homogeneous spatial representation of a sound field and allows basics manipulations, like rotations or distorsions. The aim of the thesis is to provides efficient tools for ambisonics and HOA sound scene analyse and manipulations. A real-time implementation and robustness to reverberation are the main constraints to deal with. The implemented algorithm is based on a frame-by-frame Independent Component Analysis (ICA), wich decomposes the sound field into a set of acoustic contributions. Then a bayesian classification step is applied to the extracted components to identify the real sources and the residual reverberation. Direction of arrival of the sources are extracted from the mixing matrix estimated by ICA, according to the ambisonic formalism, and a real-time cartography of the sound scene is obtained. Performances have been evaluated in different acoustic environnements to assess the influence of several parameters such as the ambisonic order, the frame length or the number of sources. Accurate results in terms of source localization and source counting have been obtained for frame lengths of a few hundred milliseconds. The algorithm is exploited as a pre-processing step for a speech recognition prototype and allows a significant increasing of the recognition results, in far field conditions and in the presence of noise and interferent sources
Wei, Tianwen. "Analyse de la convergence de l'algorithme FastICA : échantillon de taille finie et infinie". Thesis, Lille 1, 2013. http://www.theses.fr/2013LIL10030/document.
Texto completoThe FastICA algorithm is one of the most popular algorithms in the domain of Independent Component Analysis (ICA). There exist two versions of FastICA: the one that corresponds to the ideal case that the sample size is infinite, and the one that deal with the practical situation, where a sample of finite size is available. In this thesis, we made a detailed study of the rate of convergence of the FastICA algorithm of both versions, and we established five criteria for the choice of the non-linearity function. In the first three chapters, we introduced the problem of ICA and revisited the classical results. In Chapitre 4, we studied the convergence of empirical FastICA and the link between the limit of empirical FastICA and the critical points of the empirical contrast function. In Chapter 5, we used the technique of M-estimator to obtain the asymptotic normality and the asymptotic covariance matrix of the FastICA estimator. This allowed us to derive four criteria to choose the non-linearity function. A fifth criterion for the choice of the non-linearity function was studied in Chapter 6. This criterion is based on the rate of convergence of the empirical FastICA algorithm. At the end of each chapter, we provided numerical simulations that validate our theoretical results
Dupont, Mathieu. "Tomographie spectrale à comptage de photons : développement du prototype PIXSCAN et preuve de concept". Thesis, Aix-Marseille, 2014. http://www.theses.fr/2014AIXM4011/document.
Texto completoIn the field of preclinical X-ray tomography, spectral tomography is actively explorated. The aims of spectral tomography are the caracterisation of tissues and contrast agentstogether with the quantification of the latter and the enhancement of contrast between soft tissues. This is achived by the exploitation of spectral information (i.e. energy) and not only the detected quantities of photons X. The interest in spectral tomography is enforced by the arrival of hybrid pixel cameras like XPAD, because of their ability to select photons according to their energy. The XPAD3 camera, third version of XPAD, is built to be used in the micro-CT demonstrator PIXCAN fully developped at CPPM.In this context, this thesis has two goals : a contribution to the developement of the PIXSCAN and a realisation of a proof of concept of spectral tomography in PIXSCAN. The first goal is done by developing the data acquisition system of PIXSCAN. To accomplish the second one, we perform spectral tomography by implementing component separation in order to isolate photoelectric, compton and contrast agents contribution. This work begins by the caracterisation of this method and ends by a proof of concept on real data acquired by PIXSCAN
Hermosilla-Lara, Sébastien. "Amélioration d'une caméra photothermique par traitements d'images adaptés à la détection de fissures débouchantes". Cachan, Ecole normale supérieure, 2002. http://www.theses.fr/2002DENS0042.
Texto completoPeña, Manchón Francisco Javier de la. "Advanced methods for Electron Energy Loss Spectroscopy core-loss analysis". Paris 11, 2010. http://www.theses.fr/2010PA112379.
Texto completoModern analytical transmission electron microscopes are able to gather a large amount of information from the sample in the form of multi-dimensional datasets. Although the analytical procedures developed for single spectra can be extended to the analysis of multi-dimensional datasets, for an optimal use of this highly redundant information, more advanced techniques must be deployed. In this context, we investigate alternatives to the standard quantification methods and seek to optimise the experimental acquisition for accurate analysis. This addresses the current challenges facing the electron energy-loss spectroscopy (EELS) community, for whom beam damage and contamination are often the limiting factors. EELS elemental quantification by the standard integration method is limited to well-behaved cases. As an alternative we use curve fitting which, as we show, can overcome most of the limitations of the standard method. Furthermore, we extend the method to obtain, in addition to elemental maps, the first bonding maps at the nanoscale. A major difficulty when analysing multi-dimensional datasets of samples of unknown composition is that the quantitative methods require as an input the composition of the sample. We show that blind source separation methods enable fast and accurate analysis of multi-dimensional datasets without defining a model. In optimal conditions these methods are capable of extracting signals from the dataset corresponding to the different chemical compounds in the sample and their distribution
Yavuz, Esin. "Source separation analysis of visual cortical dynamics revealed by voltage sensitive dye imaging". Phd thesis, Université Pierre et Marie Curie - Paris VI, 2012. http://tel.archives-ouvertes.fr/tel-00836931.
Texto completoAchvar, Didier. "Séparation de sources : généralisation à un modèle convolutif". Montpellier 2, 1993. http://www.theses.fr/1993MON20222.
Texto completoFabresse, Luc. "Du découpage à l'assemblage non anticipé de composants : Conception et mise en oeuvre du langage à composants Scl". Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2007. http://tel.archives-ouvertes.fr/tel-00207499.
Texto completoCette thèse propose donc Scl, un langage de programmation à composants permettant de mettre en pratique réellement la programmation par composants (PPC). De par sa conception, Scl se veut être un langage : (i) minimal dans le sens où toutes les abstractions et tous les mécanismes qui lui sont intégrés répondent à un besoin identifié ; (ii) simple car ses abstractions et ses mécanismes sont de haut niveau ; (iii) détaillé car nous avons abordé un ensemble de questions souvent oubliées dans les autres propositions comme l'auto-référence, le passage d'arguments ou le statut des composants de base (collections, entiers, etc) dans un monde unifié ; (iv) dédié à la PPC, car prenant en compte les deux préoccupations que nous jugeons majeures en PPC et qui sous-tendent toute l'étude: le découplage et la non-anticipation.
Le coeur de Scl repose principalement sur les concepts de composant, de port, de service, de connecteur et de « code glue » ainsi que sur les mécanismes de liaison de ports et d'invocation de service. La séparation des préoccupations au niveau du code occupe une part importante de l'étude qui établit un lien entre la programmation par composants et la programmation par aspects (PPA). Scl propose dans ce cadre une amélioration des approches dites « symétriques » de la PPA, via une unification des concepts d'aspect et de composant et via un ensemble de différents types de liaisons de ports qui permettent d'utiliser un même composant de façon standard ou de façon transversale. Scl intègre aussi un mécanisme général permettant d'établir des connexions entre composants basées sur les changements d'états de leurs propriétés sans que leurs programmeurs n'aient à écrire une ligne de code spécifique à cet effet. Deux prototypes de Scl sont également présentés, le premier et le plus abouti est écrit en Smalltalk et le second en Ruby.
Mauviel, Guillain. "Transport multi-composants dans les polymères : séparation hydrocarbures / hydrogène par membrane à sélectivité inverse". Vandoeuvre-les-Nancy, INPL, 2003. http://docnum.univ-lorraine.fr/public/INPL_T_2003_MAUVIEL_G.pdf.
Texto completoHydrocarbon / hydrogen separation by reverse selectivity membranes is investigated. The first goal is to develop materials showing an increased selectivity. Silicone membranes loaded with inorganic fillers have been prepared, but the expected enhancement is not observed. The second goal is to model the multi- component transport through rubbers. Indeed the permeability model is not able to predict correctly permeation when a vapour is present. Thus many phenomena have to be considered: diffusional interdependancy, sorption synergy, membrane swelling and drag effect. The dependence of diffusivities with the local composition is modelled according to free-volume theory. The model resolution allows to predict the permeation flow-rates of mixed species from their pure sorption and diffusion data. For the systems under consideration, the diffusional interdependancy is shown to be preponderant. Besides, sorption synergy importance is pointed out, whereas it is most often neglected
Marvie, Raphaël. "Séparation des préoccupations et méta-modélisation pour environnements de manipulation d'architectures logicielles à base de composants". Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2002. http://tel.archives-ouvertes.fr/tel-00007381.
Texto completo