Dissertations / Theses on the topic 'Analyse basée sur les fixels'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Analyse basée sur les fixels.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Lebrun, Aurélie. "Imagerie de la connectivité et de la microstructure cérébrales dans la maladie d'Alzheimer et les maladies apparentées." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASQ056.
Full textTypical Alzheimer's disease (AD), the leading cause of dementia worldwide, is clinically described by an amnestic syndrome and is pathophysiologically characterized by the aggregation and accumulation of abnormal amyloid-β and tau proteins. However, AD is not the only cause of progressive amnesia, which can be caused by different related pathologies like limbic-predominant age-related TDP-43 encephalopathy (LATE). In this context, in vivo imaging is an invaluable tool to study alterations in brain structures and their evolution, and to identify biomarkers to characterize and differentiate each disease. The aim of this thesis is to use advanced diffusion MRI models to investigate tissue alterations, especially of white matter, in amnestic Alzheimer's disease and non-Alzheimer's amnesia.In order to implement the most efficient pre-processing pipeline, we first compared several methods for correcting magnetic susceptibility artifacts in diffusion MRI. We then implemented fixel-based analyses to study white matter alterations in typical AD and in presumed LATE. We performed two studies. First, we compared white matter fiber bundle integrity between the two groups of patients and controls in a cross-sectional study. Participants were also followed clinically on an annual basis and underwent a second MRI at 2 years, enabling us to investigate the evolution of white matter alterations using a fixel-based analysis that was adapted to the longitudinal design. Our cross-sectional study revealed a common alteration of white matter fiber bundles of the temporal lobe and limbic regions in AD patients and in presumed LATE patients compared to controls. Presumed LATE patients also showed alterations in the cerebello-thalamo-cortical tract. Additionally, both groups of patients showed cortical atrophy in regions connected to the bundles identified as altered in each group. Our longitudinal study showed a worsening over 2 years of white matter fiber bundle alterations in each group of patients compared to controls in tracts that connect the temporal lobe to the parietal and frontal lobes. It showed that the evolution of these alterations follows partly distinct patterns between the two groups of patients. The results of these studies are consistent with the staging system of propagation of tau and TDP-43 proteinopathies.This thesis shows that diffusion MRI, and fixel-based analysis in particular, are high-performance imaging methods that can accurately identify different white matter alterations in patients suffering from similar cognitive impairments. The results then show that these diseases, initially considered as cortical diseases, also result in white matter alterations with different topography and progression between groups of patients with similar initial clinical symptoms. This finding supports a possible role for white matter in the propagation of these diseases, and highlights the importance of studying these processes in the hope of a better understanding of their mechanisms
Wu, Zeqin. "SSTA basée sur la propagation des moments." Montpellier 2, 2009. http://www.theses.fr/2009MON20133.
Full textWu, Zeqin. "SSTA Basée sur la Propagation des Moments." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2009. http://tel.archives-ouvertes.fr/tel-00471241.
Full textClarot, Pierre. "Analyse de séquences vidéo de surveillance basée sur la détection d'activités." Mémoire, Université de Sherbrooke, 2010. http://savoirs.usherbrooke.ca/handle/11143/4882.
Full textCaron, Stéphane. "Détection d'anomalies basée sur les représentations latentes d'un autoencodeur variationnel." Master's thesis, Université Laval, 2021. http://hdl.handle.net/20.500.11794/69185.
Full textIn this master's thesis, we propose a methodology that aims to detect anomalies among complex data, such as images. In order to do that, we use a specific type of neural network called the varitionnal autoencoder (VAE). This non-supervised deep learning approach allows us to obtain a simple representation of our data on which we then use the Kullback-Leibler distance to discriminate between anomalies and "normal" observations. To determine if an image should be considered "abnormal", our approach is based on a proportion of observations to be filtered, which is easier and more intuitive to establish than applying a threshold based on the value of a distance metric. By using our methodology on real complex images, we can obtain superior anomaly detection performances in terms of area under the ROC curve (AUC),precision and recall compared to other non-supervised methods. Moreover, we demonstrate that the simplicity of our filtration level allows us to easily adapt the method to datasets having different levels of anomaly contamination.
Mastandrea, Vicenzo. "Analyse de synchronisation dans les objets actifs basée sur les types comportementaux." Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR4113/document.
Full textThe active object concept is a powerful computational model for defining distributed and concurrent systems. This model has recently gained prominence, largely thanks to its simplicity and its abstraction level. In this work we study an active object model with no explicit future type and wait-by-necessity synchronisations, a lightweight technique that synchronises invocations when the corresponding values are strictly needed. Although high concurrency combined with a high level of transparency leads to good performances, they also make the system more prone to problems such as deadlocks. This is the reason that led us to study deadlock analysis in this active objects model.The development of our deadlock analysis is divided in two main works. In the first work we focus on the implicit synchronisation on the availability of some value. This way we are able to analyse the data-flow synchronisation inherent to languages that feature wait-by-necessity. In the second work we present a static analysis technique based on effects and behavioural types for deriving synchronisation patterns of stateful active objects and verifying the absence of deadlocks in this context. Our effect system traces the access to object fields, thus allowing us to compute behavioural types that express synchronisation patterns in a precise way. As a consequence we can automatically verify the absence of deadlocks in active object based programs with wait-by-necessity synchronisations and stateful active objects
Ould, Aboubecrine Mohamed Mahmoud. "Sur l'estimation basée sur les records et la caractérisation des populations." Le Havre, 2011. http://www.theses.fr/2011LEHA0004.
Full textIn the first part of this work, we consider a number of k-record values from independent and identically distributed random variables with a continuous distribution function F, ou aim is to predict future k-record values under suitable assumptions on the tail of F. In the second part, we consider finite populations and investigate their characterization by regressions of order statistics under sampling without replacement. We also give some asymptotic results when the size of the population goes to infinity
Belkaroui, Rami. "Vers un contextualisation des tweets basée sur une analyse des graphes des conversation." Thesis, Nantes, 2018. http://www.theses.fr/2018NANT4013/document.
Full textEven with the recent switch to 280 characters, Twitter messages considered in their singularity, without any additional exogenous information, can confront their readers with difficulties of interpretation. The integration of contextualization on these messages is therefore a promising avenue of research to facilitate access to their information content. In the last decade, most works have focused on building summaries from complementary sources of information such as Wikipedia. In this thesis, we choose a different complementary path that relies on the analysis of conversations on Twitter in order to extract useful information for the contextualization of a tweet. These information were integrated in a prototype which, for a given tweet, offers a visualization of a subgraph of the conversation graph associated with the tweet. This subgraph, automatically extracted from the analysis of structural indicators distributions, allows to highlight particular individuals who play a major role in the conversation and tweets that have contributed to the dynamics of exchanges. This prototype was tested on a panel of users to validate its efficiency and open up prospects for improvement
Arregle, Jean-Luc. "L'industrialisation des sociétés de conseil en management : une analyse basée sur les ressources." Aix-Marseille 3, 1992. http://www.theses.fr/1992AIX32030.
Full textThis research deals with the industrialization (size, standardization, information technology) of management consulting firms in france. The strategic perspective is supported by the "resource based" approach which defines strategic management as "firms attempt to identify, protect, and exploit their unique skills and assets in order to gain competitive advantage in the market place". A cross sectional analysis of 31 enterprises gives the possibility to identify 3 "core resources" : network, size, know how. In addition, competitive behaviours are identified according to the type of resources (skills versus assets). Linkages between different industrialization levels and "core resources" are observed showing the strategic dimension of this evolution. By the means, the resource based approach is validated. Subjects for further research are defined using time section analysis to identify the building of asset stocks and the level of sustainability of competitive advantages
Davoine, Franck. "Compression d'images par fractales basée sur la triangulation de Delaunay." Phd thesis, Grenoble INPG, 1995. http://tel.archives-ouvertes.fr/tel-00005042.
Full textMarteau, Hubert. "Une méthode d'analyse de données textuelles pour les sciences sociales basée sur l'évolution des textes." Tours, 2005. http://www.theses.fr/2005TOUR4028.
Full textThis PhD Thesis aims at bringing to sociologists a data-processing tool wich allows them to analyse of semi-directing open talks. The proposed tool performs in two steps : an indexation of the talks followed by a classification. Usually, indexing methods rely on a general stastistical analysis. Such methods are suited for texts having contents and structure ( literary texts, scientific texts,. . . ). These texts have more vocabulary and structure than talks (limitation to 1000 words for suche texts). On the basis of the assumption that the sociological membership strongly induces the form of the speech, we propose various methods to evaluate the structure and the evolution of the texts. The methods attempt to find new representations of texts (image, signal) and to extract values from these new representations. Selected classification is a classification by trees (NJ). It has a low complexity and it respects distances, then this method is a good solution to provide a help to classification
Gagné, Alexandre. "Métrologie des actinides basée sur l’analyse des matières fécales pour des applications dosimétriques." Thesis, Université Laval, 2014. http://www.theses.ulaval.ca/2014/30527/30527.pdf.
Full textExternal dosimetric techniques, such as portable dosimeter and Geiger-Müller counter, are largely use in the detection and interpretation of received dose by employees of the nuclear sector. However, those techniques are inefficient for characterising internal contamination, such as alpha emitters. As for the moment, the techniques used in Canada for bioassay are oriented towards urine, nasal swabs sampling and on rare occasion; blood and tissue, exceptionally fecal samples. The fecal samples offer a different and complementary approach to other bioassays due to the interaction between the respiratory tract and the gastrointestinal tract. However, there is no official methodology for fecal analysis in Canada. For this thesis, a new methodology based on borated fusion coupled to column chromatography was developed to remedy this problem.
Iksal, Sébastien. "Ingénierie de l'observation basée sur la prescription en EIAH." Habilitation à diriger des recherches, Université du Maine, 2012. http://tel.archives-ouvertes.fr/tel-00991970.
Full textEl, Jabri Mohammed. "Etude de l'organisation spatiale du tissu conjonctif par analyse d'images basée sur une approche multiéchelles." Phd thesis, Clermont-Ferrand 2, 2008. http://www.theses.fr/2008CLF21831.
Full textBouzefrane, Samia. "Etude temporelle des applications temps réel distribuées à contraintes strictes basée sur une analyse d'ordonnançabilité." Poitiers, 1998. http://www.theses.fr/1998POIT2254.
Full textCouchot, Alain. "Analyse statique de la terminaison des règles actives basée sur la notion de chemin maximal." Paris 12, 2001. http://www.theses.fr/2001PA120042.
Full textThe active rules are intended to enrich the databases with a reactive behaviour. An active rule is composed of three main components: the event, the condition, the action. It is desired to guarantee a priori the termination of a set of active rules. The aim of this thesis is to increase the number of termination situations detected by the static analysis. We first determine some restrictions of the previous static analysis methods. We develop then an algorithm for static analysis of termination based on the notion of maximal path of a node. The notion of maximal path is intended to replace the notion of cycle, used by the previous termination algorithms. We present some applications and extensions of our termination algorithm. These extensions and applications concern the active rules flot included in a cycle, the composite conditions, the composite events, the priorities between ailes, and the modular design of rules. .
Bentahar, Amine. "Identification des préoccupations transverses : une approche statique basée sur une analyse du flot de contrôle." Thèse, Université du Québec à Trois-Rivières, 2012. http://depot-e.uqtr.ca/5169/1/030328231.pdf.
Full textMejri, Issam. "Internationalisation des PME technologiques issues des économies émergentes : une analyse basée sur les opportunités d’affaires." Thesis, Université Côte d'Azur (ComUE), 2017. http://www.theses.fr/2017AZUR0029/document.
Full textThe last two decades have been marked by the rise of emerging economies and the emergence of start-ups and Hi-Tech SMEs with high international growth. This trend has spawned a new area of research, international entrepreneurship in emerging economies. The purpose of this qualitative research is to study the factors that influence the process of identifying international opportunities in technological SMEs from an emerging economy, Tunisia. To this end, we are studying the process of internationalization of seven Tunisian technological SMEs in the information and communications technology sector. The results of the intra- and inter-case analysis identify entrepreneurial personality traits, international entrepreneurial capabilities and relational networks as the three main categories of factors that influence the identification of international opportunities. Our research results in the formulation and discussion of eight proposals that make it possible to schematize an explanatory model of the internationalization of technological SMEs emerging from emerging economies
Tetley, Romain. "Analyse mixte de protéines basée sur la séquence et la structure - applications à l'annotation fonctionnelle." Thesis, Université Côte d'Azur (ComUE), 2018. http://www.theses.fr/2018AZUR4111/document.
Full textIn this thesis, the focus is set on reconciling the realms of structure and sequence for protein analysis. Sequence analysis tools shine when faced with proteins presenting high sequence identity (≤ 30\%), but are lack - luster when it comes to remote homolog detection. Structural analysis tools present an interesting alternative, but solving structures - when at all possible- is a tedious and expensive process. These observations make the need for hybrid methods - which inject information obtained from available structures in a sequence model - quite clear. This thesis makes four main contributions toward this goal. First we present a novel structural measure, the RMSDcomb, based on local structural conservation patterns - the so called structural motifs. Second, we developed a method to identify structural motifs between two structures using a bootstrap method which relies on filtrations. Our approach is not a direct competitor to flexible aligners but can provide useful to perform a multiscale analysis of structural similarities. Third, we build upon the previous methods to design hybrid Hidden Markov Models which are biased towards regions of increased structural conservation between sets of proteins. We test this tool on the class II fusion viral proteins - particularly challenging because of their low sequence identity and mild structural homology. We find that we are able to recover known remote homologs of the viral proteins in the Drosophila and other organisms. Finally, formalizing a sub - problem encountered when comparing filtrations, we present a new theoretical problem - the D-family matching - on which we present various algorithmic results. We show - in a manner that is analogous to comparing parts of two protein conformations - how it is possible to compare two clusterings of the same data set using such a theoretical model
Dolques, Xavier. "Génération de Transformations de Modèles : une approche basée sur les treillis de Galois." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2010. http://tel.archives-ouvertes.fr/tel-00916856.
Full textMykhalchuk, Vasyl. "Correspondance de maillages dynamiques basée sur les caractéristiques." Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAD010/document.
Full text3D geometry modelling tools and 3D scanners become more enhanced and to a greater degree affordable today. Thus, development of the new algorithms in geometry processing, shape analysis and shape correspondence gather momentum in computer graphics. Those algorithms steadily extend and increasingly replace prevailing methods based on images and videos. Non-rigid shape correspondence or deformable shape matching has been a long-studied subject in computer graphics and related research fields. Not to forget, shape correspondence is of wide use in many applications such as statistical shape analysis, motion cloning, texture transfer, medical applications and many more. However, robust and efficient non-rigid shape correspondence still remains a challenging task due to fundamental variations between individual subjects, acquisition noise and the number of degrees of freedom involved in correspondence search. Although dynamic 2D/3D intra-subject shape correspondence problem has been addressed in the rich set of previous methods, dynamic inter-subject shape correspondence received much less attention. The primary purpose of our research is to develop a novel, efficient, robust deforming shape analysis and correspondence framework for animated meshes based on their dynamic and motion properties. We elaborate our method by exploiting a profitable set of motion data exhibited by deforming meshes with time-varying embedding. Our approach is based on an observation that a dynamic, deforming shape of a given subject contains much more information rather than a single static posture of it. That is different from the existing methods that rely on static shape information for shape correspondence and analysis.Our framework of deforming shape analysis and correspondence of animated meshes is comprised of several major contributions: a new dynamic feature detection technique based on multi-scale animated mesh’s deformation characteristics, novel dynamic feature descriptor, and an adaptation of a robust graph-based feature correspondence approach followed by the fine matching of the animated meshes. [...]
Tampango, Yendoubouam. "Développement d'une méthode sans maillage basée sur les approximations de Taylor." Thesis, Université de Lorraine, 2012. http://www.theses.fr/2012LORR0322/document.
Full textIn these last decades, new numerical methods known as « meshless methods » have been developped. Contrary to the FEM, these methods uses only a set of nodes in the domain, without need of any mesh. Until now, any of these methods has convinced users of FEM. In this paper, we present a new meshless method using Taylor series expansion. In this method, the PDE is solved quasi exactly in the domain and the boundary conditions are applied by using a least square method. Then only the boundary discretisation is needed so the proposed method is a « true boundary meshless method ». This technique has been proposed for the first time by S. Zeze in his PhD thesis. The study of some linear problems has shown that this technique leads to a very good accuracy and that the convergence can be improved by increasing approximation degree. Our work is a continuation of S. Zeze work, and it consists to make the proposed method more robust and to extend its range of application. For that, we first make an analysis of the series computed by the method. The aim of this analysis was to evaluate the domain of validity of these series. This analysis showed that, for some problems, an accuracy cannot be obtained without splitting the domain in subdomains and making a resolution by subdomains. Therefore the second part of our work was to define a technique which will ensure the continuity at the interface between subdomains, in the case of a resolution by subdomains. The last part of our work was dedicated to non-linear problems. We establish an algorithm to show how the proposed method can deal with nonlinear-problems
Selmane, Sid Ali. "Détection et analyse des communautés dans les réseaux sociaux : approche basée sur l'analyse formelle de concepts." Thesis, Lyon 2, 2015. http://www.theses.fr/2015LYO22004.
Full textThe study of community structure in networks became an increasingly important issue. The knowledge of core modules (communities) of networks helps us to understand how they work and behaviour, and to understand the performance of these systems. A community in a graph (network) is defined as a set of nodes that are strongly linked, but weakly linked with the rest of the graph. Members of the same community share the same interests. The originality of our research is to show that it is relevant to use formal concept analysis for community detection unlike conventional approaches using graphs. We studied several problems related to community detection in social networks : (1) the evaluation of community detection methods in the literature, (2) the detection of disjointed and overlapping communities, and (3) modelling and analysing heterogeneous social network of three-dimensional data. To assess the community detection methods proposed in the literature, we discussed this subject by studying first the state of the art that allowed us to present a classification of community detection methods by evaluating each method presented in the literature (the best known methods). For the second part, we were interested in developing a disjointed and overlapping community detection approach in homogeneous social networks from adjacency matrices (one mode data or one dimension) by exploiting techniques from formal concept analysis. We paid also a special attention to methods of modeling heterogeneous social networks. We focused in particular to three-dimensional data and proposed in this framework a modeling approach and social network analysis from three-dimensional data. This is based on a methodological framework to better understand the threedimensional aspect of this data. In addition, the analysis concerns the discovery of communities and hidden relationships between different types of individuals of these networks. The main idea lies in mining communities and rules of triadic association from these heterogeneous networks to simplify and reduce the computational complexity of this process. The results will then be used for an application recommendation of links and content to individuals in a social network
Pham, Quoc Hoan. "Analyse des réponses balistiques des fibres d'un matériau tissé à l'échelle microscopique basée sur l'homogénéisation numérique." Thesis, Lille, 2021. http://www.theses.fr/2021LILUI008.
Full textThis thesis is dedicated to the study of the ballistic behaviour of woven material using a microscopic approach based on the numerical homogenization technique. Indeed, during a ballistic loading, the projectile acts locally on the impact zone of the fabric, thus generating a transverse compression of the fibres. This phenomenon has been modelled, at the fibre scale, taking into account the effects of fibre-fibre interactions and the evolution of the volume fraction of yarn during loading. From this modelling, a non-linear mechanical behaviour law describing the evolution of apparent stress and strain in a yarn subjected to transverse compression was obtained. Then, based on a periodic Representative Volume Element (RVE), a numerical homogenization technique taking into account the void between fibres was used for modelling. A power law between the apparent deformation and the volume fraction for a yarn subjected to transverse compression was deduced based on this approach. These data were implemented into a model in order to analyse the microscopic behaviour of fibres subjected to ballistic impact based on a multi-scale approach. The analysis has provided a better prediction of the physical phenomena occurring during the ballistic impact of one single yarn at the fibre scale. Then, this model was applied to the case of one fabric subjected to ballistic impact at a mesoscopic scale (yarn scale) combined with a microscopic scale (fibre scale) in the impact zone. This model was validated by comparison with experimental data, in terms of the evolution of projectile velocity. The evolution of energy, impact force, fibre-fibre interactions and local fibre failure are also analysed
Kerbiriou, Corinne. "Développement d'une méthode d'étalonnage d'un radar transhorizon basée sur une analyse fine du fouillis de mer." Rennes 1, 2002. http://www.theses.fr/2002REN1A003.
Full textBabau, Jean-Philippe. "Etude du comportement temporel des applications temps réel à contraintes strictes basée sur une analyse d'ordonnançabilité." Poitiers, 1996. http://www.theses.fr/1996POIT2305.
Full textThuriot, Fanny. "Analyse du séquençage de l’exome basée sur le phénotype pour le diagnostic moléculaire des syndromes polymalformatifs." Mémoire, Université de Sherbrooke, 2017. http://hdl.handle.net/11143/11244.
Full textAbstract: Although the heterogeneity of genetic disorders limits our capacity to identify the causal gene with conventional approaches, exome sequencing has increased the diagnostic yield. However, the large number of variants identified by this method poses a significant challenge in their clinical interpretation. Thus, we developed PhenoVar: a software that integrates phenotypic and genotypic data and produces a short list of potential diagnoses. The objective of this study is to validate this phenotype-based approach on a clinical level and show that it can be efficient to diagnose patients with rare genetic disorders. Exome sequencing was performed on a cohort of 51 patients. These presented with dysmorphic features with or without neurodevelopmental disorders of undetermined etiology, following conventional analysis. Following exome sequencing, a bioinformatics pipeline allowed us to filter variations, keeping only rare coding variations harboring high quality. Then, we analysed these filtered variations with both manual analysis and PhenoVar. In the manual analysis each variant was manually examined to determine its impact and to identify the diagnosis without taking the patient’s phenotype into consideration. Then, Exomiser, another phenotype-based tool, was used to compare PhenoVar’s performances. In comparison to the manual analysis, PhenoVar has allowed us to reduce the analysis time by six-fold and to reduce by half the number of potential diagnoses. With both methods, we found the molecular diagnosis in 18 patients; a rate of 35%. Moreover, among these diagnoses, 16 (89%) are found in the top 10 ranks of PhenoVar, compared to only 10 (56%) for Exomiser. In conclusion, PhenoVar proved to Exomiser in prioritizing the correct diagnosis in the top 10 ranks. Finally, diagnostic yield of PhenoVar is comparable to the manual analysis while reducing the analysis time and the number of variants.
Sidibe, Ibrahima dit Bouran. "Analyse non-paramétrique des politiques de maintenance basée sur des données des durées de vie hétérogènes." Thesis, Université de Lorraine, 2014. http://www.theses.fr/2014LORR0081/document.
Full textIn the reliability literature, several researches works have been developed to deal with modeling, analysis and implementation of maintenance policies for equipments subject to random failures. The majority of these works are based on common assumptions among which the distribution function of the equipment lifetimes is assumed to be known. Furthermore, the equipment is assumed to experience only one operating environment. Such assumptions are indeed restrictive and may introduce a bias in the statistical analysis of the distribution function of the equipment lifetimes which in turn impacts optimization of maintenance policies. In the present research work, these two particular assumptions are relaxed. This relaxation allows to take into account of information related to conditions where the equipment is being operating and to focus on the statistical analysis of maintenance policies without using an intermediate parametric lifetimes distribution. The objective of this thesis consists then on the development of efficient statistical models and tools for managing the maintenance of equipments whose lifetimes distribution is unknown and defined through the heterogeneous lifetimes data. Indeed, this thesis proposes a framework for maintenance strategies determination, from lifetimes data acquisition toward the computation of optimal maintenance policies. The maintenance policies considered are assumed to be performed on used equipments. These later are conduct to experience their missions within different environments each of which is characterized by a degree of severity. In this context, a first mathematical model is proposed to evaluate costs induced by maintenance strategies. The analysis of these costs helps to establish the necessary and sufficient conditions to ensure the existence of an optimal age to perform the preventive maintenance. The maintenance costs are fully estimated by using the Kernel method. This estimation method is non-parametric and defined by two parameters, namely the kernel function and the smoothing parameter. The variability of maintenance costs estimator is deeply analyzed according to the smoothing parameter of Kernel method. From these analyses, it is shown that Kernel estimator method ensures a weak propagation of the errors due to the computation of smoothing parameter. In addition, several simulations are made to estimate the optimal replacement age. These simulations figure out that the numerical results from the Kernel method are close to the theoretical values with a weak coefficient of variation. Two probabilistic extensions of the first mathematical model are proposed and theoretically discussed. To deal with the problem of delayed preventive maintenance, an approach is proposed and discussed. The proposed approach allows evaluating the risk that could induce the delay taken to perform a preventive maintenance at the required optimal date. This approach is based on risk analysis conduct on the basis of a proposed risk function
Sidibe, Ibrahima dit Bouran. "Analyse non-paramétrique des politiques de maintenance basée sur des données des durées de vie hétérogènes." Electronic Thesis or Diss., Université de Lorraine, 2014. http://www.theses.fr/2014LORR0081.
Full textIn the reliability literature, several researches works have been developed to deal with modeling, analysis and implementation of maintenance policies for equipments subject to random failures. The majority of these works are based on common assumptions among which the distribution function of the equipment lifetimes is assumed to be known. Furthermore, the equipment is assumed to experience only one operating environment. Such assumptions are indeed restrictive and may introduce a bias in the statistical analysis of the distribution function of the equipment lifetimes which in turn impacts optimization of maintenance policies. In the present research work, these two particular assumptions are relaxed. This relaxation allows to take into account of information related to conditions where the equipment is being operating and to focus on the statistical analysis of maintenance policies without using an intermediate parametric lifetimes distribution. The objective of this thesis consists then on the development of efficient statistical models and tools for managing the maintenance of equipments whose lifetimes distribution is unknown and defined through the heterogeneous lifetimes data. Indeed, this thesis proposes a framework for maintenance strategies determination, from lifetimes data acquisition toward the computation of optimal maintenance policies. The maintenance policies considered are assumed to be performed on used equipments. These later are conduct to experience their missions within different environments each of which is characterized by a degree of severity. In this context, a first mathematical model is proposed to evaluate costs induced by maintenance strategies. The analysis of these costs helps to establish the necessary and sufficient conditions to ensure the existence of an optimal age to perform the preventive maintenance. The maintenance costs are fully estimated by using the Kernel method. This estimation method is non-parametric and defined by two parameters, namely the kernel function and the smoothing parameter. The variability of maintenance costs estimator is deeply analyzed according to the smoothing parameter of Kernel method. From these analyses, it is shown that Kernel estimator method ensures a weak propagation of the errors due to the computation of smoothing parameter. In addition, several simulations are made to estimate the optimal replacement age. These simulations figure out that the numerical results from the Kernel method are close to the theoretical values with a weak coefficient of variation. Two probabilistic extensions of the first mathematical model are proposed and theoretically discussed. To deal with the problem of delayed preventive maintenance, an approach is proposed and discussed. The proposed approach allows evaluating the risk that could induce the delay taken to perform a preventive maintenance at the required optimal date. This approach is based on risk analysis conduct on the basis of a proposed risk function
Sellami, Akrem. "Interprétation sémantique d'images hyperspectrales basée sur la réduction adaptative de dimensionnalité." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0037/document.
Full textHyperspectral imagery allows to acquire a rich spectral information of a scene in several hundred or even thousands of narrow and contiguous spectral bands. However, with the high number of spectral bands, the strong inter-bands spectral correlation and the redundancy of spectro-spatial information, the interpretation of these massive hyperspectral data is one of the major challenges for the remote sensing scientific community. In this context, the major challenge is to reduce the number of unnecessary spectral bands, that is, to reduce the redundancy and high correlation of spectral bands while preserving the relevant information. Therefore, projection approaches aim to transform the hyperspectral data into a reduced subspace by combining all original spectral bands. In addition, band selection approaches attempt to find a subset of relevant spectral bands. In this thesis, firstly we focus on hyperspectral images classification attempting to integrate the spectro-spatial information into dimension reduction in order to improve the classification performance and to overcome the loss of spatial information in projection approaches.Therefore, we propose a hybrid model to preserve the spectro-spatial information exploiting the tensor model in the locality preserving projection approach (TLPP) and to use the constraint band selection (CBS) as unsupervised approach to select the discriminant spectral bands. To model the uncertainty and imperfection of these reduction approaches and classifiers, we propose an evidential approach based on the Dempster-Shafer Theory (DST). In the second step, we try to extend the hybrid model by exploiting the semantic knowledge extracted through the features obtained by the previously proposed approach TLPP to enrich the CBS technique. Indeed, the proposed approach makes it possible to select a relevant spectral bands which are at the same time informative, discriminant, distinctive and not very redundant. In fact, this approach selects the discriminant and distinctive spectral bands using the CBS technique injecting the extracted rules obtained with knowledge extraction techniques to automatically and adaptively select the optimal subset of relevant spectral bands. The performance of our approach is evaluated using several real hyperspectral data
Datcu, Octaviana. "Méthodes de chiffrement/déchiffrement utilisant des systèmes chaotiques : Analyse basée sur des méthodes statistiques et sur la théorie du contrôle des systèmes." Phd thesis, Université de Cergy Pontoise, 2012. http://tel.archives-ouvertes.fr/tel-00802659.
Full textGacem, Amina. "Méthodologie d’évaluation de performances basée sur l’identification de modèles de comportements : applications à différentes situations de handicap." Versailles-St Quentin en Yvelines, 2013. http://www.theses.fr/2013VERS0053.
Full textThe performance assessment is an important process to identify the abilities and the limits of a person. Currently, the assessment requires the mediation of a specialist (doctor, therapist, etc. ) which must performs analysis and tests to reach a subjective decision. In the literature, several works propose assessment methods based on performance criteria: it is a quantitative evaluation which is objective. This type of evaluation is usually based on statistical analysis. In this work, a new methodology of performance assessment is proposed. It is based on the identification of reference behaviours. Those behaviours are then used as references for the evaluation of other people. The identification of reference behaviours is an essential element of our work. It is based on classification methods. In our work, we have tested two different methods. The first one is the "Fuzzy C-means" which allows a thorough search of reference behaviours. However, behaviours are represented by proxy criteria. The second method is the "Hidden Markov Models". It offers a time series analysis based on the temporal behaviour variation. However, it is not easy to determine the training phase of this method. This assessment methodology has been applied in the context of different applications designed for disabled people: driving electric wheelchair, driving an automobile and the use of pointing devices (mouse, trackball, joystick, etc. ). In each application, a protocol and an ecological situation are defined in order to evaluate participants on different platforms involving functional control interfaces (joystick, mouse, steering wheel, etc. ). Then, statistical tools are used to analyze the data and provide a first interpretation of behaviours. The application of our methodology identifies different reference behaviours and the assessment by comparing behaviours let to identify different levels of expertise. In each of the studied applications, our methodology identifies automatically different reference behaviours. Then, the assessment of people, carried out by comparing to the reference behaviours, let identify different levels of expertise and illustrate the evolution of learning during the assessment. The proposed evaluation methodology is an iterative process. So that, the population of experienced people can be enriched by adding people who become stable after assessment. Therefore, this allows the search for new reference behaviours
Ben, hamida Amal. "Vers une nouvelle architecture de videosurveillance basée sur la scalabilité orientée vers l'application." Thesis, Bordeaux, 2016. http://www.theses.fr/2016BORD0144/document.
Full textThe work presented in this thesis aims to develop a new architecture for video surveillance systems. Firstly, a literature review has led to classify the existing systems based on their applications level which dependents directly on the performed analytical functions. We, also, noticed that the usual systems treat all captured data while, actually, a small part of the scenes are useful for analysis. Hence, we extended the common architecture of surveillance systems with a pre-analysis phase that extracts and simplifies the regions of interest with keeping the important characteristics. Two different methods for preanalysis were proposed : a spatio-temporal filtering and a modeling technique for moving objects. We contributed, too, by introducing the concept of application-oriented scalability through a multi-level application architecture for surveillance systems. The different applications levels can be reached incrementally to meet the progressive needs of the enduser. An example of video surveillance system respecting this architecture and using the preanalysis methods was proposed
Jouini, Mohamed Soufiane. "Caractérisation des réservoirs basée sur des textures des images scanners de carottes." Thesis, Bordeaux 1, 2009. http://www.theses.fr/2009BOR13769/document.
Full textCores extracted, during wells drilling, are essential data for reservoirs characterization. A medical scanner is used for their acquisition. This feature provide high resolution images improving the capacity of interpretation. The main goal of the thesis is to establish links between these images and petrophysical data. Then parametric texture modelling can be used to achieve this goal and should provide reliable set of descriptors. A possible solution is to focus on parametric methods allowing synthesis. Even though, this method is not a proven mathematically, it provides high confidence on set of descriptors and allows interpretation into synthetic textures. In this thesis methods and algorithms were developed to achieve the following goals : 1. Segment main representative texture zones on cores. This is achieved automatically through learning and classifying textures based on parametric model. 2. Find links between scanner images and petrophysical parameters. This is achieved though calibrating and predicting petrophysical data with images (Supervised Learning Process)
Kaoud, Hebatalla. "Une analyse de la gouvernance et l'innovation des clusters basée sur le modèle de la Triple-Hélice." Thesis, Nantes, 2018. http://www.theses.fr/2018NANT2021/document.
Full textThe development of innovation clusters has opened several research perspectives due to its importance in the academic world and in the managerial practices. The purpose of this research is to mobilize the Triple Helix model "State, Industry, Academy" in order to identify forms of governance favouring innovation of clusters. We have attached particular importance on the role played by the state and the policies it conducts in the development of these structures. The methodology is based on a qualitative approach based on "case studies". Three case studies were conducted on clusters mainly in the field of the Information Technology and Communication in Egypt: Smart Village of Cairo, Maadi Technology Park and Greek Campus. Data were collected from different sources (semi-structured interviews, non-participant observations, and secondary data). A complementary study was carried out on The Electronics, Mechanics and Mechatronics Cluster in Morocco and on the Morocco Microelectronics Cluster in order to reinforce the external validity of the research. Last but not least, we explored the case of CAPACITES; a knowledge valorization company created by the University of Nantes in France. The results of the research confirm that the diversity of governance forms of the three Egyptian clusters (public, private, public-private partnership) impacted the strategies of innovation and entrepreneurship. This thesis proposes the model O2 Innovation for the innovation of the clusters. The model highlights the importance of spreading the culture of entrepreneurship while emphasizing on the primary role of the academy in an open innovation process
Turcotte, Nicolas. "Mise à jour numérique d'un modèle de pont par éléments finis basée sur une analyse modale expérimentale." Mémoire, Université de Sherbrooke, 2016. http://hdl.handle.net/11143/8752.
Full textFavela, Contreras Antonio. "Modélisation et analyse du comportement dynamique des systèmes hybrides : une approche basée sur le modèle d'automate hybride." Grenoble INPG, 1999. http://www.theses.fr/1999INPG0121.
Full textNguyen, Minh Duc. "Méthodologie de test de systèmes mobiles : Une approche basée sur les sénarios." Phd thesis, Université Paul Sabatier - Toulouse III, 2009. http://tel.archives-ouvertes.fr/tel-00459848.
Full textNoel, David. "Une approche basée sur le web sémantique pour l'étude de trajectoires de vie." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM022/document.
Full textThe notion of trajectory is the subject of many works in computer science. The life trajectory has several peculiarities which distinguish it from the trajectories usually considered in these works. It is first of all its temporal hold, which is the life, the existence of the observed subject. It is then its thematic hold, this one potentially concerning multiple aspects of the life of an object or an individual. Finally, it is the metaphorical use of the term trajectory, which refers more to the meaning of the trajectory than to the description of a simple evolution in time and space. The life trajectory is used by the expert (sociologist, urban planner ...) who wishes to put in perspective the information on individuals to better understand their choices. The motivations for studying the life trajectory are depending on the application and themes considered: the relation to work and employment, family life, social life, health, residential trajectory ...We propose a Semantic Web based approach to study life trajectories, which allows their modeling, collection and analysis. This approach is embodied by a software architecture whose components are configurable for each application case. This architecture is based on a life trajectory ontology design pattern, as well as a model of explanatory factors for life events. To operationalize the proposed modeling, we designed algorithms that allow the creation of a life trajectory ontology by exploiting the previous pattern and model. For data collection, we developed APIs to facilitate i) the construction of a model-compliant data collection interface; and ii) the insertion of the collected data into a Triple Store. Our approach allows the representation, and hence the collection and exploitation of multi-granular information, whether spatial, temporal or thematic. Finally, to allow the analysis of the trajectories, we propose generic functions, which are implemented by extending the SPARQL language.The methodological approach and the proposed tools are validated on a case study on residential choices of individuals in the Grenoble metropolitan area by highlighting the characteristics of their residential trajectory and the explanatory elements of it, including from their personal and professional trajectories
Amestoy, Patrick. "Factorisation de grandes matrices creuses non symétriques basée sur une méthode multifrontale dans un environnement multiprocesseur." Toulouse, INPT, 1990. http://www.theses.fr/1990INPT050H.
Full textZhang, Yiqun. "Contribution à l'étude de la vision dynamique : une approche basée sur la géométrie projective." Compiègne, 1993. http://www.theses.fr/1993COMPD650.
Full textNinin, Jordan. "Optimisation Globale basée sur l'Analyse d'Intervalles : Relaxation Affine et Limitation de la Mémoire." Phd thesis, Institut National Polytechnique de Toulouse - INPT, 2010. http://tel.archives-ouvertes.fr/tel-00580651.
Full textPirayesh, Neghab Amir. "Évaluation basée sur l’interopérabilité de la performance des collaborationsdans le processus de conception." Thesis, Paris, ENSAM, 2014. http://www.theses.fr/2014ENAM0033/document.
Full textA design process, whether for a product or for a service, is composed of a large number of activities connected by many data and information exchanges. The quality of these exchanges, called collaborations, requires being able to send and receive data and information which are useful, understandable and unambiguous to the different designers involved. The research question is thus focused on the definition and evaluation of the performance of collaborations, and by extension, of the design process in its entirety. This performance evaluation requires the definition of several key elements such as object(s) to be evaluated, the performance indicator(s) and action variable(s).In order to define the object of evaluation, this research relies on a study of the literature resulting in a meta-model of collaborative process. The measurement of the performance of these collaborations is for its part based on the concept of interoperability. The measurement of the performance of the collaborations is for its part based on the concept of interoperability. Furthermore, this measure estimates the technical and conceptual interoperability of the different elementary collaborations. This work is concluded by proposing a tooled methodological framework for collaborations' performance evaluation. Through a two-step approach (modeling and evaluation), this framework facilitates the identification of inefficient collaborations and their causes. This framework is illustrated and partially validated using an academic example and a case study in design domain
Nguyen, Minh Duc. "Méthodologie de test de systèmes mobiles : une approche basée sur les scénarios." Toulouse 3, 2009. http://thesesups.ups-tlse.fr/775/.
Full textAdvances in wireless networking have yielded the development of mobile computing applications. Their unique characteristics provide new challenges for verification. This dissertation elaborates on the testing technology for mobile systems. As a first step, a review of the state-of-the-art is performed together with a case study - a group membership protocol (GMP) in mobile ad hoc settings - that allowed us to gain insights into testing problems. We present, then, a testing approach which bases on the descriptions of scenarios. We note that the scenario interactions must take into account the spatial configurations of nodes as first class concepts. To cover new specificities of mobile systems in description languages, we introduce some extensions that focus on spatio-temporal relations between nodes and on broadcast communication. The processing of spatial aspect leads to the development of the tool GraphSeq. GraphSeq aims to analyze test traces in order to identify occurrences of successive spatial patterns described in an abstract scenario. The application of GraphSeq (support to implementation of test cases, verification of trace coverage) is shown with the analysis of outputs of a simulator and execution traces of the case study GMP
Touili, Tayssir. "Analyse symbolique de systèmes infinis basée sur les automates : application à la vérification de systèmes paramétrés et dynamiques." Phd thesis, Université Paris-Diderot - Paris VII, 2003. http://tel.archives-ouvertes.fr/tel-00161124.
Full textles systèmes paramétrés et les programmes récursifs parallèles. Nous présen\-tons un cadre
uniforme pour la vérification algorithmique de ces systèmes. Ce cadre est basé sur la
représentation des ensembles de configurations par des automates de mots ou d'arbres, et la
représentation des relations de transition des systèmes par des règles de réécritures de mots
ou de termes. Le problème de la vérification est ensuite réduit au calcul des ensembles des
accessibles dans ce cadre. Les contributions de cette thèse sont les suivantes:
1- Définition d'une technique d'accélération générale. Nous proposons une méthode basée sur
des techniques d'extrapolation sur les automates, et nous étudions la puissance de cette approche.
2- Techniques de model-checking régulier pour la vérification des réseaux paramétrés avec des
topologies linéaires et arborescentes. En particulier, nous considérons les réseaux modélisés
par des systèmes de réécriture comprenant des semi-commutations, c-à-d. des règles de la forme ab -> ba,
et nous exhibons une classe de langages qui est effectivement fermée par ces systèmes.
3- Modélisation et vérification des programmes récursifs parallèles. Dans un premier temps,
nous étudions les modèles PRS qui sont plus généraux que les systèmes à pile, les réseaux de Petri,
et les systèmes PA; et nous proposons des algorithmes qui calculent les ensembles des accessibles
de (sous-classes de) PRS en considérant différentes sémantiques.
Dans une autre approche, nous considérons des modèles basés sur des automates à pile communicants
et des systèmes de réécritures à-la CCS, et nous proposons des méthodes de vérification de ces modèles
basées sur le calcul d'abstractions des langages des chemins d'exécutions. Nous proposons un cadre
algébrique générique permettant le calcul de ces abstractions.
Le, Kim Hung. "Analyse cyclique de sécurité des grands réseaux de transport et d'interconnexion basée sur les concepts de la localisation." Grenoble INPG, 1995. http://www.theses.fr/1995INPG0125.
Full textPettersson, Fredrik. "Le fantastique dans Le Spectre large de Gérard Prévot : Une analyse basée sur la théorie de Tzvetan Todorov." Thesis, Linnéuniversitetet, Institutionen för språk (SPR), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-79278.
Full textThis thesis is centred on the fantastic features in Le Spectre Large by Gérard Prévot (1975) and subsequently on a comparison with the features from another collection by Prévot, La Nuit du Nord (1974), studied by Ricci in 1993. The analysis is based on Tzvetan Todorov’s theory of the fantastic, which puts its emphasis on the hesitation whether the enigmatic phenomena has its origins in the natural (uncanny) or the supernatural (marvellous) world. The research of Prévot’s literary work is interesting given that, compared to other Belgian fantastic writers such as Jean Ray or Thomas Owen, Gérard Prévot is still rather unknown and not much studied. With regard to the results, it is found that the stories have a mixed qualities in relation to fantastic theory. In the better tales, the fantastic is born out of a narration with a limited point of view and a vague and sometimes unknown narrator. It is also born out of the plot which introduces the disruptive elements at the very last page. When comparing the corpus with Ricci’s study (1993), several common features are found, but also a few differences. Prévot has written the two books using a similar vocabulary and the stories often take place in related environments – the North, a place with mysterious connotations. A fantastic which Ricci calls “social” can also be found in both collections. The main point of difference between the books is the narration. While the narrator in La Nuit du Nord has an omniscient status, the one in Le Spectre Large is clearly limited and anonymous.
Zhang, Xiaoqun. "Reconstruction et régularisation en tomographie par une méthode de Fourier basée sur la variation totale." Lorient, 2006. http://www.theses.fr/2006LORIS075.
Full textX-ray tomography consists in inversing a Radon transform from partial data in the Fourier space, leading this imaging process to belong to the family of ill-posed inverse problems. Solving this problem needs (explicitly or implicitly) frequency data interpolation and extrapolation from a polar grid to a Cartesian one. Our approach introduces the total variation seminorm in order to minimize the artifacts and to reconstruct a piecewise smooth image, visually close to the synthetic data set used by radiologists. We propose an optimisation scheme under a constraint given in the Fourier space for which the advantages are a precise data modelling and a low computational cost. The main difficulty remains the formulation of the frequency constraint, which must balance between a good data fit and the liberty of reducing the total variation enough so that the image can be denoised. We study three constraints, the first one being based on local extrema over the polar grid, the second on an estimator of the local Lipschitz regularity and the third on local regression with nonlinear thresholding. The proposed model and the associated algorithms are valided by numerous experiments on realistic data, simulated from the Shepp-Logan phantom using different types of noise. Numerical results show a significant qualitative enhancement of reconstructed images using our model compared to the standard algorithm (filtered backprojection), with the suppression of almost all noise and artefacts. The signal to noise ratio increases by 1 up to 6 db depending on the type of data and the constraint. At last, the proposed algorithms are faster than the standard one by more or less a factor of three
Noce, Aurélien. "Vision et reconstruction 3D basée sur la texture : application à la robotique médicale." Phd thesis, Montpellier 2, 2008. http://www.theses.fr/2008MON20244.
Full textChang, Jing. "Evaluation des risques de troubles musculo-squelettiques liés au travail basée sur OpenSim." Thesis, Ecole centrale de Nantes, 2018. http://www.theses.fr/2018ECDN0042/document.
Full textWork-related musculoskeletal disorders cause physical and mental illnesses in workers, reduce their productivity and cause great losses to industries and society. This thesis focuses on the assessment of the physical risk of work-related musculoskeletal disorders in industry, for which four key points are identified: measuring workloads, assessing the effect of workload accumulation, quantifying individual characteristics and integrating the risk assessment into digital human modeling tools. In the state of the art, the epidemiologic studies of musculoskeletal disoders and the current methods used for its physical risk assessment are reviewed, as well as the studies concerning the four key points. The second part presents an experimental study involving 17 subjects to explore a new indicator to muscle fatigue with surface EMG. In the next part, efforts are made to integrate a muscle fatigue model into OpenSim, a digital human modeling software, with which the capacity decrease of each muscle is predictable for a given task. The predicted values could be applicable to the physical risk assessment. The fourth part introduce the work to build up a Fullchain musculoskeletal model in OpenSim in view that no current model covers muscles of the torso and all the limbs. Special attention is paid to the method used by OpenSim to adapt the model inertial properties to individuals. Errors of the method is evaluated with reference data coming from the whole-body 3D scan. In the last part, the newly built Full-chain model is applied on the posture analysis of an overhead drilling task. The muscle activition varies as a function of postures, which is suggested as the indicator of posture loads