Dissertations / Theses on the topic 'Rupture detection'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 41 dissertations / theses for your research on the topic 'Rupture detection.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Fu, Jiyuan. "Using Thermographic Image Analysis in Detection of Canine Anterior Cruciate Ligament Rupture Disease." Thesis, Southern Illinois University at Edwardsville, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=1582920.
Full textAnterior cruciate ligament (ACL) rupture is a common trauma which frequently happens in overweight dogs. Veterinarians use MRI (Magnetic resonance imaging) as the standard method to diagnose this disease. However MRI is expensive and time-consuming. Therefore, it is necessary to find an alternative diagnostic method. In this research, thermographic images are utilized as a prescreening tools for the detection of ACL rupture disease. Meanwhile, a quantitative comparison is made of new feature vectors based on Gabor filters with different frequencies and orientations.
Sim, Alisia Mara. "Detection of calcification in atherosclerotic plaques using optical imaging." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33151.
Full textTaillade, Thibault. "A new strategy for change detection in SAR time-series : application to target detection." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPAST050.
Full textThe detection of targets such as ships or vehicles in SAR (Synthetic Aperture Radar) images is an essential challenge for surveillance and security purpose. In some environments such as urban areas, harbor areas or forest observed at low radar frequencies, detecting these objects becomes difficult due to the high backscattering properties of the surrounding background. To overcome this issue, change detection (CD) between SAR images enables to cancel the background and highlight successfully targets present within the scene. However, in several environments, a temporal overlapping of targets may occur and generates possible misinterpretation because the outcome relies on the relative change between objects of different sizes or properties. This is a critical issue when the purpose is to visualize and obtain the number of targets at a specific day in high attendance areas such as harbors or urban environments. Ideally, this change detection should occur between a target-free image and onewith possible objects of interest. With the current accessibility to SAR time-series, we propose to compute a frozen background reference (FBR) image that will consists only in the temporally static background. Performing change detection from this FBR image and any SAR image aim to highlight the presence of ephemeral targets. This strategy has been implemented for ship detection in harbor environment and in the context of vehicles hidden under foliage
LAGRANGE, DOMINIQUE. "Estimation de la date et de l'amplitude d'une rupture conditionnellement a sa detection par une carte cusum." Paris, Institut national d'agronomie de Paris Grignon, 1997. http://www.theses.fr/1997INAP0045.
Full textDahal, Rohini. "Bilateral Thermographic Image Comparison Software Tool for Pathology Detection in Canines with Application to Anterior Cruciate Ligament (ACL) Rupture." Thesis, Southern Illinois University at Edwardsville, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10276314.
Full textIntroduction: The bilaterally symmetry property in animals can be used to detect pathologies where body parts on both sides can be compared. For any pathological disorder, thermal patterns differ compared to the normal body parts. A software application for veterinary clinics is under development to input two thermograms of body parts on both sides, one normal and the other unknown, and the application compares them on the basis of extracted features and appropriate similarity and difference measures and outputs the likelihood of pathology. Previous research has been used to determine the appropriate image processing, feature extraction and comparison metrics to be used. The comparison metrics used are the vector inner product, Tanimoto, Euclidean, city block, Minkowski and maximum value metric. Also, results from experiments with comparison tests are used to derive a potential threshold values which will separate normal from abnormal images for a specific pathology.
Objectives: The main objective of this research is to build a comparison software tool application by combining the concepts of bilateral symmetrical property in animals and IR thermography that can be for prescreening in veterinary clinics.
Comparison Software Tool Development: The comparison software tool was developed for veterinary clinics as a prescreening tool for pathology detection using the concepts of thermography and bilateral symmetry property in animals. The software tool has a graphical user interface (GUI) that allows ease of use for the clinical technician. The technician inputs images or raw temperature data csv files and compares thermographic images of bilateral body parts. The software extracts features from the images and calculates the difference between the feature vectors with distance and/or similarity metrics. Based upon these metrics, the percentage deviation is calculated which provides the deviation of the unknown (test) image from the known image. The percentage deviation between the thermograms of the same body parts on either side provides an indication regarding the extent and impact of the disease [Poudel; 2015]. The previous research in veterinary thermography [Liu; 2012; Subedi; 2014, Fu; 2014, Poudel; 2015] has been combined with the real world veterinary clinical scenario to develop a software tool that can be helpful for researchers as well as for the clinical technicians in prescreening of pathologies.
Experimental Results and Discussion: Experiments were performed on ACL thermograms to determine a threshold that can separate normal and abnormal ACL images. 18-colored Meditherm images had poor results and could not suggest any threshold value. But results were positive for temperature remapped 256 gray level Meditherm images which suggested the 40% of percentage deviation could produce a separation. The total number of Normal - Normal pairs were greater than total number of Normal – Abnormal pairs below 40% deviation. Similarly, total number of Normal –Abnormal pairs of images were greater than total number of Normal – Normal pairs above 40%. This trend was consistent for Euclidean distance, maximum value distance and Minkowski distance for both texture distances of 6 and 10. The performance in terms of sensitivity and specificity was poor. The best sensitivity of 55% and best specificity of 67% was achieved. This indicates better results for predicting the absence of ACL rupture then actually finding the disease. In this case the software could be used by the clinician in conjunction with other diagnostic methods.
Conclusion: The Experiments, results and analysis show that the comparison software tool can be used in veterinary clinics for the pre-screening of diseases in canines and felines to estimate the extent and impact of the disease based upon the percentage deviation. However, more research is necessary to examine its efficacy for specific pathologies. Note that the software can be used by researchers to compare any two images of any formats. For ACL experimentation, there are indication that a threshold value is possible to separate normal from abnormal and the spectral, texture and spectral features suggested by researches [Subedi; 2014, Liu; 2012, Fu; 2014, Poudel; 2015] are not sufficient to determine that threshold with the given image database.
Le, bars Batiste. "Event detection and structure inference for graph vectors." Thesis, université Paris-Saclay, 2021. http://www.theses.fr/2021UPASM003.
Full textThis thesis addresses different problems around the analysis and the modeling of graph signals i.e. vector data that are observed over graphs. In particular, we are interested in two tasks. The rst one is the problem of event detection, i.e. anomaly or changepoint detection, in a set of graph vectors. The second task concerns the inference of the graph structure underlying the observed graph vectors contained in a data set. At first, our work takes an application oriented aspect in which we propose a method for detecting antenna failures or breakdowns in a telecommunication network. The proposed approach is designed to be eective for communication networks in a broad sense and it implicitly takes into account the underlying graph structure of the data. In a second time, a new method for graph structure inference within the framework of Graph Signal Processing is investigated. In this problem, notions of both local and globalsmoothness, with respect to the underlying graph, are imposed to the vectors.Finally, we propose to combine the graph learning task with the change-point detection problem. This time, a probabilistic framework is considered to model the vectors, assumed to be distributed from a specifc Markov Random Field. In the considered modeling, the graph underlying the data is allowed to evolve in time and a change-point is actually detected whenever this graph changes significantly
Do, Van Long. "Sequential detection and isolation of cyber-physical attacks on SCADA systems." Thesis, Troyes, 2015. http://www.theses.fr/2015TROY0032/document.
Full textThis PhD thesis is registered in the framework of the project “SCALA” which received financial support through the program ANR-11-SECU-0005. Its ultimate objective involves the on-line monitoring of Supervisory Control And Data Acquisition (SCADA) systems against cyber-physical attacks. The problem is formulated as the sequential detection and isolation of transient signals in stochastic-dynamical systems in the presence of unknown system states and random noises. It is solved by using the analytical redundancy approach consisting of two steps: residual generation and residual evaluation. The residuals are firstly generated by both Kalman filter and parity space approaches. They are then evaluated by using sequential analysis techniques taking into account certain criteria of optimality. However, these classical criteria are not adequate for the surveillance of safety-critical infrastructures. For such applications, it is suggested to minimize the worst-case probability of missed detection subject to acceptable levels on the worst-case probability of false alarm and false isolation. For the detection task, the optimization problem is formulated and solved in both scenarios: exactly and partially known parameters. The sub-optimal tests are obtained and their statistical properties are investigated. Preliminary results for the isolation task are also obtained. The proposed algorithms are applied to the detection and isolation of malicious attacks on a simple SCADA water network
Davis, Elizabeth H. "Detection of rupture-repair sequences in patterns of alliance development the effects of client vs. therapist raters and therapist training status /." Ohio : Ohio University, 2005. http://www.ohiolink.edu/etd/view.cgi?ohiou1133405084.
Full textDavis, Elizabeth Helen. "Detection of Rupture-Repair Sequences in Patterns of Alliance Development: The Effects of Client vs. Therapist Raters and Therapist Training Status." Ohio University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1133405084.
Full textAllab, Nedjmeddine. "Détection d'anomalies et de ruptures dans les séries temporelles. Applications à la gestion de production de l'électricité." Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066658.
Full textContinental is the main tool that edf uses for the long-term management of electricity. It elaborates the strategy exploitation of the electrical parc made up by power plants distributed all over europe. the tool simulates for each zone and each scenario several variables, such as the electricity demand, the generated quantity as well as the related costs. our works aim to provide methods to analyse the data of electricity production in order to ease their discovery and synthesis. we get a set of problmatics from the users of continental that we tent to solve through techniques of outliers and changepoints detection in time series
Fuchs, Robin. "Méthodes neuronales et données mixtes : vers une meilleure résolution spatio-temporelle des écosystèmes marins et du phytoplancton." Electronic Thesis or Diss., Aix-Marseille, 2022. http://www.theses.fr/2022AIXM0295.
Full textPhytoplankton are one of the first links in the food web and generate up to 50% of the world's primary production. The study of phytoplankton and their physical environment requires observations with a resolution of less than a day and a kilometer, as well as the consideration of the heterogeneous types of data involved and the spatio-temporal dependency structures of marine ecosystems.This thesis aims to develop statistical methods in this context by using technologies such as automated flow cytometry. Theoretical developments focused on Deep Gaussian Mixture Models (DGMM) introduced by Viroli and McLachlan (2019). To better characterize phytoplankton ecological niches, we extended these models to mixed data (exhibiting continuous and non-continuous variables) often found in oceanography. A clustering method has been proposed as well as an algorithm for generating synthetic mixed data.Regarding the high-frequency study itself, convolutional neural networks have been introduced to process flow cytometry outputs and to study six functional groups of phytoplankton in the littoral zone and the open ocean. Differentiated and reproducible responses of these groups were identified following wind-induced pulse events, highlighting the importance of the coupling between physics and biology. In this regard, a change-point detection method has been proposed to delineate epipelagic and mesopelagic zones, providing a new basis for the calculation of mesopelagic carbon budgets
Ibrahim, Dalia. "Étude théorique d'indicateurs d'analyse technique." Thesis, Nice, 2013. http://www.theses.fr/2013NICE4008.
Full textThe aim of my thesis is to study mathematically an indicator widely used by the practitioners in the trading market, and designed to detect changes in the volatility term . The Bollinger Bands indicator belongs to the family of methods known as technical analysis which consist in looking t the past price movement in order to predict its future price movements independently of any mathematical model. We study the performance of this indicator in a universe that is governed by a stochastic differential equations (Black-Scholes) such that the volatility changes at an unknown and unobservable random time, for a practitioner seeking to maximize an objective function (for instance, the expected utility of the wealth at a certain maturity). Within the framework of the model, Bollinger indicator can be interpreted as an estimator of the time at which the volatility changes its value. We show that in the case of small volatilities, the density behavior of the indicator depends on the value of the volatility, which allows that for large ratio of volatility, to detect via the distribution estimation in which regime of volatility we are. Also , for the case of large volatilities, we show by an approach via the Laplace transform that the asymptotic tails behavior of the indictor depends on the volatility value. This allows to detect a change for large volatilities. Next, we compare two indicators designed to detect a volatility change: the Bollinger bands and the quadratic variation indicators. Finally, we study the optimal portfolio allocation which is described by a non-standard stochastic problem in view of that the admissible controls need to be adapted to the filtration generated by the prices. We resolve this control problem by an approach used by Pham and Jiao to separate the initial allocation problem into an allocation problem after the rupture and an problem before the rupture, and each one of these problems is resolved by the dynamic programming method. Also, a verification theorem is proved for this stochastic control problem
Germain, Florence. "Estimation du mouvement dans une sequence d'images, en contexte lineaire optimal." Grenoble INPG, 1995. http://www.theses.fr/1995INPG0030.
Full textMiallaret, Sophie. "Dynamic Monitoring Measures." Thesis, Université Clermont Auvergne (2017-2020), 2019. http://www.theses.fr/2019CLFAC091.
Full textThe measures are daily actions, they give us a lot of information and allow us to make decisions. The analysis of measures can allow us to learn more about our environment, but the error of a measure can have important consequences in certain areas. In a first part, we propose, thanks to the study of blood test measurements carried out at the CHU of Clermont-Ferrand, a procedure for detecting deviations from medical biology laboratory analyzers based on patient analysis measurements. After a descriptive analysis of the data, the method put in place, using methods of detection of breaks of time series, is tested for simulations of breaks representing offsets, imprecision or drifts of machine for different measured biological parameters. The method is adapted for two scenarios: when the patient's hospital service is known or not. The study is supplemented by an analysis of the impact of measurement uncertainty on patient analyses. In a second part we study measurements of volcanic ash forms made at “Laboratoire Magmas et Volcans” of the Clermont Auvergne University, in order to determine a link between the collection locations and the forms of the particles. After showing the dependence between these parameters, we propose, using a classification method, a grouping of particles representing different populations depending on the distance between the collection locations and the volcano crater
Mercklé, Jean. "Stratégies de détection de rupture de modèle appliquées à la recherche et à la localisation des défauts sur des produits sidérurgiques." Nancy 1, 1988. http://www.theses.fr/1988NAN10047.
Full textLaredo, Jeanette A. "Reading the Ruptured Word: Detecting Trauma in Gothic Fiction from 1764-1853." Thesis, University of North Texas, 2016. https://digital.library.unt.edu/ark:/67531/metadc862792/.
Full textVan, der Werff Matthew John. "Development of digital instrumentation for bond rupture detection : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Engineering at Massey University, Palmerston North, New Zealand." Massey University, 2009. http://hdl.handle.net/10179/857.
Full textDavy, Axel. "Modélisation de fonds complexes statiques et en mouvement : application à la détection d'événements rares dans les séries d'images." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLN048/document.
Full textThe first part of this thesis is dedicated to the modeling of image or video backgrounds, applied to anomaly detection. In the case of anomaly detection on a single image, our analysis leads us to find five different families of structural assumptions on the background. We propose new algorithms for single-image anomaly detection, small target detection on moving background, change detection on satellite SAR (Synthetic Aperture Radar) images and cloud detection on a sequence of satellite optical images.In the second part, we study two further applications of background modeling. To perform video denoising we search, for every video patch, similar patches in the video sequence, and feed their central pixels to a convolutional neural network (CNN). The background model in this case is hidden in the CNN weights. In our experiments, the proposed method is the best performing of the compared CNN-based methods. We also study exemplar-based texture synthesis. In this problem texture samples have to be generated based on only one reference sample. Our survey classifies the families of algorithms for this task according to their model assumptions. In addition, we propose improvements to fix the border behavior issues that we pointed out in several deep learning based methods.In the third part, we propose real-time GPU implementations for B-spline interpolation and for several image and video denoising algorithms: NL-means, BM3D and VBM3D. The speed of the proposed implementations enables their use in real-time scenarios, and they are currently being transitioned to industry
Debbabi, Nehla. "Approche algébrique et théorie des valeurs extrêmes pour la détection de ruptures : Application aux signaux biomédicaux." Thesis, Reims, 2015. http://www.theses.fr/2015REIMS025/document.
Full textThis work develops non supervised techniques for on-line detection and location of change-points in noisy recorded signals. These techniques are based on the combination of an algebraic approach with the Extreme Value Theory (EVT). The algebraic approach offers an easy identification of the change-points. It characterizes them in terms of delayed Dirac distributions and their derivatives which are easily handled via operational calculus. This algebraic characterization, giving rise to an explicit expression of the change-points locations, is completed with a probabilistic interpretation in terms of extremes: a change point is seen as a rare and extreme event. Based on EVT, these events are modeled by a Generalized Pareto Distribution.Several hybrid multi-components models are proposed in this work, modeling at the same time the mean behavior (noise) and the extremes ones (change-points) of the signal after an algebraic processing. Non supervised algorithms are proposed to evaluate these hybrid models, avoiding the problems encountered with classical estimation methods which are graphical ad hoc ones. The change-points detection algorithms developed in this thesis are validated on generated data and then applied on real data, stemming from different phenomenons, where change-points represent the information to be extracted
Vozel, Benoit. "Etude comparative d'algorithmes recursifs de detection de ruptures spectrales." Nantes, 1994. http://www.theses.fr/1994NANT2021.
Full textLaurent, Hélène. "Detection de ruptures spectrales dans le plan temps-frequence." Nantes, 1998. http://www.theses.fr/1998NANT2081.
Full textHarlé, Flore. "Détection de ruptures multiples dans des séries temporelles multivariées : application à l'inférence de réseaux de dépendance." Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAT043/document.
Full textThis thesis presents a method for the multiple change-points detection in multivariate time series, and exploits the results to estimate the relationships between the components of the system. The originality of the model, called the Bernoulli Detector, relies on the combination of a local statistics from a robust test, based on the computation of ranks, with a global Bayesian framework. This non parametric model does not require strong hypothesis on the distribution of the observations. It is applicable without modification on gaussian data as well as data corrupted by outliers. The detection of a single change-point is controlled even for small samples. In a multivariate context, a term is introduced to model the dependencies between the changes, assuming that if two components are connected, the events occurring in the first one tend to affect the second one instantaneously. Thanks to this flexible model, the segmentation is sensitive to common changes shared by several signals but also to isolated changes occurring in a single signal. The method is compared with other solutions of the literature, especially on real datasets of electrical household consumption and genomic measurements. These experiments enhance the interest of the model for the detection of change-points in independent, conditionally independent or fully connected signals. The synchronization of the change-points within the time series is finally exploited in order to estimate the relationships between the variables, with the Bayesian network formalism. By adapting the score function of a structure learning method, it is checked that the independency model that describes the system can be partly retrieved through the information given by the change-points, estimated by the Bernoulli Detector
Myers, Vincent. "Le traitement, l'interprétation et l'exploitation d'images sonar à antenne synthétique obtenues à partir de trajectoires répétitives." Thesis, Brest, École nationale supérieure de techniques avancées Bretagne, 2019. http://www.theses.fr/2019ENTA0002.
Full textThere are many scenarios which call for the surveillance of an underwater scene by means of repeated surveys with high-frequency imaging sonar in order to detect changes which may have occurred during the intervening time interval. With the growing availability of commercial synthetic aperture sonar (SAS) systems it becomes possible to exploit the phase coherence between two complex SAS images in order to detect scene changes which are subtle or even invisible to approaches using only the amplitude of the images. This thesis examines the concept of coherent change detection (CCD) using SAS imagery obtained from separate, repeated passes over the same area. As the images must be processed interferometrically, the challenging problem of co-registration is addressed, with approaches based on image warping as well as renavigation / re-imaging. False alarm reduction techniques are also examined in order to mitigate detections caused by coherence losses which are not attributed to the insertion or removal of targets of interest. The proposed methods are tested on several repeat-pass SAS images collected during experiments at sea, spanning multiple frequency bands and environmental conditions, and show that SAS CCD is not only possible, but also able to detect very subtle scene changes that not observable using standard approaches
Kouamé, Denis. "Modelisation parametrique et detection de ruptures en traitement du signal ultrasonore." Tours, 1996. http://www.theses.fr/1996TOUR4015.
Full textRAMDANI, LOUNI RABEA. "Detection de ruptures de modeles et identification parametrique hybride de systemes non stationnaires." Paris 6, 1993. http://www.theses.fr/1993PA066460.
Full textBaysse, Camille. "Analyse et optimisation de la fiabilité d'un équipement opto-électrique équipé de HUMS." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2013. http://tel.archives-ouvertes.fr/tel-00986112.
Full textTruong, Charles. "Détection de ruptures multiples – application aux signaux physiologiques." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLN030/document.
Full textThis work addresses the problem of detecting multiple change points in (univariate or multivariate) physiological signals. Well-known examples of such signals include electrocardiogram (ECG), electroencephalogram (EEG), inertial measurements (acceleration, angular velocities, etc.). The objective of this thesis is to provide change point detection algorithms that (i) can handle long signals, (ii) can be applied on a wide range of real-world scenarios, and (iii) can incorporate the knowledge of medical experts. In particular, a greater emphasis is placed on fully automatic procedures which can be used in daily clinical practice. To that end, robust detection methods as well as supervised calibration strategies are described, and a documented open-source Python package is released.The first contribution of this thesis is a sub-optimal change point detection algorithm that can accommodate time complexity constraints while retaining most of the robustness of optimal procedures. This algorithm is sequential and alternates between the two following steps: a change point is estimated then its contribution to the signal is projected out. In the context of mean-shifts, asymptotic consistency of estimated change points is obtained. We prove that this greedy strategy can easily be extended to other types of changes, by using reproducing kernel Hilbert spaces. Thanks this novel approach, physiological signals can be handled without making assumption of the generative model of the data. Experiments on real-world signals show that those approaches are more accurate than standard sub-optimal algorithms and faster than optimal algorithms.The second contribution of this thesis consists in two supervised algorithms for automatic calibration. Both rely on labeled examples, which in our context, consist in segmented signals. The first approach learns the smoothing parameter for the penalized detection of an unknown number of changes. The second procedure learns a non-parametric transformation of the representation space, that improves detection performance. Both supervised procedures yield finely tuned detection algorithms that are able to replicate the segmentation strategy of an expert. Results show that those supervised algorithms outperform unsupervised algorithms, especially in the case of physiological signals, where the notion of change heavily depends on the physiological phenomenon of interest.All algorithmic contributions of this thesis can be found in ``ruptures'', an open-source Python library, available online. Thoroughly documented, ``ruptures'' also comes with a consistent interface for all methods
Gautrelet, Christophe. "Développement et exploitation d'un banc vibratoire en flexion pour les essais de fatigue Linearity investigation from a vibratory fatigue bench Fatigue curves of a low carbon steel obtained from vibration experiments with an electrodynamic shaker." Thesis, Normandie, 2019. http://www.theses.fr/2019NORMIR21.
Full textThe aim of Laboratory of Mechanics of Normandy is to provide experimental data to enhance the numerical models developed for structural design. Therefore, we have set up a vibration-based uniaxial bending bench for testing structures under a high cycle fatigue. After setting up specifications to specify the characteristics of the excitations and measurements, a qualification investigation of this test bench is presented in order to obtain the performance ranges based on the assumption of the system linearity. Then, to verify the feasibility in fatigue, we conducted a study to establish a characterization curve of a low carbon steel. Nevertheless, this test bench does not distinguish the phases of initiation, propagation and fracture of the crack. We have therefore proposed some ways to assess the cycle number of crack initiation: the first one is based on the slope variation of the strain amplitude measured from strain gauges and the second one uses the resonant frequency drop. We have thus proposed a simple model of nonlinear damage based on the variation of the resonant frequency which also makes it possible to evaluate the cycle number at fracture of the specimens. The experimental fatigue life obtained by this model is compared with the fatigue life obtained by a damage model provided by the literature
Sorba, Olivier. "Pénalités minimales pour la sélection de modèle." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS043/document.
Full textL. Birgé and P. Massart proved that the minimum penalty phenomenon occurs in Gaussian model selection when the model family arises from complete variable selection among independent variables. We extend some of their results to discrete Gaussian signal segmentation when the model family corresponds to a sufficiently rich family of partitions of the signal's support. This is the case of regression trees. We show that the same phenomenon occurs in the context of density estimation. The richness of the model family can be related to a certain form of isotropy. In this respect the minimum penalty phenomenon is intrinsic. To corroborate this point of view, we show that the minimum penalty phenomenon occurs when the models are chosen randomly under an isotropic law
Ternynck, Camille. "Contributions à la modélisation de données spatiales et fonctionnelles : applications." Thesis, Lille 3, 2014. http://www.theses.fr/2014LIL30062/document.
Full textIn this dissertation, we are interested in nonparametric modeling of spatial and/or functional data, more specifically based on kernel method. Generally, the samples we have considered for establishing asymptotic properties of the proposed estimators are constituted of dependent variables. The specificity of the studied methods lies in the fact that the estimators take into account the structure of the dependence of the considered data.In a first part, we study real variables spatially dependent. We propose a new kernel approach to estimating spatial probability density of the mode and regression functions. The distinctive feature of this approach is that it allows taking into account both the proximity between observations and that between sites. We study the asymptotic behaviors of the proposed estimates as well as their applications to simulated and real data. In a second part, we are interested in modeling data valued in a space of infinite dimension or so-called "functional data". As a first step, we adapt the nonparametric regression model, introduced in the first part, to spatially functional dependent data framework. We get convergence results as well as numerical results. Then, later, we study time series regression model in which explanatory variables are functional and the innovation process is autoregressive. We propose a procedure which allows us to take into account information contained in the error process. After showing asymptotic behavior of the proposed kernel estimate, we study its performance on simulated and real data.The third part is devoted to applications. First of all, we present unsupervised classificationresults of simulated and real spatial data (multivariate). The considered classification method is based on the estimation of spatial mode, obtained from the spatial density function introduced in the first part of this thesis. Then, we apply this classification method based on the mode as well as other unsupervised classification methods of the literature on hydrological data of functional nature. Lastly, this classification of hydrological data has led us to apply change point detection tools on these functional data
Fhima, Mehdi. "Détection de ruptures et mouvement Brownien multifractionnaire." Thesis, Clermont-Ferrand 2, 2011. http://www.theses.fr/2011CLF22197.
Full textThis Ph.D dissertation deals with "Off-line" detection of change points on parameters of time series of independent random variables, and in the Hurst parameter of multifrcational Brownian motion. It consists of three articles. In the first paper, published in Sequential Analysis, we set the cornerstones of the Filtered Derivative with p-Value method for the detection of change point on parameters of independent random variables. This method has linear time and memory complexities, with respect to the size of the series. It consists of two steps. The first step is based on Filtered Derivative method which detects the right change points as well as the false ones. We improve the Filtered Derivative method by adding a second step in which we compute the p-values associated to every single potential change point. Then we eliminate false alarms, i.e. the change points which have p-value smaller than a given critical level. We showed asymptotic properties needed for the calibration of the algorithm. The effectiveness of the method has been proved both on simulated data and on real data. Then we moved to the application of the method for the detection of change point on the Hurst parameter of multifractional Brownian motion. This was done in two phases. In the first phase, a paper is to be published in ESAIM P&S where we investigated the Central Limit Theorem of the Increment Ratio Statistic of a multifractional Brownian motion, leading to a CLT for the time varying Hurst index. The proofs are quite simple relying on Breuer-Major theorems and an original freezing of time strategy.The second phase relies on a new paper submitted for publication. We adapted the FDpV method to detect change points on the Hurst parameter of piecewise fractional Brownian motion. The underlying statistics of the FDpV technology is a new statistic estimator for Hurst index, so-called Increment Zero-Crossing Statistic (IZCS) which is a variation of IRS. Both FDpV and IZCS are methods with linear time and memory complexities, with respect to the size of the series
Garreau, Damien. "Change-point detection and kernel methods." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE061/document.
Full textIn this thesis, we focus on a method for detecting abrupt changes in a sequence of independent observations belonging to an arbitrary set on which a positive semidefinite kernel is defined. That method, kernel changepoint detection, is a kernelized version of a penalized least-squares procedure. Our main contribution is to show that, for any kernel satisfying some reasonably mild hypotheses, this procedure outputs a segmentation close to the true segmentation with high probability. This result is obtained under a bounded assumption on the kernel for a linear penalty and for another penalty function, coming from model selection.The proofs rely on a concentration result for bounded random variables in Hilbert spaces and we prove a less powerful result under relaxed hypotheses—a finite variance assumption. In the asymptotic setting, we show that we recover the minimax rate for the change-point locations without additional hypothesis on the segment sizes. We provide empirical evidence supporting these claims. Another contribution of this thesis is the detailed presentation of the different notions of distances between segmentations. Additionally, we prove a result showing these different notions coincide for sufficiently close segmentations.From a practical point of view, we demonstrate how the so-called dimension jump heuristic can be a reasonable choice of penalty constant when using kernel changepoint detection with a linear penalty. We also show how a key quantity depending on the kernelthat appears in our theoretical results influences the performance of kernel change-point detection in the case of a single change-point. When the kernel is translationinvariant and parametric assumptions are made, it is possible to compute this quantity in closed-form. Thanks to these computations, some of them novel, we are able to study precisely the behavior of the maximal penalty constant. Finally, we study the median heuristic, a popular tool to set the bandwidth of radial basis function kernels. Fora large sample size, we show that it behaves approximately as the median of a distribution that we describe completely in the setting of kernel two-sample test and kernel change-point detection. More precisely, we show that the median heuristic is asymptotically normal around this value
Srinivas, Sushma. "DETECTING VULNERABLE PLAQUES WITH MULTIRESOLUTION ANALYSIS." Cleveland State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=csu1326932229.
Full textGUIHENNEUC, CHANTAL. "Test du rapport des maximums de vraisemblance : - emploi en genetique; - detections de rupture dans les modeles de regression lineaire; application au sida." Paris 11, 1991. http://www.theses.fr/1991PA112350.
Full textPotin, Delphine. "Traitement des signaux pour la detection de mines antipersonnel." Ecole Centrale de Lille, 2007. http://www.theses.fr/2007ECLI0003.
Full textThe millions of landmines spread out over the planet are not only a humanitarian disaster, they also hinder the social and economic development of the concerned countries. In this thesis, new signal processing methods are proposed for the detection and localization of landmines on data recorded by a GPR (Ground Penetrating Radar). First, two digital filters are designed in order to remove clutter in Bscan and Cscan data delivered by GPRs. These two kinds of data are respectively interpreted as vertical and horizontal slices of the ground. In order to design the digital filters, a frequency analysis of a clutter geometrical model and a geometrical model of a signal coming from a landmine is led for each type of data. Second, a new detection method, based on a nonparametric abrupt changes detection technique, is proposed in order to detect and localize landmines in Bscan data. The method consists in searching spatial abrupt changes in order to detect the possible horizontal landmines position and in searching time abrupt changes in order to detect the buried objects response times. A detection method, based on contours extraction, is also proposed in order to automatically localize landmines in Cscan data. The performances of these two landmines detection method are studied in terms of detection probability and false alarm probability
Maroney, Roy Thomas. "Missed opportunities for the detection of abdominal aortic aneurysms : a retrospective study of eighteen patients presenting with a ruptured or acute symptomatic abdominal aortic aneurysm." Master's thesis, University of Cape Town, 1997. http://hdl.handle.net/11427/25566.
Full textBossavy, Arthur. "Caractérisation et prédiction probabiliste des variations brusques et importantes de la production éolienne." Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2012. http://pastel.archives-ouvertes.fr/pastel-00803234.
Full textTu, Yi-shan, and 杜怡珊. "Detection of Diaphragmatic Rupture using Computed Tomography Images." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/60630666095198382830.
Full text國立中正大學
資訊工程所
96
There is a high possibility that injuries to the diaphragm occur after penetrating or blunt trauma. With the passing of the time, there will be a progressive damage. Complex diaphragmatic rupture will lead to higher morbidity and mortality, thus early diagnosis and treatment is extremely important. In most cases, diaphragmatic injuries are not seen in isolation but with some other concurrent injuries. Thus doctors often pay more attention on coronal view of the computed tomography images and miss the examination of the diaphragm. In this paper, a detection method is developed to examine the diaphragmatic rupture using the computed tomography images. In this diaphragm detection system, the image resolution is reduced to decrease the computation time of detections. Then, two kinds of detections are used to detect different regions in computed tomography images. One is lung air boundary detection; the feature that the intensities of liquid and organs are different to the intensities of air is utilized to detect the border between air and liquid. The principle that the boundary should be continued is used to improve the correctness of the detected result. The other is the diaphragm detection; the diaphragm dome has different intensity with other tissues on coronal view of the computed tomography images, and the Sobel operator is used to enhance the diaphragm dome. The characteristic of diaphragm shape is also used as a guided rule for the diaphragm detection. Finally, the results of lung air boundary detection and diaphragm detection are compared to find out whether the diaphragm is injured or not. In our experiments, there are total 18 cases examined, the detection accuracy rate is up to 95%. The results show that the proposed system almost could detect the diaphragmatic rupture out in the images with a good accuracy.
Hsu, Cheng-I., and 許正宜. "Thermographic Detection of Creep-rupture Tubes Using a CCD Camera." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/58148628057755008207.
Full text國立彰化師範大學
電機工程學系
96
This study aims to use a low-cost camera to detect abnormal high-temperature areas on tubes before creep rupture occurs. Based on image gray level, the infrared energy radiated by high surface temperature can be detected, so heat distribution difference caused by tube flaws can be revealed and the temperature can also be calculated. Gray level influenced by external illumination or surface color can be corrected through image processing. Thus, through contactless full-range temperature detection, creep rupture of tubes under high temperature can be identified and the surface temperature can also be measured. Keywords: creep-rupture, charged coupled device, infrared, thermographic.
Pham, Tuan A. "Early detection and treatment strategies for vulnerable atherosclerotic plaques." Thesis, 2015. https://hdl.handle.net/2144/15207.
Full textHuang, Bin Feng, and 黃斌峰. ""Say the Unsaid, Repair the Fractures:" On the Narrative Ruptures in Edgar Allan Poe's Detective Stories." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/07856966362641133013.
Full text國立政治大學
英國語文學系
104
In this dissertation, Pierre Macherey’s theorizations will be employed for an in-depth analysis of the narrative ruptures in Edgar Allan Poe’s detective stories. First, in Chapter Two the definition of the detective novel is clarified: a detective novel refers to a fictional story that deals thematically with a crime as well as how it is solved by a detective or someone like a detective. By this definition, we can conclude that Poe didn’t write the first detective story but is the progenitor of the detective fiction genre. Chapter Three deals with Macherey’s theorizations. Macherey thinks that the author must have left something unsaid in his text. The unsaid is responsible for the multiplicity of the voices in the text, enabling the text to exist. When Macherey’s said/unsaid model is examined along with Althusser’s, Eagleton’s and Jameson’s theorizations about ideology, the nature of this interrelationship can be characterized: the text can’t access history directly; it has to go through the ideology. The text only reflects the ideology inaccurately; if the latter is put in the former, the former’s unsaid will emerge. In addition, in A Theory of Literary Productions, Macherey has mentioned that the unsaid in the text is what a text could have been, or a potentiality. Framed with Deleuze’s concept of virtual and actual, it can lead to this conclusion: Macherey’s so-called unsaid or narrative rupture is Deleuze’s virtual(ity), or a repository of potentialities. Once the unsaid is said or the narrative rupture is repaired, a potentiality will be tapped, or a possible case scenario of the text will be enacted. Based on the above, I postulate hypothesis: Poe’s detective stories have three narrative ruptures pertaining to the story settings, the characterization of the detective, and the logic reasoning. And in classical, hard-boiled, and postmodern detective fiction, the three narrative ruptures have been repaired in three different ways. Thus, three possible case scenarios of Poe’s detective stories have been enacted. Chapter Four deals with the relationship between the settings and the themes in Poe’s detective stories. Poe’s detective stories were composed in the 1840’s, when the Americans were convinced of the U.S.’s prosperity (the dominant ideology). However, industrialization and urbanization also brought about various social problems. In Poe’s detective stories, the city is portrayed as a dark place (the internal contradiction of the dominant ideology), but it has no bearing on the action of the story. It is clearly relevant to Poe’s upbringing (the authorial position). So this is what Poe has left unsaid: in a detective story its setting should be related to its theme. Several significant classical detective novelists, such as Arthur Conan Doyle and Agatha Christie, tend to base the settings of their stories on the general ideology then, and view the crimes in their stories as the menaces to their stable society. Hard-boiled detective fiction foregrounds social realism, so the setting is usually a decaying city, which is also s seedbed of criminal activities. As for postmodern detective fiction, the setting is often a labyrinthine world, where a trapped detective’s investigation ends up stranded. Chapter Five deals with the characterization of the detective. The dominant ideology in Poe’s era was the Enlightenment thinking, which emphasizes reason. However, there was also an undercurrent of unreason (the internal contradiction of the dominant ideology), and Poe takes an ambivalent attitude towards reason (the author’s position). In “The Murders in the Rue Morgue” and “The Mystery of Marie Rogêt,” the detective, Dupin, is a flat character, an embodiment of reason. However, in “The Purloined Letter,” the characterization of Dupin isn’t so differentiated from that of Minister D, the villain. Here, this is what Poe has left unsaid: the detective/villain dichotomy has been dismantled. However, throughout the three developmental stages of the detective fiction genre, this narrative rupture has never been repaired. Classical detectives seem to represent law or reason, but they often break the law; hard-boiled detectives are usually complex characters, or come from the minority groups in society. They often overstep the legal boundary. As for postmodern detectives, there’s no telling them from the villains. Chapter Six deals with the correlation between the ratiocinative pattern and the truth. First, ratiocination is a product of the Enlightenment; it is deeply rooted in the dominant ideology. In Poe’s three detective stories as well as “The Gold-Bug,” the ratiocinative pattern is a means to the truth. However, a close look at Dupin’s reasoning processes will reveal that it is imperfect (the internal contradiction of the dominant ideology). Here, what Poe has left unsaid is: the ratiocinative pattern doesn’t necessarily lead to the truth. This narrative rupture has never been repaired in classical detective fiction. The top priority in this subgenre is the narrative structure; it focuses on how the detective has reasoned out the truth. There were even rules dictating how classical detective stories should be created. Eventually, the detective fiction genre ended up bottlenecked. (In A Theory of Literary Production, Macherey speaks about his distrust in structure, as if he had foreseen the quagmire of the detective fiction genre). The narrative rupture has finally been repaired in hard-boiled detective fiction: hard-boiled detectives often solve their cases with an active investigation, pushing ratiocination to a secondary position. In postmodern detective fiction, the unsaid has been said even more: a postmodern detective’s ratiocination is often fruitless, and the truth is not found. Chapter Six also touches on the “truth issue,” namely, the issue of whether the case should be solved in a detective story. First, in both classical and hard-boiled detective fiction, the truth always comes out in the end. In addition, solving a mystery is an inherent attribute of the detective fiction genre. It is precisely what makes reading detective fiction fun, and it is also what puts detective fiction in the popular literature. What’s more, the three subgenres can be combined sometimes, and not all postmodern detective stories have done away with the discovery of the truth. Considering all these above, we can draw this conclusion: finding the truth is the final defense line in keeping the detective fiction genre within the popular literature. Those postmodern detective stories where the truth is lost should be few and far between, and they will end up in the serious literature. Chapter Seven sums up the findings in this dissertation, listing how the narrative ruptures have been (un)repaired. And the developmental route of the detective fiction genre is also retraced: it moves from GI, to social realism, and to AI.