Dissertationen zum Thema „Anomalies temporelles“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-17 Dissertationen für die Forschung zum Thema "Anomalies temporelles" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Ravilly, Morgane. „Etude de l'anomalie magnétique axiale le long de la ride médio-atlantique : implications sur les processus de l'accrétion et les variations temporelles du champ géomagnétique“. Brest, 1999. http://www.theses.fr/1999BRES2042.
Der volle Inhalt der QuelleWilmet, Audrey. „Détection d'anomalies dans les flots de liens : combiner les caractéristiques structurelles et temporelles“. Electronic Thesis or Diss., Sorbonne université, 2019. http://www.theses.fr/2019SORUS402.
Der volle Inhalt der QuelleA link stream is a set of links {(t, u, v)} in which a triplet (t, u, v) models the interaction between two entities u and v at time t. In many situations, data result from the measurement of interactions between several million of entities over time and can thus be studied through the link stream's formalism. This is the case, for instance, of phone calls, email exchanges, money transfers, contacts between individuals, IP traffic, online shopping, and many more. The goal of this thesis is the detection of sets of abnormal links in a link stream. In a first part, we design a method that constructs different contexts, a context being a set of characteristics describing the circumstances of an anomaly. These contexts allow us to find unexpected behaviors that are relevant, according to several dimensions and perspectives. In a second part, we design a method to detect anomalies in heterogeneous distributions whose behavior is constant over time, by comparing a sequence of similar heterogeneous distributions. We apply our methodological tools to temporal interactions coming from retweets of Twitter and IP traffic of MAWI group
Benkabou, Seif-Eddine. „Détection d’anomalies dans les séries temporelles : application aux masses de données sur les pneumatiques“. Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE1046/document.
Der volle Inhalt der QuelleAnomaly detection is a crucial task that has attracted the interest of several research studies in machine learning and data mining communities. The complexity of this task depends on the nature of the data, the availability of their labeling and the application framework on which they depend. As part of this thesis, we address this problem for complex data and particularly for uni and multivariate time series. The term "anomaly" can refer to an observation that deviates from other observations so as to arouse suspicion that it was generated by a different generation process. More generally, the underlying problem (also called novelty detection or outlier detection) aims to identify, in a set of data, those which differ significantly from others, which do not conform to an "expected behavior" (which could be defined or learned), and which indicate a different mechanism. The "abnormal" patterns thus detected often result in critical information. We focus specifically on two particular aspects of anomaly detection from time series in an unsupervised fashion. The first is global and consists in detecting abnormal time series compared to an entire database, whereas the second one is called contextual and aims to detect locally, the abnormal points with respect to the global structure of the relevant time series. To this end, we propose an optimization approaches based on weighted clustering and the warping time for global detection ; and matrix-based modeling for the contextual detection. Finally, we present several empirical studies on public data to validate the proposed approaches and compare them with other known approaches in the literature. In addition, an experimental validation is provided on a real problem, concerning the detection of outlier price time series on the tyre data, to meet the needs expressed by, LIZEO, the industrial partner of this thesis
Binder, Benjamin. „Definitions and Detection Procedures of Timing Anomalies for the Formal Verification of Predictability in Real-Time Systems“. Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG086.
Der volle Inhalt der QuelleThe timing behavior of real-time systems is often validated through timing analyses, which are yet jeopardized by execution phenomena called timing anomalies (TAs). A counter-intuitive TA manifests when a local speedup eventually leads to a global slowdown, and an amplification TA, when a local slowdown leads to an even larger global slowdown.While counter-intuitive TAs threaten the soundness/scalability of timing analyses, tools to systematically detect them do not exist. We set up a unified formal framework for systematically assessing the definitions of TAs, concluding the lack of a practical definition, mainly due to the absence of relations between local and global timing effects. We address these relations through the causality, which we further use to revise the formalization of these TAs. We also propose a specialized instance of the notions for out-of-order pipelines. We evaluate our subsequent detection procedure on illustrative examples and standard benchmarks, showing that it allows accurately capturing TAs.The complexity of the systems demands that their timing analyses be able to cope with the large resulting state space. A solution is to perform compositional analyses, specifically threatened by amplification TAs. We advance their study by showing how a specialized abstraction can be adapted for an industrial processor, by modeling the timing-relevant features of such a hardware with appropriate reductions. We also illustrate from this class of TAs how verification strategies can be used towards the obtainment of TA patterns
Boniol, Paul. „Detection of anomalies and identification of their precursors in large data series collections“. Electronic Thesis or Diss., Université Paris Cité, 2021. http://www.theses.fr/2021UNIP5206.
Der volle Inhalt der QuelleExtensive collections of data series are becoming a reality in a large number of scientific and social domains. There is, therefore, a growing interest and need to elaborate efficient techniques to analyze and process these data, such as in finance, environmental sciences, astrophysics, neurosciences, engineering. Informally, a data series is an ordered sequence of points or values. Once these series are collected and available, users often need to query them. These queries can be simple, such as the selection of time interval, but also complex, such as the similarities search or the detection of anomalies, often synonymous with malfunctioning of the system under study, or sudden and unusual evolution likely undesired. This last type of analysis represents a crucial problem for applications in a wide range of domains, all sharing the same objective: to detect anomalies as soon as possible to avoid critical events. Therefore, in this thesis, we address the following three objectives: (i) retrospective unsupervised subsequence anomaly detection in data series. (ii) unsupervised detection of anomalies in data streams. (iii) classification explanation of known anomalies in data series in order to identify possible precursors. This manuscript first presents the industrial context that motivated this thesis, fundamental definitions, a taxonomy of data series, and state-of-the-art anomaly detection methods. We then present our contributions along the three axes mentioned above. First, we describe two original solutions, NormA (that aims to build a weighted set of subsequences that represent the different behaviors of the data series) and Series2Graph (that transform the data series in a directed graph), for the task of unsupervised detection of anomalous subsequences in static data series. Secondly, we present the SAND (inspired from NormA) method for unsupervised detection of anomalous subsequences in data streams. Thirdly, we address the problem of the supervised identification of precursors. We subdivide this task into two generic problems: the supervised classification of time series and the explanation of this classification’s results by identifying discriminative subsequences. Finally, we illustrate the applicability and interest of our developments through an application concerning the identification of undesirable vibration precursors occurring in water supply pumps in the French nuclear power plants of EDF
Dentzer, Jacques. „Forçages environnementaux et contrôles structuraux sur le régime thermique actuel du bassin de Paris : enjeux pour la compréhension du potentiel géothermique en Ile-de-France“. Thesis, Paris 6, 2016. http://www.theses.fr/2016PA066187/document.
Der volle Inhalt der QuelleThe acquisition of measurements of temperature and of thermal conductivity has enriched the understanding of the thermal regime of the Paris sedimentary basin and brought to light spatial and temporal thermal heterogeneities. In order to understand them better, these variations need to be integrated into a multidisciplinary vision of the basin by comparing data against models. The bibliographic review made it possible to integrate data of diverse sorts, to compare them using GIS and to investigate the knowledge base. This study has highlighted and reinterpreted the vertical variations of geothermal flux. Simulations carried out based on diffusive palaeoclimatic scenarios show that the system has retained a memory of the effects of palaeoclimates. Furthermore, for the first time, a systematic decline of the geothermal flux has been identified at the level of the main aquifer formations. Transitory thermo-hydraulic simulations of palaeoclimatic phenomena show the development in the sedimentary basin of cold and hot zones according to the areas of flow. An explanation of the temperature anomaly of over 20°C between the geothermal installations located to the north and south of Paris in the Bathonian is put forward. The models produced clearly show the potential contribution of fractured zones, as well as that of the faults, to the heterogeneity observed in the temperature field of the basin by allowing flow constrained by the regional charge gradient and unstable densities. This work has shown the link between the formations in the basin which are exploited for their resources or used as a storage medium
Dentzer, Jacques. „Forçages environnementaux et contrôles structuraux sur le régime thermique actuel du bassin de Paris : enjeux pour la compréhension du potentiel géothermique en Ile-de-France“. Electronic Thesis or Diss., Paris 6, 2016. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2016PA066187.pdf.
Der volle Inhalt der QuelleThe acquisition of measurements of temperature and of thermal conductivity has enriched the understanding of the thermal regime of the Paris sedimentary basin and brought to light spatial and temporal thermal heterogeneities. In order to understand them better, these variations need to be integrated into a multidisciplinary vision of the basin by comparing data against models. The bibliographic review made it possible to integrate data of diverse sorts, to compare them using GIS and to investigate the knowledge base. This study has highlighted and reinterpreted the vertical variations of geothermal flux. Simulations carried out based on diffusive palaeoclimatic scenarios show that the system has retained a memory of the effects of palaeoclimates. Furthermore, for the first time, a systematic decline of the geothermal flux has been identified at the level of the main aquifer formations. Transitory thermo-hydraulic simulations of palaeoclimatic phenomena show the development in the sedimentary basin of cold and hot zones according to the areas of flow. An explanation of the temperature anomaly of over 20°C between the geothermal installations located to the north and south of Paris in the Bathonian is put forward. The models produced clearly show the potential contribution of fractured zones, as well as that of the faults, to the heterogeneity observed in the temperature field of the basin by allowing flow constrained by the regional charge gradient and unstable densities. This work has shown the link between the formations in the basin which are exploited for their resources or used as a storage medium
Hadjem, Medina. „Contribution à l'analyse et à la détection automatique d'anomalies ECG dans le cas de l'ischémie myocardique“. Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCB011.
Der volle Inhalt der QuelleRecent advances in sensing and miniaturization of ultra-low power devices allow for more intelligent and wearable health monitoring sensor-based systems. The sensors are capable of collecting vital signs, such as heart rate, temperature, oxygen saturation, blood pressure, ECG, EMG, etc., and communicate wirelessly the collected data to a remote device and/or smartphone. Nowadays, these aforementioned advances have led a large research community to have interest in the design and development of new biomedical data analysis systems, particularly electrocardiogram (ECG) analysis systems. Aimed at contributing to this broad research area, we have mainly focused in this thesis on the automatic analysis and detection of coronary heart diseases, such as Ischemia and Myocardial Infarction (MI), that are well known to be the leading death causes worldwide. Toward this end, and because the ECG signals are deemed to be very noisy and not stationary, our challenge was first to extract the relevant parameters without losing their main features. This particular issue has been widely addressed in the literature and does not represent the main purpose of this thesis. However, as it is a prerequisite, it required us to understand the state of the art proposed methods and select the most suitable one for our work. Based on the ECG parameters extracted, particularly the ST segment and the T wave parameters, we have contributed with two different approaches to analyze the ECG records: (1) the first analysis is performed in the time series level, in order to detect abnormal elevations of the ST segment and the T wave, known to be an accurate predictor of ischemia or MI; (2) the second analysis is performed at the ECG beat level to automatically classify the ST segment and T wave anomalies within different categories. This latter approach is the most commonly used in the literature. However, lacking a performance comparison standard in the state of the art existing works, we have carried out our own comparison of the actual classification methods by taking into account diverse ST and T anomaly classes, several performance evaluation parameters, as well as several ECG signal leads. To obtain more realistic performances, we have also performed the same study in the presence of other frequent cardiac anomalies, such as arrhythmia. Based on this substantial comparative study, we have proposed a new classification approach of seven ST-T anomaly classes, by using a hybrid of the boosting and the random under sampling methods, our goal was ultimately to reach the best tradeoff between true-positives and false-positives
Hadjem, Medina. „Contribution à l'analyse et à la détection automatique d'anomalies ECG dans le cas de l'ischémie myocardique“. Electronic Thesis or Diss., Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCB011.
Der volle Inhalt der QuelleRecent advances in sensing and miniaturization of ultra-low power devices allow for more intelligent and wearable health monitoring sensor-based systems. The sensors are capable of collecting vital signs, such as heart rate, temperature, oxygen saturation, blood pressure, ECG, EMG, etc., and communicate wirelessly the collected data to a remote device and/or smartphone. Nowadays, these aforementioned advances have led a large research community to have interest in the design and development of new biomedical data analysis systems, particularly electrocardiogram (ECG) analysis systems. Aimed at contributing to this broad research area, we have mainly focused in this thesis on the automatic analysis and detection of coronary heart diseases, such as Ischemia and Myocardial Infarction (MI), that are well known to be the leading death causes worldwide. Toward this end, and because the ECG signals are deemed to be very noisy and not stationary, our challenge was first to extract the relevant parameters without losing their main features. This particular issue has been widely addressed in the literature and does not represent the main purpose of this thesis. However, as it is a prerequisite, it required us to understand the state of the art proposed methods and select the most suitable one for our work. Based on the ECG parameters extracted, particularly the ST segment and the T wave parameters, we have contributed with two different approaches to analyze the ECG records: (1) the first analysis is performed in the time series level, in order to detect abnormal elevations of the ST segment and the T wave, known to be an accurate predictor of ischemia or MI; (2) the second analysis is performed at the ECG beat level to automatically classify the ST segment and T wave anomalies within different categories. This latter approach is the most commonly used in the literature. However, lacking a performance comparison standard in the state of the art existing works, we have carried out our own comparison of the actual classification methods by taking into account diverse ST and T anomaly classes, several performance evaluation parameters, as well as several ECG signal leads. To obtain more realistic performances, we have also performed the same study in the presence of other frequent cardiac anomalies, such as arrhythmia. Based on this substantial comparative study, we have proposed a new classification approach of seven ST-T anomaly classes, by using a hybrid of the boosting and the random under sampling methods, our goal was ultimately to reach the best tradeoff between true-positives and false-positives
Guigou, Fabio. „The artificial immune ecosystem : a scalable immune-inspired active classifier, an application to streaming time series analysis for network monitoring“. Thesis, Strasbourg, 2019. http://www.theses.fr/2019STRAD007/document.
Der volle Inhalt der QuelleSince the early 1990s, immune-inspired algorithms have tried to adapt the properties of the biological immune system to various computer science problems, not only in computer security but also in optimization and classification. This work explores a different direction for artificial immune systems, focussing on the interaction between subsystems rather than the biological processes involved in each one. These patterns of interaction in turn create the properties expected from immune systems, namely their ability to detect anomalies, memorize their signature to react quickly upon secondary exposure, and remain tolerant to symbiotic foreign organisms such as the intestinal fauna. We refer to a set of interacting systems as an ecosystem, thus this new approach has called the Artificial Immune Ecosystem. We demonstrate this model in the context of a real-world problem where scalability and performance are essential: network monitoring. This entails time series analysis in real time with an expert in the loop, i.e. active learning instead of supervised learning
Ait, Bensaid Samira. „Formal Semantics of Hardware Compilation Framework“. Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG085.
Der volle Inhalt der QuelleStatic worst-case timing analyses are used to ensure the timing deadlines required for safety-critical systems. In order to derive accurate bounds, these timing analyses require precise (micro-)architecture considerations. Usually, such micro-architecture models are constructed by hand from processor manuals.However, with the open-source hardware initiatives and high-level Hardware Description Languages (HCLs), the automatic generation of these micro-architecture models and, more specifically, the pipeline models are promoted. We propose a workflow that aims to automatically construct pipeline datapath models from processor designs described in HCLs. Our workflow is based on the Chisel/FIRRTL Hardware Compiler Framework. We build at the intermediate representation level the datapath pipeline models. Our work intends to prove the timing properties, such as the timing predictability-related properties. We rely on the formal verification as our method. The generated models are then translated into formal models and integrated into an existing model checking-based procedure for detecting timing anomalies. We use TLA+ modeling and verification language and experiment with our analysis with several open-source RISC-V processors. Finally, we advance the studies by evaluating the impact of automatic generation through a series of synthetic benchmarks
Foulon, Lucas. „Détection d'anomalies dans les flux de données par structure d'indexation et approximation : Application à l'analyse en continu des flux de messages du système d'information de la SNCF“. Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI082.
Der volle Inhalt der QuelleIn this thesis, we propose methods to approximate an anomaly score in order to detect abnormal parts in data streams. Two main problems are considered in this context. Firstly, the handling of the high dimensionality of the objects describing the time series extracted from the raw streams, and secondly, the low computation cost required to perform the analysis on-the-fly. To tackle the curse of dimensionality, we have selected the CFOF anomaly score, that has been proposed recently and proven to be robust to the increase of the dimensionality. Our main contribution is then the proposition of two methods to quickly approximate the CFOF score of new objects in a stream. The first one is based on safe pruning and approximation during the exploration of object neighbourhood. The second one is an approximation obtained by the aggregation of scores computed in several subspaces. Both contributions complete each other and can be combined. We show on a reference benchmark that our proposals result in important reduction of the execution times, while providing approximations that preserve the quality of anomaly detection. Then, we present our application of these approaches within the SNCF information system. In this context, we have extended the existing monitoring modules by a new tool to help to detect abnormal behaviours in the real stream of messages within the SNCF communication system
Audibert, Julien. „Unsupervised anomaly detection in time-series“. Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS358.
Der volle Inhalt der QuelleAnomaly detection in multivariate time series is a major issue in many fields. The increasing complexity of systems and the explosion of the amount of data have made its automation indispensable. This thesis proposes an unsupervised method for anomaly detection in multivariate time series called USAD. However, deep neural network methods suffer from a limitation in their ability to extract features from the data since they only rely on local information. To improve the performance of these methods, this thesis presents a feature engineering strategy that introduces non-local information. Finally, this thesis proposes a comparison of sixteen time series anomaly detection methods to understand whether the explosion in complexity of neural network methods proposed in the current literature is really necessary
Daouayry, Nassia. „Détection d’évènements anormaux dans les gros volumes de données d’utilisation issues des hélicoptères“. Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI084.
Der volle Inhalt der QuelleThis thesis addresses the topic of the normality of the helicopter component systems functioning through the exploitation of the usage data coming from the HUMS (Health and Usage Monitoring System) for the maintenance. Helicopters are complex systems and are subject to strict regulatory requirements imposed by the authorities in charge of flight safety. The analysis of monitoring data is therefore a preferred means of improving helicopter maintenance. In addition, the data produced by the HUMS system are an indispensable resource for assessing the health of the systems after each flight. The data collected are numerous and the complexity of the different systems makes it difficult to analyze them on a case-by-case basis.The work of this thesis deals mainly with the issues related to the utilization of multivariate series for the visualization and the implementation of anomaly detection tools within Airbus Helicopters.We have developed different approaches to catch in the flight data a relative normality for a given system.A work on the visualization of time series has been developed to identify the patterns representing the normality of a system's operation.Based on this approach, we have developed a "virtual sensor" allowing to estimate the values of a real sensor from a set of flight parameters in order to detect abnormal events when the values of these two sensors tend to diverge
Gernez, Pierre. „Analyse de la variabilité temporelle des propriétés optiques en mer Ligure depuis un moiullage instrumenté (site Boussole) : fluctuations à haute fréquence, cyclicité diurne, changements saisonniers et variabilité interannuelle“. Paris 6, 2009. http://www.theses.fr/2009PA066644.
Der volle Inhalt der QuelleSalaün, Achille. „Prédiction d'alarmes dans les réseaux via la recherche de motifs spatio-temporels et l'apprentissage automatique“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAS010.
Der volle Inhalt der QuelleNowadays, telecommunication networks occupy a central position in our world. Indeed, they allow to share worldwide a huge amount of information. Networks are however complex systems, both in size and technological diversity. Therefore, it makes their management and reparation more difficult. In order to limit the negative impact of such failures, some tools have to be developed to detect a failure whenever it occurs, analyse its root causes to solve it efficiently, or even predict this failure as prevention is better than cure. In this thesis, we mainly focus on these two last problems. To do so, we use files, called alarm logs, storing all the alarms that have been emitted by the system. However, these files are generally noisy and verbose: an operator managing a network needs tools able to extract and handle in an interpretable manner the causal relationships inside a log. In this thesis, we followed two directions. First, we have inspired from pattern matching techniques: similarly to the Ukkonen’s algorithm, we build online a structure, called DIG-DAG, that stores all the potential causal relationships between the events of a log. Moreover, we introduce a query system to exploit our DIG-DAG structure. Finally, we show how our solution can be used for root cause analysis. The second approach is a generative approach for the prediction of time series. In particular, we compare two well-known models for this task: recurrent neural nets on the one hand, hidden Markov models on the other hand. Here, we compare analytically the expressivity of these models by encompassing them into a probabilistic model, called GUM
Shao, Wenhao. „Enhancing Video Anomaly Detection by Leveraging Advanced Deep Learning Techniques“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2023. http://www.theses.fr/2023IPPAS012.
Der volle Inhalt der QuelleSecurity in public spaces is a primary concern across different domains and the deployment of real-time monitoring systems addresses this challenge. Video surveillance systems employing deep learning techniques allows for the effective recognition of anomaly events. However, even with the current advances in anomaly detection methods, distinguishing abnormal events from normal events in real-world scenarios remains a challenge because they often involve rare, visually diverse, and unrecognizable abnormal events. This is particularly true when relying on supervised methods, where the lack of sufficient labeled anomaly data poses a significant challenge for distinguishing between normal and abnormal videos. As a result, state-of-the-art anomaly detection approaches utilize existing datasets to design or learn a model that captures normal patterns, which is then helpful in identifying unknown abnormal patterns. During the model design stage, it is crucial to label videos with attributes such as abnormal appearance, behavior, or target categories that deviate significantly from normal data, marking them as anomalies. In addition to the lack of labeled data, we identified three challenges from the literature: 1) insufficient representation of temporal feature, 2) lack of precise positioning of abnormal events and 3) lack the consistency research of temporal feature and appearance feature. The objective of my thesis is to propose and investigate advanced video anomaly detection methods by addressing the aforementioned challenges using novel concepts and utilizing weak supervision and unsupervised models rather than relying on supervised models.We actively explored the applications of new video processing technologies, including action recognition, target detection, optical flow feature extraction, representation learning, and contrastive learning in order to utilize them in video anomaly detection models. Our proposed models comparatively analysed with baseline models. This comparative analysis are conducted using prevalent public datasets, including UCSD(Ped2), Avenue, UCF-Crime, and Shanghaitech.The first contribution addresses the first challenge outlined above by introducing an enhanced Temporal Convolutional Network (TCN). This novel TCN model learns dynamic video features and optimizes features to mitigate errors due to contrastive learned initial weights. This method enhances the overall capability of weakly supervised models by reducing the loss caused by initial parameters in contrastive learning. Nevertheless, weakly supervised learning only reduces the reliance on labeled data but does not eliminate the dependence on such data. Hence, our subsequent two contributions rely on unsupervised learning to addressing the other two challenges mentioned above. The second contribution combines the self-attention mechanism to prioritize the weights of areas with obvious dynamic fluctuations in frames. And, during the testing, abnormal areas are located through comparison of object detection and loss functions. The combination of self-attention mechanism and object detection significantly improves the detection accuracy and expands the functionality. The third contribution explores the integration of collaborative teaching network models, which bridges consistency between optical flow information and appearance information. This integration aims to enhance the spatio-temporal capture capabilities of unsupervised models. The overall performance and capabilities of the unsupervised model are significantly enhanced compared to the other baseline models