Literatura académica sobre el tema "Change point and trend detection"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Change point and trend detection".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Change point and trend detection"

1

Militino, Ana, Mehdi Moradi y M. Ugarte. "On the Performances of Trend and Change-Point Detection Methods for Remote Sensing Data". Remote Sensing 12, n.º 6 (21 de marzo de 2020): 1008. http://dx.doi.org/10.3390/rs12061008.

Texto completo
Resumen
Detecting change-points and trends are common tasks in the analysis of remote sensing data. Over the years, many different methods have been proposed for those purposes, including (modified) Mann–Kendall and Cox–Stuart tests for detecting trends; and Pettitt, Buishand range, Buishand U, standard normal homogeneity (Snh), Meanvar, structure change (Strucchange), breaks for additive season and trend (BFAST), and hierarchical divisive (E.divisive) for detecting change-points. In this paper, we describe a simulation study based on including different artificial, abrupt changes at different time-periods of image time series to assess the performances of such methods. The power of the test, type I error probability, and mean absolute error (MAE) were used as performance criteria, although MAE was only calculated for change-point detection methods. The study reveals that if the magnitude of change (or trend slope) is high, and/or the change does not occur in the first or last time-periods, the methods generally have a high power and a low MAE. However, in the presence of temporal autocorrelation, MAE raises, and the probability of introducing false positives increases noticeably. The modified versions of the Mann–Kendall method for autocorrelated data reduce/moderate its type I error probability, but this reduction comes with an important power diminution. In conclusion, taking a trade-off between the power of the test and type I error probability, we conclude that the original Mann–Kendall test is generally the preferable choice. Although Mann–Kendall is not able to identify the time-period of abrupt changes, it is more reliable than other methods when detecting the existence of such changes. Finally, we look for trend/change-points in land surface temperature (LST), day and night, via monthly MODIS images in Navarre, Spain, from January 2001 to December 2018.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Ishak, Elias y Ataur Rahman. "Examination of Changes in Flood Data in Australia". Water 11, n.º 8 (20 de agosto de 2019): 1734. http://dx.doi.org/10.3390/w11081734.

Texto completo
Resumen
This study performs a simultaneous evaluation of gradual and abrupt changes in Australian annual maximum (AM) flood data using a modified Mann–Kendall and Pettitt change-point detection test. The results show that AM flood data in eastern Australia is dominated by downward trends. Depending on the significance level and study period under consideration, about 8% to 33% of stations are characterised by significant trends, where over 85% of detected significant trends are downward. Furthermore, the change-point analysis shows that the percentages of stations experiencing one abrupt change in the mean or in the direction of the trend are in the range of 8% to 33%, of which over 50% occurred in 1991, with a mode in 1995. Prominent resemblance between the monotonic trend and change-point analysis results is also noticed, in which a negative shift in the mean is observed at catchments that exhibited downward trends, and a positive shift in the mean is observed in the case of upward trends. Trend analysis of the segmented AM flood series based on their corresponding date indicates an absence of a significant trend, which may be attributed to the false detection of trends when the AM flood data are characterised by a shift in its mean.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Alashan, Sadık. "Can innovative trend analysis identify trend change points?" Brilliant Engineering 1, n.º 3 (21 de febrero de 2020): 6–15. http://dx.doi.org/10.36937/ben.2020.003.002.

Texto completo
Resumen
Trends in temperature series are the main cause of climate change. Because solar energy directs hydro-meteorological events and increasing variations in this resource change the balance between events such as evaporation, wind, and rainfall. There are many methods for calculating trends in a time series such as Mann-Kendall, Sen's slope estimator, Spearman's rho, linear regression and the new Sen innovative trend analysis (ITA). In addition, Mann-Kendall's variant, the sequential Mann Kendall, has been developed to identify trend change points; however, it is sensitive to related data as specified by some researchers. Şen_ITA is a new trend detection method and does not require independent and normally distributed time series, but has never been used to detect trend change points. In the literature, multiple, half-time and multi-durations ITA methods are used to calculate partial trends in a time series without identifying trend change points. In this study, trend change points are detected using the Şen_ITA method and named ITA_TCP. This approach may allow researchers to identify trend change points in a time series. Diyarbakır (Turkey) is selected as a study area, and ITA_TCP has detected trends and trends change points in monthly average temperatures. Although ITA detects only a significant upward trend in August, given the 95% statistical significance level, ITA_TCP shows three upward trends in June, July and August, and a decreasing trend in September. Critical trend slope values are obtained using the bootstrap method, which does not require the normal distribution assumption.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Alashan, Sadık. "Can innovative trend analysis identify trend change points?" Brilliant Engineering 1, n.º 3 (21 de febrero de 2020): 6–15. http://dx.doi.org/10.36937/ben.2020.003.02.

Texto completo
Resumen
Trends in temperature series are the main cause of climate change. Because solar energy directs hydro-meteorological events and increasing variations in this resource change the balance between events such as evaporation, wind, and rainfall. There are many methods for calculating trends in a time series such as Mann-Kendall, Sen's slope estimator, Spearman's rho, linear regression and the new Sen innovative trend analysis (ITA). In addition, Mann-Kendall's variant, the sequential Mann Kendall, has been developed to identify trend change points; however, it is sensitive to related data as specified by some researchers. Şen_ITA is a new trend detection method and does not require independent and normally distributed time series, but has never been used to detect trend change points. In the literature, multiple, half-time and multi-durations ITA methods are used to calculate partial trends in a time series without identifying trend change points. In this study, trend change points are detected using the Şen_ITA method and named ITA_TCP. This approach may allow researchers to identify trend change points in a time series. Diyarbakır (Turkey) is selected as a study area, and ITA_TCP has detected trends and trends change points in monthly average temperatures. Although ITA detects only a significant upward trend in August, given the 95% statistical significance level, ITA_TCP shows three upward trends in June, July and August, and a decreasing trend in September. Critical trend slope values are obtained using the bootstrap method, which does not require the normal distribution assumption.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Wehbe, Youssef y Marouane Temimi. "A Remote Sensing-Based Assessment of Water Resources in the Arabian Peninsula". Remote Sensing 13, n.º 2 (13 de enero de 2021): 247. http://dx.doi.org/10.3390/rs13020247.

Texto completo
Resumen
A better understanding of the spatiotemporal distribution of water resources is crucial for the sustainable development of hyper-arid regions. Here, we focus on the Arabian Peninsula (AP) and use remotely sensed data to (i) analyze the local climatology of total water storage (TWS), precipitation, and soil moisture; (ii) characterize their temporal variability and spatial distribution; and (iii) infer recent trends and change points within their time series. Remote sensing data for TWS, precipitation, and soil moisture are obtained from the Gravity Recovery and Climate Experiment (GRACE), the Tropical Rainfall Measuring Mission (TRMM), and the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), respectively. The study relies on trend analysis, the modified Mann–Kendall test, and change point detection statistics. We first derive 10-year (2002–2011) seasonal averages from each of the datasets and intercompare their spatial organization. In the absence of large-scale in situ data, we then compare trends from GRACE TWS retrievals to in situ groundwater observations locally over the subdomain of the United Arab Emirates (UAE). TWS anomalies vary between −6.2 to 3.2 cm/month and −6.8 to −0.3 cm/month during the winter and summer periods, respectively. Trend analysis shows decreasing precipitation trends (−2.3 × 10−4 mm/day) spatially aligned with decreasing soil moisture trends (−1.5 × 10−4 g/cm3/month) over the southern part of the AP, whereas the highest decreasing TWS trends (−8.6 × 10−2 cm/month) are recorded over areas of excessive groundwater extraction in the northern AP. Interestingly, change point detection reveals increasing precipitation trends pre- and post-change point breaks over the entire AP region. Significant spatial dependencies are observed between TRMM and GRACE change points, particularly over Yemen during 2010, revealing the dominant impact of climatic changes on TWS depletion.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Vaman, H. J. y K. Suresh Chandra. "OPTIMAL CHANGE-POINT DETECTION IN TREND MODELS WITH INTEGRATED MOVING AVERAGE ERRORS". Sequential Analysis 21, n.º 1-2 (20 de mayo de 2002): 99–107. http://dx.doi.org/10.1081/sqa-120004175.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ray, Litan Kumar, Narendra Kumar Goel y Manohar Arora. "Trend analysis and change point detection of temperature over parts of India". Theoretical and Applied Climatology 138, n.º 1-2 (23 de febrero de 2019): 153–67. http://dx.doi.org/10.1007/s00704-019-02819-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Sherwood, Steven C. "Simultaneous Detection of Climate Change and Observing Biases in a Network with Incomplete Sampling". Journal of Climate 20, n.º 15 (1 de agosto de 2007): 4047–62. http://dx.doi.org/10.1175/jcli4215.1.

Texto completo
Resumen
Abstract All instrumental climate records are affected by instrumentation changes and variations in sampling over time. While much attention has been paid to the problem of detecting “change points” in time series, little has been paid to the statistical properties of climate signals that result after adjusting (“homogenizing”) the data—or to the effects of the irregular sampling and serial correlation exhibited by real climate records. These issues were examined here by simulating multistation datasets. Simple homogenization methods, which remove apparent artifacts and then calculate trends, tended to remove some of the real signal. That problem became severe when change-point times were not known a priori, leading to significant underestimation of real and/or artificial trends. A key cause is false detection of change points, even with nominally strict significance testing, due to serial correlation in the data. One conclusion is that trends in previously homogenized radiosonde datasets should be viewed with caution. Two-phase regression reduced but did not resolve this problem. A new approach is proposed in which trends, change points, and natural variability are estimated simultaneously. This is accomplished here for the case of incomplete data from a fixed station network by an adaptation of the “iterative universal Kriging” method, which converges to maximum-likelihood parameters by iterative imputation of missing values. With careful implementation this method’s trend estimates had low random errors and were nearly unbiased in these tests. It is argued that error-free detection of change points is neither realistic nor necessary, and that success should be measured instead by the integrity of climate signals.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Alhathloul, Saleh H., Abdul A. Khan y Ashok K. Mishra. "Trend analysis and change point detection of annual and seasonal horizontal visibility trends in Saudi Arabia". Theoretical and Applied Climatology 144, n.º 1-2 (24 de enero de 2021): 127–46. http://dx.doi.org/10.1007/s00704-021-03533-z.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Nguyen, Khanh Ninh, Annarosa Quarello, Olivier Bock y Emilie Lebarbier. "Sensitivity of Change-Point Detection and Trend Estimates to GNSS IWV Time Series Properties". Atmosphere 12, n.º 9 (26 de agosto de 2021): 1102. http://dx.doi.org/10.3390/atmos12091102.

Texto completo
Resumen
This study investigates the sensitivity of the GNSSseg segmentation method to change in: GNSS data processing method, length of time series (17 to 25 years), auxiliary data used in the integrated water vapor (IWV) conversion, and reference time series used in the segmentation (ERA-Interim versus ERA5). Two GNSS data sets (IGS repro1 and CODE REPRO2015), representative of the first and second IGS reprocessing, were compared. Significant differences were found in the number and positions of detected change-points due to different a priori ZHD models, antenna/radome calibrations, and mapping functions. The more recent models used in the CODE solution have reduced noise and allow the segmentation to detect smaller offsets. Similarly, the more recent reanalysis ERA5 has reduced representativeness errors, improved quality compared to ERA-Interim, and achieves higher sensitivity of the segmentation. Only 45–50% of the detected change-points are similar between the two GNSS data sets or between the two reanalyses, compared to 70–80% when the length of the time series or the auxiliary data are changed. About 35% of the change-points are validated with respect to metadata. The uncertainty in the homogenized trends is estimated to be around 0.01–0.02 kg m−2 year−1.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Change point and trend detection"

1

Petersson, David y Emil Backman. "Change Point Detection and Kernel Ridge Regression for Trend Analysis on Financial Data". Thesis, KTH, Skolan för teknikvetenskap (SCI), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230729.

Texto completo
Resumen
The investing market can be a cold ruthless place for the layman. In order to get the chance of making money in this business one must place countless hours on research, with many different parameters to handle in order to reach success. To reduce the risk, one must look to many different companies operating in multiple fields and industries. In other words, it can be a hard task to manage this feat. With modern technology, there is now lots of potential to handle this tedious analysis autonomously using machine learning and clever algorithms. With this approach, the amount of analyzes is only limited by the capacity of the computer. Resulting in a number far greater than if done by hand. This study aims at exploring the possibilities to modify and implement efficient algorithms in the field of finance. The study utilizes the power of kernel methods in order to algorithmically analyze the patterns found in financial data efficiently. By combining the powerful tools of change point detection and nonlinear regression the computer can classify the different trends and moods in the market. The study culminates to a tool for analyzing data from the stock market in a way that minimizes the influence from short spikes and drops, and instead is influenced by the underlying pattern. But also, an additional tool for predicting future movements in the price.
Aktiemarknaden kan vara en hård och oförlåtande plats att investera sina pengar i som novis. För att ha någon chans att gå med vinst krävs oräkneligt många timmars efterforskning av företag och dess möjligheter. Vidare bör man sprida sina investeringar över flertalet oberoende branscher och på så sätt minska risken för stora förluster. Med många aktörer och en stor mängd parametrar som måste falla samman kan detta verka näst intill omöjligt att klara av som privatperson. Med modern teknologi finns nu stor potential till att kunna hantera dessa analyser autonomt med maskininlärning. Om man ser på problemet från denna infallsvinkel inser man snart att analysförmågan enbart begränsas av vilken datorkraft man besitter. Denna studie utforskar möjligheterna kring maskininlärning inom teknisk analys genom att kombinera effektiva algoritmer på ett nytänkande sätt. Genom att utnyttja kraften bakom kernel-metoder kan mönster i finansiella data analyseras effektivt. En ny kombination, av ickelinjär regression och algoritmer som är kapabla till att hitta brytpunkter i mönster, föreslås. Slutprodukten från denna studie är ett analysverktyg som minimerar influensen från plötsliga händelser och istället ger större vikt till de underliggande mönstren i finansiella data. Det introduceras också ett ytterligare verktyg som kan användas för att estimera framtida prisrörelser.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Gao, Zhenguo. "Variance Change Point Detection under A Smoothly-changing Mean Trend with Application to Liver Procurement". Diss., Virginia Tech, 2018. http://hdl.handle.net/10919/82351.

Texto completo
Resumen
Literature on change point analysis mostly requires a sudden change in the data distribution, either in a few parameters or the distribution as a whole. We are interested in the scenario that the variance of data may make a significant jump while the mean of data changes in a smooth fashion. It is motivated by a liver procurement experiment with organ surface temperature monitoring. Blindly applying the existing change point analysis methods to the example can yield erratic change point estimates since the smoothly-changing mean violates the sudden-change assumption. In my dissertation, we propose a penalized weighted least squares approach with an iterative estimation procedure that naturally integrates variance change point detection and smooth mean function estimation. Given the variance components, the mean function is estimated by smoothing splines as the minimizer of the penalized weighted least squares. Given the mean function, we propose a likelihood ratio test statistic for identifying the variance change point. The null distribution of the test statistic is derived together with the rates of convergence of all the parameter estimates. Simulations show excellent performance of the proposed method. Application analysis offers numerical support to the non-invasive organ viability assessment by surface temperature monitoring. The method above can only yield the variance change point of temperature at a single point on the surface of the organ at a time. In practice, an organ is often transplanted as a whole or in part. Therefore, it is generally of more interest to study the variance change point for a chunk of organ. With this motivation, we extend our method to study variance change point for a chunk of the organ surface. Now the variances become functions on a 2D space of locations (longitude and latitude) and the mean is a function on a 3D space of location and time. We model the variance functions by thin-plate splines and the mean function by the tensor product of thin-plate splines and cubic splines. However, the additional dimensions in these functions incur serious computational problems since the sample size, as a product of the number of locations and the number of sampling time points, becomes too large to run the standard multi-dimensional spline models. To overcome the computational hurdle, we introduce a multi-stages subsampling strategy into our modified iterative algorithm. The strategy involves several down-sampling or subsampling steps educated by preliminary statistical measures. We carry out extensive simulations to show that the new method can efficiently cut down the computational cost and make a practically unsolvable problem solvable with reasonable time and satisfactory parameter estimates. Application of the new method to the liver surface temperature monitoring data shows its effectiveness in providing accurate status change information for a portion of or the whole organ.
Ph. D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Hedberg, Sofia. "Regional Quantification of Climatic and Anthropogenic Impacts on Streamflows in Sweden". Thesis, Uppsala universitet, Institutionen för geovetenskaper, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-269824.

Texto completo
Resumen
The anthropogenic impact on earth’s systems has rapidly increased since the middle of the last century and today it is hard to find a stream that is not influenced by human activities. The understanding of causes to changes is an important knowledge for future water management and planning and of that reason climatic and anthropogenic impact on streamflow changes in Sweden were explored and quantified. In the first step trends and abrupt changes in annual streamflow were detected and verified with the non- parametric Mann-Kendall’s and Pettitt’s test, all performed as moving window tests. In the second step HBV, a climatic driven rainfall-runoff model, was used to attribute the causes of the detected changes. Detection and attribution of changes were performed on several catchments in order to investigate regional patterns. On one hand using smaller window sizes, period higher number of detected positive and negative trends were found. On the other hand bigger window sizes resulted in positive trends in more than half of the catchments and almost no negative trends. The detected changes were highly dependent on the investigated time frame, due to periodicity, e.g. natural variability in streamflow. In general the anthropogenic impact on streamflow changes was smaller than changes due to temperature and streamflow. In median anthropogenic impact could explain 7% of the total change. No regional differences were found which indicated that anthropogenic impact varies more between individual catchments than following a regional pattern.
Sedan mitten av förra århundradet har den antropogena påverkan på jordens system ökat kraftigt. Idag är det svårt att hitta ett vattendrag som inte är påverkat av mänsklig aktivitet. Att förstå orsakerna bakom förändringarna är en viktig kunskap för framtida vattenplanering och av denna anledning undersöktes och kvantiferades den antropogen och klimatpåverkan på flödesförändringar i svenska vattendrag. I arbetets första steg användes de Mann-Kendalls och Pettitts test för att lokalisera och verifiera förändringar i årligt vattenflöde. Alla test var icke parametriska och utfördes som ett glidande fönster. I nästa steg undersöktes orsakerna till förändringar med hjälp av HBV, en klimatdriven avrinningsmodell. Ett större antal avrinningsområden undersöktes för att upptäcka regionala mönster och skillnader. Perioder med omväxlande positiva och negativa trender upptäcktes med mindre fönsterstorlekar, medan större fönster hittade positiva trender i mer än hälften av områdena och knappt några negativa trender hittades. De detekterade förändringarna var på grund av periodicitet i årligt vattenflöde till stor grad beroende på det undersöka tidsintervallet. Generellt var den antropogena påverkan större påverkan från nederbörd och temperatur, med ett medianvärde där 7 % av den totala förändringen kunde förklaras med antropogen påverkan. Inga regionala skillnader i antropogen påverkan kunde identifieras vilket indikerar att den varierar mer mellan individuella områden än följer ett regionalt mönster.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Jawa, Taghreed Mohammed. "Statistical methods of detecting change points for the trend of count data". Thesis, University of Strathclyde, 2017. http://digitool.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=28854.

Texto completo
Resumen
In epidemiology, controlling infection is a crucial element. Since healthcare associated infections (HAIs) are correlated with increasing costs and mortality rates, effective healthcare interventions are required. Several healthcare interventions have been implemented in Scotland and subsequently Health Protection Scotland (HPS) reported a reduction in HAIs [HPS (2015b, 2016a)]. The aim of this thesis is to use statistical methods and change points analysis to detect the time when the rate of HAIs changed and determine which associated interventions may have impacted such rates. Change points are estimated from polynomial generalized linear models (GLM) and confidence intervals are constructed using bootstrap and delta methods and the two techniques are compared. Segmented regression is also used to look for change points at times when specific interventions took place. A generalization of segmented regression is known as joinpoint analysis which looks for potential change points at each time point in the data, which allows the change to have occurred at any point over time. The joinpoint model is adjusted by adding a seasonal effect to account for additional variability in the rates. Confidence intervals for joinpoints are constructed using bootstrap and profile likelihood methods and the two approaches are compared. Change points from the smoother trend of the generalized additive model (GAM) are also estimated and bootstrapping is used to construct confidence intervals. All methods were found to have similar change points. Segmented regression detects the actual point when an intervention took place. Polynomial GLM, spline GAM and joinpoint analysis models are useful when the impact of an intervention occurs after a period of time. Simulation studies are used to compare polynomial GLM, segmented regression and joinpoint analysis models for detecting change points along with their confidence intervals.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Garreau, Damien. "Change-point detection and kernel methods". Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE061/document.

Texto completo
Resumen
Dans cette thèse, nous nous intéressons à une méthode de détection des ruptures dans une suite d’observations appartenant à un ensemble muni d’un noyau semi-défini positif. Cette procédure est une version « à noyaux » d’une méthode des moindres carrés pénalisés. Notre principale contribution est de montrer que, pour tout noyau satisfaisant des hypothèses raisonnables, cette méthode fournit une segmentation proche de la véritable segmentation avec grande probabilité. Ce résultat est obtenu pour un noyau borné et une pénalité linéaire, ainsi qu’une autre pénalité venant de la sélection de modèles. Les preuves reposent sur un résultat de concentration pour des variables aléatoires bornées à valeurs dans un espace de Hilbert, et nous obtenons une version moins précise de ce résultat lorsque l’on supposeseulement que la variance des observations est finie. Dans un cadre asymptotique, nous retrouvons les taux minimax usuels en détection de ruptures lorsqu’aucune hypothèse n’est faite sur la taille des segments. Ces résultats théoriques sont confirmés par des simulations. Nous étudions également de manière détaillée les liens entre différentes notions de distances entre segmentations. En particulier, nous prouvons que toutes ces notions coïncident pour des segmentations suffisamment proches. D’un point de vue pratique, nous montrons que l’heuristique du « saut de dimension » pour choisir la constante de pénalisation est un choix raisonnable lorsque celle-ci est linéaire. Nous montrons également qu’une quantité clé dépendant du noyau et qui apparaît dans nos résultats théoriques influe sur les performances de cette méthode pour la détection d’une unique rupture. Dans un cadre paramétrique, et lorsque le noyau utilisé est invariant partranslation, il est possible de calculer cette quantité explicitement. Grâce à ces calculs, nouveaux pour plusieurs d’entre eux, nous sommes capable d’étudier précisément le comportement de la constante de pénalité maximale. Pour finir, nous traitons de l’heuristique de la médiane, un moyen courant de choisir la largeur de bande des noyaux à base de fonctions radiales. Dans un cadre asymptotique, nous montrons que l’heuristique de la médiane se comporte à la limite comme la médiane d’une distribution que nous décrivons complètement dans le cadre du test à deux échantillons à noyaux et de la détection de ruptures. Plus précisément, nous montrons que l’heuristique de la médiane est approximativement normale centrée en cette valeur
In this thesis, we focus on a method for detecting abrupt changes in a sequence of independent observations belonging to an arbitrary set on which a positive semidefinite kernel is defined. That method, kernel changepoint detection, is a kernelized version of a penalized least-squares procedure. Our main contribution is to show that, for any kernel satisfying some reasonably mild hypotheses, this procedure outputs a segmentation close to the true segmentation with high probability. This result is obtained under a bounded assumption on the kernel for a linear penalty and for another penalty function, coming from model selection.The proofs rely on a concentration result for bounded random variables in Hilbert spaces and we prove a less powerful result under relaxed hypotheses—a finite variance assumption. In the asymptotic setting, we show that we recover the minimax rate for the change-point locations without additional hypothesis on the segment sizes. We provide empirical evidence supporting these claims. Another contribution of this thesis is the detailed presentation of the different notions of distances between segmentations. Additionally, we prove a result showing these different notions coincide for sufficiently close segmentations.From a practical point of view, we demonstrate how the so-called dimension jump heuristic can be a reasonable choice of penalty constant when using kernel changepoint detection with a linear penalty. We also show how a key quantity depending on the kernelthat appears in our theoretical results influences the performance of kernel change-point detection in the case of a single change-point. When the kernel is translationinvariant and parametric assumptions are made, it is possible to compute this quantity in closed-form. Thanks to these computations, some of them novel, we are able to study precisely the behavior of the maximal penalty constant. Finally, we study the median heuristic, a popular tool to set the bandwidth of radial basis function kernels. Fora large sample size, we show that it behaves approximately as the median of a distribution that we describe completely in the setting of kernel two-sample test and kernel change-point detection. More precisely, we show that the median heuristic is asymptotically normal around this value
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Niu, Yue S., Ning Hao y Heping Zhang. "Multiple Change-Point Detection: A Selective Overview". INST MATHEMATICAL STATISTICS, 2016. http://hdl.handle.net/10150/622820.

Texto completo
Resumen
Very long and noisy sequence data arise from biological sciences to social science including high throughput data in genomics and stock prices in econometrics. Often such data are collected in order to identify and understand shifts in trends, for example, from a bull market to a bear market in finance or from a normal number of chromosome copies to an excessive number of chromosome copies in genetics. Thus, identifying multiple change points in a long, possibly very long, sequence is an important problem. In this article, we review both classical and new multiple change-point detection strategies. Considering the long history and the extensive literature on the change-point detection, we provide an in-depth discussion on a normal mean change-point model from aspects of regression analysis, hypothesis testing, consistency and inference. In particular, we present a strategy to gather and aggregate local information for change-point detection that has become the cornerstone of several emerging methods because of its attractiveness in both computational and theoretical properties.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Yang, Ping. "Adaptive trend change detection and pattern recognition in physiological monitoring". Thesis, University of British Columbia, 2009. http://hdl.handle.net/2429/8932.

Texto completo
Resumen
Advances in monitoring technology have resulted in the collection of a vast amount of data that exceeds the simultaneous surveillance capabilities of expert clinicians in the clinical environment. To facilitate the clinical decision-making process, this thesis solves two fundamental problems in physiological monitoring: signal estimation and trend-pattern recognition. The general approach is to transform changes in different trend features to nonzero level-shifts by calculating the model-based forecast residuals and then to apply a statistical test or Bayesian approach on the residuals to detect changes. The EWMA-Cusum method describes a signal as the exponentially moving weighted average (EWMA) of historical data. This method is simple, robust, and applicable to most variables. The method based on the Dynamic Linear Model (refereed to as Adaptive-DLM method) describes a signal using the linear growth model combined with an EWMA model. An adaptive Kalman filter is used to estimate the second-order characteristics and adjust the change-detection process online. The Adaptive-DLM method is designed for monitoring variables measured at a high sampling rate. To address the intraoperative variability in variables measured at a low sampling rate, a generalized hidden Markov model is used to classify trend changes into different patterns and to describe the transition between these patterns as a first-order Markov-chain process. Trend patterns are recognized online with a quantitative evaluation of the occurrence probability. In addition to the univariate methods, a test statistic based on Factor Analysis is also proposed to investigate the inver-variable relationship and to reveal subtle clinical events. A novel hybrid median filter is also proposed to fuse heart-rate measurements from the ECG monitor, pulse oximeter, and arterial BP monitor to obtain accurate estimates of HR in the presence of artifacts. These methods have been tested using simulated and clinical data. The EWMA-Cusum and Adaptive-DLM methods have been implemented in a software system iAssist and evaluated by clinicians in the operating room. The results demonstrate that the proposed methods can effectively detect trend changes and assist clinicians in tracking the physiological state of a patient during surgery.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Mei, Yajun Lorden Gary. "Asymptotically optimal methods for sequential change-point detection /". Diss., Pasadena, Calif. : California Institute of Technology, 2003. http://resolver.caltech.edu/CaltechETD:etd-05292003-133431.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Geng, Jun. "Quickest Change-Point Detection with Sampling Right Constraints". Digital WPI, 2015. https://digitalcommons.wpi.edu/etd-dissertations/440.

Texto completo
Resumen
The quickest change-point detection problems with sampling right constraints are considered. Specially, an observer sequentially takes observations from a random sequence, whose distribution will change at an unknown time. Based on the observation sequence, the observer wants to identify the change-point as quickly as possible. Unlike the classic quickest detection problem in which the observer can take an observation at each time slot, we impose a causal sampling right constraint to the observer. In particular, sampling rights are consumed when the observer takes an observation and are replenished randomly by a stochastic process. The observer cannot take observations if there is no sampling right left. The causal sampling right constraint is motivated by several practical applications. For example, in the application of sensor network for monitoring the abrupt change of its ambient environment, the sensor can only take observations if it has energy left in its battery. With this additional constraint, we design and analyze the optimal detection and sampling right allocation strategies to minimize the detection delay under various problem setups. As one of our main contributions, a greedy sampling right allocation strategy, by which the observer spends sampling rights in taking observations as long as there are sampling rights left, is proposed. This strategy possesses a low complexity structure, and leads to simple but (asymptotically) optimal detection algorithms for the problems under consideration. Specially, our main results include: 1) Non-Bayesian quickest change-point detection: we consider non-Bayesian quickest detection problem with stochastic sampling right constraint. Two criteria, namely the algorithm level average run length (ARL) and the system level ARL, are proposed to control the false alarm rate. We show that the greedy sampling right allocation strategy combined with the cumulative sum (CUSUM) algorithm is optimal for Lorden's setup with the algorithm level ARL constraint and is asymptotically optimal for both Lorden's and Pollak's setups with the system level ARL constraint. 2) Bayesian quickest change-point detection: both limited sampling right constraint and stochastic sampling right constraint are considered in the Bayesian quickest detection problem. The limited sampling right constraint can be viewed as a special case of the stochastic sampling right constraint with a zero sampling right replenishing rate. The optimal solutions are derived for both sampling right constraints. However, the structure of the optimal solutions are rather complex. For the problem with the limited sampling right constraint, we provide asymptotic upper and lower bounds for the detection delay. For the problem with the stochastic sampling right constraint, we show that the greedy sampling right allocation strategy combined with Shiryaev's detection rule is asymptotically optimal. 3) Quickest change-point detection with unknown post-change parameters: we extend previous results to the quickest detection problem with unknown post-change parameters. Both non-Bayesian and Bayesian setups with stochastic sampling right constraints are considered. For the non-Bayesian problem, we show that the greedy sampling right allocation strategy combined with the M-CUSUM algorithm is asymptotically optimal. For the Bayesian setups, we show that the greedy sampling right allocation strategy combined with the proposed M-Shiryaev algorithm is asymptotically optimal.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Schröder, Anna Louise. "Methods for change-point detection with additional interpretability". Thesis, London School of Economics and Political Science (University of London), 2016. http://etheses.lse.ac.uk/3421/.

Texto completo
Resumen
The main purpose of this dissertation is to introduce and critically assess some novel statistical methods for change-point detection that help better understand the nature of processes underlying observable time series. First, we advocate the use of change-point detection for local trend estimation in financial return data and propose a new approach developed to capture the oscillatory behaviour of financial returns around piecewise-constant trend functions. Core of the method is a data-adaptive hierarchically-ordered basis of Unbalanced Haar vectors which decomposes the piecewise-constant trend underlying observed daily returns into a binary-tree structure of one-step constant functions. We illustrate how this framework can provide a new perspective for the interpretation of change points in financial returns. Moreover, the approach yields a family of forecasting operators for financial return series which can be adjusted flexibly depending on the forecast horizon or the loss function. Second, we discuss change-point detection under model misspecification, focusing in particular on normally distributed data with changing mean and variance. We argue that ignoring the presence of changes in mean or variance when testing for changes in, respectively, variance or mean, can affect the application of statistical methods negatively. After illustrating the difficulties arising from this kind of model misspecification we propose a new method to address these using sequential testing on intervals with varying length and show in a simulation study how this approach compares to competitors in mixed-change situations. The third contribution of this thesis is a data-adaptive procedure to evaluate EEG data, which can improve the understanding of an epileptic seizure recording. This change-point detection method characterizes the evolution of frequencyspecific energy as measured on the human scalp. It provides new insights to this high dimensional high frequency data and has attractive computational and scalability features. In addition to contrasting our method with existing approaches, we analyse and interpret the method’s output in the application to a seizure data set.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Change point and trend detection"

1

Olympia, Hadjiliadis, ed. Quickest detection. Cambridge: Cambridge University Press, 2009.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Rahat, Gideon y Ofer Kenig. A Cross-National Comparison of Party Change. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198808008.003.0005.

Texto completo
Resumen
The chapter presents an integrative cross-national comparative analysis of party change in all twenty-six countries under study. It finds that, apart from often sharing the same decline trend, the various indicators of party change appear to be independent of one another. The indicators that refer to the various mediators and those that refer to voters point clearly to decline, while those that look at the party background of representatives stand out as prominent survivors of a major decline trend. Party decline is evident in almost all countries, but its levels vary. Many explanations for variance are ruled out, leaving room for the possibility that human agency rather than environmental factors may prove to be the cause. Parties are not on their way out, but in some countries they have already experienced sharp decline, while in others their experience may be better described as adaptation. In many others still, parties lie somewhere in-between these poles.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Kennett, Douglas J. y David A. Hodell. AD 750–1100 Climate Change and Critical Transitions in Classic Maya Sociopolitical Networks. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199329199.003.0007.

Texto completo
Resumen
Multiple palaeoclimatic reconstructions point to a succession of major droughts in the Maya Lowlands between AD 750 and 1100 superimposed on a regional drying trend that itself was marked by considerable spatial and temporal variability. The longest and most severe regional droughts occurred between AD 800 and 900 and again between AD 1000 and 1100. Well-dated historical records carved on stone monuments from forty Classic Period civic-ceremonial centers reflect a dynamic sociopolitical landscape between AD 250 and 800 marked by a complex of antagonistic, diplomatic, lineage-based, and subordinate networks. Warfare between Maya polities increased between AD 600 and 800 within the context of population expansion and long-term environmental degradation exacerbated by increasing drought. Nevertheless, in spite of the clear effects of drought on network collapse during the Classic Period, one lingering question is why polities in the northern lowlands persisted and even flourished between AD 800 and 1000 (Puuc Maya and Chichén Itzá) before they too fragmented during an extended and severe regional drought between AD 1000 and 1100. Here we review available regional climate records during this critical transition and consider the different sociopolitical trajectories in the South/Central versus Northern Maya lowlands.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Rahat, Gideon y Ofer Kenig. From Party Politics to Personalized Politics? Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198808008.001.0001.

Texto completo
Resumen
The book examines two of the most prominent developments in contemporary democratic politics, party change and political personalization, and the relationship between them. It presents a broad-brush, cross-national comparison of these phenomena that covers around fifty years in twenty-six countries through the use of more than twenty indicators. It demonstrates that, behind a general trend of decline of political parties, there is much variance among countries. In some, party decline is moderate or even small, which may point to adaptation to the changing environments these parties operate in. In others, parties sharply decline. Most cases fall between these two poles. A clear general trend of personalization in politics is identified, but there are large differences among countries in its magnitude and manifestations. Surprisingly, the online world seems to supply parties with an opportunity to revive. When parties decline, personalization increases. Yet these are far from being perfect zero-sum relationships, which leaves room for the possibility that other political actors may step in when parties decline and that, in some cases, personalization may not hurt parties; it may even strengthen them. Personalization is a big challenge to parties. But parties were, are, and will remain a solution to the problem of collective action, of channeling personal energies to the benefit of the group. Thus they can cope with personalization and even use it to their advantage.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Fensholt, Rasmus, Cheikh Mbow, Martin Brandt y Kjeld Rasmussen. Desertification and Re-Greening of the Sahel. Oxford University Press, 2017. http://dx.doi.org/10.1093/acrefore/9780190228620.013.553.

Texto completo
Resumen
In the past 50 years, human activities and climatic variability have caused major environmental changes in the semi-arid Sahelian zone and desertification/degradation of arable lands is of major concern for livelihoods and food security. In the wake of the Sahel droughts in the early 1970s and 1980s, the UN focused on the problem of desertification by organizing the UN Conference on Desertification (UNCOD) in Nairobi in 1976. This fuelled a significant increase in the often alarmist popular accounts of desertification as well as scientific efforts in providing an understanding of the mechanisms involved. The global interest in the subject led to the nomination of desertification as focal point for one of three international environmental conventions: the UN Convention to Combat Desertification (UNCCD), emerging from the Rio conference in 1992. This implied that substantial efforts were made to quantify the extent of desertification and to understand its causes. Desertification is a complex and multi-faceted phenomenon aggravating poverty that can be seen as both a cause and a consequence of land resource depletion. As reflected in its definition adopted by the UNCCD, desertification is “land degradation in arid, semi-arid[,] and dry sub-humid areas resulting from various factors, including climate variation and human activities” (UN, 1992). While desertification was seen as a phenomenon of relevance to drylands globally, the Sahel-Sudan region remained a region of specific interest and a significant amount of scientific efforts have been invested to provide an empirically supported understanding of both climatic and anthropogenic factors involved. Despite decades of intensive research on human–environmental systems in the Sahel, there is no overall consensus about the severity of desertification and the scientific literature is characterized by a range of conflicting observations and interpretations of the environmental conditions in the region. Earth Observation (EO) studies generally show a positive trend in rainfall and vegetation greenness over the last decades for the majority of the Sahel and this has been interpreted as an increase in biomass and contradicts narratives of a vicious cycle of widespread degradation caused by human overuse and climate change. Even though an increase in vegetation greenness, as observed from EO data, can be confirmed by ground observations, long-term assessments of biodiversity at finer spatial scales highlight a negative trend in species diversity in several studies and overall it remains unclear if the observed positive trends provide an environmental improvement with positive effects on people’s livelihood.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kenyon, Ian R. Quantum 20/20. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198808350.001.0001.

Texto completo
Resumen
This text reviews fundametals and incorporates key themes of quantum physics. One theme contrasts boson condensation and fermion exclusivity. Bose–Einstein condensation is basic to superconductivity, superfluidity and gaseous BEC. Fermion exclusivity leads to compact stars and to atomic structure, and thence to the band structure of metals and semiconductors with applications in material science, modern optics and electronics. A second theme is that a wavefunction at a point, and in particular its phase is unique (ignoring a global phase change). If there are symmetries, conservation laws follow and quantum states which are eigenfunctions of the conserved quantities. By contrast with no particular symmetry topological effects occur such as the Bohm–Aharonov effect: also stable vortex formation in superfluids, superconductors and BEC, all these having quantized circulation of some sort. The quantum Hall effect and quantum spin Hall effect are ab initio topological. A third theme is entanglement: a feature that distinguishes the quantum world from the classical world. This property led Einstein, Podolsky and Rosen to the view that quantum mechanics is an incomplete physical theory. Bell proposed the way that any underlying local hidden variable theory could be, and was experimentally rejected. Powerful tools in quantum optics, including near-term secure communications, rely on entanglement. It was exploited in the the measurement of CP violation in the decay of beauty mesons. A fourth theme is the limitations on measurement precision set by quantum mechanics. These can be circumvented by quantum non-demolition techniques and by squeezing phase space so that the uncertainty is moved to a variable conjugate to that being measured. The boundaries of precision are explored in the measurement of g-2 for the electron, and in the detection of gravitational waves by LIGO; the latter achievement has opened a new window on the Universe. The fifth and last theme is quantum field theory. This is based on local conservation of charges. It reaches its most impressive form in the quantum gauge theories of the strong, electromagnetic and weak interactions, culminating in the discovery of the Higgs. Where particle physics has particles condensed matter has a galaxy of pseudoparticles that exist only in matter and are always in some sense special to particular states of matter. Emergent phenomena in matter are successfully modelled and analysed using quasiparticles and quantum theory. Lessons learned in that way on spontaneous symmetry breaking in superconductivity were the key to constructing a consistent quantum gauge theory of electroweak processes in particle physics.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Change point and trend detection"

1

Ballová, Dominika. "Trend Analysis and Detection of Change-Points of Selected Financial and Market Indices". En Advances in Intelligent Systems and Computing, 372–81. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-18058-4_30.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Park, Chiwoo y Yu Ding. "Change Point Detection". En Data Science for Nano Image Analysis, 241–75. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72822-9_9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sy, Bon K. y Arjun K. Gupta. "Change Point Detection Techniques". En The Kluwer International Series in Engineering and Computer Science, 93–98. Boston, MA: Springer US, 2004. http://dx.doi.org/10.1007/978-1-4419-9001-3_7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Chang, Seo-Won, Yong-Ik Byun y Jaegyoon Hahm. "Variability Detection by Change-Point Analysis". En Lecture Notes in Statistics, 491–93. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-3520-4_48.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Isupova, Olga. "Change Point Detection with Gaussian Processes". En Machine Learning Methods for Behaviour Analysis and Anomaly Detection in Video, 83–104. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-75508-3_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Horváth, Lajos, Zsuzsanna Horváth y Marie Hušková. "Ratio tests for change point detection". En Beyond Parametrics in Interdisciplinary Research: Festschrift in Honor of Professor Pranab K. Sen, 293–304. Beachwood, Ohio, USA: Institute of Mathematical Statistics, 2008. http://dx.doi.org/10.1214/193940307000000220.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Tatti, Nikolaj. "Fast Likelihood-Based Change Point Detection". En Machine Learning and Knowledge Discovery in Databases, 662–77. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46150-8_39.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Staude, Gerhard y Werner Wolf. "Change-Point Detection in Kinetic Signals". En Medical Data Analysis, 43–48. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/3-540-39949-6_7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Brodsky, B. E. y B. S. Darkhovsky. "Disorder Detection of Random Fields". En Nonparametric Methods in Change-Point Problems, 151–68. Dordrecht: Springer Netherlands, 1993. http://dx.doi.org/10.1007/978-94-015-8163-9_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Mayer, Brandon A. y Joseph L. Mundy. "Change Point Geometry for Change Detection in Surveillance Video". En Image Analysis, 377–87. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-19665-7_31.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Change point and trend detection"

1

Klyushin, Dmitriy y Kateryna Golubeva. "Nonparametric Multiple Comparison Test for Change-Point Detection in Big Data". En 2020 IEEE 2nd International Conference on Advanced Trends in Information Theory (ATIT). IEEE, 2020. http://dx.doi.org/10.1109/atit50783.2020.9349323.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Srivastava, Abhishek, P. K. Kapur y Deepti Mehrotra. "Modelling fault detection with change-point in agile software development environment". En 2017 International Conference on Infocom Technologies and Unmanned Systems (Trends and Future Directions) (ICTUS). IEEE, 2017. http://dx.doi.org/10.1109/ictus.2017.8286023.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

DePold, Hans, Jason Seigel, Allan Volponi y Jonthan Hull. "Validation of Diagnostic Data With Statistical Analysis and Embedded Knowledge". En ASME Turbo Expo 2003, collocated with the 2003 International Joint Power Generation Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/gt2003-38764.

Texto completo
Resumen
The method described in this paper is intended to improve data stability and reduce dispersion of a given parameter signal by replacing any point deemed to be noise (e.g. an outlier) with an estimate of the parameter based on its recent history. This method determines a point is an outlier by utilizing learning and analytical elements that automatically adjust to the dispersion of the input data and automatically model the underlying process. The analytical elements of this method contain two types of embedded knowledge. The first is physics based engineering knowledge based on known interrelationships between the parameters to provide evidence when a parameter data point is physically unexplainable. It also contains the rule-based knowledge of real gas turbine trend characteristics such as persistency, polarity, and monotonic direction. Once a parameter data point is suspected to be noise, the system validates that assessment with a persistency check. If no parameter trend shift is occurring, as determined by a lack of persistency, the parameter is deemed noise and is replaced with the parameter’s last good state. Persistency checks enable removing noise without obscuring shifts in data that could be occurring due to real system changes. The process is designed to decide within two points if a data point should be considered noise. The goal of the process presented here for improving the quality of gas generator data is to automatically replace all probable noise without distorting the underlying parameter signals. The metrics for success are a 25% reduction in the dispersion (e.g. scatter) of the data with no bias in the parameter central tendency (e.g. mean value), with no reduction the granularity of the parameter signal (visibility of anomalies), and with no delay in the detection of a real trend change.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Domaradzki, Andrzej. "AIS for Trend Change Detection". En 2007 6th International Conference on Computer Information Systems and Industrial Management Applications. IEEE, 2007. http://dx.doi.org/10.1109/cisim.2007.10.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Chen, Wenhua y C. C. Jay Kuo. "Change-point detection using wavelets". En Aerospace/Defense Sensing and Controls, editado por Joseph Picone. SPIE, 1996. http://dx.doi.org/10.1117/12.241984.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Xie, Yao, Meng Wang y Andrew Thompson. "Sketching for sequential change-point detection". En 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP). IEEE, 2015. http://dx.doi.org/10.1109/globalsip.2015.7418160.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Solomentsev, Olexander, Maksym Zaliskyi y Tetyana Gerasymenko. "Change-point detection during radar operation". En 2016 IEEE First International Conference on Data Stream Mining & Processing (DSMP). IEEE, 2016. http://dx.doi.org/10.1109/dsmp.2016.7583562.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Yao Xie y D. Siegmund. "Sequential multi-sensor change-point detection". En 2013 Information Theory and Applications Workshop (ITA 2013). IEEE, 2013. http://dx.doi.org/10.1109/ita.2013.6502987.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Canzanese, Raymond, Moshe Kam y Spiros Mancoridis. "Multi-channel Change-Point Malware Detection". En 2013 7th IEEE International Conference on Software Security and Reliability (SERE). IEEE, 2013. http://dx.doi.org/10.1109/sere.2013.20.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Bouchikhi, Ikram, Andre Ferrari, Cedric Richard, Anthony Bourrier y Marc Bernot. "Kernel Based Online Change Point Detection". En 2019 27th European Signal Processing Conference (EUSIPCO). IEEE, 2019. http://dx.doi.org/10.23919/eusipco.2019.8902582.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Change point and trend detection"

1

Siegmund, David. Change-Point Detection and Adaptive Control of Time-Varying Systems. Fort Belvoir, VA: Defense Technical Information Center, septiembre de 1993. http://dx.doi.org/10.21236/ada273509.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Tsunokai, Manabu. Quantification of Forecasting and Change-Point Detection Methods for Predictive Maintenance. Fort Belvoir, VA: Defense Technical Information Center, agosto de 2015. http://dx.doi.org/10.21236/ada627305.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Mei, Yajun. Robust Rapid Change-Point Detection in Multi-Sensor Data Fusion and Behavior Research. Fort Belvoir, VA: Defense Technical Information Center, febrero de 2011. http://dx.doi.org/10.21236/ada557750.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Rahmani, Mehran y Manan Naik. Structural Identification and Damage Detection in Bridges using Wave Method and Uniform Shear Beam Models: A Feasibility Study. Mineta Transportation Institute, febrero de 2021. http://dx.doi.org/10.31979/mti.2021.1934.

Texto completo
Resumen
This report presents a wave method to be used for the structural identification and damage detection of structural components in bridges, e.g., bridge piers. This method has proven to be promising when applied to real structures and large amplitude responses in buildings (e.g., mid-rise and high-rise buildings). This study is the first application of the method to damaged bridge structures. The bridge identification was performed using wave propagation in a simple uniform shear beam model. The method identifies a wave velocity for the structure by fitting an equivalent uniform shear beam model to the impulse response functions of the recorded earthquake response. The structural damage is detected by measuring changes in the identified velocities from one damaging event to another. The method uses the acceleration response recorded in the structure to detect damage. In this study, the acceleration response from a shake-table four-span bridge tested to failure was used. Pairs of sensors were identified to represent a specific wave passage in the bridge. Wave velocities were identified for several sensor pairs and various shaking intensities are reported; further, actual observed damage in the bridge was compared with the detected reductions in the identified velocities. The results show that the identified shear wave velocities presented a decreasing trend as the shaking intensity was increased, and the average percentage reduction in the velocities was consistent with the overall observed damage in the bridge. However, there was no clear correlation between a specific wave passage and the observed reduction in the velocities. This indicates that the uniform shear beam model was too simple to localize the damage in the bridge. Instead, it provides a proxy for the overall extent of change in the response due to damage.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

McKenna, Patrick y Mark Evans. Emergency Relief and complex service delivery: Towards better outcomes. Queensland University of Technology, junio de 2021. http://dx.doi.org/10.5204/rep.eprints.211133.

Texto completo
Resumen
Emergency Relief (ER) is a Department of Social Services (DSS) funded program, delivered by 197 community organisations (ER Providers) across Australia, to assist people facing a financial crisis with financial/material aid and referrals to other support programs. ER has been playing this important role in Australian communities since 1979. Without ER, more people living in Australia who experience a financial crisis might face further harm such as crippling debt or homelessness. The Emergency Relief National Coordination Group (NCG) was established in April 2020 at the start of the COVID-19 pandemic to advise the Minister for Families and Social Services on the implementation of ER. To inform its advice to the Minister, the NCG partnered with the Institute for Governance at the University of Canberra to conduct research to understand the issues and challenges faced by ER Providers and Service Users in local contexts across Australia. The research involved a desktop review of the existing literature on ER service provision, a large survey which all Commonwealth ER Providers were invited to participate in (and 122 responses were received), interviews with a purposive sample of 18 ER Providers, and the development of a program logic and theory of change for the Commonwealth ER program to assess progress. The surveys and interviews focussed on ER Provider perceptions of the strengths, weaknesses, future challenges, and areas of improvement for current ER provision. The trend of increasing case complexity, the effectiveness of ER service delivery models in achieving outcomes for Service Users, and the significance of volunteering in the sector were investigated. Separately, an evaluation of the performance of the NCG was conducted and a summary of the evaluation is provided as an appendix to this report. Several themes emerged from the review of the existing literature such as service delivery shortcomings in dealing with case complexity, the effectiveness of case management, and repeat requests for service. Interviews with ER workers and Service Users found that an uplift in workforce capability was required to deal with increasing case complexity, leading to recommendations for more training and service standards. Several service evaluations found that ER delivered with case management led to high Service User satisfaction, played an integral role in transforming the lives of people with complex needs, and lowered repeat requests for service. A large longitudinal quantitative study revealed that more time spent with participants substantially decreased the number of repeat requests for service; and, given that repeat requests for service can be an indicator of entrenched poverty, not accessing further services is likely to suggest improvement. The interviews identified the main strengths of ER to be the rapid response and flexible use of funds to stabilise crisis situations and connect people to other supports through strong local networks. Service Users trusted the system because of these strengths, and ER was often an access point to holistic support. There were three main weaknesses identified. First, funding contracts were too short and did not cover the full costs of the program—in particular, case management for complex cases. Second, many Service Users were dependent on ER which was inconsistent with the definition and intent of the program. Third, there was inconsistency in the level of service received by Service Users in different geographic locations. These weaknesses can be improved upon with a joined-up approach featuring co-design and collaborative governance, leading to the successful commissioning of social services. The survey confirmed that volunteers were significant for ER, making up 92% of all workers and 51% of all hours worked in respondent ER programs. Of the 122 respondents, volunteers amounted to 554 full-time equivalents, a contribution valued at $39.4 million. In total there were 8,316 volunteers working in the 122 respondent ER programs. The sector can support and upskill these volunteers (and employees in addition) by developing scalable training solutions such as online training modules, updating ER service standards, and engaging in collaborative learning arrangements where large and small ER Providers share resources. More engagement with peak bodies such as Volunteering Australia might also assist the sector to improve the focus on volunteer engagement. Integrated services achieve better outcomes for complex ER cases—97% of survey respondents either agreed or strongly agreed this was the case. The research identified the dimensions of service integration most relevant to ER Providers to be case management, referrals, the breadth of services offered internally, co-location with interrelated service providers, an established network of support, workforce capability, and Service User engagement. Providers can individually focus on increasing the level of service integration for their ER program to improve their ability to deal with complex cases, which are clearly on the rise. At the system level, a more joined-up approach can also improve service integration across Australia. The key dimensions of this finding are discussed next in more detail. Case management is key for achieving Service User outcomes for complex cases—89% of survey respondents either agreed or strongly agreed this was the case. Interviewees most frequently said they would provide more case management if they could change their service model. Case management allows for more time spent with the Service User, follow up with referral partners, and a higher level of expertise in service delivery to support complex cases. Of course, it is a costly model and not currently funded for all Service Users through ER. Where case management is not available as part of ER, it might be available through a related service that is part of a network of support. Where possible, ER Providers should facilitate access to case management for Service Users who would benefit. At a system level, ER models with a greater component of case management could be implemented as test cases. Referral systems are also key for achieving Service User outcomes, which is reflected in the ER Program Logic presented on page 31. The survey and interview data show that referrals within an integrated service (internal) or in a service hub (co-located) are most effective. Where this is not possible, warm referrals within a trusted network of support are more effective than cold referrals leading to higher take-up and beneficial Service User outcomes. However, cold referrals are most common, pointing to a weakness in ER referral systems. This is because ER Providers do not operate or co-locate with interrelated services in many cases, nor do they have the case management capacity to provide warm referrals in many other cases. For mental illness support, which interviewees identified as one of the most difficult issues to deal with, ER Providers offer an integrated service only 23% of the time, warm referrals 34% of the time, and cold referrals 43% of the time. A focus on referral systems at the individual ER Provider level, and system level through a joined-up approach, might lead to better outcomes for Service Users. The program logic and theory of change for ER have been documented with input from the research findings and included in Section 4.3 on page 31. These show that ER helps people facing a financial crisis to meet their immediate needs, avoid further harm, and access a path to recovery. The research demonstrates that ER is fundamental to supporting vulnerable people in Australia and should therefore continue to be funded by government.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía