Dissertations / Theses on the topic 'Long-range dependence'

To see the other types of publications on this topic, follow the link: Long-range dependence.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Long-range dependence.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Vivero, Oskar. "Estimation of long-range dependence." Thesis, University of Manchester, 2010. https://www.research.manchester.ac.uk/portal/en/theses/estimation-of-longrange-dependence(65565876-4ec6-44b3-8181-51b13dca309c).html.

Full text
Abstract:
A set of observations from a random process which exhibit correlations that decay slower than an exponential rate is regarded as long-range dependent. This phenomenon has stimulated great interest in the scientific community as it appears in a wide range of areas of knowledge. For example, this property has been observed in data pertaining to electronics, econometrics, hydrology and biomedical signals.There exist several estimation methods for finding model parameters that help explain the set of observations exhibiting long-range dependence. Among these methods, maximum likelihood is attractive, given its desirable statistical properties such as asymptotic consistency and efficiency. However, its computational complexity makes the implementation of maximum likelihood prohibitive.This thesis presents a group of computationally efficient estimators based on the maximum likelihood framework. The thesis consists of two main parts. The first part is devoted to developing a computationally efficient alternative to the maximum likelihood estimate. This alternative is based on the circulant embedding concept and it is shown to maintain the desirable statistical properties of maximum likelihood.Interesting results are obtained by analysing the circulant embedding estimate. In particular, this thesis shows that the maximum likelihood based methods are ill-conditioned; the estimators' performance will deteriorate significantly when the set of observations is corrupted by errors. The second part of this thesis focuses on developing computationally efficient estimators with improved performance under the presence of errors in the observations.
APA, Harvard, Vancouver, ISO, and other styles
2

Carpio, Kristine Joy Espiritu, and kjecarpio@lycos com. "Long-Range Dependence of Markov Processes." The Australian National University. School of Mathematical Sciences, 2006. http://thesis.anu.edu.au./public/adt-ANU20061024.131933.

Full text
Abstract:
Long-range dependence in discrete and continuous time Markov chains over a countable state space is defined via embedded renewal processes brought about by visits to a fixed state. In the discrete time chain, solidarity properties are obtained and long-range dependence of functionals are examined. On the other hand, the study of LRD of continuous time chains is defined via the number of visits in a given time interval. Long-range dependence of Markov chains over a non-countable state space is also carried out through positive Harris chains. Embedded renewal processes in these chains exist via visits to sets of states called proper atoms. Examples of these chains are presented, with particular attention given to long-range dependent Markov chains in single-server queues, namely, the waiting times of GI/G/1 queues and queue lengths at departure epochs in M/G/1 queues. The presence of long-range dependence in these processes is dependent on the moment index of the lifetime distribution of the service times. The Hurst indexes are obtained under certain conditions on the distribution function of the service times and the structure of the correlations. These processes of waiting times and queue sizes are also examined in a range of M/P/2 queues via simulation (here, P denotes a Pareto distribution).
APA, Harvard, Vancouver, ISO, and other styles
3

Carpio, Kristine Joy Espiritu. "Long-range dependence of Markov processes /." View thesis entry in Australian Digital Theses Program, 2006. http://thesis.anu.edu.au/public/adt-ANU20061024.131933/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gaigalas, Raimundas. "A Non-Gaussian Limit Process with Long-Range Dependence." Doctoral thesis, Uppsala : Matematiska institutionen, Univ. [distributör], 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-3993.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Trovero, Michele A. Smith Richard L. "Effects of aggregation on estimators of long-range dependence." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2007. http://dc.lib.unc.edu/u?/etd,1170.

Full text
Abstract:
Thesis (Ph. D.)--University of North Carolina at Chapel Hill, 2007.
Title from electronic title page (viewed Mar. 27, 2008). "... in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Statistics and Operations Research." Discipline: Statistics and Operations Research; Department/School: Statistics and Operations Research.
APA, Harvard, Vancouver, ISO, and other styles
6

Finlay, Richard. "The Variance Gamma (VG) Model with Long Range Dependence." Thesis, The University of Sydney, 2009. http://hdl.handle.net/2123/5434.

Full text
Abstract:
This thesis mainly builds on the Variance Gamma (VG) model for financial assets over time of Madan & Seneta (1990) and Madan, Carr & Chang (1998), although the model based on the t distribution championed in Heyde & Leonenko (2005) is also given attention. The primary contribution of the thesis is the development of VG models, and the extension of t models, which accommodate a dependence structure in asset price returns. In particular it has become increasingly clear that while returns (log price increments) of historical financial asset time series appear as a reasonable approximation of independent and identically distributed data, squared and absolute returns do not. In fact squared and absolute returns show evidence of being long range dependent through time, with autocorrelation functions that are still significant after 50 to 100 lags. Given this evidence against the assumption of independent returns, it is important that models for financial assets be able to accommodate a dependence structure.
APA, Harvard, Vancouver, ISO, and other styles
7

Finlay, Richard. "The Variance Gamma (VG) Model with Long Range Dependence." University of Sydney, 2009. http://hdl.handle.net/2123/5434.

Full text
Abstract:
Doctor of Philosophy (PhD)
This thesis mainly builds on the Variance Gamma (VG) model for financial assets over time of Madan & Seneta (1990) and Madan, Carr & Chang (1998), although the model based on the t distribution championed in Heyde & Leonenko (2005) is also given attention. The primary contribution of the thesis is the development of VG models, and the extension of t models, which accommodate a dependence structure in asset price returns. In particular it has become increasingly clear that while returns (log price increments) of historical financial asset time series appear as a reasonable approximation of independent and identically distributed data, squared and absolute returns do not. In fact squared and absolute returns show evidence of being long range dependent through time, with autocorrelation functions that are still significant after 50 to 100 lags. Given this evidence against the assumption of independent returns, it is important that models for financial assets be able to accommodate a dependence structure.
APA, Harvard, Vancouver, ISO, and other styles
8

Rust, Henning. "Detection of long-range dependence : applications in climatology and hydrology." Phd thesis, Universität Potsdam, 2007. http://opus.kobv.de/ubp/volltexte/2007/1334/.

Full text
Abstract:
It is desirable to reduce the potential threats that result from the variability of nature, such as droughts or heat waves that lead to food shortage, or the other extreme, floods that lead to severe damage. To prevent such catastrophic events, it is necessary to understand, and to be capable of characterising, nature's variability. Typically one aims to describe the underlying dynamics of geophysical records with differential equations. There are, however, situations where this does not support the objectives, or is not feasible, e.g., when little is known about the system, or it is too complex for the model parameters to be identified. In such situations it is beneficial to regard certain influences as random, and describe them with stochastic processes. In this thesis I focus on such a description with linear stochastic processes of the FARIMA type and concentrate on the detection of long-range dependence. Long-range dependent processes show an algebraic (i.e. slow) decay of the autocorrelation function. Detection of the latter is important with respect to, e.g. trend tests and uncertainty analysis. Aiming to provide a reliable and powerful strategy for the detection of long-range dependence, I suggest a way of addressing the problem which is somewhat different from standard approaches. Commonly used methods are based either on investigating the asymptotic behaviour (e.g., log-periodogram regression), or on finding a suitable potentially long-range dependent model (e.g., FARIMA[p,d,q]) and test the fractional difference parameter d for compatibility with zero. Here, I suggest to rephrase the problem as a model selection task, i.e.comparing the most suitable long-range dependent and the most suitable short-range dependent model. Approaching the task this way requires a) a suitable class of long-range and short-range dependent models along with suitable means for parameter estimation and b) a reliable model selection strategy, capable of discriminating also non-nested models. With the flexible FARIMA model class together with the Whittle estimator the first requirement is fulfilled. Standard model selection strategies, e.g., the likelihood-ratio test, is for a comparison of non-nested models frequently not powerful enough. Thus, I suggest to extend this strategy with a simulation based model selection approach suitable for such a direct comparison. The approach follows the procedure of a statistical test, with the likelihood-ratio as the test statistic. Its distribution is obtained via simulations using the two models under consideration. For two simple models and different parameter values, I investigate the reliability of p-value and power estimates obtained from the simulated distributions. The result turned out to be dependent on the model parameters. However, in many cases the estimates allow an adequate model selection to be established. An important feature of this approach is that it immediately reveals the ability or inability to discriminate between the two models under consideration. Two applications, a trend detection problem in temperature records and an uncertainty analysis for flood return level estimation, accentuate the importance of having reliable methods at hand for the detection of long-range dependence. In the case of trend detection, falsely concluding long-range dependence implies an underestimation of a trend and possibly leads to a delay of measures needed to take in order to counteract the trend. Ignoring long-range dependence, although present, leads to an underestimation of confidence intervals and thus to an unjustified belief in safety, as it is the case for the return level uncertainty analysis. A reliable detection of long-range dependence is thus highly relevant in practical applications. Examples related to extreme value analysis are not limited to hydrological applications. The increased uncertainty of return level estimates is a potentially problem for all records from autocorrelated processes, an interesting examples in this respect is the assessment of the maximum strength of wind gusts, which is important for designing wind turbines. The detection of long-range dependence is also a relevant problem in the exploration of financial market volatility. With rephrasing the detection problem as a model selection task and suggesting refined methods for model comparison, this thesis contributes to the discussion on and development of methods for the detection of long-range dependence.
Die potentiellen Gefahren und Auswirkungen der natürlicher Klimavariabilitäten zu reduzieren ist ein wünschenswertes Ziel. Solche Gefahren sind etwa Dürren und Hitzewellen, die zu Wasserknappheit führen oder, das andere Extrem, Überflutungen, die einen erheblichen Schaden an der Infrastruktur nach sich ziehen können. Um solche katastrophalen Ereignisse zu vermeiden, ist es notwendig die Dynamik der Natur zu verstehen und beschreiben zu können. Typischerweise wird versucht die Dynamik geophysikalischer Datenreihen mit Differentialgleichungssystemen zu beschreiben. Es gibt allerdings Situationen in denen dieses Vorgehen nicht zielführend oder technisch nicht möglich ist. Dieses sind Situationen in denen wenig Wissen über das System vorliegt oder es zu komplex ist um die Modellparameter zu identifizieren. Hier ist es sinnvoll einige Einflüsse als zufällig zu betrachten und mit Hilfe stochastischer Prozesse zu modellieren. In dieser Arbeit wird eine solche Beschreibung mit linearen stochastischen Prozessen der FARIMA-Klasse angestrebt. Besonderer Fokus liegt auf der Detektion von langreichweitigen Korrelationen. Langreichweitig korrelierte Prozesse sind solche mit einer algebraisch, d.h. langsam, abfallenden Autokorrelationsfunktion. Eine verläßliche Erkennung dieser Prozesse ist relevant für Trenddetektion und Unsicherheitsanalysen. Um eine verläßliche Strategie für die Detektion langreichweitig korrelierter Prozesse zur Verfügung zu stellen, wird in der Arbeit ein anderer als der Standardweg vorgeschlagen. Gewöhnlich werden Methoden eingesetzt, die das asymptotische Verhalten untersuchen, z.B. Regression im Periodogramm. Oder aber es wird versucht ein passendes potentiell langreichweitig korreliertes Modell zu finden, z.B. aus der FARIMA Klasse, und den geschätzten fraktionalen Differenzierungsparameter d auf Verträglichkeit mit dem trivialen Wert Null zu testen. In der Arbeit wird vorgeschlagen das Problem der Detektion langreichweitiger Korrelationen als Modellselektionsproblem umzuformulieren, d.h. das beste kurzreichweitig und das beste langreichweitig korrelierte Modell zu vergleichen. Diese Herangehensweise erfordert a) eine geeignete Klasse von lang- und kurzreichweitig korrelierten Prozessen und b) eine verläßliche Modellselektionsstrategie, auch für nichtgenestete Modelle. Mit der flexiblen FARIMA-Klasse und dem Whittleschen Ansatz zur Parameterschätzung ist die erste Voraussetzung erfüllt. Hingegen sind standard Ansätze zur Modellselektion, wie z.B. der Likelihood-Ratio-Test, für nichtgenestete Modelle oft nicht trennscharf genug. Es wird daher vorgeschlagen diese Strategie mit einem simulationsbasierten Ansatz zu ergänzen, der insbesondere für die direkte Diskriminierung nichtgenesteter Modelle geeignet ist. Der Ansatz folgt einem statistischen Test mit dem Quotienten der Likelihood als Teststatistik. Ihre Verteilung wird über Simulationen mit den beiden zu unterscheidenden Modellen ermittelt. Für zwei einfache Modelle und verschiedene Parameterwerte wird die Verläßlichkeit der Schätzungen für p-Wert und Power untersucht. Das Ergebnis hängt von den Modellparametern ab. Es konnte jedoch in vielen Fällen eine adäquate Modellselektion etabliert werden. Ein wichtige Eigenschaft dieser Strategie ist, dass unmittelbar offengelegt wird, wie gut sich die betrachteten Modelle unterscheiden lassen. Zwei Anwendungen, die Trenddetektion in Temperaturzeitreihen und die Unsicherheitsanalyse für Bemessungshochwasser, betonen den Bedarf an verläßlichen Methoden für die Detektion langreichweitiger Korrelationen. Im Falle der Trenddetektion führt ein fälschlicherweise gezogener Schluß auf langreichweitige Korrelationen zu einer Unterschätzung eines Trends, was wiederum zu einer möglicherweise verzögerten Einleitung von Maßnahmen führt, die diesem entgegenwirken sollen. Im Fall von Abflußzeitreihen führt die Nichtbeachtung von vorliegenden langreichweitigen Korrelationen zu einer Unterschätzung der Unsicherheit von Bemessungsgrößen. Eine verläßliche Detektion von langreichweitig Korrelierten Prozesse ist somit von hoher Bedeutung in der praktischen Zeitreihenanalyse. Beispiele mit Bezug zu extremem Ereignissen beschränken sich nicht nur auf die Hochwasseranalyse. Eine erhöhte Unsicherheit in der Bestimmung von extremen Ereignissen ist ein potentielles Problem von allen autokorrelierten Prozessen. Ein weiteres interessantes Beispiel ist hier die Abschätzung von maximalen Windstärken in Böen, welche bei der Konstruktion von Windrädern eine Rolle spielt. Mit der Umformulierung des Detektionsproblems als Modellselektionsfrage und mit der Bereitstellung geeigneter Modellselektionsstrategie trägt diese Arbeit zur Diskussion und Entwicklung von Methoden im Bereich der Detektion von langreichweitigen Korrelationen bei.
APA, Harvard, Vancouver, ISO, and other styles
9

Pilipauskaité, Vytauté. "Limit theorems for spatio-temporal models with long-range dependence." Thesis, Nantes, 2017. http://www.theses.fr/2017NANT4057/document.

Full text
Abstract:
Les travaux de la thèse portent sur les théorèmes limites pour des modèles stochastiques à forte dépendance. Dans la première partie, nous considérons des modèles AR(1) à coefficient aléatoire. Nous identifions trois régimes asymptotiques différents pour le schéma d’agrégation conjointe temporelle-contemporaine lorsque les processus AR sont indépendants et lorsque les AR possède des innovations communes. Ensuite, on discute de l’estimation non paramétrique de la fonction de répartition du coefficient autorégressif à partir d’un panel de séries AR(1) à coefficient aléatoire. Nous prouvons la convergence faible du processus empirique basé sur des estimations des coefficients autorégressifs non observables vers un pont brownien généralisé. Ce résultat est ensuite appliqué pour valider différents outils d’inférence statistique à partir des données du panel AR(1). Dans la deuxième partie de la thèse, nous nous concentrons sur les modèles spatiaux en dimension 2. Nous considérons des champs aléatoires construits à partir des polynômes Appell et de champs aléatoires linéaires. Pour ce modèle non linéaire, nous étudions la limite de ses sommes partielles normalisées prises sur des rectangles et prouvons l’existence d’une transition d’échelle. Enfin, nous abordons la même question pour le modèle de germes-grains aléatoire. Nous mettons en évidence l’existence de deux points de transition dans les limites de ces modèles
The thesis is devoted to limit theorems for stochastic models with long-range dependence. We first consider a random-coefficient AR(1) process, which can have long memory provided the distribution of autoregressive coefficient concentrates near the unit root. We identify three different limit regimes in the scheme of joint temporal-contemporaneous aggregation for independent copies of random-coefficient AR(1) process and for its copies driven by common innovations. Next, we discuss nonparametric estimation of the distribution of the autoregressive coefficient given multiple random-coefficient AR(1) series. We prove the weak convergence of the empirical process based on estimates of unobservable autoregressive coefficients to a generalized Brownian bridge and apply this result to draw statistical inference from panel AR(1) data. In the second part of the thesis we focus on spatial models in dimension 2. We define a nonlinear random field as the Appell polynomial of a linear random field with long-range dependence. For the nonlinear random field, we investigate the limit of its normalized partial sums over rectangles and prove the existence of scaling transition. Finally, we study such like scaling of the random grain model and obtain two-change points in its limits
APA, Harvard, Vancouver, ISO, and other styles
10

Casas, Villalba Isabel. "Statistical inference in continuous-time models with short-range and/or long-range dependence." University of Western Australia. School of Mathematics and Statistics, 2006. http://theses.library.uwa.edu.au/adt-WU2006.0133.

Full text
Abstract:
The aim of this thesis is to estimate the volatility function of continuoustime stochastic models. The estimation of the volatility of the following wellknown international stock market indexes is presented as an application: Dow Jones Industrial Average, Standard and Poor’s 500, NIKKEI 225, CAC 40, DAX 30, FTSE 100 and IBEX 35. This estimation is studied from two different perspectives: a) assuming that the volatility of the stock market indexes displays shortrange dependence (SRD), and b) extending the previous model for processes with longrange dependence (LRD), intermediaterange dependence (IRD) or SRD. Under the efficient market hypothesis (EMH), the compatibility of the Vasicek, the CIR, the Anh and Gao, and the CKLS models with the stock market indexes is being tested. Nonparametric techniques are presented to test the affinity of these parametric volatility functions with the volatility observed from the data. Under the assumption of possible statistical patterns in the volatility process, a new estimation procedure based on the Whittle estimation is proposed. This procedure is theoretically and empirically proven. In addition, its application to the stock market indexes provides interesting results.
APA, Harvard, Vancouver, ISO, and other styles
11

Liu, Jian. "Fractal Network Traffic Analysis with Applications." Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/11477.

Full text
Abstract:
Today, the Internet is growing exponentially, with traffic statistics that mathematically exhibit fractal characteristics: self-similarity and long-range dependence. With these properties, data traffic shows high peak-to-average bandwidth ratios and causes networks inefficient. These problems make it difficult to predict, quantify, and control data traffic. In this thesis, two analytical methods are used to study fractal network traffic. They are second-order self-similarity analysis and multifractal analysis. First, self-similarity is an adaptability of traffic in networks. Many factors are involved in creating this characteristic. A new view of this self-similar traffic structure related to multi-layer network protocols is provided. This view is an improvement over the theory used in most current literature. Second, the scaling region for traffic self-similarity is divided into two timescale regimes: short-range dependence (SRD) and long-range dependence (LRD). Experimental results show that the network transmission delay separates the two scaling regions. This gives us a physical source of the periodicity in the observed traffic. Also, bandwidth, TCP window size, and packet size have impacts on SRD. The statistical heavy-tailedness (Pareto shape parameter) affects the structure of LRD. In addition, a formula to estimate traffic burstiness is derived from the self-similarity property. Furthermore, studies with multifractal analysis have shown the following results. At large timescales, increasing bandwidth does not improve throughput. The two factors affecting traffic throughput are network delay and TCP window size. On the other hand, more simultaneous connections smooth traffic, which could result in an improvement of network efficiency. At small timescales, in order to improve network efficiency, we need to control bandwidth, TCP window size, and network delay to reduce traffic burstiness. In general, network traffic processes have a Hlder exponent a ranging between 0.7 and 1.3. Their statistics differ from Poisson processes. From traffic analysis, a notion of the efficient bandwidth, EB, is derived. Above that bandwidth, traffic appears bursty and cannot be reduced by multiplexing. But, below it, traffic is congested. An important finding is that the relationship between the bandwidth and the transfer delay is nonlinear.
APA, Harvard, Vancouver, ISO, and other styles
12

Lima, Alexandre Barbosa de. "Contribuições à modelagem de teletráfego fractal." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/3/3142/tde-30052008-152514/.

Full text
Abstract:
Estudos empíricos [1],[2] demonstraram que o trafego das redes Internet Protocol (IP) possui propriedades fractais tais como impulsividade, auto-similaridade e dependência de longa duração em diversas escalas de agregação temporal, na faixa de milissegundos a minutos. Essas características tem motivado o desenvolvimento de novos modelos fractais de teletráfego e de novos algoritmos de controle de trafego em redes convergentes. Este trabalho propõe um novo modelo de trafego no espaço de estados baseado numa aproximação finito-dimensional do processo AutoRegressive Fractionally Integrated Moving Average (ARFIMA). A modelagem por meio de processos auto-regressivos (AR) também é investigada. A analise estatística de series simuladas e de series reais de trafego mostra que a aplicação de modelos AR de ordem alta em esquemas de previsão de teletráfego é fortemente prejudicada pelo problema da identificação da ordem do modelo. Também demonstra-se que a modelagem da memória longa pode ser obtida as custas do posicionamento de um ou mais pólos nas proximidades do circulo de raio unitário. Portanto, a implementação do modelo AR ajustado pode ser instável devido a efeitos de quantização dos coeficientes do filtro digital. O modelo de memória longa proposto oferece as seguintes vantagens: a) possibilidade de implementação pratica, pois não requer memória infinita, b) modelagem (explícita) da região das baixas freqüências do espectro e c) viabilização da utilização do filtro de Kalman. O estudo de caso apresentado demonstra que é possível aplicar o modelo de memória longa proposto em trechos estacionários de sinais de teletráfego fractal. Os resultados obtidos mostram que a dinâmica do parâmetro de Hurst de sinais de teletráfego pode ser bastante lenta na pratica. Sendo assim, o novo modelo proposto é adequado para esquemas de previsão de trafego, tais como Controle de Admissão de Conexões (CAC) e alocação dinâmica de banda, dado que o parâmetro de Hurst pode ser estimado em tempo real por meio da aplicação da transformada wavelet discreta (Discrete Wavelet Transform (DWT)).
Empirical studies [1],[2] demonstrated that heterogeneous IP traffic has fractal properties such as impulsiveness, self-similarity, and long-range dependence over several time scales, from miliseconds to minutes. These features have motivated the development of new traffic models and traffic control algorithms. This work presents a new state-space model for teletraffic which is based on a finite-dimensional representation of the ARFIMA random process. The modeling via AutoRegressive (AR) processes is also investigated. The statistical analysis of simulated time series and real traffic traces show that the application of high-order AR models in schemes of teletraffic prediction can be highly impaired by the model identification problem. It is also demonstrated that the modeling of the long memory can be obtained at the cost of positioning one or more poles near the unit circle. Therefore, the implementation of the adjusted AR model can be unstable due to the quantization of the digital filter coefficients. The proposed long memory model has the following advantages: a) possibility of practical implementation, inasmuch it does not require infinite memory, b) explicit modeling of the low frequency region of the power spectrum, and c) forecasts can be performed via the Kalman predictor. The presented case study suggests one can apply the proposed model in periods where stationarity can be safely assumed. The results indicate that the dynamics of the Hurst parameter can be very slow in practice. Hence, the new proposed model is suitable for teletraffic prediction schemes, such as CAC and dynamic bandwidth allocation, given that the Hurst parameter can be estimated on-line via DWT.
APA, Harvard, Vancouver, ISO, and other styles
13

Edwards, Samuel Zachary. "Forecasting Highly-Aggregate Internet Time Series Using Wavelet Techniques." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/33223.

Full text
Abstract:
The U.S. Coast Guard maintains a network structure to connect its nation-wide assets. This paper analyzes and models four highly aggregate traces of the traffic to/from the Coast Guard Data Network ship-shore nodes, so that the models may be used to predict future system demand. These internet traces (polled at 5â 40â intervals) are shown to adhere to a Gaussian distribution upon detrending, which imposes limits to the exponential distribution of higher time-resolution traces. Wavelet estimation of the Hurst-parameter is shown to outperform estimation by another common method (Sample-Variances). The First Differences method of detrending proved problematic to this analysis and is shown to decorrelate AR(1) processes where 0.65< phi1 <1.35 and correlate AR(1) processes with phi1 <-0.25. The Hannan-Rissanen method for estimating (phi,theta) is employed to analyze this series and a one-step ahead forecast is generated.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
14

Gerstenberger, Carina [Verfasser], Herold [Gutachter] Dehling, and Liudas [Gutachter] Giraitis. "Robust tests for discriminating between long-range dependence and short-range dependence with a change in mean / Carina Gerstenberger ; Gutachter: Herold Dehling, Liudas Giraitis ; Fakultät für Mathematik." Bochum : Ruhr-Universität Bochum, 2018. http://d-nb.info/1161942378/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Aparicio, Acosta Felipe Miguel. "Nonlinear modelling and analysis under long-range dependence with an application to positive time series /." [S.l.] : [s.n.], 1995. http://library.epfl.ch/theses/?nr=1381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Buchsteiner, Jannis [Verfasser], Herold [Gutachter] Dehling, and Holger [Gutachter] Dette. "On the empirical process under long-range dependence / Jannis Buchsteiner ; Gutachter: Herold Dehling, Holger Dette." Bochum : Ruhr-Universität Bochum, 2016. http://d-nb.info/1114497371/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Valdivieso, Serrano Luis Hilmar. "Fractionally integrated processes of Ornstein-Uhlenbeck type." Pontificia Universidad Católica del Perú, 2014. http://repositorio.pucp.edu.pe/index/handle/123456789/97091.

Full text
Abstract:
An estimation methodology to deal with fractionally integrated processes of Ornstein- Uhlenbeck type is proposed. The methodology is based on the continuous Whittle contrast. A simulation study is performed by driving this process with a symmetric CGMY background Lévy process.
APA, Harvard, Vancouver, ISO, and other styles
18

Jackson, Brian Scott Carney Laurel H. "Consequences of long-range temporal dependence in neural spiking activity for theories of processing and coding." Related Electronic Resource: Current Research at SU : database of SU dissertations, recent titles available full text, 2003. http://wwwlib.umi.com/cr/syr/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Tarr, Garth. "Quantile Based Estimation of Scale and Dependence." Thesis, The University of Sydney, 2014. http://hdl.handle.net/2123/10590.

Full text
Abstract:
QUANTILE BASED ESTIMATION OF SCALE AND DEPENDENCE Garth Tarr Abstract The sample quantile has a long history in statistics. The aim of this thesis is to explore some further applications of quantiles as simple, convenient and robust alternatives to classical procedures. Chapter 1 addresses the need for reliable confidence intervals for quantile regression coefficients particularly in small samples. We demonstrate the competitive performance of the xy-pair quantile bootstrap approach in a broad range of model designs with a focus on small and moderate sample sizes. Chapter 2 forms the core of this thesis with its investigation into robust estimation of scale. Common robust estimators of scale such as the interquartile range and the median absolute deviation from the median are inefficient when the observations come from a Gaussian distribution. We present a new robust scale estimator, Pn, which is proportional to the interquartile range of the pairwise means. When the underlying distribution is Gaussian, Pn trades some robustness for high Gaussian efficiency. Chapter 3 extends our robust scale estimator to the bivariate setting. We show that the resulting covariance estimator inherits the robustness and efficiency properties of the underlying scale estimator. We also consider the problem of estimating scale and autocovariance in dependent processes. We establish the asymptotic normality of Pn under short and mildly long range dependent Gaussian processes. In the case of extreme long range dependence, we prove a non-normal limit result for the interquartile range. Simulation suggests that an equivalent result holds for Pn. Chapter 4 looks at the problem of estimating covariance and precision matrices under cellwise contamination. A pairwise approach is shown to perform well under much higher levels of contamination than standard robust techniques would allow. Our approach works well with high levels of scattered contamination and has the advantage of being able to impose sparsity on the resulting precision matrix.
APA, Harvard, Vancouver, ISO, and other styles
20

Gandikota, Vijai. "Modeling operating system crash behavior through multifractal analysis, long range dependence and mining of memory usage patterns." Morgantown, W. Va. : [West Virginia University Libraries], 2006. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=4566.

Full text
Abstract:
Thesis (M.S.)--West Virginia University, 2006.
Title from document title page. Document formatted into pages; contains xii, 102 p. : ill. (some col.). Vita. Includes abstract. Includes bibliographical references (p. 96-99).
APA, Harvard, Vancouver, ISO, and other styles
21

Nüßgen, Ines [Verfasser], and Alexander [Gutachter] Schnurr. "Ordinal pattern analysis: limit theorems for multivariate long-range dependent Gaussian time series and a comparison to multivariate dependence measures / Ines Nüßgen ; Gutachter: Alexander Schnurr." Siegen : Universitätsbibliothek der Universität Siegen, 2021. http://nbn-resolving.de/urn:nbn:de:hbz:467-19650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Stancescu, Daniel O. "Bootstrap Methods for the Estimation of the Variance of Partial Sums." University of Cincinnati / OhioLINK, 2001. http://rave.ohiolink.edu/etdc/view?acc_num=ucin998055058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Wishart, Justin Rory. "Nonparametric estimation of change-points in derivatives." Thesis, The University of Sydney, 2011. http://hdl.handle.net/2123/8754.

Full text
Abstract:
In this thesis, the main concern is to analyse change-points in a non-parametric regression model. More specifically, the analysis is focussed on the estimation of the location of jumps in the first derivative of the regression function. These change-points will be referred to as kinks. The estimation method is closely based on the zero-crossing technique (ZCT) introduced by Goldenshluger, Tsybakov and Zeevi (2006). The work of Goldenshluger et al. (2006) was aimed at estimating jumps in the regression function in the indirect non-parametric regression model and shown to be optimal in the minimax sense. Their analysis was applied in practice by Cheng and Raimondo (2008) whereby a class of kernel functions is constructed to use ZCT with a kernel smoothing implementation. Moreover, Cheng and Raimondo (2008) adapted the technique to estimating kinks from a fixed design model with i.i.d. errors. The thesis extends the aforementioned kink estimation technique in two ways. The first extension is to include a long-range dependent (LRD) error structure in the fixed design scenario. The rate of convergence of the resultant LRD method is shown to be reliant on the level of dependence and the smoothness of the underlying regression function. This rate of convergence is shown to be optimal in the sense of the minimax rate. The second extension is to include a regression model with random design and LRD structures. The random design regression models considered include an i.i.d. random design with LRD errors and a separate model with a LRD design with i.i.d. errors. For the case of LRD design variables, the rate of convergence for the estimator is again reliant on the level of dependence and the smoothness of the regression function. However, interestingly for the case of i.i.d. design and LRD errors, the rate of convergence is shown to not rely on the level of dependence but only rely on the smoothness of the regression function and in fact agrees with the minimax rate for fixed design with i.i.d. errors. To conclude, it is summarised where original work occurs in this thesis. Firstly, the extension of the ZCT to the fixed design framework with LRD noise arose with discussions with my initial Ph.D. supervisor Dr Marc Raimondo before his passing. The method is based on the technique proposed by Cheng and Raimondo (2008) but the mathematical analysis and development of the extension to the LRD framework and its minimax optimality is my own work. For the second extension which covers the random design regression framework, the main idea and premise arose through discussions with Assistant Professor Rafal Kulik. I wish it to be known that although the published versions of the work are in joint names with Assistant Professor Kulik, the great bulk of the mathematical analysis and development presented in this thesis is my own. Finally my current supervisor's contribution, Professor N. C. Weber, has been to provide direction in terms of checking the accuracy, clarity and style of the work.
APA, Harvard, Vancouver, ISO, and other styles
24

Tewes, Johannes [Verfasser], Herold [Gutachter] Dehling, and Holger [Gutachter] Dette. "Change-point tests and the bootstrap under long- and short-range dependence / Johannes Tewes ; Gutachter: Herold Dehling, Holger Dette ; Fakultät für Mathematik." Bochum : Ruhr-Universität Bochum, 2018. http://d-nb.info/1155588185/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zhu, Beijia. "Analysis of non-stationary (seasonal/cyclical) long memory processes." Thesis, Paris 1, 2013. http://www.theses.fr/2013PA010013/document.

Full text
Abstract:
La mémoire longue, aussi appelée la dépendance à long terme (LRD), est couramment détectée dans l’analyse de séries chronologiques dans de nombreux domaines, par exemple,en finance, en économétrie, en hydrologie, etc. Donc l’étude des séries temporelles à mémoire longue est d’une grande valeur. L’introduction du processus ARFIMA (fractionally autoregressive integrated moving average) établit une relation entre l’intégration fractionnaire et la mémoire longue, et ce modèle a trouvé son pouvoir de prévision à long terme, d’où il est devenu l’un des modèles à mémoire longue plus populaires dans la littérature statistique. Précisément, un processus à longue mémoire ARFIMA (p, d, q) est défini comme suit : Φ(B)(I − B)d (Xt − µ) = Θ(B)εt, t ∈ Z, où Φ(z) = 1 − ϕ1z − · · · − ϕpzp et Θ(z) = 1 + · · · + θ1zθpzq sont des polynômes d’ordre p et q, respectivement, avec des racines en dehors du cercle unité; εt est un bruit blanc Gaussien avec une variance constante σ2ε. Lorsque d ∈ (−1/2,1/2), {Xt} est stationnaire et inversible. Cependant, l’hypothèse a priori de la stationnarité des données réelles n’est pas raisonnable. Par conséquent, de nombreux auteurs ont fait leurs efforts pour proposer des estimateurs applicables au cas non-stationnaire. Ensuite, quelques questions se lèvent : quel estimateurs doit être choisi pour applications, et à quoi on doit faire attention lors de l’utilisation de ces estimateurs. Donc à l’aide de la simulation de Monte Carlo à échantillon fini, nous effectuons une comparaison complète des estimateurs semi-paramétriques, y compris les estimateurs de Fourier et les estimateurs d’ondelettes, dans le cadre des séries non-stationnaires. À la suite de cette étude comparative, nous avons que (i) sans bonnes échelles taillées, les estimateurs d’ondelettes sont fortement biaisés et ils ont généralement une performance inférieure à ceux de Fourier; (ii) tous les estimateurs étudiés sont robustes à la présence d’une tendance linéaire en temps dans le niveau de {Xt} et des effets GARCH dans la variance de {Xt}; (iii) dans une situation où le probabilité de transition est bas, la consistance des estimateurs quand même tient aux changements de régime dans le niveau de {Xt}, mais les changements ont une contamination au résultat d’estimation; encore, l’estimateur d’ondelettes de log-regression fonctionne mal dans ce cas; et (iv) en général, l’estimateur complètement étendu de Whittle avec un polynôme locale (fully-extended local polynomial Whittle Fourier estimator) est préféré pour une utilisation pratique, et cet estimateur nécessite une bande (i.e. un nombre de fréquences utilisés dans l’estimation) plus grande que les autres estimateurs de Fourier considérés dans ces travaux
Long memory, also called long range dependence (LRD), is commonly detected in the analysis of real-life time series data in many areas; for example, in finance, in econometrics, in hydrology, etc. Therefore the study of long-memory time series is of great value. The introduction of ARFIMA (fractionally autoregressive integrated moving average) process established a relationship between the fractional integration and long memory, and this model has found its power in long-term forecasting, hence it has become one of the most popular long-memory models in the statistical literature. Specifically, an ARFIMA(p,d,q) process X, is defined as follows: cD(B)(I - B)d X, = 8(B)c, , where cD(z)=l-~lz-•••-~pzP and 8(z)=1-B1z- .. •-Bqzq are polynomials of order $p$ and $q$, respectively, with roots outside the unit circle; and c, is Gaussian white noise with a constant variance a2 . When c" X, is stationary and invertible. However, the a priori assumption on stationarity of real-life data is not reasonable. Therefore many statisticians have made their efforts to propose estimators applicable to the non-stationary case. Then questions arise that which estimator should be chosen for applications; and what we should pay attention to when using these estimators. Therefore we make a comprehensive finite sample comparison of semi-parametric Fourier and wavelet estimators under the non-stationary ARFIMA setting. ln light of this comparison study, we have that (i) without proper scale trimming the wavelet estimators are heavily biased and the y generally have an inferior performance to the Fourier ones; (ii) ail the estimators under investigation are robust to the presence of a linear time trend in levels of XI and the GARCH effects in variance of XI; (iii) the consistency of the estimators still holds in the presence of regime switches in levels of XI , however, it tangibly contaminates the estimation results. Moreover, the log-regression wavelet estimator works badly in this situation with small and medium sample sizes; and (iv) fully-extended local polynomial Whittle Fourier (fextLPWF) estimator is preferred for a practical utilization, and the fextLPWF estimator requires a wider bandwidth than the other Fourier estimators
APA, Harvard, Vancouver, ISO, and other styles
26

Düker, Marie-Christine [Verfasser], Herold [Gutachter] Dehling, Vladas [Gutachter] Pipiras, and Jeannette [Gutachter] Woerner. "High-dimensional time series under long-range dependence and nonstationarity / Marie-Christine Düker ; Gutachter: Herold Dehling, Vladas Pipiras, Jeannette Woerner ; Fakultät für Mathematik." Bochum : Ruhr-Universität Bochum, 2020. http://d-nb.info/1217858253/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Lund, Isabelle Reis. "Contribuições à geração de tráfego fractal por meio da transformada wavelet." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/3/3142/tde-29112016-082547/.

Full text
Abstract:
Estudos mostraram que o tráfego nas redes de dados tanto locais quanto de grande área, possui propriedades fractais como dependência de longa duração - Long-Range Dependence (LRD) e auto-similaridade. Devido à heterogeneidade de aplicações nessas redes, os traces de tráfego podem apresentar dependência de longa duração - Long Range Dependence (LRD), dependência de curta duração - Short Range Dependence (SRD) ou uma mistura de LRD com SRD. Sendo assim, este trabalho tem como objetivo sintetizar séries temporais gaussianas com flexibilidade de processamento no plano tempo-frequência a serem inseridas num gerador de tráfego com as características estatísticas específicas do tráfego encontrado em redes por comutação de pacotes reais, como autossimilaridade, LRD e SRD. Para isto foram desenvolvidos dois métodos para síntese de séries temporais gaussianas com LRD e simultânea introdução de SRD em diferentes faixas de frequência: Discrete Wavelet Tansform (DWT) com mapa de variâncias e Discrete Wavelet Packet Tansform (DWPT). Estes métodos utilizaram o mapa de variâncias cujo conceito foi desenvolvido neste trabalho. A validação dos métodos foi feita através de análise estatística e comparação com resultados de séries geradas pelo método Discrete Wavelet Transfom (DWT) de Backar utilizado em [1]. Além disso, também foi validada a ideia de que a DWPT é mais interessante que a DWT por ser mais flexível e prover uma maior flexibilidade de processamento no plano tempo-frequência.
Studies demonstrated that the data network traffic of Local Area Network (LAN) and Wide Area Network has fractal properties as long range dependence (LRD) and self-similarity. The traffic traces can show long range dependence, short range dependence or the both behaviors because of applications heterogeneity in these networks. This work objective is to synthetisize gaussian time series with processor flexibility in the time-frequency plan to be inserted in a traffic generator with the specific statistical traffic characteristics of real packet networks such as selfsimilarity, long range dependence (LRD) and short range dependence (SRD). Two methods were developed for the gaussian time series with LRD and SRD synthesis: Discrete Wavelet Tansform (DWT) with variance map and Discrete Wavelet Packet Tansform (DWPT). These methods used the variance map which concept was developed in this work. The methods validation was done by statistic analysis and comparison with the time series generated by the B¨ackar Discrete Wavelet Transfom (DWT) used by [1]. Besides of this, the idea that the DWPT is more because of its processing flexibility in the time-frequency plan was validated.
APA, Harvard, Vancouver, ISO, and other styles
28

Inkaya, Alper. "Option Pricing With Fractional Brownian Motion." Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613736/index.pdf.

Full text
Abstract:
Traditional financial modeling is based on semimartingale processes with stationary and independent increments. However, empirical investigations on financial data does not always support these assumptions. This contradiction showed that there is a need for new stochastic models. Fractional Brownian motion (fBm) was proposed as one of these models by Benoit Mandelbrot. FBm is the only continuous Gaussian process with dependent increments. Correlation between increments of a fBm changes according to its self-similarity parameter H. This property of fBm helps to capture the correlation dynamics of the data and consequently obtain better forecast results. But for values of H different than 1/2, fBm is not a semimartingale and classical Ito formula does not exist in that case. This gives rise to need for using the white noise theory to construct integrals with respect to fBm and obtain fractional Ito formulas. In this thesis, the representation of fBm and its fundamental properties are examined. Construction of Wick-Ito-Skorohod (WIS) and fractional WIS integrals are investigated. An Ito type formula and Girsanov type theorems are stated. The financial applications of fBm are mentioned and the Black&
Scholes price of a European call option on an asset which is assumed to follow a geometric fBm is derived. The statistical aspects of fBm are investigated. Estimators for the self-similarity parameter H and simulation methods of fBm are summarized. Using the R/S methodology of Hurst, the estimations of the parameter H are obtained and these values are used to evaluate the fractional Black&
Scholes prices of a European call option with different maturities. Afterwards, these values are compared to Black&
Scholes price of the same option to demonstrate the effect of long-range dependence on the option prices. Also, estimations of H at different time scales are obtained to investigate the multiscaling in financial data. An outlook of the future work is given.
APA, Harvard, Vancouver, ISO, and other styles
29

Nguyen, Cu Ngoc. "Stochastic differential equations with long-memory input." Thesis, Queensland University of Technology, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
30

Bružaitė, Kristina. "Some linear models of time series with nonstationary long memory." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2009. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2009~D_20090312_091000-74094.

Full text
Abstract:
In the thesis is studied the limit distribution of partial sums of certain linear time series models with nonstationary long memory and certain statistics which involve partial sums processes. Philippe, Surgailis, Viano (2006, 2008) introduced time-varying fractionally integrated filters and studied the limit distribution of partial sums processes of these filters under finite variance set-up. In the thesis is studied the limit distribution of partial sums processes of infinite variance time-varying fractionally integrated filters. We assume that the innovations belong to the domain of attraction of an α-stable law (1<α<2) and show that the partial sums process converges to some α-stable self-similar process. In the thesis is studied the limit of the Increment Ratio (IR) statistic for Gaussian observations superimposed on a slowly varying deterministic trend. The IR statistic was introduced in Surgailis, Teyssière, Vaičiulis (2008) and its limit distribution was studied under the assumption of stationarity of observations. The IR statistic can be used for testing nonparametric hypotheses about d-integrated (-1/2 < d <3/2) behavior of the time series, which can be confused with deterministic trends and change-points. This statistic is written in terms of partial sums process and its limit is closely related to the limit of partial sums. In particularly, the consistency of the IR statistic uses asymptotic independence of distant partial sums, the fact is established in the... [to full text]
Disertacijoje ištirti trupmeniškai integruotų tiesinių laiko eilučių modelių su nestacionaria ilgąja atmintimi dalinių sumų ribiniai skirstiniai ir tam tikros statistikos, susijusios su dalinių sumų procesais. Philippe, Surgailis, Viano 2006 ir 2008 m. darbuose apibrėžė kintančius laike trupmeniškai integruotus filtrus su baigtine dispersija ir nagrinėjo jų dalinių sumų ribinius skirstinius. Disertacijoje ištirti tokių procesų dalinių sumų ribiniai skirstiniai, kai dispersija begalinė, laikant, kad inovacijos priklauso α–stabilaus dėsnio traukos sričiai (čia 1<α<2). Įrodyta, kad dalinių sumų procesas konverguoja į tam tikrą α–stabilų savastingąjį procesą su nestacionariais pokyčiais. Surgailis, Teyssière, Vaičiulis 2008 m. darbe įvedė pokyčių santykių arba IR (= Increment Ratio) statistiką ir parodė, kad IR statistika gali būti naudojama tikrinti neparametrinėms hipotezėms apie stacionariosios laiko eilutės ilgąją atmintį bei ilgosios atminties parametrą d. Disertacijoje apibendrinti šių autorių gauti rezultatai, t. y. įrodyta IR statistikos centrinė ribinė teorema ir gauti poslinkio įverčiai, kai stebiniai aprašomi tiesiniu laiko eilutės modeliu su trendu. Praplėsta laiko eilučių klasė, kuriai IR statistika yra pagrįsta, t. y. konverguoja į vidurkį.
APA, Harvard, Vancouver, ISO, and other styles
31

Augusto, Marcelo Lipas. "Contribuição para a análise de teletráfego com dependência de longa duração." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/3/3142/tde-01072009-143641/.

Full text
Abstract:
A utilização de modelos de teletrafego que contemplem caractersticas tais como autossimilaridade e dependencia de longa duraçao tem se mostrado cada vez mais como sendo ponto-chave na correta caracterizaçao do teletrafego Local Area Network (LAN) e Wide Area Network (WAN) [1, 2]. Tal caracterizaçao e necessaria para o monitoramento e controle de teletrafego em redes convergentes [3]. Nesse contexto, a questão da estimaçao precisa do parâmetro de autossimilaridade, denominado de parâmetro de Hurst, torna-se essencial. Entretanto, estudos comprovam que, alem da dependência de longa duraçao, redes WAN podem, não raramente, apresentar caractersticas mistas de dependência de longa e de curta duraçao [4, 5]. Enquanto vasta literatura cientca, tanto teorica como pratica, tem abordado com anco a questão da acuracia de diversos estimadores para o parâmetro de Hurst [6, 7, 8, 9], pouca atenção tem sido dada a questão da estimação deste parâmetro na presenca de dependência de curta duração. O presente trabalho de pesquisa concentrou-se no estudo dos metodos de estimaçao do parametro de Hurst baseados no espectro wavelet, em particular atraves do metodo de Abry-Veitch [10] { baseado na transformada Discrete Wavelet Transform (DWT) { e atraves do espectro obtido atraves da transformada Discrete Wavelet Packet Transform (DWPT). Os resultados baseados no metodo de Abry-Veitch demonstram que, atraves de um ajuste apropriado dos par^ametros de estimaçao, tal metodo permite uma estimaçao robusta na presenca de componentes com dependencia de curta duraçao, mesmo em situaçoes de mudanca de regime de tal componente, caracterstica desejavel para a estimaçao em tempo real do parametro de Hurst. Entretanto, a dispersao consideravel apresentada, em alguns casos, pelas estimativas do metodo de Abry-Veitch, motivou o estudo da utilizaçao do espectro wavelet obtido via transformada DWPT para realizaçao da estimaçao do parametro de Hurst. Os resultados indicam que a utilizaçao de tal transformada gera um espectro wavelet tal que e possvel detectar a presenca ou não de componentes com dependencia de curta duraçao. Ao final, os resultados da pesquisa realizada são sumarizados e utilizados em uma proposta de mecanismo de estimaçao do parametro de Hurst em tempo real, na presenca simultanea de componentes de dependencia de longa e curta duracão.
The use of network trac models that hold self-similar and long-range dependence characteristics have shown to be a key element on the correct characterization of Local Area Network (LAN) and Wide Area Network (WAN) network trac [1, 2]. Such characterization is necessary to monitor and control the network trac in converged networks [3]. In this context, the accurate estimation of the selfsimilarity parameter, named Hurst parameter, is a major issue. However, studies show that, besides the long-range dependence, WAN network trac may, not uncommonly, present mixed long and short-range dependence characteristics [4, 5]. While great part of either theoretical or practical scientic literature has been focused on the issue of Hurst parameter estimator accuracy [6, 7, 8, 9], little attention has been given to the estimation of such parameter in the presence of short-range dependence. This research work has focused on the study of the Hurst parameter estimation methods based on the wavelet spectrum, specially through the Abry-Veitch method [10] { which is based on the Discrete Wavelet Transform (DWT) transform { and through the wavelet spectrum based on the Discrete Wavelet Packet Transform (DWPT) transform. The results based on the Abry-Veitch method show that, through a suitable adjustment of the estimation parameters, such method yields a robust estimation in the presence of short-range dependence components, even in changing conditions of such component, a desirable characteristic for the real-time estimation of the Hurst parameter. However, the signi cant dispersion presented, occasionally, by the Abry-Veitch method estimates motivated the research of the usage of the wavelet spectrum obtained via DWPT transform to estimate the Hurst parameter. The results show that the usage of such transform generates such a wavelet spectrum that it is possible to detect whether short-range dependence components are present, or not, in the analyzed series. At the end, the research results are summarized and used to propose a realtime Hurst parameter estimation mechanism, in the presence of simultaneous long- and short-range dependence components.
APA, Harvard, Vancouver, ISO, and other styles
32

Kaklauskas, Liudvikas. "Study and application of methods of fractal processes monitoring in computer networks." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2012. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2012~D_20120809_105043-75157.

Full text
Abstract:
The field of the dissertation research is features of computer network packet traffic, the impact of network node features on traffic service, methods of real-time analysis of network traffic features and their application for dynamic prognostication of computer network packet traffic variance. The object of the research is the features of computer network packet traffic, the impact of network node features on computer network traffic service, methods of real-time network traffic features analysis and their application for dynamic prognostication of network traffic variances. The aim of work is to investigate fractal processes in computer networks, grounding on the results obtained to select methods suitable for real-time analysis of network traffic and to work out methods for real-time measurement of self-similarity as well as to apply it for perfection of computer networks service quality. Possibilities for mathematical modelling of network components, computer network packet traffic models and models using service theory instruments have been analysed. The package of network traffic features analysis has been worked out; it was used for analysis, assessment and comparison of methods for computer networks fractality and self-similarity research. For assessment of self-similarity of the network traffic time lines analysis, frequency/wave feature estimates, self-similarity analysis methods based on time line stability parameters estimators and assessed by the chaos theory... [to full text]
Disertacijos tyrimų sritis – kompiuterių tinklo paketinio srauto savybės, tinklo mazgo savybių įtaka srauto aptarnavimui, tinklo srauto savybių realaus laiku analizės metodai ir jų taikymas kompiuterių tinklo srauto kaitos dinaminiam prognozavimui. Tyrimų objektas – kompiuterių tinklo paketinio srauto savybės, tinklo mazgo savybių įtaka paketinio kompiuterių tinklo srauto aptarnavimui, realaus laiko tinklo srauto savybių analizės metodai ir jų taikymas tinklo srauto kaitos dinaminiam prognozavimui. Darbo tikslas – ištirti fraktalinius procesus kompiuterių tinkluose, remiantis gautais rezultatais parinkti metodus, tinkamus tinklo srauto analizei realiu laiku, ir sukurti savastingumo matavimo realiu laiku metodiką bei ją pritaikyti kompiuterių tinklų aptarnavimo kokybei gerinti. Išanalizuotos tinklo komponentų matematinio modeliavimo galimybės, kompiuterių tinklo paketinio srauto modeliai ir modeliai, naudojantys aptarnavimo teorijos instrumentus. Parengtas tinklo srauto savybių analizės paketas, panaudotas kompiuterių tinklų fraktališkumo ir savastingumo tyrimo metodams analizuoti, vertinti ir palyginti. Ištirti paketinio kompiuterių tinklo srauto laiko eilučių analizės, dažninių/banginių savybių įvertinimo, laiko eilutės stabilumo parametrų įverčiais grindžiami bei chaoso teorijos priemonėmis įvertinami savastingumo analizės metodai. Sudarytas tinklo srauto savastingumo realiu laiku analizės paketas, kurį naudojant savastingumo matavimui realiu laiku atrinktas robastinis... [toliau žr. visą tekstą]
APA, Harvard, Vancouver, ISO, and other styles
33

Bružaitė, Kristina. "Kai kurie tiesiniai laiko eilučių modeliai su nestacionaria ilgąja atmintimi." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2009. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2009~D_20090312_090927-47730.

Full text
Abstract:
Disertacijoje ištirti trupmeniškai integruotų tiesinių laiko eilučių modelių su nestacionaria ilgąja atmintimi dalinių sumų ribiniai skirstiniai ir tam tikros statistikos, susijusios su dalinių sumų procesais. Philippe, Surgailis, Viano 2006 ir 2008 m. darbuose apibrėžė kintančius laike trupmeniškai integruotus filtrus su baigtine dispersija ir nagrinėjo jų dalinių sumų ribinius skirstinius. Disertacijoje ištirti tokių procesų dalinių sumų ribiniai skirstiniai, kai dispersija begalinė, laikant, kad inovacijos priklauso α–stabilaus dėsnio traukos sričiai (čia 1<α<2). Įrodyta, kad dalinių sumų procesas konverguoja į tam tikrą α–stabilų savastingąjį procesą su nestacionariais pokyčiais. Surgailis, Teyssière, Vaičiulis 2008 m. darbe įvedė pokyčių santykių arba IR (= Increment Ratio) statistiką ir parodė, kad IR statistika gali būti naudojama tikrinti neparametrinėms hipotezėms apie stacionariosios laiko eilutės ilgąją atmintį bei ilgosios atminties parametrą d. Disertacijoje apibendrinti šių autorių gauti rezultatai, t. y. įrodyta IR statistikos centrinė ribinė teorema ir gauti poslinkio įverčiai, kai stebiniai aprašomi tiesiniu laiko eilutės modeliu su trendu. Praplėsta laiko eilučių klasė, kuriai IR statistika yra pagrįsta, t. y. konverguoja į vidurkį.
In the thesis is studied the limit distribution of partial sums of certain linear time series models with nonstationary long memory and certain statistics which involve partial sums processes. Philippe, Surgailis, Viano (2006, 2008) introduced time-varying fractionally integrated filters and studied the limit distribution of partial sums processes of these filters under finite variance set-up. In the thesis is studied the limit distribution of partial sums processes of infinite variance time-varying fractionally integrated filters. We assume that the innovations belong to the domain of attraction of an α-stable law (1<α<2) and show that the partial sums process converges to some α-stable self-similar process. In the thesis is studied the limit of the Increment Ratio (IR) statistic for Gaussian observations superimposed on a slowly varying deterministic trend. The IR statistic was introduced in Surgailis, Teyssière, Vaičiulis (2008) and its limit distribution was studied under the assumption of stationarity of observations. The IR statistic can be used for testing nonparametric hypotheses about d-integrated (-1/2 < d <3/2) behavior of the time series, which can be confused with deterministic trends and change-points. This statistic is written in terms of partial sums process and its limit is closely related to the limit of partial sums. In particularly, the consistency of the IR statistic uses asymptotic independence of distant partial sums, the fact is established in the... [to full text]
APA, Harvard, Vancouver, ISO, and other styles
34

Snguanyat, Ongorn. "Stochastic modelling of financial time series with memory and multifractal scaling." Thesis, Queensland University of Technology, 2009. https://eprints.qut.edu.au/30240/1/Ongorn_Snguanyat_Thesis.pdf.

Full text
Abstract:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
APA, Harvard, Vancouver, ISO, and other styles
35

Snguanyat, Ongorn. "Stochastic modelling of financial time series with memory and multifractal scaling." Queensland University of Technology, 2009. http://eprints.qut.edu.au/30240/.

Full text
Abstract:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
APA, Harvard, Vancouver, ISO, and other styles
36

Reaño, Jorge Luis González. "Ajuste de tráfego intrachip obtido por simulação no nível de transação a modelos de séries autossimilares." Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/3/3140/tde-22092014-123244/.

Full text
Abstract:
Este trabalho visa dar uma contribuição para o aumento de eficiência no fluxo de projeto de sistemas integrados, especificamente na avaliação de desempenho da comunicação entre os seus blocos componentes. É proposto o uso de modelagem e simulação de hardware em alto nível, no nível de transações, denominado TLM, para aproveitar a redução de esforço e tempo que se pode oferecer ao projeto de sistemas integrados, diferentemente de enfoques convencionais em níveis mais baixos de descrição, como o nível de registradores (RTL). É proposta uma forma de análise do tráfego intrachip produzido na comunicação de elementos do sistema, visando-se o uso dos resultados obtidos para descrição de geradores de tráfego. A principal contribuição deste trabalho é a proposta da análise de séries de tráfego obtido durante simulação de plataformas de hardware descritas no nível TLM usando-se métodos estatísticos conhecidos da área de estudo de séries temporais. A análise permite ao projetista ter maior compreensão da natureza estatística do tráfego intrachip, denominada dependência de curta ou longa duração (SRD e LRD), para o posterior ajuste de modelos usados na geração de séries sintéticas que representem tal natureza. Os resultados da análise mostraram que o tráfego obtido por simulação TLM tem natureza similar em relação ao da do tráfego obtido por simulação num nível mais baixo de abstração, do tipo de precisão por ciclos, indicando que o tráfego TLM pode ser usado para a representação do tráfego intrachip. Outra contribuição deste trabalho é a proposta de ajuste de modelos paramétricos autossimilares usando-se a decomposição da série de tráfego original, tendo sido feita uma comparação dos resultados desta com o ajuste convencional feito a modelos sem decomposição. Estas contribuições foram agrupadas dentro de uma metodologia detalhada, apresentada neste documento, para a qual experimentos foram realizados. Os resultados a partir das séries sintéticas autossimilares geradas pelos modelos estimados, apresentaram semelhança nos indicadores de SRD e LRD em relação às séries originais TLM, mostrando ser favorável o uso futuro destas séries sintéticas na implementação de geradores de tráfego.
It is objective of this work to make a contribution to improve the efficiency of the integrated systems design flow, specifically on the evaluation of communication performance between component blocks. The use of high level hardware modeling and simulation, at the transaction level, known as TLM, is proposed, in order to take advantage of the reduction of effort and time for the integrated system design; that in contrast to the traditional approaches, which use lower hardware description level, such as register transfer level (RTL). A methodology to evaluate the intra-chip traffic produced by the communication between system elements is proposed. The main contribution of this work is the analysis of traffic time series obtained by simulation of hardware platforms modeled in TLM, using well-known statistical methods for time series analysis. The analysis allows the system developer to understand the statistical nature of the intra-chip traffic, also known as short and long range dependence (SRD and LRD), for later adjustment and accurate representation of the traffic nature in synthetic series. The analysis results have shown that traffic traces obtained by TLM simulation has similar statistical nature as the traffic traces obtained at lower abstraction level, as cycle accurate type, which indicates that TLM traffic could be used to represent intrachip traffic. Another contribution of this work is a fitting procedure to auto similar parametric models thought the decomposition of the original traffic, and its comparison to the results of the conventional fitting, when applied to models that are not decomposed. These contributions were grouped and included in the detailed methodology presented in this document, being a series of experiments carried out. The results related to self-similar synthetic series, obtained from the fitted models, have shown similarity to the SRD and LRD indicators of the original TLM series, what favors the use of synthetic series future for the implementation of traffic generators.
APA, Harvard, Vancouver, ISO, and other styles
37

McVinish, Ross Stewart. "Stochastic analysis and approximation of fractional diffusion." Thesis, Queensland University of Technology, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
38

Huang, Changcheng Carleton University Dissertation Engineering Systems and Computer. "Long range dependent traffic: modeling, simulation and congestion control." Ottawa, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
39

Kotopoulos, Constantinos A. "Asymptotics of multi-buffered queueing systems with generalised processor sharing." Thesis, University of Essex, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.327066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Willert, Juliane [Verfasser]. "Contributions to change-point analysis under long-range dependencies / Juliane Willert." Hannover : Technische Informationsbibliothek und Universitätsbibliothek Hannover (TIB), 2012. http://d-nb.info/1024680010/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Lopez, Guerrero Miguel. "On network resource allocation using alpha-stable long-range dependent traffic models." Thesis, University of Ottawa (Canada), 2004. http://hdl.handle.net/10393/29136.

Full text
Abstract:
Recent studies suggest that networks should be designed taking into account the long-range dependence and high-variability properties of the traffic they carry. It has been proven in the past that these two statistical properties can be properly represented using traffic models based on alpha-stable self-similar stochastic processes. Assuming this traffic modeling approach, in this dissertation we propose and evaluate some techniques for resource allocation. We propose suitable envelope processes for the levels of bandwidth demand which allows us to develop static resource allocation schemes. The proposal is based on a generalization, to the alpha-stable case, of the concept of probabilistic envelope processes, which have been previously defined for simpler models. It is shown that, with this approach, we can simply and effectively deal with much of the argued complexity encountered in alpha-stable models and develop techniques for proper dimensioning of network elements. From our analysis it is concluded that the presence of heavy tails in the distribution of the traffic process has a severe impact on the requirements of network resource. For instance, the multiplexing gain is negatively affected, which directly impacts the scale economies expected by service providers. In order to cope with these issues, dynamic resource allocation is also considered. A dynamic prediction-based resource allocation method is introduced and evaluated. It is shown that it significantly improves network utilization over static resource allocation schemes in trade for some signaling and processing overhead. Although other schemes based on prediction have been proposed, we use a novel linear prediction algorithm for symmetric fractional stable noise. This approach is intended for some traffic classes whose marginal distribution exhibits a heavy tail. The linear prediction algorithm we use was recently introduced by other researchers, but has not been studied in detail. Therefore, its performance evaluation is also carried out. In addition to this study on the prediction-based approach, a dynamic resource allocation scheme based on envelope processes is also introduced and evaluated. We conclude that when alpha-stable models are properly used and interpreted, they let us accurately represent network traffic and therefore design and analyze reliable resource allocation mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
42

Schreiber, John Michael Medhi Deepankar Place Jerry. "Performance analysis of disk scheduling mechanisms with long-range dependent access requests." Diss., UMK access, 2006.

Find full text
Abstract:
Thesis (M.S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2006.
"A thesis in computer science." Typescript. Advisors: Deep Medhi and Jerry Place. Vita. Title from "catalog record" of the print edition Description based on contents viewed Nov. 9, 2007. Includes bibliographical references (leaves 59-[61]). Online version of the print edition.
APA, Harvard, Vancouver, ISO, and other styles
43

Pesee, Chatchai. "Stochastic modelling of financial processes with memory and semi-heavy tails." Thesis, Queensland University of Technology, 2005. https://eprints.qut.edu.au/16057/2/Chatchai%20Pesee%20Thesis.pdf.

Full text
Abstract:
This PhD thesis aims to study financial processes which have semi-heavy-tailed marginal distributions and may exhibit memory. The traditional Black-Scholes model is expanded to incorporate memory via an integral operator, resulting in a class of market models which still preserve the completeness and arbitragefree conditions needed for replication of contingent claims. This approach is used to estimate the implied volatility of the resulting model. The first part of the thesis investigates the semi-heavy-tailed behaviour of financial processes. We treat these processes as continuous-time random walks characterised by a transition probability density governed by a fractional Riesz- Bessel equation. This equation extends the Feller fractional heat equation which generates a-stable processes. These latter processes have heavy tails, while those processes generated by the fractional Riesz-Bessel equation have semi-heavy tails, which are more suitable to model financial data. We propose a quasi-likelihood method to estimate the parameters of the fractional Riesz- Bessel equation based on the empirical characteristic function. The second part considers a dynamic model of complete financial markets in which the prices of European calls and puts are given by the Black-Scholes formula. The model has memory and can distinguish between historical volatility and implied volatility. A new method is then provided to estimate the implied volatility from the model. The third part of the thesis considers the problem of classification of financial markets using high-frequency data. The classification is based on the measure representation of high-frequency data, which is then modelled as a recurrent iterated function system. The new methodology developed is applied to some stock prices, stock indices, foreign exchange rates and other financial time series of some major markets. In particular, the models and techniques are used to analyse the SET index, the SET50 index and the MAI index of the Stock Exchange of Thailand.
APA, Harvard, Vancouver, ISO, and other styles
44

Pesee, Chatchai. "Stochastic Modelling of Financial Processes with Memory and Semi-Heavy Tails." Queensland University of Technology, 2005. http://eprints.qut.edu.au/16057/.

Full text
Abstract:
This PhD thesis aims to study financial processes which have semi-heavy-tailed marginal distributions and may exhibit memory. The traditional Black-Scholes model is expanded to incorporate memory via an integral operator, resulting in a class of market models which still preserve the completeness and arbitragefree conditions needed for replication of contingent claims. This approach is used to estimate the implied volatility of the resulting model. The first part of the thesis investigates the semi-heavy-tailed behaviour of financial processes. We treat these processes as continuous-time random walks characterised by a transition probability density governed by a fractional Riesz- Bessel equation. This equation extends the Feller fractional heat equation which generates a-stable processes. These latter processes have heavy tails, while those processes generated by the fractional Riesz-Bessel equation have semi-heavy tails, which are more suitable to model financial data. We propose a quasi-likelihood method to estimate the parameters of the fractional Riesz- Bessel equation based on the empirical characteristic function. The second part considers a dynamic model of complete financial markets in which the prices of European calls and puts are given by the Black-Scholes formula. The model has memory and can distinguish between historical volatility and implied volatility. A new method is then provided to estimate the implied volatility from the model. The third part of the thesis considers the problem of classification of financial markets using high-frequency data. The classification is based on the measure representation of high-frequency data, which is then modelled as a recurrent iterated function system. The new methodology developed is applied to some stock prices, stock indices, foreign exchange rates and other financial time series of some major markets. In particular, the models and techniques are used to analyse the SET index, the SET50 index and the MAI index of the Stock Exchange of Thailand.
APA, Harvard, Vancouver, ISO, and other styles
45

Wingert, Simon [Verfasser]. "Essays on long memory estimation and testing for structural breaks under long-range dependent errors / Simon Wingert." Hannover : Gottfried Wilhelm Leibniz Universität Hannover, 2020. http://d-nb.info/1214367062/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Schwefel, Hans-Peter. "Performance analysis of intermediate systems serving aggregated on-off traffic with long-range dependent properties." [S.l. : s.n.], 2000. http://deposit.ddb.de/cgi-bin/dokserv?idn=96206937X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Tan, Phaik-Hoon. "Statistical Analysis of Long-Range Dependent Processes via a Stochastic Intensity Approach, with Applications in Networking." NCSU, 1999. http://www.lib.ncsu.edu/theses/available/etd-19990706-145610.

Full text
Abstract:

The objective of this research is to develop a flexible stochastic intensity-function model for traffic arrivals arising in an Ethernet local area network. To test some well-known Bellcore datasets for long-rangedependence or nonstationarity, a battery of statistical tests was applied---including a new extensionof the classical Priestley-Rao test for nonstationarity; and the results of this analysis revealedpronounced nonstationarity in all of the Bellcore datasets. To model such teletraffic arrivalprocesses accurately, a stochastic intensity function was formulated as a nonlinear extension of theCox regression model that incorporates a general time trend together with cyclic effects andpacket-size effects. The proposed intensity-function model has anexponential-polynomial-trigonometric form that includes a covariate representing the latest packet size. Maximum likelihoodestimates of the unknown continuous parameters of the stochastic intensity function areobtained numerically, and the degrees of the polynomial time and packet-size components aredetermined by a likelihood ratio test. Although this approach yielded excellent fits to the Bellcoredatasets, it also yielded the surprising conclusion that packet size has a negligible effect on thepacket arrival rate. A follow-up analysis of the packet-size process confirmed this conclusion andshed additional light on the packet-generation mechanism in Ethernet local area networks. This research also includes the development of procedures for simulating traffic processes having a stochastic intensity function of the proposed form. An extensive Monte Carlo performanceevaluation demonstrates the effectiveness of the proposed procedure for modeling and simulation ofteletraffic arrival processes.

APA, Harvard, Vancouver, ISO, and other styles
48

Ozdem, Mehmet. "Video Distribution Over Ip Networks." Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/12608187/index.pdf.

Full text
Abstract:
As applications like IPTV and VoD (Video on demand) are gaining popularity, it is becoming more important to study the behavior of video signals in the Internet access infrastructures such as ADSL and cable networks. Average delay, average jitter and packet loss in these networks affect the quality of service, hence transmission and access speeds need to be determined such that these parameters are minimized. In this study the behavior of the above mentioned IP networks under variable bit rate (VBR) video traffic is investigated. ns-2 simulator is used for this purpose and actual as well as artificially generated signals are applied to the networks under test. Variable bit rate (VBR) traffic is generated synthetically using ON/OFF sources with ON/OFF times taken from exponential or Pareto distributions. As VBR video shows long range dependence with a Hurst parameter between 0.5 and 1, this parameter was used as a metric to measure the accuracy of the synthetic sources. Two different topologies were simulated in this study: one similar to ADSL access networks and the other behaving like cable distribution network. The performance of the networks (delay, jitter and packet loss) under VBR video traffic and different access speeds were measured. According to the obtained results, minimum access speeds in order achieve acceptable quality video delivery to the customers were suggested.
APA, Harvard, Vancouver, ISO, and other styles
49

Betken, Annika [Verfasser], Herold [Gutachter] Dehling, and Holger [Gutachter] Dette. "Change-point analysis for long-range dependent time series / Annika Betken ; Gutachter: Herold Dehling, Holger Dette ; Fakultät für Mathematik." Bochum : Ruhr-Universität Bochum, 2018. http://d-nb.info/116194205X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Fares, Rasha Hamed Abdel Moaty. "Performance modelling and analysis of congestion control mechanisms for communication networks with quality of service constraints : an investigation into new methods of controlling congestion and mean delay in communication networks with both short range dependent and long range dependent traffic." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5435.

Full text
Abstract:
Active Queue Management (AQM) schemes are used for ensuring the Quality of Service (QoS) in telecommunication networks. However, they are sensitive to parameter settings and have weaknesses in detecting and controlling congestion under dynamically changing network situations. Another drawback for the AQM algorithms is that they have been applied only on the Markovian models which are considered as Short Range Dependent (SRD) traffic models. However, traffic measurements from communication networks have shown that network traffic can exhibit self-similar as well as Long Range Dependent (LRD) properties. Therefore, it is important to design new algorithms not only to control congestion but also to have the ability to predict the onset of congestion within a network. An aim of this research is to devise some new congestion control methods for communication networks that make use of various traffic characteristics, such as LRD, which has not previously been employed in congestion control methods currently used in the Internet. A queueing model with a number of ON/OFF sources has been used and this incorporates a novel congestion prediction algorithm for AQM. The simulation results have shown that applying the algorithm can provide better performance than an equivalent system without the prediction. Modifying the algorithm by the inclusion of a sliding window mechanism has been shown to further improve the performance in terms of controlling the total number of packets within the system and improving the throughput. Also considered is the important problem of maintaining QoS constraints, such as mean delay, which is crucially important in providing satisfactory transmission of real-time services over multi-service networks like the Internet and which were not originally designed for this purpose. An algorithm has been developed to provide a control strategy that operates on a buffer which incorporates a moveable threshold. The algorithm has been developed to control the mean delay by dynamically adjusting the threshold, which, in turn, controls the effective arrival rate by randomly dropping packets. This work has been carried out using a mixture of computer simulation and analytical modelling. The performance of the new methods that have.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography