Dissertations / Theses on the topic 'Time-series analysis – Mathematical models'

To see the other types of publications on this topic, follow the link: Time-series analysis – Mathematical models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Time-series analysis – Mathematical models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

黃鎮山 and Chun-shan Wong. "Statistical inference for some nonlinear time series models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31239444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wong, Chun-shan. "Statistical inference for some nonlinear time series models /." Hong Kong : University of Hong Kong, 1998. http://sunzi.lib.hku.hk/hkuto/record.jsp?B20715316.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cheung, King Chau. "Modelling multiple time series with missing observations." Thesis, Canberra, ACT : The Australian National University, 1993. http://hdl.handle.net/1885/133887.

Full text
Abstract:
This thesis introduces an approach to the state space modelling of time series that may possess missing observations. The procedure starts by estimating the autocovariance sequence using an idea proposed by Parzen(1963) and Stoffer(1986). Successive Hankel matrices are obtained via Autoregressive approximations. The rank of the Hankel matrix is determined by a singular value decomposition in conjunction with an appropriate model selection criterion . An in tern ally balanced state space realisation of the selected Hankel matrix provides initial estimate for maximum likelihood estimation. Finally, theoretical evaluation of the Fisher information matrix with missing observations is considered. The methodology is illustrated by applying the implied algorithm to real data. We consider modelling the white blood cell counts of a patient who has Leukaemia. Our modelling objective is to be able to describe the dynamic behaviour of the white blood cell counts.
APA, Harvard, Vancouver, ISO, and other styles
4

Jin, Shusong, and 金曙松. "Nonlinear time series modeling with application to finance and other fields." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B3199605X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chan, Yin-ting, and 陳燕婷. "Topics on actuarial applications of non-linear time series models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B32002099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lin, Zhongli, and 林中立. "On the statistical inference of some nonlinear time series models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2009. http://hub.hku.hk/bib/B43757625.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yiu, Fu-keung, and 饒富強. "Time series analysis of financial index." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1996. http://hub.hku.hk/bib/B31267804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kilminster, Devin. "Modelling dynamical systems via behaviour criteria." University of Western Australia. Dept. of Mathematics and Statistics, 2002. http://theses.library.uwa.edu.au/adt-WU2003.0029.

Full text
Abstract:
An important part of the study of dynamical systems is the fitting of models to time-series data. That is, given the data, a series of observations taken from a (not fully understood) system of interest, we would like to specify a model, a mathematical system which generates a sequence of “simulated” observations. Our aim is to obtain a “good” model — one that is in agreement with the data. We would like this agreement to be quantitative — not merely qualitative. The major subject of this thesis is the question of what good quantitative agreement means. Most approaches to this question could be described as “predictionist”. In the predictionist approach one builds models by attempting to answer the question, “given that the system is now here, where will it be next?” The quality of the model is judged by the degree to which the states of the model and the original system agree in the near future, conditioned on the present state of the model agreeing with that of the original system. Equivalently, the model is judged on its ability to make good short-term predictions on the original system. The main claim of this thesis is that prediction is often not the most appropriate criterion to apply when fitting models. We show, for example, that one can have models that, while able to make good predictions, have long term (or free-running) behaviour bearing little resemblance to that exhibited in the original time-series. We would hope to be able to use our models for a wide range of purposes other than just prediction — certainly we would like our models to exhibit good free-running behaviour. This thesis advocates a “behaviourist” approach, in which the criterion for a good model is that its long-term behaviour matches that exhibited by the data. We suggest that the behaviourist approach enjoys a certain robustness over the predictionist approaches. We show that good predictors can often be very poorly behaved, and suggest that well behaved models cannot perform too badly at the task of prediction. The thesis begins by comparing the predictionist and behaviourist approaches in the context of a number of simplified model-building problems. It then presents a simple theory for the understanding of the differences between the two approaches. Effective methods for the construction of well-behaved models are presented. Finally, these methods are applied to two real-world problems — modelling of the response of a voltage-clamped squid “giant” axon, and modelling of the “yearly sunspot number”.
APA, Harvard, Vancouver, ISO, and other styles
9

Thyer, Mark Andrew. "Modelling long-term persistence in hydrological time series." Diss., 2000, 2000. http://www.newcastle.edu.au/services/library/adt/public/adt-NNCU20020531.035349/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Rivera, Pablo Marshall. "Analysis of a cross-section of time series using structural time series models." Thesis, London School of Economics and Political Science (University of London), 1990. http://etheses.lse.ac.uk/13/.

Full text
Abstract:
This study deals with multivariate structural time series models, and in particular, with the analysis and modelling of cross-sections of time series. In this context, no cause and effect relationships are assumed between the time series, although they are subject to the same overall environment. The main motivations in the analysis of cross-sections of time series are (i) the gains in efficiency in the estimation of the irregular, trend and seasonal components; and (ii) the analysis of models with common effects. The study contains essentially two parts. The first one considers models with a general specification for the correlation of the irregular, trend and seasonal components across the time series. Four structural time series models are presented, and the estimation of the components of the time series, as well as the estimation of the parameters which define this components, is discussed. The second part of the study deals with dynamic error components models where the irregular, trend and seasonal components are generated by common, as well as individual, effects. The extension to models for multivariate observations of cross-sections is also considered. Several applications of the methods studied are presented. Particularly relevant is an econometric study of the demand for energy in the U. K.
APA, Harvard, Vancouver, ISO, and other styles
11

Kwan, Chun-kit, and 關進傑. "Statistical inference for some financial time series models with conditional heteroscedasticity." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B39794027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Shah, Nauman. "Statistical dynamical models of multivariate financial time series." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:428015e6-8a52-404e-9934-0545c80da4e1.

Full text
Abstract:
The last few years have witnessed an exponential increase in the availability and use of financial market data, which is sampled at increasingly high frequencies. Extracting useful information about the dependency structure of a system from these multivariate data streams has numerous practical applications and can aid in improving our understanding of the driving forces in the global financial markets. These large and noisy data sets are highly non-Gaussian in nature and require the use of efficient and accurate interaction measurement approaches for their analysis in a real-time environment. However, most frequently used measures of interaction have certain limitations to their practical use, such as the assumption of normality or computational complexity. This thesis has two major aims; firstly, to address this lack of availability of suitable methods by presenting a set of approaches to dynamically measure symmetric and asymmetric interactions, i.e. causality, in multivariate non-Gaussian signals in a computationally efficient (online) framework, and secondly, to make use of these approaches to analyse multivariate financial time series in order to extract interesting and practically useful information from financial data. Most of our proposed approaches are primarily based on independent component analysis, a blind source separation method which makes use of higher-order statistics to capture information about the mixing process which gives rise to a set of observed signals. Knowledge about this information allows us to investigate the information coupling dynamics, as well as to study the asymmetric flow of information, in multivariate non-Gaussian data streams. We extend our multivariate interaction models, using a variety of statistical techniques, to study the scale-dependent nature of interactions and to analyse dependencies in high-dimensional systems using complex coupling networks. We carry out a detailed theoretical, analytical and empirical comparison of our proposed approaches with some other frequently used measures of interaction, and demonstrate their comparative utility, efficiency and accuracy using a set of practical financial case studies, focusing primarily on the foreign exchange spot market.
APA, Harvard, Vancouver, ISO, and other styles
13

Gurung, Ai Bahadur. "Analysis and prediction of hydrometeorological time series by dynamical system approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31240203.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Lu, Zhen Cang. "Price forecasting models in online flower shop implementation." Thesis, University of Macau, 2017. http://umaclib3.umac.mo/record=b3691395.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Cheng, Xixin, and 程細辛. "Mixture time series models and their applications in volatility estimation and statistical arbitrage trading." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40988053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Fok, Carlotta Ching Ting 1973. "Approximating periodic and non-periodic trends in time-series data." Thesis, McGill University, 2002. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79765.

Full text
Abstract:
Time-series data that reflect a periodic pattern are often used in psychology. In personality psychology, Brown and Moskowitz (1998) used spectral analysis to study whether fluctuations in the expression of four interpersonal behaviors show a cyclical pattern. Spline smoothing had also been used in the past to track the non-periodic trend, but no research has yet been done that combines spectral analysis and spline smoothing. The present thesis describes a new model which combines these two techniques to capture both periodic and non-periodic trends in the data.
The new model is then applied to Brown and Moskowitz's time-series data to investigate the long-term evolution to the four interpersonal behaviors, and to the GDP data to examine the periodic and non-periodic pattern for the GDP values of the 16 countries. Finally, the extent to which the model is accurate is tested using simulated data.
APA, Harvard, Vancouver, ISO, and other styles
17

Barbosa, Emanuel Pimentel. "Dynamic Bayesian models for vector time series analysis & forecasting." Thesis, University of Warwick, 1989. http://wrap.warwick.ac.uk/34817/.

Full text
Abstract:
This thesis considers the Bayesian analysis of general multivariate DLM's (Dynamic Linear Models) for vector time series forecasting where the observational variance matrices are unknown. This extends considerably some previous work based on conjugate analysis for a special sub—class of vector DLM's where all marginal univariate models follow the same structure. The new methods developed in this thesis, are shown to have a better performance than other competing approaches to vector DLM analysis, as for instance, the one based on the Student t filter. Practical aspects of implementation of the new methods, as well as some theoretical properties are discussed, further model extensions are considered, including non—linear models and some applications with real and simulated data are provided.
APA, Harvard, Vancouver, ISO, and other styles
18

Yan, Ka-lok, and 忻嘉樂. "Time series regression modelling of air quality data in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B31252990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Nakamura, Tomomichi. "Modelling nonlinear time series using selection methods and information criteria." University of Western Australia. School of Mathematics and Statistics, 2004. http://theses.library.uwa.edu.au/adt-WU2004.0085.

Full text
Abstract:
[Truncated abstract] Time series of natural phenomena usually show irregular fluctuations. Often we want to know the underlying system and to predict future phenomena. An effective way of tackling this task is by time series modelling. Originally, linear time series models were used. As it became apparent that nonlinear systems abound in nature, modelling techniques that take into account nonlinearity in time series were developed. A particularly convenient and general class of nonlinear models is the pseudolinear models, which are linear combinations of nonlinear functions. These models can be obtained by starting with a large dictionary of basis functions which one hopes will be able to describe any likely nonlinearity, selecting a small subset of it, and taking a linear combination of these to form the model. The major component of this thesis concerns how to build good models for nonlinear time series. In building such models, there are three important problems, broadly speaking. The first is how to select basis functions which reflect the peculiarities of the time series as much as possible. The second is how to fix the model size so that the models can reflect the underlying system of the data and the influences of noise included in the data are removed as much as possible. The third is how to provide good estimates for the parameters in the basis functions, considering that they may have significant bias when the noise included in the time series is significant relative to the nonlinearity. Although these problems are mentioned separately, they are strongly interconnected
APA, Harvard, Vancouver, ISO, and other styles
20

Maharesi, Retno. "Modelling time series using time varying coefficient autoregressive models : with application to several data sets." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1994. https://ro.ecu.edu.au/theses/1099.

Full text
Abstract:
In this thesis the state space approach and the Kalman recursions are used for modelling univariate time series data. The models that are examined in this thesis are time varying Coefficient Autoregressive models, which can be represented in state space form. The coefficients are assumed to change according to a stationary process, a non-stationary process or a random process. In order to be able to estimate these changing unknown coefficients, they will be treated as state variables and the equation describing the changes of the state variables will be given by the state equation. The model can then be expressed in the form of a measurement equation. The parameters of the model, which include the transition matrix T, the covariance matrices of the random terms in the state equation and the measurement equation denoted respectively by Q and R will be obtained using the EM algorithm developed by Shumway and Stoffer (1982). Other models considered for comparison in this thesis are the Box-Jenkins and Harvey's Structural models. The results of model fitting are illustrated by applying these three models to three special data sets. These results are compared to investigate whether the time varying coefficients model can provide a better fit, and, where appropriate, a suitable data -transformation is applied to the data sets in order to get a fit of the time varying coefficient autoregressive model.
APA, Harvard, Vancouver, ISO, and other styles
21

McCloud, Nadine. "Model misspecification theory and applications /." Diss., Online access via UMI:, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
22

Hay, John Leslie. "Statistical modelling for non-Gaussian time series data with explanatory variables." Thesis, Queensland University of Technology, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
23

Aboagye-Sarfo, Patrick. "Time series analysis of HIV incidence cases in Ghana : trends, predictions and impact of interventions." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2009. https://ro.ecu.edu.au/theses/1889.

Full text
Abstract:
The HIV/AIDS epidemic is one of the world’s leading causes of death, particularly in sub-Saharan African nations like Ghana, and threatens socio-economic development in many developing countries. In this thesis Ghanaian HIV data, comprising monthly number of serologically confirmed reported new HIV cases since 1996, was subdivided into northern and southern sectors based on the geographical location of the ten administrative regions. Potential bias in the collection is considered given the strategic location of the two specialist teaching hospital, one in each sector, which receive referrals from the regions. Time series modelling was applied to the monthly number of new HIV cases in each sector. Moving average of time series analysis of equal weight was applied to determine the trend for cases of incidence of HIV infection in the northern and southern sectors while Box-Jenkins modelling identification and Holt’s (double) exponential smoothing modelling methods were employed to predict of new incidence of HIV cases for both sectors in respect to sex and age groups. The effectiveness of three existing major interventions was examined using intervention modelling whereas cointegration modelling was used to determine the long-run impact of condom utilisation on the incidence cases of HIV infection. The trend analysis and predictions for the next three years reveal a slow increase in the number of new incidence of HIV cases. Although, various interventions have influenced the number of cases of HIV infection, the magnitude of impact fluctuated and declined with time. The analysis of the long-run impact of condom utilisation, on the reduction of new HIV incidence cases, indicates that new cases of infection will actually increase monthly by factor of 0.4 -0.6 for every 1000 condoms issued. These perplexing results may be because issuing of condoms does not ensure usage.
APA, Harvard, Vancouver, ISO, and other styles
24

Button, Peter. "Models for ocean waves." Master's thesis, University of Cape Town, 1988. http://hdl.handle.net/11427/14299.

Full text
Abstract:
Includes bibliography.
Ocean waves represent an important design factor in many coastal engineering applications. Although extreme wave height is usually considered the single most important of these factors there are other important aspects that require consideration. These include the probability distribution of wave heights, the seasonal variation and the persistence, or duration, of calm and storm periods. If one is primarily interested in extreme wave height then it is possible to restrict one's attention to events which are sufficiently separated in time to be effectively independently (and possibly even identically) distributed. However the independence assumption is not tenable for the description of many other aspects of wave height behaviour, such as the persistence of calm periods. For this one has to take account of the serial correlation structure of observed wave heights, the seasonal behaviour of the important statistics, such as mean and standard deviation, and in fact the entire seasonal probability distribution of wave heights. In other words the observations have to be regarded as a time series.
APA, Harvard, Vancouver, ISO, and other styles
25

Coroneo, Laura. "Essays on modelling and forecasting financial time series." Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210284.

Full text
Abstract:
This thesis is composed of three chapters which propose some novel approaches to model and forecast financial time series. The first chapter focuses on high frequency financial returns and proposes a quantile regression approach to model their intraday seasonality and dynamics. The second chapter deals with the problem of forecasting the yield curve including large datasets of macroeconomics information. While the last chapter addresses the issue of modelling the term structure of interest rates.

The first chapter investigates the distribution of high frequency financial returns, with special emphasis on the intraday seasonality. Using quantile regression, I show the expansions and shrinks of the probability law through the day for three years of 15 minutes sampled stock returns. Returns are more dispersed and less concentrated around the median at the hours near the opening and closing. I provide intraday value at risk assessments and I show how it adapts to changes of dispersion over the day. The tests performed on the out-of-sample forecasts of the value at risk show that the model is able to provide good risk assessments and to outperform standard Gaussian and Student’s t GARCH models.

The second chapter shows that macroeconomic indicators are helpful in forecasting the yield curve. I incorporate a large number of macroeconomic predictors within the Nelson and Siegel (1987) model for the yield curve, which can be cast in a common factor model representation. Rather than including macroeconomic variables as additional factors, I use them to extract the Nelson and Siegel factors. Estimation is performed by EM algorithm and Kalman filter using a data set composed by 17 yields and 118 macro variables. Results show that incorporating large macroeconomic information improves the accuracy of out-of-sample yield forecasts at medium and long horizons.

The third chapter statistically tests whether the Nelson and Siegel (1987) yield curve model is arbitrage-free. Theoretically, the Nelson-Siegel model does not ensure the absence of arbitrage opportunities. Still, central banks and public wealth managers rely heavily on it. Using a non-parametric resampling technique and zero-coupon yield curve data from the US market, I find that the no-arbitrage parameters are not statistically different from those obtained from the Nelson and Siegel model, at a 95 percent confidence level. I therefore conclude that the Nelson and Siegel yield curve model is compatible with arbitrage-freeness.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
26

Van, Zyl Verena Helen. "Searching for histogram patterns due to macroscopic fluctuations in financial time series." Thesis, Stellenbosch : University of Stellenbosch, 2007. http://hdl.handle.net/10019.1/3078.

Full text
Abstract:
Thesis (MComm (Business Management))--University of Stellenbosch, 2007.
ENGLISH ABSTRACT: his study aims to investigate whether the phenomena found by Shnoll et al. when applying histogram pattern analysis techniques to stochastic processes from chemistry and physics are also present in financial time series, particularly exchange rate and index data. The phenomena are related to fine structure of non-smoothed frequency distributions drawn from statistically insufficient samples of changes and their patterns in time. Shnoll et al. use the notion of macroscopic fluctuations to explain the behaviour of sequences of histograms. Histogram patterns in time adhere to several laws that could not be detected when using time series analysis methods. In this study general approaches are reviewed that may be used to model financial markets and the volatility of price processes in particular. Special emphasis is placed on the modelling of highfrequency data sets and exchange rate data. Following previous studies of the Shnoll phenomena from other fields, different steps of the histogram sequence analysis are carried out to determine whether the findings of Shnoll et al. could also be applied to financial market data. The findings of this thesis widen the understanding of time varying volatility and can aid in financial risk measurement and management. Outcomes of the study include an investigation of time series characteristics in terms of the formation of discrete states, the detection of the near zone effect as proclaimed by Shnoll et al., the periodic recurrence of histogram shapes as well as the synchronous variation in data sets measured in the same time intervals.
APA, Harvard, Vancouver, ISO, and other styles
27

Farag, Saarah A. "A comparison of advanced time series models for environmental dependent stock recruitment of the western rock lobster." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 1998. https://ro.ecu.edu.au/theses/997.

Full text
Abstract:
Time series models have been applied in many areas including economics, stuck recruitment and the environment. Most environmental time series involve highly correlated dependent variables, which makes it difficult to apply conventional regression analysis, Traditionally, regression analysis has been applied to the environmental dependent stock and recruitment relationships for crustacean species in Western Australian fisheries. Alternative models, such as transfer function models and state space models have the potential to provide unproved forecasts for these types of data sets. This dissertation will explore the application of regression models, transfer function models, and state space models to modelling the puerulus stage of the western rock lobster (Panulirus Cynus) in the fisheries of Western Australia. The transfer function models are consulted to examining the influences of the environment on crustacean species and can be used where correlated variables are involved. These models aim at producing short-term forecasts that may help in the management of the fisheries. In comparison with regression models, TFM models gave better forecast values with state space models given the forecast values in the first two years. Overall, it was shown that environmental effects, westerly winds and the Leeuwin Current, have a significant effect on the puerulus settlement for Dongara and Alkimos. It was also shown that westerly winds and spawning stock have a significant effect on the puerulus settlement at the Abrolhos Islands.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhu, Jia Jun. "A language for financial chart patterns and template-based pattern classification." Thesis, University of Macau, 2018. http://umaclib3.umac.mo/record=b3950603.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Al, zghool Raed Ahmad Hasan. "Estimation for state space models quasi-likelihood and asymptotic quasi-likelihood approaches /." Access electronically, 2008. http://ro.uow.edu.au/theses/91.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Chong, Siu-yung. "Comparison of estimates of autoregressive models with superimposed errors." Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22752997.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

莊少容 and Siu-yung Chong. "Comparison of estimates of autoregressive models with superimposed errors." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31224246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

YU, CHUNG-CHYI. "FINITE-ELEMENT ANALYSIS OF TIME-DEPENDENT CONVECTION DIFFUSION EQUATIONS (PETROV-GALERKIN)." Diss., The University of Arizona, 1986. http://hdl.handle.net/10150/183930.

Full text
Abstract:
Petrov-Galerkin finite element methods based on time-space elements are developed for the time-dependent multi-dimensional linear convection-diffusion equation. The methods introduce two parameters in conjunction with perturbed weighting functions. These parameters are determined locally using truncation error analysis techniques. In the one-dimensional case, the new algorithms are thoroughly analyzed for convergence and stability properties. Numerical schemes that are second order in time, third order in space and stable when the Courant number is less than or equal to one are produced. Extensions of the algorithm to nonlinear Navier-Stokes equations are investigated. In this case, it is found more efficient to use a Petrov-Galerkin method based on a one parameter perturbation and a semi-discrete Petrov-Galerkin formulation with a generalized Newmark algorithm in time. The algorithm is applied to the two-dimensional simulation of natural convection in a horizontal circular cylinder when the Boussinesq approximation is valid. New results are obtained for this problem which show the development of three flow regimes as the Rayleigh number increases. Detailed calculations for the fluid flow and heat transfer in the cylinder for the different regimes as the Rayleigh number increases are presented.
APA, Harvard, Vancouver, ISO, and other styles
33

Kohers, Gerald. "The use of neural networks in the combining of time series forecasts with differential penalty costs." Diss., Virginia Tech, 1993. http://hdl.handle.net/10919/40086.

Full text
Abstract:
The need for accurate forecasting and its potential benefits are well established in the literature. Virtually all individuals and organizations have at one time or another made decisions based on forecasts of future events. This widespread need for accurate predictions has resulted in considerable growth in the science of forecasting. To a large degree, practitioners are heavily dependent on academicians for generating new and improved forecasting techniques. In response to an increasingly dynamic environment, diverse and complex forecasting methods have been proposed to more accurately predict future events. These methods, which focus on the different characteristics of historical data, have ranged in complexity from simplistic to very sophisticated mathematical computations requiring a high level of expertise. By combining individual techniques to form composite forecasts in order to improve on the forecasting accuracy, researchers have taken advantage of the various strengths of these techniques. A number of combining methods have proven to yield better forecasts than individual methods, with the complexity of the various combining methods ranging from a simple average to quite complex weighting schemes. The focus of this study is to examine the usefulness of neural networks in composite forecasting. Emphasis is placed on the effectiveness of two neural networks (i.e., a backpropagation neural network and a modular neural network) relative to three traditional composite models (i.e., a simple average, a constrained mathematical programming model, and an unconstrained mathematical programming model) in the presence of four penalty cost functions for forecasting errors. Specifically, the overall objective of this study is to compare the shortterm predictive ability of each of the five composite forecasting techniques on various first-order autoregressive models, taking into account penalty cost functions representing four different situations. The results of this research suggest that in the vast majority of scenarios examined in this study, the neural network model clearly outperformed the other composite models.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
34

Guo, Zigang, and 郭自剛. "Optimization of stochastic vehicle routing with soft time windows." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B36758255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Xiang, and 王翔. "Model order reduction of time-delay systems with variational analysis." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B46604236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Srisurichan, Sukanlaya. "Time series modelling of the environmental factors affecting the daily catch rate of western rock lobster." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2001. https://ro.ecu.edu.au/theses/1511.

Full text
Abstract:
The western rock lobster fishery is one of the most significant and valuable single species fisheries in Australia and in the world. It generates a gross commercial value of $200-300 million dollars per year for the economy of Western Australia. The impact of environmental factors on the daily catch rate of the western rock lobster is of particular interest to the W.A. Marine Research Laboratories, at the Ministry of Fisheries, Western Australia. Considerable time and effort has been invested into building and developing suitable models to measure such impact on this fishery. While past research has focussed on monthly or seasonal data, this study investigated appropriate time series analyses to model the effect of major environmental factors such as lunar cycle, swell, and sea water temperature on the daily catch rate data of the western rock lobster at different depths. The variation in western rock lobster daily catch rate data for two periods ("whites" and "reds") and four categories (undersize, legal size, spawner, and setose ), was examined for three management zones, A, B, and C. Regression and transfer function models for relationships between catch rates and environmental data were considered and compared. Results show that the lunar cycle especially the presence of the full moon and the swell has a significant impact on the daily catch rates of the Western rock lobster. The results of this research assist in the development of improved models to support the management of this very valuable resource.
APA, Harvard, Vancouver, ISO, and other styles
37

Lowry, Matthew C. "A new approach to the train algorithm for distributed garbage collection." Title page, table of contents and abstract only, 2004. http://hdl.handle.net/2440/37710.

Full text
Abstract:
This thesis describes a new approach to achieving high quality distributed garbage collection using the Train Algorithm. This algorithm has been investigated for its ability to provide high quality collection in a variety of contexts, including persistent object systems and distributed object systems. Prior literature on the distributed Train Algorithm suggests that safe, complete, asynchronous, and scalable collection can be attained, however an approach that achieves this combination of behaviour has yet to emerge. The mechanisms and policies described in this thesis are unique in their ability to exploit the distributed Train Algorithm in a manner that displays all four desirable qualities. Further the mechanisms allow any number of mutator and collector threads to operate concurrently within a site; this is also a unique property amongst train-based mechanisms (distributed or otherwise). Confidence in the quality of the approach promoted in this thesis is obtained via a top-down approach. Firstly a concise behavioural model is introduced to capture fundamental requirements for safe and complete behaviour from train-based collection mechanisms. The model abstracts over the techniques previously introduced under the banner of the Train Algorithm. It serves as a self- contained template for correct train-based collection that is independent of a target object system for deployment of the algorithm. Secondly a means to instantiate the model in a distributed object system is described. The instantiation includes well-established techniques from prior literature, and via the model these are correctly refined and reorganised with new techniques to achieve asynchrony, scalability, and support for concurrency. The result is a flexible approach that allows a distributed system to exhibit a variety of local collection mechanisms and policies, while ensuring their interaction is safe, complete, asynchronous, and scalable regardless of the local choices made by each site. Additional confidence in the properties of the new approach is obtained from implementation within a distributed object system simulation. The implementation provides some insight into the practical issues that arise through the combination of distribution, concurrent execution within sites, and train-based collection. Executions of the simulation system are used to verify that safe collection is observed at all times, and obtain evidence that asynchrony, scalability, and concurrency can be observed in practice.
Thesis (Ph.D.)--School of Computer Science, 2004.
APA, Harvard, Vancouver, ISO, and other styles
38

洪觀宇 and Roy Hung. "Time domain analysis and synthesis of cello tones based on perceptual quality and playing gestures." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31215348.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Torku, Thomas K. "Takens Theorem with Singular Spectrum Analysis Applied to Noisy Time Series." Digital Commons @ East Tennessee State University, 2016. https://dc.etsu.edu/etd/3013.

Full text
Abstract:
The evolution of big data has led to financial time series becoming increasingly complex, noisy, non-stationary and nonlinear. Takens theorem can be used to analyze and forecast nonlinear time series, but even small amounts of noise can hopelessly corrupt a Takens approach. In contrast, Singular Spectrum Analysis is an excellent tool for both forecasting and noise reduction. Fortunately, it is possible to combine the Takens approach with Singular Spectrum analysis (SSA), and in fact, estimation of key parameters in Takens theorem is performed with Singular Spectrum Analysis. In this thesis, we combine the denoising abilities of SSA with the Takens theorem approach to make the manifold reconstruction outcomes of Takens theorem less sensitive to noise. In particular, in the course of performing the SSA on a noisy time series, we branch of into a Takens theorem approach. We apply this approach to a variety of noisy time series.
APA, Harvard, Vancouver, ISO, and other styles
40

Zhang, You-Kuan. "A quasilinear theory of time-dependent nonlocal dispersion in geologic media." Diss., The University of Arizona, 1990. http://hdl.handle.net/10150/185039.

Full text
Abstract:
A theory is presented which accounts for a particular aspect of nonlinearity caused by the deviation of plume "particles" from their mean trajectory in three-dimensional, statistically homogeneous but anisotropic porous media under an exponential covariance of log hydraulic conductivities. Quasilinear expressions for the time-dependent nonlocal dispersivity and spatial covariance tensors of ensemble mean concentration are derived, as a function of time, variance σᵧ² of log hydraulic conductivity, degree of anisotropy, and flow direction. One important difference between existing linear theories and the new quasilinear theory is that in the former transverse nonlocal dispersivities tend asymptotically to zero whereas in the latter they tend to nonzero Fickian asymptotes. Another important difference is that while all existing theories are nominally limited to situations where σᵧ² is less than 1, the quasilinear theory is expected to be less prone to error when this restriction is violated because it deals with the above nonlinearity without formally limiting σᵧ². The theory predicts a significant drop in dimensionless longitudinal dispersivity when σᵧ² is large as compared to the case where σᵧ² is small. As a consequence of this drop the real asymptotic longitudinal dispersivity, which varies in proportion to σᵧ² when σᵧ² is small, is predicted to vary as σᵧ when σᵧ² is large. The dimensionless transverse dispersivity also drops significantly at early dimensionless time when σᵧ² is large. At late time this dispersivity attains a maximum near σᵧ² = 1, varies asymptotically at a rate proportional to σᵧ² when σᵧ² is small, and appears inversely proportional to σᵧ when σᵧ² is large. The actual asymptotic transverse dispersivity varies in proportion to σᵧ⁴ when σᵧ² is small and appears proportional to σᵧ when σᵧ² is large. One of the most interesting findings is that when the mean seepage velocity vector μ is at an angle to the principal axes of statistical anisotropy, the orientation of longitudinal spread is generally offset from μ toward the direction of largest log hydraulic conductivity correlation scale. When local dispersion is active, a plume starts elongating parallel to μ. With time the long axis of the plume rotates toward the direction of largest correlation scale, then rotates back toward μ, and finally stabilizes asymptotically at a relatively small angle of deflection. Application of the theory to depth-averaged concentration data from the recent tracer experiment at Borden, Ontario, yields a consistent and improved fit without any need for parameter adjustment.
APA, Harvard, Vancouver, ISO, and other styles
41

Lu, Jin 1959. "Degradation processes and related reliability models." Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=39952.

Full text
Abstract:
Reliability characteristics of new devices are usually demonstrated by life testing. When lifetime data are sparse, as is often the case with highly reliable devices, expensive devices, and devices for which accelerated life testing is not feasible, reliability models that are based on a combination of degradation and lifetime data represent an important practical approach. This thesis presents reliability models based on the combination of degradation and lifetime data or degradation data alone, with and without the presence of covariates. Statistical inference methods associated with the models are also developed.
The degradation process is assumed to follow a Wiener process. Failure is defined as the first passage of this process to a fixed barrier. The degradation data of a surviving item are described by a truncated Wiener process and lifetimes follow an inverse Gaussian distribution. Models are developed for three types of data structures that are often encountered in reliability studies, terminal point data (a combination of degradation and lifetime data) and mixed data (an extended case of terminal point data); conditional degradation data; and covariate data.
Maximum likelihood estimators (MLEs) are derived for the parameters of each model. Inferences about the parameters are based on asymptotic properties of the MLEs and on the likelihood ratio method. An analysis of deviance is presented and approximate pivotal quantities are derived for the drift and variance parameters. Predictive density functions for the lifetime and the future degradation level of either a surviving item or a new item are obtained using empirical Bayes methods. Case examples are given to illustrate the applications of the models.
APA, Harvard, Vancouver, ISO, and other styles
42

Rasoul, Ryan. "Comparison of Forecasting Models Used by The Swedish Social Insurance Agency." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-49107.

Full text
Abstract:
We will compare two different forecasting models with the forecasting model that was used in March 2014 by The Swedish Social Insurance Agency ("Försäkringskassan" in Swedish or "FK") in this degree project. The models are used for forecasting the number of cases. The two models that will be compared with the model used by FK are the Seasonal Exponential Smoothing model (SES) and Auto-Regressive Integrated Moving Average (ARIMA) model. The models will be used to predict case volumes for two types of benefits: General Child Allowance “Barnbidrag” or (BB_ABB), and Pregnancy Benefit “Graviditetspenning” (GP_ANS). The results compare the forecast errors at the short time horizon (22) months and at the long-time horizon (70) months for the different types of models. Forecast error is the difference between the actual and the forecast value of case numbers received every month. The ARIMA model used in this degree project for GP_ANS had forecast errors on short and long horizons that are lower than the forecasting model that was used by FK in March 2014. However, the absolute forecast error is lower in the actual used model than in the ARIMA and SES models for pregnancy benefit cases. The results also show that for BB_ABB the forecast errors were large in all models, but it was the lowest in the actual used model (even the absolute forecast error). This shows that random error due to laws, rules, and community changes is almost impossible to predict. Therefore, it is not feasible to predict the time series with tested models in the long-term. However, that mainly depends on what FK considers as accepted forecast errors and how those forecasts will be used. It is important to mention that the implementation of ARIMA differs across different software. The best model in the used software in this degree project SAS (Statistical Analysis System) is not necessarily the best in other software.
APA, Harvard, Vancouver, ISO, and other styles
43

Imam, Bisher 1960. "Evaluation of disaggregation model in arid land stream flow generation." Thesis, The University of Arizona, 1989. http://hdl.handle.net/10150/277033.

Full text
Abstract:
A Disaggregation model was tested for arid land stream flow generating. The test was performed on data from Black River, near Fort Apache, Arizona. The model was tested in terms of preserving the relevant historical statistics on both monthly and daily levels, the monthly time series were disaggregated to a random observation of their daily components and the daily components were then reaggregated to yield monthly values. A computer model (DSGN) was developed to perform the model implementation. The model was written and executed on the Macintosh plus personal computer Data from two months were studied; the October data represented the low flow season, while the April data represented the high flow season. Twenty five years of data for each month was used. The generated data for the two months was compared with the historical data.
APA, Harvard, Vancouver, ISO, and other styles
44

Alj, Abdelkamel. "Contribution to the estimation of VARMA models with time-dependent coefficients." Doctoral thesis, Universite Libre de Bruxelles, 2012. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209651.

Full text
Abstract:
Dans cette thèse, nous étudions l’estimation de modèles autorégressif-moyenne mobile

vectoriels ou VARMA, `a coefficients dépendant du temps, et avec une matrice de covariance

des innovations dépendant du temps. Ces modèles sont appel´es tdVARMA. Les éléments

des matrices des coefficients et de la matrice de covariance sont des fonctions déterministes

du temps dépendant d’un petit nombre de paramètres. Une première partie de la thèse

est consacrée à l’étude des propriétés asymptotiques de l’estimateur du quasi-maximum

de vraisemblance gaussienne. La convergence presque sûre et la normalité asymptotique

de cet estimateur sont démontrées sous certaine hypothèses vérifiables, dans le cas o`u les

coefficients dépendent du temps t mais pas de la taille des séries n. Avant cela nous considérons les propriétés asymptotiques des estimateurs de modèles non-stationnaires assez

généraux, pour une fonction de pénalité générale. Nous passons ensuite à l’application de

ces théorèmes en considérant que la fonction de pénalité est la fonction de vraisemblance

gaussienne (Chapitre 2). L’étude du comportement asymptotique de l’estimateur lorsque

les coefficients du modèle dépendent du temps t et aussi de n fait l’objet du Chapitre 3.

Dans ce cas, nous utilisons une loi faible des grands nombres et un théorème central limite

pour des tableaux de différences de martingales. Ensuite, nous présentons des conditions

qui assurent la consistance faible et la normalité asymptotique. Les principaux

résultats asymptotiques sont illustrés par des expériences de simulation et des exemples

dans la littérature. La deuxième partie de cette thèse est consacrée à un algorithme qui nous

permet d’évaluer la fonction de vraisemblance exacte d’un processus tdVARMA d’ordre (p, q) gaussien. Notre algorithme est basé sur la factorisation de Cholesky d’une matrice

bande partitionnée. Le point de départ est une généralisation au cas multivarié de Mélard

(1982) pour évaluer la fonction de vraisemblance exacte d’un modèle ARMA(p, q) univarié. Aussi, nous utilisons quelques résultats de Jonasson et Ferrando (2008) ainsi que les programmes Matlab de Jonasson (2008) dans le cadre d’une fonction de vraisemblance

gaussienne de modèles VARMA à coefficients constants. Par ailleurs, nous déduisons que

le nombre d’opérations requis pour l’évaluation de la fonction de vraisemblance en fonction de p, q et n est approximativement le double par rapport à un modèle VARMA à coefficients

constants. L’implémentation de cet algorithme a été testée en comparant ses résultats avec

d’autres programmes et logiciels très connus. L’utilisation des modèles VARMA à coefficients

dépendant du temps apparaît particulièrement adaptée pour la dynamique de quelques

séries financières en mettant en évidence l’existence de la dépendance des paramètres en

fonction du temps.


Doctorat en Sciences
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
45

Shi, Zhenwu. "Non-worst-case response time analysis for real-time systems design." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/51827.

Full text
Abstract:
A real-time system is a system such that the correctness of operations depends not only on the logical results, but also on the time at which these results are available. A fundamental problem in designing real-time systems is to analyze response time of operations, which is defined as the time elapsed from the moment when the operation is requested to the moment when the operation is completed. Response time analysis is challenging due to the complex dynamics among operations. A common technique is to study response time under worst-case scenario. However, using worst-case response time may lead to the conservative real-time system designs. To improve the real-time system design, we analyze the non-worst-case response time of operations and apply these results in the design process. The main contribution of this thesis includes mathematical modeling of real-time systems, calculation of non-worst-case response time, and improved real-time system design. We perform analysis and design on three common types of real-time systems as the real-time computing system, real-time communication network, and real-time energy management. For the real-time computing systems, our non-worst-response time analysis leads a necessary and sufficient online schedulability test and a measure of robustness of real-time systems. For the real-time communication network, our non-worst-response time analysis improves the performance for the model predictive control design based on the real-time communication network. For the real-time energy management, we use the non-worst-case response time to check whether the micro-grid can operate independently from the main grid.
APA, Harvard, Vancouver, ISO, and other styles
46

Casas, Villalba Isabel. "Statistical inference in continuous-time models with short-range and/or long-range dependence." University of Western Australia. School of Mathematics and Statistics, 2006. http://theses.library.uwa.edu.au/adt-WU2006.0133.

Full text
Abstract:
The aim of this thesis is to estimate the volatility function of continuoustime stochastic models. The estimation of the volatility of the following wellknown international stock market indexes is presented as an application: Dow Jones Industrial Average, Standard and Poor’s 500, NIKKEI 225, CAC 40, DAX 30, FTSE 100 and IBEX 35. This estimation is studied from two different perspectives: a) assuming that the volatility of the stock market indexes displays shortrange dependence (SRD), and b) extending the previous model for processes with longrange dependence (LRD), intermediaterange dependence (IRD) or SRD. Under the efficient market hypothesis (EMH), the compatibility of the Vasicek, the CIR, the Anh and Gao, and the CKLS models with the stock market indexes is being tested. Nonparametric techniques are presented to test the affinity of these parametric volatility functions with the volatility observed from the data. Under the assumption of possible statistical patterns in the volatility process, a new estimation procedure based on the Whittle estimation is proposed. This procedure is theoretically and empirically proven. In addition, its application to the stock market indexes provides interesting results.
APA, Harvard, Vancouver, ISO, and other styles
47

張立茜 and Liqian Zhang. "Optimal H2 model reduction for dynamic systems." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31241372.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Li, Lok-man Jennifer, and 李諾文. "Schedule delay of work trips in Hong Kong: anempirical analysis." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40988041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Britton, Matthew Scott. "Stochastic task scheduling in time-critical information delivery systems." Title page, contents and abstract only, 2003. http://web4.library.adelaide.edu.au/theses/09PH/09phb8629.pdf.

Full text
Abstract:
"January 2003" Includes bibliographical references (leaves 120-129) Presents performance analyses of dynamic, stochastic task scheduling policies for a real- time-communications system where tasks lose value as they are delayed in the system.
APA, Harvard, Vancouver, ISO, and other styles
50

Gao, Wenzhong. "New methodology for power system modeling and its application in machine modeling and simulation." Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/14732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography