Dissertations / Theses on the topic 'Mortality – Forecasting – Mathematical models'

To see the other types of publications on this topic, follow the link: Mortality – Forecasting – Mathematical models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Mortality – Forecasting – Mathematical models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Sulemana, Hisham. "Comparison of mortality rate forecasting using the Second Order Lee–Carter method with different mortality models." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-43563.

Full text
Abstract:
Mortality information is very important for national planning and health of a country. Mortality rate forecasting is a basic contribution for the projection of financial improvement of pension plans, well-being and social strategy planning. In the first part of the thesis, we fit the selected mortality rate models, namely the Power-exponential function based model, the ModifiedPerks model and the Heligman and Pollard (HP4) model to the data obtained from the HumanMortality Database [22] for the male population ages 1–70 of the USA, Japan and Australia. We observe that the Heligman and Pollard (HP4) model performs well and better fit the data as compared to the Power-exponential function based model and the Modified Perks model. The second part is to systematically compare the quality of the mortality rate forecasting using the second order Lee–Carter method with the selected mortality rate models. The results indicate that Power-exponential function based model and the Heligman and Pollard (HP4) model gives a more reliable forecast depending on individual countries.
APA, Harvard, Vancouver, ISO, and other styles
2

Ramos, Anthony Kojo. "Forecasting Mortality Rates using the Weighted Hyndman-Ullah Method." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-54711.

Full text
Abstract:
The performance of three methods of mortality modelling and forecasting are compared. These include the basic Lee–Carter and two functional demographic models; the basic Hyndman–Ullah and the weighted Hyndman–Ullah. Using age-specific data from the Human Mortality Database of two developed countries, France and the UK (England&Wales), these methods are compared; through within-sample forecasting for the years 1999-2018. The weighted Hyndman–Ullah method is adjudged superior among the three methods through a comparison of mean forecast errors and qualitative inspection per the dataset of the selected countries. The weighted HU method is then used to conduct a 32–year ahead forecast to the year 2050.
APA, Harvard, Vancouver, ISO, and other styles
3

Boulougari, Andromachi. "Application of a power-exponential function based model to mortality rates forecasting." Thesis, Mälardalens högskola, Akademin för utbildning, kultur och kommunikation, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-39921.

Full text
Abstract:
De modellering van een wet of mortaliteit heeft een consequente interesse van een grote meerderheid van onderzoekers en vele modellen door de jaren is voorgesteld. The first aim of this thesis is to systematically evaluate a selection of models --- Modified Perks, Heligman-Pollard and Power-exponential --- to determine their relative strengths and weaknesses with regard to forecasting the mortality rate using the Lee-Carter model. Den andre målsætningen er at tilpasse dødelighedsdata ved de selektive modeller fra USA, Sverige og Grækenland ved hjælp af numeriske teknikker til kurvefitting med den ikke-lineære mindst kvadratmetode. The results indicate that the Heligman-Pollard model performs better especially when the phenomenon of the `` accident hump '' occurs during adulthood.
APA, Harvard, Vancouver, ISO, and other styles
4

Putnam, Douglas Alan. "Forecasting for local water management." PDXScholar, 1985. https://pdxscholar.library.pdx.edu/open_access_etds/3540.

Full text
Abstract:
Forecast models are investigated and developed for use in local water management to aid in determining short term water requirements and availability. The forecast models include precipitation occurrence and depth using a Markov chain model, temperature and solar radiation with a multivariate autoregressive model, and streamflow with autoregressive-moving average models. The precipitation, temperature, and solar radiation forecasts are used with a soil moisture model to determine water demands. A state space approach to the Muskingum-Cunge streamflow routing technique is developed. The forecast water demands and streamflow forecasts are used as inputs to this routing model. Forecast model errors and propagation of these errors from one model into the next are investigated.
APA, Harvard, Vancouver, ISO, and other styles
5

Zbib, Imad J. (Imad Jamil). "Sales Forecasting Accuracy Over Time: An Empirical Investigation." Thesis, University of North Texas, 1991. https://digital.library.unt.edu/ark:/67531/metadc332526/.

Full text
Abstract:
This study investigated forecasting accuracy over time. Several quantitative and qualitative forecasting models were tested and a number of combinational methods was investigated. Six time series methods, one causal model, and one subjective technique were compared in this study. Six combinational forecasts were generated and compared to individual forecasts. A combining technique was developed. Thirty data sets, obtained from a market leader in the cosmetics industry, were used to forecast sales. All series represent monthly sales from January 1985 to December 1989. Gross sales forecasts from January 1988 to June 1989 were generated by the company using econometric models. All data sets exhibited seasonality and trend.
APA, Harvard, Vancouver, ISO, and other styles
6

Nyulu, Thandekile. "Weather neutral models for short-term electricity demand forecasting." Thesis, Nelson Mandela Metropolitan University, 2013. http://hdl.handle.net/10948/d1018751.

Full text
Abstract:
Energy demand forecasting, and specifically electricity demand forecasting, is a fun-damental feature in both industry and research. Forecasting techniques assist all electricity market participants in accurate planning, selling and purchasing decisions and strategies. Generation and distribution of electricity require appropriate, precise and accurate forecasting methods. Also accurate forecasting models assist producers, researchers and economists to make proper and beneficial future decisions. There are several research papers, which investigate this fundamental aspect and attempt var-ious statistical techniques. Although weather and economic effects have significant influences on electricity demand, in this study they are purposely eliminated from investigation. This research considers calendar-related effects such as months of the year, weekdays and holidays (that is, public holidays, the day before a public holiday, the day after a public holiday, school holidays, university holidays, Easter holidays and major religious holidays) and includes university exams, general election days, day after elections, and municipal elections in the analysis. Regression analysis, cate-gorical regression and auto-regression are used to illustrate the relationships between response variable and explanatory variables. The main objective of the investigation was to build forecasting models based on this calendar data only and to observe how accurate the models can be without taking into account weather effects and economic effects, hence weather neutral models. Weather and economic factors have to be forecasted, and these forecasts are not so accurate and calendar events are known for sure (error-free). Collecting data for weather and economic factors is costly and time consuming, while obtaining calendar data is relatively easy.
APA, Harvard, Vancouver, ISO, and other styles
7

Lu, Zhen Cang. "Price forecasting models in online flower shop implementation." Thesis, University of Macau, 2017. http://umaclib3.umac.mo/record=b3691395.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Venter, Daniel Jacobus Lodewyk. "The consolidation of forecests with regression models." Thesis, Nelson Mandela Metropolitan University, 2014. http://hdl.handle.net/10948/d1020964.

Full text
Abstract:
The primary objective of this study was to develop a dashboard for the consolidation of multiple forecasts utilising a range of multiple linear regression models. The term dashboard is used to describe with a single word the characteristics of the forecasts consolidation application that was developed to provide the required functionalities via a graphical user interface structured as a series of interlinked screens. Microsoft Excel© was used as the platform to develop the dashboard named ConFoRM (acronym for Consolidate Forecasts with Regression Models). The major steps of the consolidation process incorporated in ConFoRM are: 1. Input historical data. Select appropriate analysis and holdout samples. 3. Specify regression models to be considered as candidates for the final model to be used for the consolidation of forecasts. 4. Perform regression analysis and holdout analysis for each of the models specified in step 3. 5. Perform post-holdout testing to assess the performance of the model with best holdout validation results on out-of-sample data. 6. Consolidate forecasts. Two data transformations are available: the removal of growth and time-periods effect from the time series; a translation of the time series by subtracting ̅i, the mean of all the forecasts for data record i, from the variable being predicted and its related forecasts for each data record I. The pre-defined regression models available for ordinary least square linear regression models (LRM) are: a. A set of k simple LRM’s, one for each of the k forecasts; b. A multiple LRM that includes all the forecasts: c. A multiple LRM that includes all the forecasts and as many of the first-order interactions between the input forecasts as allowed by the sample size and the maximum number of predictors provided by the dashboard with the interactions included in the model to be those with the highest individual correlation with the variable being predicted; d. A multiple LRM that includes as many of the forecasts and first-order interactions between the input forecasts as allowed by the sample size and the maximum number of predictors provided by the dashboard: with the forecasts and interactions included in the model to be those with the highest individual correlation with the variable being predicted; e. A simple LRM with the predictor variable being the mean of the forecasts: f. A set of simple LRM’s with the predictor variable in each case being the weighted mean of the forecasts with different formulas for the weights Also available is an ad hoc user specified model in terms of the forecasts and the predictor variables generated by the dashboard for the pre-defined models. Provision is made in the regression analysis for both of forward entry and backward removal regression. Weighted least squares (WLS) regression can be performed optionally based on the age of forecasts with smaller weight for older forecasts.
APA, Harvard, Vancouver, ISO, and other styles
9

Chan, Johnson Lap-Kay. "Numerical procedure for potential flow problems with a free surface." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/28637.

Full text
Abstract:
A numerical procedure based upon a boundary integral method for gravity wave making problems is studied in the time domain. The free-surface boundary conditions are combined and expressed in a Lagrangian notation to follow the free-surface particle's motion in time. The corresponding material derivative term is approximated by a finite difference expression, and the velocity terms are extrapolated in time for the completion of the formulations. The fluid-body intersection position at the free surface is predicted by an interpolation function that requires information from both the free surface and the submerged surface conditions. Solutions corresponding to a linear free-surface condition and to a non-linear free-surface condition are obtained at small time increment values. Numerical modelling of surface wave problems is studied in two dimensions and in three dimensions. Comparisons are made to linear analytical solutions as well as to published experimental results. Good agreement between the numerical solutions and measured values is found. For the modelling of a three dimensional wave diffraction problem, results at high wave amplitude are restricted because of the use of quadrilateral elements. The near cylinder region of the free surface is not considered to be well represented because of the coarse element size. Wave forces calculated on the vertical cylinder are found to be affected by the modelled tank length. When the simulated wave length is comparable to the wave tank's dimension, numerical results are found to be less than the experimental measurements. However, when the wave length is shorter than the tank's length, solutions are obtained with very good precision.
Applied Science, Faculty of
Mechanical Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
10

Yan, Tsz-leung, and 甄子良. "Spatio-temporal modeling and forecasting of air quality data." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/197498.

Full text
Abstract:
Respirable Suspended Particulate (RSP) time series data sampled in an air quality monitoring network are found strongly correlated and they are varying in highly similar patterns. This study provides a methodology for spatio-temporal modeling and forecasting of multiple RSP time series, in which the dynamic spatial correlations amongst the series can be effectively utilized.   The efficacy of the Spatio-Temporal Dynamic Harmonic Regression (STDHR) model is demonstrated. Based on the decomposition of the observed time series into the trend and periodic components, the model is capable of making forecast of RSP data series that exhibit variation patterns during air pollution episodes and typhoons with dynamic weather conditions. It is also capable to produce spatial predictions of RSP time series up to three unobserved sites.   The Noise-variance-ratio (NVR) form of the multivariate recursive algorithm ((M2) algorithm) that derived by the author can greatly facilitate its practical application in both multivariate and univariate time series analysis. The (M2) algorithm allows the spatial correlations to be specified at parametric levels. The state-space (SS) model formulation can flexibly accommodate the existing inter or intra (auto) correlations amongst the parameters of the data series.   Applications of the variance intervention (VI) are exploited and illustrated with a real life case study which involves forecasting of RSP data series during an air pollution episode. This illustrates that time series with abrupt changes can be predicted by automatic implementation of the VI approach.   The present study also extended the anisotropic Matern model to estimate the dynamic spatial correlation structure of the air quality data by using mean wind speed and prevailing wind direction in defining the spatial anisotropy. The Anisotropic Matern model by Mean Wind Speed and Prevailing Wind Direction (AMMP) model that devised by the author can avoid huge computational burden in estimating variogram at every variation of the underlying spatial structure.   Finally, the findings of this dissertation have laid the foundation for further research on multiple time series analysis and estimation of dynamic spatial structure.
published_or_final_version
Geography
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
11

Tsakou, Katina. "Essays on financial volatility forecasting." Thesis, University of Stirling, 2016. http://hdl.handle.net/1893/25403.

Full text
Abstract:
The accurate estimation and forecasting of volatility is of utmost importance for anyone who participates in the financial market as it affects the whole financial system and, consequently, the whole economy. It has been a popular subject of research with no general conclusion as to which model provides the most accurate forecasts. This thesis enters the ongoing debate by assessing and comparing the forecasting performance of popular volatility models. Moreover, the role of key parameters of volatility is evaluated in improving the forecast accuracy of the models. For these purposes a number of US and European stock indices is used. The main contributions are four. First, I find that implied volatility can be per se forecasted and combining the information of implied volatility and GARCH models predict better the future volatility. Second, the GARCH class of models are superior to the stochastic volatility models in forecasting the one-, five- and twenty two-days ahead volatility. Third, when the realised volatility is modelled and forecast directly using time series, I find that the HAR model performs better than the ARFIMA. Finally, I find that the leverage effect and implied volatility significantly improve the fit and forecasting performance of all the models.
APA, Harvard, Vancouver, ISO, and other styles
12

Cheng, Xin. "Three essays on volatility forecasting." HKBU Institutional Repository, 2010. http://repository.hkbu.edu.hk/etd_ra/1183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Yue, Yang, and 樂陽. "Spatial-temporal dependency of traffic flow and its implications for short-term traffic forecasting." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B35507366.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Ankrah, Samuel K. O. "A case study of short-run forecasting of commodity prices : an application of autoregressive integrated moving average models." Thesis, McGill University, 1991. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=61112.

Full text
Abstract:
That Ghana derives her foreign exchange earnings mainly from cocoa and gold exports cannot be over emphasised. There is therefore the need to forecast these commodities prices as accurately as possible for proper planning and execution of major policies, since the prices have been notoriously volatile during the past two decades and attempts to stabilize especially the price of the beans (which contributes about 60% of the country's foreign exchange) through the system of buffer stock and export restrictions have not been successful. In this regard, autoregressive integrated moving averages models are built and used to generate short run forecasts for the beans and the precious metal price series. These models are simple to build and appear not only to describe the behaviour of the series but provide good forecasts of the prices.
APA, Harvard, Vancouver, ISO, and other styles
15

Schwann, Gregory Michael. "Housing demand : an empirical intertemporal model." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/27526.

Full text
Abstract:
I develop an empirical model of housing demand which is based as closely as possible on a theoretical intertemporal model of consumer demand. In the empirical model, intertemporal behavior by households is incorporated in two ways. First, a household's expected length of occupancy in a dwelling is a parameter in the model; thus, households are able to choose when to move. Second, a household's decision to move and its choice of dwelling are based on the same intertemporal utility function. The parameters of the utility function are estimated using a switching regresion model in which the decision to move and the choice of housing quantity are jointly determined. The model has four other features: (1) a characteristics approach to housing demand is taken, (2) the transaction costs of changing dwellings are incorporated in the model, (3) sample data on household mortgages are employed in computing the user cost of owned dwellings, and (4) demographic variables are incorporated systematically into the household utility function. Rosen's two step proceedure is used to estimate the model. Cragg's technique for estimating regressions in the presence of heteroscedasticity of unknown form is used to estimate the hedonic regressions in step one of the proceedure. In the second step, the switching regression model, is estimated by maximum likelihood. A micro data set of 2,513 Canadian households is used in the estimations. The stage one hedonic regressions indicate that urban housing markets are not in long run equilibrium, that the errors of the hedonic regressions are heteroscedastic, and that simple functional forms for hedonic regressions may perform as well as more complex forms. The stage two estimates establish that a tight link between the theoretical and empirical models of housing demand produces a better model. My results show that conventional static models of housing demand are misspecified. They indicate that households have vastly different planned lengths of dwelling occupancy. They also indicate that housing demand is determined to a great extent by demographic factors.
Arts, Faculty of
Vancouver School of Economics
Graduate
APA, Harvard, Vancouver, ISO, and other styles
16

Mazviona, Batsirai Winmore. "Volatility forecasting using Double-Markov switching GARCH models under skewed Student-t distribution." Master's thesis, University of Cape Town, 2012. http://hdl.handle.net/11427/12344.

Full text
Abstract:
Includes bibliographical references.
This thesis focuses on forecasting the volatility of daily returns using a double Markov switching GARCH model with a skewed Student-t error distribution. The model was applied to individual shares obtained from the Johannesburg Stock Exchange (JSE). The Bayesian approach which uses Markov Chain Monte Carlo was used to estimate the unknown parameters in the model. The double Markov switching GARCH model was compared to a GARCH(1,1) model. Value at risk thresholds and violations ratios were computed leading to the ranking of the GARCH and double Markov switching GARCH models. The results showed that double Markov switching GARCH model performs similarly to the GARCH model based on the ranking technique employed in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
17

De, Antonio Liedo David. "Structural models for macroeconomics and forecasting." Doctoral thesis, Universite Libre de Bruxelles, 2010. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210142.

Full text
Abstract:
This Thesis is composed by three independent papers that investigate

central debates in empirical macroeconomic modeling.

Chapter 1, entitled “A Model for Real-Time Data Assessment with an Application to GDP Growth Rates”, provides a model for the data

revisions of macroeconomic variables that distinguishes between rational expectation updates and noise corrections. Thus, the model encompasses the two polar views regarding the publication process of statistical agencies: noise versus news. Most of the studies previous studies that analyze data revisions are based

on the classical noise and news regression approach introduced by Mankiew, Runkle and Shapiro (1984). The problem is that the statistical tests available do not formulate both extreme hypotheses as collectively exhaustive, as recognized by Aruoba (2008). That is, it would be possible to reject or accept both of them simultaneously. In turn, the model for the

DPP presented here allows for the simultaneous presence of both noise and news. While the “regression approach” followed by Faust et al. (2005), along the lines of Mankiew et al. (1984), identifies noise in the preliminary

figures, it is not possible for them to quantify it, as done by our model.

The second and third chapters acknowledge the possibility that macroeconomic data is measured with errors, but the approach followed to model the missmeasurement is extremely stylized and does not capture the complexity of the revision process that we describe in the first chapter.

Chapter 2, entitled “Revisiting the Success of the RBC model”, proposes the use of dynamic factor models as an alternative to the VAR based tools for the empirical validation of dynamic stochastic general equilibrium (DSGE) theories. Along the lines of Giannone et al. (2006), we use the state-space parameterisation of the factor models proposed by Forni et al. (2007) as a competitive benchmark that is able to capture weak statistical restrictions that DSGE models impose on the data. Our empirical illustration compares the out-of-sample forecasting performance of a simple RBC model augmented with a serially correlated noise component against several specifications belonging to classes of dynamic factor and VAR models. Although the performance of the RBC model is comparable

to that of the reduced form models, a formal test of predictive accuracy reveals that the weak restrictions are more useful at forecasting than the strong behavioral assumptions imposed by the microfoundations in the model economy.

The last chapter, “What are Shocks Capturing in DSGE modeling”, contributes to current debates on the use and interpretation of larger DSGE

models. Recent tendency in academic work and at central banks is to develop and estimate large DSGE models for policy analysis and forecasting. These models typically have many shocks (e.g. Smets and Wouters, 2003 and Adolfson, Laseen, Linde and Villani, 2005). On the other hand, empirical studies point out that few large shocks are sufficient to capture the covariance structure of macro data (Giannone, Reichlin and

Sala, 2005, Uhlig, 2004). In this Chapter, we propose to reconcile both views by considering an alternative DSGE estimation approach which

models explicitly the statistical agency along the lines of Sargent (1989). This enables us to distinguish whether the exogenous shocks in DSGE

modeling are structural or instead serve the purpose of fitting the data in presence of misspecification and measurement problems. When applied to the original Smets and Wouters (2007) model, we find that the explanatory power of the structural shocks decreases at high frequencies. This allows us to back out a smoother measure of the natural output gap than that

resulting from the original specification.
Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
18

Betton, Sandra Ann. "Bankruptcy : a proportional hazard approach." Thesis, University of British Columbia, 1987. http://hdl.handle.net/2429/26056.

Full text
Abstract:
The recent dramatic increase in the corporate bankruptcy rate, coupled with a similar rate of increase in the bank failure rate, has re-awakened investor, lender and government interest in the area of bankruptcy prediction. Bankruptcy prediction models are of particular value to a firm's current and future creditors who often do not have the benefit of an actively traded market in the firm's securities from which to make inferences about the debtor's viability. The models commonly used by many experts in an endeavour to predict the possibility of disaster are outlined in this paper. The proportional hazard model, pioneered by Cox [1972], assumes that the hazard function, the risk of failure, given failure has not already occurred, is a function of various explanatory variables and estimated coefficients multiplied by an arbitrary and unknown function of time. The Cox Proportional Hazard model is usually applied in medical studies; but, has recently been applied to the bank failure question [Lane, Looney & Wansley, 1986]. The model performed well in the narrowly defined, highly regulated, banking industry. The principal advantage of this approach is that the model incorporates both the survival times observed and any censoring of data thereby using more of the available information in the analysis. Unlike many bankruptcy prediction models, such as logit and probit based regression models, the Cox model estimates the probability distribution of survival times. The proportional hazard model would, therefore, appear to offer a useful addition to the more traditional bankruptcy prediction models mentioned above. This paper evaluates the applicability of the Cox proportional hazard model in the more diverse industrial environment. In order to test this model, a sample of 109 firms was selected from the Compustat Industrial and Research Industrial data tapes. Forty one of these firms filed petitions under the various bankruptcy acts applicable between 1972 and 1985 and were matched to 67 firms which had not filed petitions for bankruptcy during the same period. In view of the dramatic changes in the bankruptcy regulatory environment caused by the Bankruptcy reform act of 1978, the legal framework of the bankruptcy process was also examined. The performance of the estimated Cox model was then evaluated by comparing its classification and descriptive capabilities to those of an estimated discriminant analysis based model. The results of this study indicate that while the classification capability of the Cox model was less than that of discriminant analysis, the model provides additional information beyond that available from the discriminant analysis.
Business, Sauder School of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
19

Zhu, Jiasong, and 朱家松. "A self-learning short-term traffic forecasting system through dynamic hybrid approach." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B39634516.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Ye, Qing, and 叶青. "Short-term traffic speed forecasting based on data recorded at irregular intervals." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B47250732.

Full text
Abstract:
Efficient and comprehensive forecasting of information is of great importance to traffic management. Three types of forecasting methods based on irregularly spaced data—for situations when traffic detectors cannot be installed to generate regularly spaced data on all roads—are studied in this thesis, namely, the single segment forecasting method, multi-segment forecasting method and model-based forecasting method. The proposed models were tested using Global Positioning System (GPS) data from 400 Hong Kong taxis collected within a 2-kilometer section on Princess Margaret Road and Hong Chong Road, approaching the Cross Harbour Tunnel. The speed limit for the road is 70 km/h. It has flyovers and ramps, with a small number of merges and diverges. There is no signalized intersection along this road section. A total of 14 weeks of data were collected, in which the first 12 weeks of data were used to calibrate the models and the last two weeks of data were used for validation. The single-segment forecasting method for irregularly spaced data uses a neural network to aggregate the predicted speeds from the naive method, simple exponential smoothing method and Holt’s method, with explicit consideration of acceleration information. The proposed method shows a great improvement in accuracy compared with using the individual forecasting method separately. The acceleration information, which is viewed as an indicator of the phase-transition effect, is considered to be the main contribution to the improvement. The multi-segment forecasting method aggregates not only the information from the current forecasting segment, but also from adjacent segments. It adopts the same sub-methods as the single-segment forecasting method. The forecasting results from adjacent segments help to describe the phase-transition effect, so that the forecasting results from the multi-segment forecasting method are more accurate than those that are obtained from the single segment forecasting method. For one-second forecasting length, the correlation coefficient between the forecasts from the multi-segment forecasting method and observations is 0.9435, which implies a good consistency between the forecasts and observations. While the first two methods are based on pure data fitting techniques, the third method is based on traffic models and is called the model-based forecasting method. Although the accuracy of the one-second forecasting length of the model-based method lies between those of the single-segment and multi-segment forecasting methods, its accuracy outperforms the other two for longer forecasting steps, which offers a higher potential for practical applications.
published_or_final_version
Civil Engineering
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
21

Lu, Guanhua. "Asymptotic theory for multiple-sample semiparametric density ratio models and its application to mortality forecasting." College Park, Md.: University of Maryland, 2007. http://hdl.handle.net/1903/7615.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2007.
Thesis research directed by: Dept. of Mathematics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
22

Hatzopoulos, Peter. "Statistical and mathematical modelling for mortality trends and the comparison of mortality experiences, through generalised linear models and GLIM." Thesis, City University London, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Conatser, Dean G. "Forecasting U.S. Marine Corps reenlistments by military occupational specialty and grade." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Sep%5FConatser.pdf.

Full text
Abstract:
Thesis (M.S. in Operations Research)--Naval Postgraduate School, September 2006.
Thesis Advisor(s): Ronald D. Fricker. "September 2006." Includes bibliographical references (p. 49-50). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
24

Hildebrand, Paul. "The use of absorbing boundaries in the analysis of bankruptcy." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ34550.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zhai, Yusheng. "Time series forecasting competition among three sophisticated paradigms /." Electronic version (Microsoft Word), 2005. http://dl.uncw.edu/etd/2005/zhaiy/yushengzhai.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Ma, Chin-wan Raymond, and 馬展雲. "A study on the beta coefficients of securities in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1989. http://hub.hku.hk/bib/B31976050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Modugno, Michèle. "Essays on real-time econometrics and forecasting." Doctoral thesis, Universite Libre de Bruxelles, 2011. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209841.

Full text
Abstract:
The thesis contains four essays covering topics in the field of real time econometrics and forecasting.

The first Chapter, entitled “An area wide real time data base for the euro area” and coauthored with Domenico Giannone, Jerome Henry and Magda Lalik, describes how we constructed a real time database for the euro area covering more than 200 series regularly published in the European Central Bank Monthly Bulletin, as made available ahead of publication to the Governing Council members before their first meeting of the month.

Recent research has emphasised that the data revisions can be large for certain indicators and can have a bearing on the decisions made, as well as affect the assessment of their relevance. It is therefore key to be in a position to reconstruct the historical environment of economic decisions at the time they were made by private agents and policy-makers rather than using the data as they become available some years later. For this purpose, it is necessary to have the information in the form of all the different vintages of data as they were published in real time, the so-called "real-time data" that reflect the economic situation at a given point in time when models are estimated or policy decisions made.

We describe the database in details and study the properties of the euro area real-time data flow and data revisions, also providing comparisons with the United States and Japan. We finally illustrate how such revisions can contribute to the uncertainty surrounding key macroeconomic ratios and the NAIRU.

The second Chapter entitled “Maximum likelihood estimation of large factor model on datasets with arbitrary pattern of missing data” is based on a joint work with Marta Banbura. It proposes a methodology for the estimation of factor models on large cross-sections with a general pattern of missing data. In contrast to Giannone et al (2008), we can handle datasets that are not only characterised by a 'ragged edge', but can include e.g. mixed frequency or short history indicators. The latter is particularly relevant for the euro area or other young economies, for which many series have been compiled only since recently. We adopt the maximum likelihood approach, which, apart from the flexibility with regard to the pattern of missing data, is also more efficient and allows imposing restrictions on the parameters. It has been shown by Doz et al (2006) to be consistent, robust and computationally feasible also in the case of large cross-sections. To circumvent the computational complexity of a direct likelihood maximisation in the case of large cross-section, Doz et al (2006) propose to use the iterative Expectation-Maximisation (EM) algorithm. Our contribution is to modify the EM steps to the case of missing data and to show how to augment the model in order to account for the serial correlation of the idiosyncratic component. In addition, we derive the link between the unexpected part of a data release and the forecast revision and illustrate how this can be used to understand the sources of the latter in the case of simultaneous releases.

We use this methodology for short-term forecasting and backdating of the euro area GDP on the basis of a large panel of monthly and quarterly data.

The third Chapter is entitled “Nowcasting Inflation Using High Frequency Data” and it proposes a methodology for nowcasting and forecasting inflation using data with sampling frequency higher than monthly. In particular, this Chapter focuses on the energy component of inflation given the availability of data like the Weekly Oil Bulletin Price Statistics for the euro area, the Weekly Retail Gasoline and Diesel Prices for the US and the daily spot and future prices of crude oil.

Although nowcasting inflation is a novel idea, there is a rather long literature focusing on nowcasting GDP. The use of higher frequency indicators in order to Nowcast/Forecast lower frequency indicators had started with monthly data for GDP. GDP is a quarterly variable released with a substantial time delay (e.g. two months after the end of the reference quarter for the euro area GDP).

The estimation adopts the methodology described in Chapter 2, modeling the data as a trading day frequency factor model with missing observations in a state space representation. In contrast to other procedures, the methodology proposed models all the data within a unified single framework that allows one to produce forecasts of all the involved variables from a factor model, which, by definition, does not suffer from overparametrisation. Moreover, this offers the possibility to disentangle model-based "news" from each release and then to assess their impact on the forecast revision. The Chapter provides an illustrative example of this procedure, focusing on a specific month.

In order to assess the importance of using high frequency data for forecasting inflation this Chapter compares the forecast performance of the univariate models, i.e. random walk and autoregressive process, with the forecast performance of the model that uses weekly and daily data. The provided empirical evidence shows that exploiting high frequency data relative to oil not only let us nowcast and forecast the energy component of inflation with a precision twice better than the proposed benchmarks, but we obtain a similar improvement even for total inflation.

The fourth Chapter entitled “The forecasting power of international yield curve linkages”, coauthored with Kleopatra Nikolaou, investigates dependency patterns between the yield curves of Germany and the US, by using an out-of-sample forecast exercise.

The motivation for this Chapter stems from the fact that our up to date knowledge on dependency patterns among yields curves of different countries is limited. Looking at the yield curve literature, the empirical evidence to-date informs us of strong contemporaneous interdependencies of yield curves across countries, in line with increased globalization and financial integration. Nevertheless, this yield curve literature does not investigate non-contemporaneous correlations. And yet, clear indication in favour of such dependency patterns is recorded in studies focusing on specific interest rates, which look at the role of certain countries as global players (see Frankel et al. (2004), Chinn and Frankel (2005) and Wang et al. (2007)). Evidence from these studies suggests a leading role for the US. Moreover, dependency patterns recorded in the real business cycles between the US and the euro area (Giannone and Reichlin, 2007) can also rationalize such linkages, to the extent that output affects nominal interest rates.

We propose, estimate and forecast (out-of-sample) a novel dynamic factor model for the yield curve, where dynamic information from foreign yield curves is introduced into domestic yield curve forecasts. This is the International Dependency Model (IDM). We want to compare the yield curve forecast under the IDM versus a purely domestic model and a model that allows for contemporaneous common global factors. These models serve as useful comparisons. The domestic model bears direct modeling links with IDM, as it can be seen as a nested model of IDM. The global model bears less direct links in terms of modeling, but, in line with IDM, it is also an international model that serves to highlight the advantages of introducing international information in yield curve forecasts. However, the global model aims to identify contemporaneous linkages in the yield curve of the two countries, whereas the IDM also allows for detecting dependency patterns.

Our results that shocks appear to be diffused in a rather asymmetric manner across the two countries. Namely, we find a unidirectional causality effect that runs from the US to Germany. This effect is stronger in the last ten years, where out-of-sample forecasts of Germany using the US information are even more accurate than the random walk forecasts. Our statistical results demonstrate a more independent role for the US.
Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
28

Yan, Hanjun. "Numerical methods for data assimilation in weather forecasting." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/555.

Full text
Abstract:
Data assimilation plays an important role in weather forecasting. The purpose of data assimilation is try to provide a more accurate atmospheric state for future forecast. Several existed methods currently used in this field fall into two categories: statistical data assimilation and variational data assimilation. This thesis focuses mainly on variational data assimilation. The original objective function of three dimensional data assimilation (3D-VAR) consists of two terms: the difference between the pervious forecast and analysis and the difference between the observations and analysis in observation space. Considering the inaccuracy of previous forecasting results, we replace the first term by the difference between the previous forecast gradients and analysis gradients. The associated data fitting term can be interpreted using the second-order finite difference matrix as the inverse of the background error covariance matrix in the 3D-VAR setting. In our approach, it is not necessary to estimate the background error covariance matrix and to deal with its inverse in the 3D-VAR algorithm. Indeed, the existence and uniqueness of the analysis solution of the proposed objective function are already established. Instead, the solution can be calculated using the conjugate gradient method iteratively. We present the experimental results based on WRF simulations. We show that the performance of this forecast gradient based DA model is better than that of 3D-VAR. Next, we propose another optimization method of variational data assimilation. Using the tensor completion in the cost function for the analysis, we replace the second term in the 3D-VAR cost function. This model is motivated by a small number of observations compared with the large portion of the grids. Applying the alternating direction method of multipliers to solve this optimization problem, we conduct numerical experiments on real data. The results show that this tensor completion based DA model is competitive in terms of prediction accuracy with 3D-VAR and the forecast gradient based DA model. Then, 3D-VAR and the two model proposed above lack temporal information, we construct a third model in four-dimensional space. To include temporal information, this model is based on the second proposed model, in which introduce the total variation to describe the change of atmospheric state. To this end, we use the alternating direction method of multipliers. One set of experimental results generates a positive performance. In fact, the prediction accuracy of our third model is better than that of 3D-VAR, the forecast gradient based DA model, and the tensor completion based DA model. Nevertheless, although the other sets of experimental results show that this model has a better performance than 3D-VAR and the forecast gradient based DA model, its prediction accuracy is slightly lower than the tensor completion based model.
APA, Harvard, Vancouver, ISO, and other styles
29

關惠貞 and Wai-ching Josephine Kwan. "Trend models for price movements in financial markets." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B31211513.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Kışınbay, Turgut. "Predictive ability or data snopping? : essays on forecasting with large data sets." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=85018.

Full text
Abstract:
This thesis examines the predictive ability of models for forecasting inflation and financial market volatility. Emphasis is put on evaluation of forecasts and the usage of large data sets. Variety of models are used to forecast inflation, including diffusion indices, artificial neural networks, and traditional linear regressions. Financial market volatility is forecast using various GARCH-type and high-frequency based models. High-frequency data are also used to obtain ex-post estimates of volatility, which is then used to evaluate forecasts. All forecast are evaluated using recently proposed techniques that can account for data snooping bias, nested, and nonlinear models.
APA, Harvard, Vancouver, ISO, and other styles
31

Keefer, Timothy Orrin, and Timothy Orrin Keefer. "Likelihood development for a probabilistic flash flood forecasting model." Thesis, The University of Arizona, 1993. http://hdl.handle.net/10150/192077.

Full text
Abstract:
An empirical method is developed for constructing likelihood functions required in a Bayesian probabilistic flash flood forecasting model using data on objective quantitative precipitation forecasts and their verification. Likelihoods based on categorical and probabilistic forecast information for several forecast periods, seasons, and locations are shown and compared. Data record length, forecast information type and magnitude, grid area, and discretized interval size are shown to affect probabilistic differentiation of amounts of potential rainfall. Use of these likelihoods in Bayes' Theorem to update prior probability distributions of potential rainfall, based on preliminary data, to posterior probability distributions, reflecting the latest forecast information, demonstrates that an abbreviated version of the flash flood forecasting methodology is currently practicable. For this application, likelihoods based on the categorical forecast are indicated. Apart from flash flood forecasting, it is shown that likelihoods can provide detailed insight into the value of information contained in particular forecast products.
APA, Harvard, Vancouver, ISO, and other styles
32

Feiring, Douglas I. "Forecasting Marine Corps enlisted manpower inventory levels with univariate time series models." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Mar%5FFeiring.pdf.

Full text
Abstract:
Thesis (M.S. in Management)--Naval Postgraduate School, March 2006.
Thesis Advisor(s): Samuel Buttrey, William Hatch. "March 2006." Includes bibliographical references (p. 87-88). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
33

Haidar, Imad. "Short-term forecasting model for crude oil price based on artificial neural networks /." Access document online, 2008. http://archimedes.ballarat.edu.au:8080/vital/access/HandleResolver/1959.17/5946.

Full text
Abstract:
Thesis (Masters) -- University of Ballarat, 2008.
Submitted in total fulfillment of the requirements for Masters of Computing, School of Information Technology and Mathematical Sciences. Bibliography: leaves cxxii-cxxvii.
APA, Harvard, Vancouver, ISO, and other styles
34

Du, Jun 1962. "Short-range ensemble forecasting of an explosive cyclogenesis with a limited area model." Diss., The University of Arizona, 1996. http://hdl.handle.net/10150/191197.

Full text
Abstract:
Since the atmosphere is a chaotic system, small errors in the initial condition of any numerical weather prediction (NWP) model amplify as the forecast evolves. To estimate and possibly reduce the uncertainty of NWP associated with initial-condition uncertainty (ICU), ensemble forecasting has been proposed which is a method of, differently from the traditional deterministic forecasting, running several model forecasts starting from slightly different initial states. In this dissertation, the impact of ICU and short-range ensemble forecasting (SREF) on quantitative precipitation forecasts (QPFs), as well as on sea-level cyclone position and central pressure, is examined for a case of explosive cyclogenesis that occurred over the contiguous United States. A limited-area model (the PSU/NCAR MM4) is run at 80-km horizontal resolution and 15 layers to produce a 25-member, 36-h forecast ensemble. Lateral boundary conditions for the MM4 model are provided by ensemble forecasts from a global spectral model (the NCAR CCM1). The initial perturbations of the ensemble members possess a magnitude and spatial decomposition which closely match estimates of global analysis error, but they were not dynamically-conditioned. Results for 80-km ensemble forecast are compared to forecasts from the then operational Nested Grid Model (NGM), a single 40-km MM4 forecast, and a second 25-member MM4 ensemble based on a different cumulus parameterization and slightly different initial conditions. Acute sensitivity to ICU marks ensemble QPF and the forecasts of cyclone position and central pressure. Ensemble averaging always reduces the rms error for QPF. Nearly 90% of the improvement is obtainable using ensemble sizes as small as 8-10. However, ensemble averaging can adversely affect the forecasts related to precipitation areal coverage because of its smoothing nature. Probabilistic forecasts for five mutually exclusive, completely exhaustive categories are found to be skillful relative to a climatological forecast. Ensemble sizes of --, 10 can account for 90% of improvement in probability density function. Our results indicate that SREF techniques can now provide useful QPF guidance and increase the accuracy of precipitation, cyclone position, and cyclone's central pressure forecasts. With current analysis/forecast systems, the benefit from simple ensemble averaging is comparable to or exceed that obtainable from improvement in the analysis/forecast system.
APA, Harvard, Vancouver, ISO, and other styles
35

Tsang, Yick-tat, and 曾億達. "Modelling and forecasting the general financial performance of listed construction firms in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2014. http://hdl.handle.net/10722/198814.

Full text
Abstract:
It is well recognised that construction firms encounter risk and are sensitive to trends and volatility in the business environment. Measuring the financial performance of a firm serves as the basis of monitoring and evaluating its management competence, resource allocation and corporate strategy in response to environmental change. Forecasting is paramount in responding to potential problems and perpetuating positive developments that result in sustainable competitiveness. Thus, an enriched understanding and prediction of the financial performance of construction firms are desirable for decision makers and other industry stakeholders. Notwithstanding that, little research attention has been paid to this premise conceptually and empirically. Thus, the overall aim of this study was to model and forecast the general financial performance of Hong Kong construction firms under the dynamic influence of the business environment. This study involved the application of quantitative modelling using various statistical and econometric techniques. Multidimensional firm financial performance was first approximated using factor analysis based on the financial data of local publicly listed construction firms from 1992 to 2010. The factor model uncovers five common financial factors: liquidity, asset, leverage, profitability and activity. The time trends of these factors display diverse and cyclical patterns with irregular cycle periods. Autoregressive integrated moving average (ARIMA) models were then constructed based on the Box-Jenkins approach, which provided univariate forecasts of the financial factors. The results reaffirmed that ARIMA models were highly effective in forecasting. In conjunction with cross-correlation analysis, multiple linear regression (MLR) models were next used to explore the influence of environmental determinants on firm financial performance. The findings identified different sets of significant leading determinants for different financial factors. They further justified the dominance of sectoral factors in the determination of firm performance. Supported by empirical verification, a theoretical framework depicting the relationships between business environment and firm performance was proposed. In conjunction with cross-correlation analysis, multiple linear regression (MLR) models were next used to explore the influence of environmental determinants on firm financial performance. The findings identified different sets of significant leading determinants for different financial factors. They further justified the dominance of sectoral factors in the determination of firm performance. Supported by empirical verification, a theoretical framework depicting the relationships between business environment and firm performance was proposed. This study is among the first to apply advanced econometric techniques to develop reliable performance measurement and forecasting models. The results improve the theoretical framework by explaining the dynamic relationships between the financial performance and business environment of construction firms. The empirical findings of the quantitative analysis offer new implications for firms’ financial performance and the significant leading determinants in a local context. The outcomes of this study make seminal contributions to current knowledge and practice.
published_or_final_version
Civil Engineering
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
36

Chen, Qiming, and 陈启明. "Statistical inference for the APGARCH and threshold APGARCH models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B4598511X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Smith, Michael C. "Diameter and height increment and mortality functions for loblolly pine trees in thinned and unthinned plantations." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-03242009-040942/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Santos, Jorge Ruben. "Numerical study of a tornado-like vortex in a supercell storm." Thesis, McGill University, 2008. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=115876.

Full text
Abstract:
Recent observations and numerical simulations have significantly improved our understanding of tornadic storms. However, our knowledge of tornado-genesis remains rudimentary. Necessary atmospheric conditions favoring the formation of tornadoes in supercell storms are known, but sufficient conditions remain elusive. The underlying reason is that the processes involved in environment-storm and storm-tornado interactions are not fully understood, as numerical models in the past lacked sufficient resolution to resolve these interactions satisfactorily. In this thesis, an attempt is made to fill this gap by performing a multi-grid high resolution simulation of a supercell storm spawning a tornado-like vortex. Four grids, with grid sizes of 600 m, 200 m, 70 m, and 30 m, are used to allow explicit simulation of storm-tornado interactions. Diagnostic analysis of the modeling results allows an investigation of the origin of rotation at both the storm scale and the tornado scale.
The simulation results showed that the origin of vertical rotation at storm scale during the early stage of storm development is due to tilting of the horizontal vorticity in the environment. This so called mesocyclone then further strengthens by the mechanism of stretching and Dynamic Pipe Effect and descends downwards. During the time of mesocyclone intensification, incipient surface vertical vortices form along the outflow boundary created by the rear flank downdraft due to the process of horizontal shear instability.
One of the surface vortices experiences an initial exponential growth in its vorticity by interacting with the descending mesocyclone and merging with multiple smaller satellite vortices. The tornado-like vortex (TLV) which forms has a maximum horizontal wind of 103 m s-1 and a minimum central pressure of 927 hPa. Vorticity budgets of the mesocyclone and the TLV are computed to assess quantitatively the importance of various processes for rotation.
Sensitivity experiments were also performed to determine the effect of varying the environmental conditions on the mesocyclone and surface vorticity. It was found that as the low-level vertical shear of the environmental wind increases, the mesocyclone intensifies and favors the intensification of near surface vorticity. The presence of drier layers in the upper and middle troposphere eventually produces a weaker mesocyclone and weaker outflow boundaries. On the other hand, inclusion of the ice phase processes produces a stronger mesocyclone and more intense outflow boundaries to enhance the intensification of near surface vorticity.
APA, Harvard, Vancouver, ISO, and other styles
39

Rumantir, Grace Widjaja. "Minimum message length criterion for second-order polynomial model selection applied to tropical cyclone intensity forecasting." Monash University, School of Computer Science and Software Engineering, 2003. http://arrow.monash.edu.au/hdl/1959.1/5813.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Kamwi, Innocent Silibelo. "Fitting extreme value distributions to the Zambezi river flood water levels recorded at Katima Mulilo in Namibia." Thesis, University of the Western Cape, 2005. http://etd.uwc.ac.za/index.php?module=etd&amp.

Full text
Abstract:
The aim of this research project was to estimate parameters for the distribution of annual maximum flood levels for the Zambezi River at Katima Mulilo. The estimation of parameters was done by using the maximum likelihood method. The study aimed to explore data of the Zambezi's annual maximum flood heights at Katima Mulilo by means of fitting the Gumbel, Weibull and the generalized extreme value distributions and evaluated their goodness of fit.
APA, Harvard, Vancouver, ISO, and other styles
41

Venter, Rudolf Gerrit. "Pricing options under stochastic volatility." Diss., Pretoria : [s.n.], 2003. http://upetd.up.ac.za/thesis/available/etd09052005-120952.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Abdelghany, Ahmed F. "Dynamic micro-assignment of travel demand with activity/trip chains." Full text (PDF) from UMI/Dissertation Abstracts International Access restricted to users with UT Austin EID, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3023538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Michaud, Jene Diane. "Distributed rainfall-runoff modeling of thunderstorm-generated floods a case study in a mid-sized, semi-arid watershed in Arizona /." Diss., The University of Arizona, 1992. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu_e9791_1992_49_sip1_w.pdf&type=application/pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Khajehei, Sepideh. "A Multivariate Modeling Approach for Generating Ensemble Climatology Forcing for Hydrologic Applications." PDXScholar, 2015. https://pdxscholar.library.pdx.edu/open_access_etds/2403.

Full text
Abstract:
Reliability and accuracy of the forcing data plays a vital role in the Hydrological Streamflow Prediction. Reliability of the forcing data leads to accurate predictions and ultimately reduction of uncertainty. Currently, Numerical Weather Prediction (NWP) models are developing ensemble forecasts for various temporal and spatial scales. However, it is proven that the raw products of the NWP models may be biased at the basin scale; unlike model grid scale, depending on the size of the catchment. Due to the large space-time variability of precipitation, bias-correcting the ensemble forecasts has proven to be a challenging task. In recent years, Ensemble Pre-Processing (EPP), a statistical approach, has proven to be helpful in reduction of bias and generation of reliable forecast. The procedure is based on the bivariate probability distribution between observation and single-value precipitation forecasts. In the current work, we have applied and evaluated a Bayesian approach, based on the Copula density functions, to develop an ensemble precipitation forecasts from the conditional distribution of the single-value precipitation. Copula functions are the multivariate joint distribution of univariate marginal distributions and are capable of modeling the joint distribution of two variables with any level of correlation and dependency. The advantage of using Copulas, amongst others, includes its capability of modeling the joint distribution independent of the type of marginal distribution. In the present study, we have evaluated the capability of copula-based functions in EPP and comparison is made against an existing and commonly used procedure for same i.e. meta-Gaussian distribution. Monthly precipitation forecast from Climate Forecast System (CFS) and gridded observation from Parameter-elevation Relationships on Independent Slopes Model (PRISM) have been utilized to create ensemble pre-processed precipitation over three sub-basins in the western USA at 0.5-degree spatial resolution. The comparison has been made using both deterministic and probabilistic frameworks of evaluation. Across all the sub-basins and evaluation techniques, copula-based technique shows more reliability and robustness as compared to the meta-Gaussian approach.
APA, Harvard, Vancouver, ISO, and other styles
45

Wong, Chun-mei May, and 王春美. "The statistical tests on mean reversion properties in financial markets." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B31211975.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Perrone, Jim T. "Hydrologic modeling of an agricultural watershed in Quebec using AGNPS." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp01/MQ29763.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Wahl, Douglas Timothy. "Increasing range and lethality of Extended -Range Munitions (ERMS) using Numerical Weather Prediction (NWP) and the AUV workbench to compute a Ballistic Correction (BALCOR)." Thesis, Monterey, Calif. : Naval Postgraduate School, 2006. http://bosun.nps.edu/uhtbin/hyperion.exe/06Dec%5FWahl.pdf.

Full text
Abstract:
Thesis (M.S. in Meteorology and Physical Oceanography)--Naval Postgraduate School, December 2006.
Thesis Advisor(s): Wendell Nuss, Don Brutzmann. "December 2006." Includes bibliographical references (p. 107-116). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
48

Cheng, Xixin, and 程細辛. "Mixture time series models and their applications in volatility estimation and statistical arbitrage trading." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2008. http://hub.hku.hk/bib/B40988053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Monti, Francesca. "Combining structural and reduced-form models for macroeconomic forecasting and policy analysis." Doctoral thesis, Universite Libre de Bruxelles, 2011. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/209970.

Full text
Abstract:
Can we fruitfully use the same macroeconomic model to forecast and to perform policy analysis? There is a tension between a model’s ability to forecast accurately and its ability to tell a theoretically consistent story. The aim of this dissertation is to propose ways to soothe this tension, combining structural and reduced-form models in order to have models that can effectively do both.
Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
50

Law, Ka-chung, and 羅家聰. "A comparison of volatility predictions in the HK stock market." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B30163535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography