To see the other types of publications on this topic, follow the link: Generalized Linear Modelling.

Dissertations / Theses on the topic 'Generalized Linear Modelling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 18 dissertations / theses for your research on the topic 'Generalized Linear Modelling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Schnatter, Sylvia. "Integration-based Kalman-filtering for a Dynamic Generalized Linear Trend Model." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1991. http://epub.wu.ac.at/424/1/document.pdf.

Full text
Abstract:
The topic of the paper is filtering for non-Gaussian dynamic (state space) models by approximate computation of posterior moments using numerical integration. A Gauss-Hermite procedure is implemented based on the approximate posterior mode estimator and curvature recently proposed in 121. This integration-based filtering method will be illustrated by a dynamic trend model for non-Gaussian time series. Comparision of the proposed method with other approximations ([15], [2]) is carried out by simulation experiments for time series from Poisson, exponential and Gamma distributions. (author's abstract)
Series: Forschungsberichte / Institut für Statistik
APA, Harvard, Vancouver, ISO, and other styles
2

Nargis, Suraiya, and n/a. "Robust methods in logistic regression." University of Canberra. Information Sciences & Engineering, 2005. http://erl.canberra.edu.au./public/adt-AUC20051111.141200.

Full text
Abstract:
My Masters research aims to deepen our understanding of the behaviour of robust methods in logistic regression. Logistic regression is a special case of Generalized Linear Modelling (GLM), which is a powerful and popular technique for modelling a large variety of data. Robust methods are useful in reducing the effect of outlying values in the response variable on parameter estimates. A literature survey shows that we are still at the beginning of being able to detect extreme observations in logistic regression analyses, to apply robust methods in logistic regression and to present informatively the results of logistic regression analyses. In Chapter 1 I have made a basic introduction to logistic regression, with an example, and to robust methods in general. In Chapters 2 through 4 of the thesis I have described traditional methods and some relatively new methods for presenting results of logistic regression using powerful visualization techniques as well as the concepts of outliers in binomial data. I have used different published data sets for illustration, such as the Prostate Cancer data set, the Damaged Carrots data set and the Recumbent Cow data set. In Chapter 4 I summarize and report on the modem concepts of graphical methods, such as central dimension reduction, and the use of graphics as pioneered by Cook and Weisberg (1999). In Section 4.6 I have then extended the work of Cook and Weisberg to robust logistic regression. In Chapter 5 I have described simulation studies to investigate the effects of outlying observations on logistic regression (robust and non-robust). In Section 5.2 I have come to the conclusion that, in the case of classical or robust multiple logistic regression with no outliers, robust methods do not necessarily provide more reasonable estimates of the parameters for the data that contain no st~ong outliers. In Section 5.4 I have looked into the cases where outliers are present and have come to the conclusion that either the breakdown method or a sensitivity analysis provides reasonable parameter estimates in that situation. Finally, I have identified areas for further study.
APA, Harvard, Vancouver, ISO, and other styles
3

Mallya, Shruti. "Modelling Human Risk of West Nile Virus Using Surveillance and Environmental Data." Thesis, Université d'Ottawa / University of Ottawa, 2017. http://hdl.handle.net/10393/35734.

Full text
Abstract:
Limited research has been performed in Ontario to ascertain risk factors for West Nile Virus (WNV) and to develop a unified risk prediction strategy. The aim of the current body of work was to use spatio-temporal modelling in conjunction with surveillance and environmental data to determine which pre-WNV season factors could forecast a high risk season and to explore how well mosquito surveillance data could predict human cases in space and time during the WNV season. Generalized linear mixed modelling found that mean minimum monthly temperature variables and annual WNV-positive mosquito pools were most significantly predictive of number of human WNV cases (p<0.001). Spatio-temporal cluster analysis found that positive mosquito pool clusters could predict human case clusters up to one month in advance. These results demonstrate the usefulness of mosquito surveillance data as well as publicly available climate data for assessing risk and informing public health practice.
APA, Harvard, Vancouver, ISO, and other styles
4

Charalambous, Christiana. "Variable selection in joint modelling of mean and variance for multilevel data." Thesis, University of Manchester, 2011. https://www.research.manchester.ac.uk/portal/en/theses/variable-selection-in-joint-modelling-of-mean-and-variance-for-multilevel-data(cbe5eb08-1e77-4b44-b7df-17bd4bf4937f).html.

Full text
Abstract:
We propose to extend the use of penalized likelihood based variable selection methods to hierarchical generalized linear models (HGLMs) for jointly modellingboth the mean and variance structures. We are interested in applying these newmethods on multilevel structured data, hence we assume a two-level hierarchical structure, with subjects nested within groups. We consider a generalized linearmixed model (GLMM) for the mean, with a structured dispersion in the formof a generalized linear model (GLM). In the first instance, we model the varianceof the random effects which are present in the mean model, or in otherwords the variation between groups (between-level variation). In the second scenario,we model the dispersion parameter associated with the conditional varianceof the response, which could also be thought of as the variation betweensubjects (within-level variation). To do variable selection, we use the smoothlyclipped absolute deviation (SCAD) penalty, a penalized likelihood variable selectionmethod, which shrinks the coefficients of redundant variables to 0 and at thesame time estimates the coefficients of the remaining important covariates. Ourmethods are likelihood based and so in order to estimate the fixed effects in ourmodels, we apply iterative procedures such as the Newton-Raphson method, inthe form of the LQA algorithm proposed by Fan and Li (2001). We carry out simulationstudies for both the joint models for the mean and variance of the randomeffects, as well as the joint models for the mean and dispersion of the response,to assess the performance of our new procedures against a similar process whichexcludes variable selection. The results show that our method increases both theaccuracy and efficiency of the resulting penalized MLEs and has 100% successrate in identifying the zero and non-zero components over 100 simulations. Forthe main real data analysis, we use the Health Survey for England (HSE) 2004dataset. We investigate how obesity is linked to several factors such as smoking,drinking, exercise, long-standing illness, to name a few. We also discover whetherthere is variation in obesity between individuals and between households of individuals,as well as test whether that variation depends on some of the factorsaffecting obesity itself.
APA, Harvard, Vancouver, ISO, and other styles
5

Carreira, Inês Duarte. "Modelling dependence between frequency and severity of insurance claims." Master's thesis, Instituto Superior de Economia e Gestão, 2017. http://hdl.handle.net/10400.5/14631.

Full text
Abstract:
Mestrado em Actuarial Science
A estimação da perda individual é uma importante tarefa para calcular os preços das apólices de seguro. A abordagem padrão assume independência entre a frequência e a severidade dos sinistros, o que pode não ser uma suposição realística. Neste texto, a dependência entre números e montantes de sinistros é explorada, num contexto de Modelos Lineares Generalizados. Um modelo de severidade condicional e um modelo de Cópula são apresentados como alternativas para modelar esta dependência e posteriormente aplicados a um conjunto de dados fornecido por uma seguradora portuguesa. No final, a comparação com o cenário de independência é realizada.
The estimation of the individual loss is an important task to price insurance policies. The standard approach assumes independence between claim frequency and severity, which may not be a realistic assumption. In this text, the dependence between claim counts and claim sizes is explored, in a Generalized Linear Model framework. A Conditional severity model and a Copula model are presented as alternatives to model this dependence and later applied to a data set provided by a Portuguese insurance company. At the end, the comparison with the independence scenario is carried out.
info:eu-repo/semantics/publishedVersion
APA, Harvard, Vancouver, ISO, and other styles
6

Creegan, Helen P. "Modelling the effects of changing habitat characteristics and spatial pattern on woodland songbird distributions in West and Central Scotland." Thesis, University of Stirling, 2005. http://hdl.handle.net/1893/48.

Full text
Abstract:
This study investigated bird distributions in relation to local habitat and landscape pattern and the implications which habitat fragmentation may have for woodland birds. There were two sections to the research: an experimental study investigating bird gap crossing behaviour across distances of five to 120m; and an observational study modelling woodland bird distributions in relation to local habitat and landscape scale variables in two study areas (East Loch Lomond and the Central Scotland Forest). In the experimental study it was hypothesised that bird willingness to cross gaps will decrease with increasing gap distance even at home-range scales and that the rate of decline will vary interspecifically in relation to bird morphology. Song thrush mobbing calls played at woodland edges in the West of Scotland were used to attract birds across gaps and results were compared with the response along woodland edges. Data were obtained for four species: chaffinch, coal tit, robin and goldcrest. The decline in response with distance across gaps and along woodland edge was modelled for each species using generalized linear modelling. Maximum gap crossing distances ranged from 46m (goldcrest) to 150m (extrapolated value for the chaffinch). Goldcrests responded more readily through woodlands. There was no difference between woodland edge and gap response for the coal tit. Robins and chaffinches however responded more readily across gaps than through woodland. When different response indices were plotted against bird mass and wing area, results suggested that larger birds with bigger wings responded more readily across gaps than through woodland. It is suggested that this relates to differences in bird manoeuvrability within woodlands and ability to evade a predator in gaps. Fragmentation indices were calculated for an area of the Central Scotland Forest to show how willingness to cross different gap distances influences perception of how fragmented the woodlands are in a region. Results are discussed in the context of the creation of Forest Habitat Networks. The data for the observational section of the work was from bird point counts for 200 sample points at East Loch Lomond in 1998 and 2000 and 267 sample points in the Central Scotland Forest in 1999. In addition a time series of point count data was available for 30 sample points at East Loch Lomond. Additional data was gathered for ten sample points (1998) and two sample points (2000) at East Loch Lomond to investigate effects of observer, time and weather on count data. Generalized linear and generalized additive modelling was carried out on these additional data. Results indicated that biases due to the variation in time and weather conditions between counts existed in the pure count data but that these were eliminated by reducing data to presence and absence form for analysis. Species accumulation curves indicated that two counts per sample point were insufficient to determine species richness. However a sufficiently large proportion of the species was being detected consistently in two counts of ten minutes duration for it to be valid to model them in relation to habitat and landscape variables. Point count data for East Loch Lomond in 1998 (ELL98) and the Central Scotland Forest in 1999 (CSF99) for the wren, treecreeper, garden warbler, robin, blue tit, blackbird, willow warbler, coal tit, goldcrest, great tit, and song thrush were analysed using generalized additive modelling. In addition models were built for the blackcap (CSF99) and the siskin, redstart and wood warbler (ELL98). Where all relationships were identified as linear, models were rebuilt as GLMs. Models were evaluated using the Area Under the Curve (AUC) of Receiver Operating Characteristic (ROC) plots. AUC values ranged from 0.84-0.99 for ELL98 and from 0.76-0.93 for CSF99 indicating high predictive accuracy. Habitat variables accounted for the largest proportion of explained variation in all models and could be interpreted in terms of bird nesting and feeding behaviour. However additional variation was explained by landscape scale and fragmentation related (especially edge) variables. ELL98 models were used to predict bird distributions for Loch Lomond in 2000 (ELL00) and for the CSF99. Likewise the CSF99 models were used to predict distributions for ELL98 and ELL00. Predicted distributions had useful application in many cases within the ELL site between years. Fewer cases of useful application arose for predicting distributions between sites. Results are discussed in the context of the generality of bird environment relationships and reasons for low predictive accuracy when models are applied between sites and years. Models which had useful application for ELL00 were used to predict bird distributions for 2025 and 2050 at East Loch Lomond. Habitat and landscape changes were projected based on the proposed management for the site. Since woodland regeneration rates are difficult to predict, two scenarios were modelled, one assuming a modest amount of regeneration and one assuming no regeneration. Predictions derived from the ELL98 models showed broad-leaved species increasing in distribution while coniferous species declined. This was in keeping with the expected changes in the relative extent of broad-leaved and coniferous habitat. However, predictions from the CSF99 models were often less readily explicable. The value of the modelling approach is discussed and suggestions are made for further study to improve confidence in the predictions.
APA, Harvard, Vancouver, ISO, and other styles
7

Hardin, Patrik, and Sam Tabari. "Modelling Non-life Insurance Policyholder Price Sensitivity : A Statistical Analysis Performed with Logistic Regression." Thesis, KTH, Matematisk statistik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209773.

Full text
Abstract:
This bachelor thesis within mathematical statistics studies the possibility of modelling the renewal probability for commercial non-life insurance policyholders. The project was carried out in collaboration with the non-life insurance company If P&C Insurance Ltd. at their headquarters in Stockholm, Sweden. The paper includes an introduction to underlying concepts within insurance and mathematics and a detailed review of the analytical process followed by a discussion and conclusions. The first stages of the project were the initial collection and processing of explanatory insurance data and the development of a logistic regression model for policy renewal. An initial model was built and modern methods of mathematics and statistics were applied in order obtain a final model consisting of 9 significant characteristics. The regression model had a predictive power of 61%. This suggests that it to a certain degree is possible to predict the renewal probability of non-life insurance policyholders based on their characteristics. The results from the final model were ultimately translated into a measure of price sensitivity which can be implemented in both pricing models and CRM systems. We believe that price sensitivity analysis, if done correctly, is a natural step in improving the current pricing models in the insurance industry and this project provides a foundation for further research in this area.
Detta kandidatexamensarbete inom matematisk statistik undersöker möjligheten att modellera förnyelsegraden för kommersiella skadeförsärkringskunder. Arbetet utfördes i samarbete med If Skadeförsäkring vid huvudkontoret i Stockholm, Sverige. Uppsatsen innehåller en introduktion till underliggande koncept inom försäkring och matematik samt en utförlig översikt över projektets analytiska process, följt av en diskussion och slutsatser. De huvudsakliga delarna av projektet var insamling och bearbetning av förklarande försäkringsdata samt utvecklandet och tolkningen av en logistisk regressionsmodell för förnyelsegrad. En första modell byggdes och moderna metoder inom matematik och statistik utfördes för att erhålla en slutgiltig regressionsmodell uppbyggd av 9  signifikanta kundkaraktäristika. Regressionsmodellen hade en förklaringsgrad av 61% vilket pekar på att det till en viss grad är möjligt att förklara förnyelsegraden hos försäkringskunder utifrån dessa karaktäristika. Resultaten från den slutgiltiga modellen översattes slutligen till ett priskänslighetsmått vilket möjliggjorde implementering i prissättningsmodeller samt CRM-system. Vi anser att priskänslighetsanalys, om korrekt genomfört, är ett naturligt steg i utvecklingen av dagens prissättningsmodeller inom försäkringsbranschen och detta projekt lägger en grund för fortsatta studier inom detta område.
APA, Harvard, Vancouver, ISO, and other styles
8

Frühwirth-Schnatter, Sylvia. "Applied State Space Modelling of Non-Gaussian Time Series using Integration-based Kalman-filtering." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1993. http://epub.wu.ac.at/1558/1/document.pdf.

Full text
Abstract:
The main topic of the paper is on-line filtering for non-Gaussian dynamic (state space) models by approximate computation of the first two posterior moments using efficient numerical integration. Based on approximating the prior of the state vector by a normal density, we prove that the posterior moments of the state vector are related to the posterior moments of the linear predictor in a simple way. For the linear predictor Gauss-Hermite integration is carried out with automatic reparametrization based on an approximate posterior mode filter. We illustrate how further topics in applied state space modelling such as estimating hyperparameters, computing model likelihoods and predictive residuals, are managed by integration-based Kalman-filtering. The methodology derived in the paper is applied to on-line monitoring of ecological time series and filtering for small count data. (author's abstract)
Series: Forschungsberichte / Institut für Statistik
APA, Harvard, Vancouver, ISO, and other styles
9

Aarts, Geert. "Modelling space-use and habitat preference from wildlife telemetry data." Thesis, St Andrews, 2007. http://hdl.handle.net/10023/327.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Stephenson, John. "Multilevel generalised linear modelling and competing risks multistate survival analysis modelling of childhood caries." Thesis, Cardiff University, 2009. http://eprints.hud.ac.uk/7910/.

Full text
Abstract:
There has been an ongoing debate regarding appropriate strategies for the management of carious primary teeth. Studies appear to provide evidence that both selective, symptom-based interventions and traditional restorative strategies are advantageous. However, the analysis and quantification of childhood caries may be affected by clustering of data, and the concurrent risk of exfoliation of primary teeth. Multilevel generalised linear models for the occurrence of primary caries were derived utilising data from a cohort study of 2,654 children aged 4-5 years at baseline undertaken 1999-2003. These models, which assumed underlying hierarchies with clustering at child, tooth and surface levels, identified higher rates of caries occurrence in primary molar teeth to be associated with boys, poor socio-economic background, lack of water fluoridation, 2nd mandibular molars and occlusal surfaces. Significant risk factors identified were carried forward for inclusion in parametric competing risks multivariate multilevel survival models, utilising cohort study data augmented with Dental Practice Board treatment data. Analysis of sound teeth and surfaces found the concurrent risk of exfoliation did not alter inferences of parameter significance, but restricted the extent of caries occurrence and reduced distinction in survival experience between different types of teeth and surfaces in children from different demographic backgrounds. Further competing risks survival models were derived to analyse the same teeth and surfaces in the untreated carious and filled states, to assess the effect of restorative treatment on subsequent exfoliation and extraction. Survival rates extrapolated to 14 years without further treatment for filled molar teeth were approximately double those of untreated teeth. Time of caries occurrence and treatment also affected survival, with later occurrence or treatment of caries associated with higher survival rates. However, early filling of carious teeth resulted in the greatest reductions in the expected time that decay is present in the mouth.
APA, Harvard, Vancouver, ISO, and other styles
11

Hatzopoulos, Peter. "Statistical and mathematical modelling for mortality trends and the comparison of mortality experiences, through generalised linear models and GLIM." Thesis, City University London, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Dixon, William J., and bill dixon@dse vic gov au. "Uncertainty in Aquatic Toxicological Exposure-Effect Models: the Toxicity of 2,4-Dichlorophenoxyacetic Acid and 4-Chlorophenol to Daphnia carinata." RMIT University. Biotechnology and Environmental Biology, 2005. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20070119.163720.

Full text
Abstract:
Uncertainty is pervasive in risk assessment. In ecotoxicological risk assessments, it arises from such sources as a lack of data, the simplification and abstraction of complex situations, and ambiguities in assessment endpoints (Burgman 2005; Suter 1993). When evaluating and managing risks, uncertainty needs to be explicitly considered in order to avoid erroneous decisions and to be able to make statements about the confidence that we can place in risk estimates. Although informative, previous approaches to dealing with uncertainty in ecotoxicological modelling have been found to be limited, inconsistent and often based on assumptions that may be false (Ferson & Ginzburg 1996; Suter 1998; Suter et al. 2002; van der Hoeven 2004; van Straalen 2002a; Verdonck et al. 2003a). In this thesis a Generalised Linear Modelling approach is proposed as an alternative, congruous framework for the analysis and prediction of a wide range of ecotoxicological effects. This approach was used to investigate the results of toxicity experiments on the effect of 2,4-Dichlorophenoxyacetic Acid (2,4-D) formulations and 4-Chlorophenol (4-CP, an associated breakdown product) on Daphnia carinata. Differences between frequentist Maximum Likelihood (ML) and Bayesian Markov-Chain Monte-Carlo (MCMC) approaches to statistical reasoning and model estimation were also investigated. These approaches are inferentially disparate and place different emphasis on aleatory and epistemic uncertainty (O'Hagan 2004). Bayesian MCMC and Probability Bounds Analysis methods for propagating uncertainty in risk models are also compared for the first time. For simple models, Bayesian and frequentist approaches to Generalised Linear Model (GLM) estimation were found to produce very similar results when non-informative prior distributions were used for the Bayesian models. Potency estimates and regression parameters were found to be similar for identical models, signifying that Bayesian MCMC techniques are at least a suitable and objective replacement for frequentist ML for the analysis of exposureresponse data. Applications of these techniques demonstrated that Amicide formulations of 2,4-D are more toxic to Daphnia than their unformulated, Technical Acid parent. Different results were obtained from Bayesian MCMC and ML methods when more complex models and data structures were considered. In the analysis of 4-CP toxicity, the treatment of 2 different factors as fixed or random in standard and Mixed-Effect models was found to affect variance estimates to the degree that different conclusions would be drawn from the same model, fit to the same data. Associated discrepancies in the treatment of overdispersion between ML and Bayesian MCMC analyses were also found to affect results. Bayesian MCMC techniques were found to be superior to the ML ones employed for the analysis of complex models because they enabled the correct formulation of hierarchical (nested) datastructures within a binomial logistic GLM. Application of these techniques to the analysis of results from 4-CP toxicity testing on two strains of Daphnia carinata found that between-experiment variability was greater than that within-experiments or between-strains. Perhaps surprisingly, this indicated that long-term laboratory culture had not significantly affected the sensitivity of one strain when compared to cultures of another strain that had recently been established from field populations. The results from this analysis highlighted the need for repetition of experiments, proper model formulation in complex analyses and careful consideration of the effects of pooling data on characterising variability and uncertainty. The GLM framework was used to develop three dimensional surface models of the effects of different length pulse exposures, and subsequent delayed toxicity, of 4-CP on Daphnia. These models described the relationship between exposure duration and intensity (concentration) on toxicity, and were constructed for both pulse and delayed effects. Statistical analysis of these models found that significant delayed effects occurred following the full range of pulse exposure durations, and that both exposure duration and intensity interacted significantly and concurrently with the delayed effect. These results indicated that failure to consider delayed toxicity could lead to significant underestimation of the effects of pulse exposure, and therefore increase uncertainty in risk assessments. A number of new approaches to modelling ecotoxicological risk and to propagating uncertainty were also developed and applied in this thesis. In the first of these, a method for describing and propagating uncertainty in conventional Species Sensitivity Distribution (SSD) models was described. This utilised Probability Bounds Analysis to construct a nonparametric 'probability box' on an SSD based on EC05 estimates and their confidence intervals. Predictions from this uncertain SSD and the confidence interval extrapolation methods described by Aldenberg and colleagues (2000; 2002a) were compared. It was found that the extrapolation techniques underestimated the width of uncertainty (confidence) intervals by 63% and the upper bound by 65%, when compared to the Probability Bounds (P3 Bounds) approach, which was based on actual confidence estimates derived from the original data. An alternative approach to formulating ecotoxicological risk modelling was also proposed and was based on a Binomial GLM. In this formulation, the model is first fit to the available data in order to derive mean and uncertainty estimates for the parameters. This 'uncertain' GLM model is then used to predict the risk of effect from possible or observed exposure distributions. This risk is described as a whole distribution, with a central tendency and uncertainty bounds derived from the original data and the exposure distribution (if this is also 'uncertain'). Bayesian and P-Bounds approaches to propagating uncertainty in this model were compared using an example of the risk of exposure to a hypothetical (uncertain) distribution of 4-CP for the two Daphnia strains studied. This comparison found that the Bayesian and P-Bounds approaches produced very similar mean and uncertainty estimates, with the P-bounds intervals always being wider than the Bayesian ones. This difference is due to the different methods for dealing with dependencies between model parameters by the two approaches, and is confirmation that the P-bounds approach is better suited to situations where data and knowledge are scarce. The advantages of the Bayesian risk assessment and uncertainty propagation method developed are that it allows calculation of the likelihood of any effect occurring, not just the (probability)bounds, and that the same software (WinBugs) and model construction may be used to fit regression models and predict risks simultaneously. The GLM risk modelling approaches developed here are able to explain a wide range of response shapes (including hormesis) and underlying (non-normal) distributions, and do not involve expression of the exposure-response as a probability distribution, hence solving a number of problems found with previous formulations of ecotoxicological risk. The approaches developed can also be easily extended to describe communities, include modifying factors, mixed-effects, population growth, carrying capacity and a range of other variables of interest in ecotoxicological risk assessments. While the lack of data on the toxicological effects of chemicals is the most significant source of uncertainty in ecotoxicological risk assessments today, methods such as those described here can assist by quantifying that uncertainty so that it can be communicated to stakeholders and decision makers. As new information becomes available, these techniques can be used to develop more complex models that will help to bridge the gap between the bioassay and the ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
13

Mphekgwana, Modupi Peter. "Analysis of road traffic accidents in Limpopo Province using generalized linear modelling." Thesis, 2020. http://hdl.handle.net/10386/3483.

Full text
Abstract:
Thesis (M.Sc. (Statistics)) -- University of Limpopo, 2020
Background: Death and economic losses due to road traffic accidents (RTA) are huge global public health and developmental problems and need urgent attention. Each year nearly 1.24 million people die and millions suffer various forms of disability as a result of road accidents. This puts road traffic injuries (RTIs) as the eighth leading cause of death globally and RTIs are set to become the fifth leading cause of death worldwide by the year 2030 unless urgent actions are taken. Aim: In this paper, we investigate factors that contribute to road traffic deaths (RTDs) in the Limpopo province of South Africa using models such as the generalized linear models (GLM) and zero inflated models. Methods: The study was based on retrospective data that comprised of reports of 18,029 road traffic accidents and 4,944 road traffic deaths over the years 2009 – 2015. Generalized linear modelling and zero-inflated models were used to identify factors and determine their relationships to RTDs. Results: The data was split into two categories: deaths that occurred during holidays and those that occurred during non-holiday periods. It was found that the following variables, namely, Monday, human actions, vehicle conditions and vehicle makes, were significant predictors of RTDs during holidays. On the other hand, during non-holiday periods, weekend, Tuesday, Wednesday, national road, provincial road, sedan, LDV, combi and bus were found to be significant predictors of road traffic deaths. Conclusion: GLM techniques, such as the standard Poisson regression model and the negative binomial (NB) model, did little to explain the zero excess, therefore, zero-inflated models, such as zero-inflated negative binomial (ZINB), were found to be useful in explaining excess zeros. Recommendation: The study recommends that the government should make more human power available during the festive seasons, such as the December holidays, and over weekends.
APA, Harvard, Vancouver, ISO, and other styles
14

Parra, Hugo Alexandre Esteves. "Habitat predictive modelling of demersal fish species in the Azores." Master's thesis, 2013. http://hdl.handle.net/10400.3/3092.

Full text
Abstract:
Dissertação de Mestrado, Estudos Integrados dos Oceanos, 25 de Março de 2013, Universidade dos Açores.
Species distribution modelling of the marine environment has been extensively used to assess species–environment relationships to predict fish spatial distributions accurately. In this study we explored the application of two distinct modelling techniques, maximum entropy model (MaxEnt) and generalized linear models (GLMs) for predicting the potential distribution in the Azores economic exclusive zone (EEZ) of four economically important demersal fish species: blackbelly rosefish, Helicolenus dactylopterus dactylopterus, forkbeard, Phycis phycis, wreckfish, Polyprion americanus and offshore rockfish, Pontinus kuhlii. Models were constructed based on 13 years of fish presence/absence data derived from bottom longline surveys performed in the study area combined with high resolution (300 m) topographic and biogeochemical habitat seafloor variables. The most important predictors were depth and slope followed by sediment type, oxygen saturation and salinity, with relative contributions being similar among species. GLMs provided ‘outstanding’ model predictions (AUC>0.9) for two of the four fish species while MaxEnt provided ‘excellent’ model predictions (AUC=0.8–0.9) for three of four species. The level of agreement between observed and predicted presence/absence sites for both modelling techniques was ‘moderate’ (K=0.4–0.6) for three of the four species with P. americanus models presenting the lowest level of agreement (K<0.1). For the scope of this study, both modelling approaches presented here were determined to produce viable presence/absence maps which represent a snap–shot of the potential distributions of the investigated species. This information provides a better description of demersal fish spatial ecology and can be of a great deal of interest for future fisheries management and conservation planning.
APA, Harvard, Vancouver, ISO, and other styles
15

Burombo, Emmanuel Chamunorwa. "Statistical modelling of return on capital employed of individual units." Diss., 2014. http://hdl.handle.net/10500/19627.

Full text
Abstract:
Return on Capital Employed (ROCE) is a popular financial instrument and communication tool for the appraisal of companies. Often, companies management and other practitioners use untested rules and behavioural approach when investigating the key determinants of ROCE, instead of the scientific statistical paradigm. The aim of this dissertation was to identify and quantify key determinants of ROCE of individual companies listed on the Johannesburg Stock Exchange (JSE), by comparing classical multiple linear regression, principal components regression, generalized least squares regression, and robust maximum likelihood regression approaches in order to improve companies decision making. Performance indicators used to arrive at the best approach were coefficient of determination ( ), adjusted ( , and Mean Square Residual (MSE). Since the ROCE variable had positive and negative values two separate analyses were done. The classical multiple linear regression models were constructed using stepwise directed search for dependent variable log ROCE for the two data sets. Assumptions were satisfied and problem of multicollinearity was addressed. For the positive ROCE data set, the classical multiple linear regression model had a of 0.928, an of 0.927, a MSE of 0.013, and the lead key determinant was Return on Equity (ROE),with positive elasticity, followed by Debt to Equity (D/E) and Capital Employed (CE), both with negative elasticities. The model showed good validation performance. For the negative ROCE data set, the classical multiple linear regression model had a of 0.666, an of 0.652, a MSE of 0.149, and the lead key determinant was Assets per Capital Employed (APCE) with positive effect, followed by Return on Assets (ROA) and Market Capitalization (MC), both with negative effects. The model showed poor validation performance. The results indicated more and less precision than those found by previous studies. This suggested that the key determinants are also important sources of variability in ROCE of individual companies that management need to work with. To handle the problem of multicollinearity in the data, principal components were selected using Kaiser-Guttman criterion. The principal components regression model was constructed using dependent variable log ROCE for the two data sets. Assumptions were satisfied. For the positive ROCE data set, the principal components regression model had a of 0.929, an of 0.929, a MSE of 0.069, and the lead key determinant was PC4 (log ROA, log ROE, log Operating Profit Margin (OPM)) and followed by PC2 (log Earnings Yield (EY), log Price to Earnings (P/E)), both with positive effects. The model resulted in a satisfactory validation performance. For the negative ROCE data set, the principal components regression model had a of 0.544, an of 0.532, a MSE of 0.167, and the lead key determinant was PC3 (ROA, EY, APCE) and followed by PC1 (MC, CE), both with negative effects. The model indicated an accurate validation performance. The results showed that the use of principal components as independent variables did not improve classical multiple linear regression model prediction in our data. This implied that the key determinants are less important sources of variability in ROCE of individual companies that management need to work with. Generalized least square regression was used to assess heteroscedasticity and dependences in the data. It was constructed using stepwise directed search for dependent variable ROCE for the two data sets. For the positive ROCE data set, the weighted generalized least squares regression model had a of 0.920, an of 0.919, a MSE of 0.044, and the lead key determinant was ROE with positive effect, followed by D/E with negative effect, Dividend Yield (DY) with positive effect and lastly CE with negative effect. The model indicated an accurate validation performance. For the negative ROCE data set, the weighted generalized least squares regression model had a of 0.559, an of 0.548, a MSE of 57.125, and the lead key determinant was APCE and followed by ROA, both with positive effects.The model showed a weak validation performance. The results suggested that the key determinants are less important sources of variability in ROCE of individual companies that management need to work with. Robust maximum likelihood regression was employed to handle the problem of contamination in the data. It was constructed using stepwise directed search for dependent variable ROCE for the two data sets. For the positive ROCE data set, the robust maximum likelihood regression model had a of 0.998, an of 0.997, a MSE of 6.739, and the lead key determinant was ROE with positive effect, followed by DY and lastly D/E, both with negative effects. The model showed a strong validation performance. For the negative ROCE data set, the robust maximum likelihood regression model had a of 0.990, an of 0.984, a MSE of 98.883, and the lead key determinant was APCE with positive effect and followed by ROA with negative effect. The model also showed a strong validation performance. The results reflected that the key determinants are major sources of variability in ROCE of individual companies that management need to work with. Overall, the findings showed that the use of robust maximum likelihood regression provided more precise results compared to those obtained using the three competing approaches, because it is more consistent, sufficient and efficient; has a higher breakdown point and no conditions. Companies management can establish and control proper marketing strategies using the key determinants, and results of these strategies can see an improvement in ROCE.
Mathematical Sciences
M. Sc. (Statistics)
APA, Harvard, Vancouver, ISO, and other styles
16

Popovic, Dejan. "Location analysis of city sections: socio-demographic segmentation and restaurant potentiality estimation : a case study of Lisbon." Master's thesis, 2016. http://hdl.handle.net/10362/17657.

Full text
Abstract:
Dissertation submitted in partial fulfilment of the requirements for the degree of Master of Science in Geospatial Technologies
One of the objectives of this study is to perform classification of socio-demographic components for the level of city section in City of Lisbon. In order to accomplish suitable platform for the restaurant potentiality map, the socio-demographic components were selected to produce a map of spatial clusters in accordance to restaurant suitability. Consequently, the second objective is to obtain potentiality map in terms of underestimation and overestimation in number of restaurants. To the best of our knowledge there has not been found identical methodology for the estimation of restaurant potentiality. The results were achieved with combination of SOM (Self-Organized Map) which provides a segmentation map and GAM (Generalized Additive Model) with spatial component for restaurant potentiality. Final results indicate that the highest influence in restaurant potentiality is given to tourist sites, spatial autocorrelation in terms of neighboring restaurants (spatial component), and tax value, where lower importance is given to household with 1 or 2 members and employed population, respectively. In addition, an important conclusion is that the most attractive market sites have shown no change or moderate underestimation in terms of restaurants potentiality.
APA, Harvard, Vancouver, ISO, and other styles
17

Tifaoui, Said. "The Influence of Scale on the Measurement of the Vertical Price Transmission." Doctoral thesis, 2016. http://hdl.handle.net/11858/00-1735-0000-002B-7C01-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Mugisha, Stella. "Applied mathematical modelling with new parameters and applications to some real life problems." Thesis, 2018. http://hdl.handle.net/10500/24973.

Full text
Abstract:
Some Epidemic models with fractional derivatives were proved to be well-defined, well-posed and more accurate [34, 51, 116], compared to models with the conventional derivative. An Ebola epidemic model with non-linear transmission is fully analyzed. The model is expressed with the conventional time derivative with a new parameter included, which happens to be fractional (that derivative is called the 􀀀derivative). We proved that the model is well-de ned and well-posed. Moreover, conditions for boundedness and dissipativity of the trajectories are established. Exploiting the generalized Routh-Hurwitz Criteria, existence and stability analysis of equilibrium points for the Ebola model are performed to show that they are strongly dependent on the non-linear transmission. In particular, conditions for existence and stability of a unique endemic equilibrium to the Ebola system are given. Numerical simulations are provided for particular expressions of the non-linear transmission, with model's parameters taking di erent values. The resulting simulations are in concordance with the usual threshold behavior. The results obtained here may be signi cant for the ght and prevention against Ebola haemorrhagic fever that has so far exterminated hundreds of families and is still a ecting many people in West-Africa and other parts of the world. The full comprehension and handling of the phenomenon of shattering, sometime happening during the process of polymer chain degradation [129, 142], remains unsolved when using the traditional evolution equations describing the degradation. This traditional model has been proved to be very hard to handle as it involves evolution of two intertwined quantities. Moreover, the explicit form of its solution is, in general, impossible to obtain. We explore the possibility of generalizing evolution equation modeling the polymer chain degradation and analyze the model with the conventional time derivative with a new parameter. We consider the general case where the breakup rate depends on the size of the chain breaking up. In the process, the alternative version of Sumudu integral transform is used to provide an explicit form of the general solution representing the evolution of polymer sizes distribution. In particular, we show that this evolution exhibits existence of complex periodic properties due to the presence of cosine and sine functions governing the solutions. Numerical simulations are performed for some particular cases and prove that the system describing the polymer chain degradation contains complex and simple harmonic poles whose e ects are given by these functions or a combination of them. This result may be crucial in the ongoing research to better handle and explain the phenomenon of shattering. Lastly, it has become a conjecture that power series like Mittag-Le er functions and their variants naturally govern solutions to most of generalized fractional evolution models such as kinetic, di usion or relaxation equations. The question is to say whether or not this is always true! Whence, three generalized evolution equations with an additional fractional parameter are solved analytically with conventional techniques. These are processes related to stationary state system, relaxation and di usion. In the analysis, we exploit the Sumudu transform to show that investigation on the stationary state system leads to results of invariability. However, unlike other models, the generalized di usion and relaxation models are proven not to be governed by Mittag-Le er functions or any of their variants, but rather by a parameterized exponential function, new in the literature, more accurate and easier to handle. Graphical representations are performed and also show how that parameter, called ; can be used to control the stationarity of such generalized models.
Mathematical Sciences
Ph. D. (Applied Mathematics)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography