Rozprawy doktorskie na temat „Empirical distribution function test”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 48 najlepszych rozpraw doktorskich naukowych na temat „Empirical distribution function test”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.
Steele, Michael C., i n/a. "The Power of Categorical Goodness-Of-Fit Statistics". Griffith University. Australian School of Environmental Studies, 2003. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20031006.143823.
Pełny tekst źródłaSteele, Michael C. "The Power of Categorical Goodness-Of-Fit Statistics". Thesis, Griffith University, 2003. http://hdl.handle.net/10072/366717.
Pełny tekst źródłaThesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Australian School of Environmental Studies
Full Text
Haluzová, Dana. "Uplatnění statistických metod pro zkoumání vlastností nejprodávanějších přípravků na ochranu rostlin a vztahů mezi nimi". Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2018. http://www.nusl.cz/ntk/nusl-377387.
Pełny tekst źródłaTSUYUGUCHI, Aline Barbosa. "Testes de bondade de ajuste para a distribuição Birnbaum-Saunders". Universidade Federal de Campina Grande, 2012. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1333.
Pełny tekst źródłaMade available in DSpace on 2018-08-02T21:21:48Z (GMT). No. of bitstreams: 1 ALINE BARBOSA TSUYUGUCHI - DISSERTAÇÃO PPGMAT 2012..pdf: 613833 bytes, checksum: c354cd90842e461c0fb29b0ee5f925d3 (MD5) Previous issue date: 2012-02
CNPq
Neste trabalho estudamos testes de bondade de ajuste para a distribuição Birnbaum-Saunders. Consideramos testes clássicos baseados em função de distribuição empírica (Anderson-Darling, Cramér-von Mises e Kolmogorov-Sminorv) e baseados em função característica empírica. Nos limitamos ao caso onde o vetor de parâmetros é desconhecido e, portanto deverá ser estimado. Apresentamos estudos de simulação para verificar o desempenho das estatísticas de teste em estudo. Além disso, propomos estudos de simulação de Monte Carlo para testes de bondade de ajuste para a distribuição Birnbaum-Saunders com dados com censura tipo II.
In this work we study goodness-of-fit tests for Birnbaum-Saunders distribution. We consider classical tests based on empirical distribution function (Anderson-Darling, Cramér-von Mises e Kolmogorov-Sminorv) and based on empirical characteristic function. We limited this study to the case in which the vector of parameters is unknown and, therefore, must be estimated. We present the simulation studies to verify the performance of the test statistics in study. Also, we propose simulation studies of Monte Carlo for goodness-of-fit test for Birnbaum-Saunders distribution using Type-II censored data.
Huang, Yen-Chin. "Empirical distribution function statistics, speed of convergence, and p-variation". Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/12017.
Pełny tekst źródłaYu, Jun. "Empirical characteristics function in time series estimation and a test statistic in financial modelling". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp02/NQ31169.pdf.
Pełny tekst źródłaKWON, YEIL. "NONPARAMETRIC EMPIRICAL BAYES SIMULTANEOUS ESTIMATION FOR MULTIPLE VARIANCES". Diss., Temple University Libraries, 2018. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/495491.
Pełny tekst źródłaPh.D.
The shrinkage estimation has proven to be very useful when dealing with a large number of mean parameters. In this dissertation, we consider the problem of simultaneous estimation of multiple variances and construct a shrinkage type, non-parametric estimator. We take the non-parametric empirical Bayes approach by starting with an arbitrary prior on the variances. Under an invariant loss function, the resultant Bayes estimator relies on the marginal cumulative distribution function of the sample variances. Replacing the marginal cdf by the empirical distribution function, we obtain a Non-parametric Empirical Bayes estimator for multiple Variances (NEBV). The proposed estimator converges to the corresponding Bayes version uniformly over a large set. Consequently, the NEBV works well in a post-selection setting. We then apply the NEBV to construct condence intervals for mean parameters in a post-selection setting. It is shown that the intervals based on the NEBV are shortest among all the intervals which guarantee a desired coverage probability. Through real data analysis, we have further shown that the NEBV based intervals lead to the smallest number of discordances, a desirable property when we are faced with the current "replication crisis".
Temple University--Theses
Chen, Yuanyuan. "Continuity and compositions of operators with kernels in ultra-test function and ultra-distribution spaces". Doctoral thesis, Linnéuniversitetet, Institutionen för matematik (MA), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-58076.
Pełny tekst źródłaNgunkeng, Grace. "Statistical Analysis of Skew Normal Distribution and its Applications". Bowling Green State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1370958073.
Pełny tekst źródłaChopping, M. J. "Linear semi-empirical kernel-driven bidirectional reflectance distribution function models in monitoring semi-arid grasslands from space". Thesis, University of Nottingham, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.262949.
Pełny tekst źródłaYang, Guangyuan. "The Energy Goodness-of-fit Test for Univariate Stable Distributions". Bowling Green State University / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1339476355.
Pełny tekst źródłaOnaran, Özlem, i Engelbert Stockhammer. "Do profits affect investment and employment? An empirical test based on the Bhaduri-Marglin model". Inst. für Volkswirtschaftstheorie und -politik, WU Vienna University of Economics and Business, 2005. http://epub.wu.ac.at/1534/1/document.pdf.
Pełny tekst źródłaSeries: Working Papers Series "Growth and Employment in Europe: Sustainability and Competitiveness"
Gottfridsson, Anneli. "Likelihood ratio tests of separable or double separable covariance structure, and the empirical null distribution". Thesis, Linköpings universitet, Matematiska institutionen, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-69738.
Pełny tekst źródłaStewart, Michael Ian. "Asymptotic methods for tests of homogeneity for finite mixture models". University of Sydney. Mathematics and Statistics, 2002. http://hdl.handle.net/2123/855.
Pełny tekst źródłaSchwartz, Amy K. "Divergent natural selection and the parallel evolution of mating preferences : a model and empirical test for the origins of reproductive isolation". Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=82432.
Pełny tekst źródłaEger, Karl-Heinz, i Evgeni Borisovich Tsoy. "Robustness of Sequential Probability Ratio Tests in Case of Nuisance Parameters". Universitätsbibliothek Chemnitz, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-201000949.
Pełny tekst źródłaLoukrati, Hicham. "Tail Empirical Processes: Limit Theorems and Bootstrap Techniques, with Applications to Risk Measures". Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37594.
Pełny tekst źródłaMžourek, Zdeněk. "Statistická charakteristická funkce a její využití pro zpracování signálu". Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2014. http://www.nusl.cz/ntk/nusl-220664.
Pełny tekst źródłaHe, Bin. "APPLICATION OF THE EMPIRICAL LIKELIHOOD METHOD IN PROPORTIONAL HAZARDS MODEL". Doctoral diss., University of Central Florida, 2006. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4384.
Pełny tekst źródłaPh.D.
Department of Mathematics
Sciences
Mathematics
Trönnberg, Filip. "Empirical evaluation of a Markovian model in a limit order market". Thesis, Uppsala universitet, Matematiska institutionen, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-176726.
Pełny tekst źródłaPesee, Chatchai. "Stochastic modelling of financial processes with memory and semi-heavy tails". Thesis, Queensland University of Technology, 2005. https://eprints.qut.edu.au/16057/2/Chatchai%20Pesee%20Thesis.pdf.
Pełny tekst źródłaPesee, Chatchai. "Stochastic Modelling of Financial Processes with Memory and Semi-Heavy Tails". Queensland University of Technology, 2005. http://eprints.qut.edu.au/16057/.
Pełny tekst źródłaBjellerup, Mårten. "Essays on Consumption : - Aggregation, Asymmetry and Asset Distributions". Doctoral thesis, Växjö universitet, Ekonomihögskolan, EHV, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-406.
Pełny tekst źródłaErguven, Sait. "Path Extraction Of Low Snr Dim Targets From Grayscale 2-d Image Sequences". Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607723/index.pdf.
Pełny tekst źródłachange detection of super pixels, a group of pixels that has sufficient statistics for likelihood ratio testing, is proposed. Super pixels that are determined as transition points are signed on a binary difference matrix and grouped by 4-Connected Labeling method. Each label is processed to find its vector movement in the next frame by Label Destruction and Centroids Mapping techniques. Candidate centroids are put into Distribution Density Function Maximization and Maximum Histogram Size Filtering methods to find the target related motion vectors. Noise related mappings are eliminated by Range and Maneuver Filtering. Geometrical centroids obtained on each frame are used as the observed target path which is put into Optimum Decoding Based Smoothing Algorithm to smooth and estimate the real target path. Optimum Decoding Based Smoothing Algorithm is based on quantization of possible states, i.e. observed target path centroids, and Viterbi Algorithm. According to the system and observation models, metric values of all possible target paths are computed using observation and transition probabilities. The path which results in maximum metric value at the last frame is decided as the estimated target path.
Povalač, Karel. "Sledování spektra a optimalizace systémů s více nosnými pro kognitivní rádio". Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2012. http://www.nusl.cz/ntk/nusl-233577.
Pełny tekst źródłaMadani, Soffana. "Contributions à l’estimation à noyau de fonctionnelles de la fonction de répartition avec applications en sciences économiques et de gestion". Thesis, Lyon, 2017. http://www.theses.fr/2017LYSE1183/document.
Pełny tekst źródłaThe income distribution of a population, the distribution of failure times of a system and the evolution of the surplus in with-profit policies - studied in economics and management - are related to continuous functions belonging to the class of functionals of the distribution function. Our thesis covers the kernel estimation of some functionals of the distribution function with applications in economics and management. In the first chapter, we offer local polynomial estimators in the i.i.d. case of two functionals of the distribution function, written LF and TF , which are useful to produce the smooth estimators of the Lorenz curve and the scaled total time on test transform. The estimation method is described in Abdous, Berlinet and Hengartner (2003) and we prove the good asymptotic behavior of the local polynomial estimators. Until now, Gastwirth (1972) and Barlow and Campo (1975) have defined continuous piecewise estimators of the Lorenz curve and the scaled total time on test transform, which do not respect the continuity of the original curves. Illustrations on simulated and real data are given. The second chapter is intended to provide smooth estimators in the i.i.d. case of the derivatives of the two functionals of the distribution function presented in the last chapter. Apart from the estimation of the first derivative of the function TF with a smooth estimation of the distribution function, the estimation method is the local polynomial approximation of functionals of the distribution function detailed in Berlinet and Thomas-Agnan (2004). Various types of convergence and asymptotic normality are obtained, including the probability density function and its derivatives. Simulations appear and are discussed. The starting point of the third chapter is the Parzen-Rosenblatt estimator (Rosenblatt (1956), Parzen (1964)) of the probability density function. We first improve the bias of this estimator and its derivatives by using higher order kernels (Berlinet (1993)). Then we find the modified conditions for the asymptotic normality of these estimators. Finally, we build a method to remove boundary effects of the estimators of the probability density function and its derivatives, thanks to higher order derivatives. We are interested, in this final chapter, in the hazard rate function which, unlike the two functionals of the distribution function explored in the first chapter, is not a fraction of two linear functionals of the distribution function. In the i.i.d. case, kernel estimators of the hazard rate and its derivatives are produced from the kernel estimators of the probability density function and its derivatives. The asymptotic normality of the first estimators is logically obtained from the second ones. Then, we are placed in the multiplicative intensity model, a more general framework including censored and dependent data. We complete the described method in Ramlau-Hansen (1983) to obtain good asymptotic properties of the estimators of the hazard rate and its derivatives and we try to adopt the local polynomial approximation in this context. The surplus rate in with-profit policies will be nonparametrically estimated as its mathematical expression depends on transition rates (hazard rates from one state to another) in a Markov chain (Ramlau-Hansen (1991), Norberg (1999))
Santos, Mariana Faria dos. "Modelling claim counts of homogeneous risk groups using copulas". Master's thesis, Instituto Superior de Economia e Gestão, 2010. http://hdl.handle.net/10400.5/2932.
Pełny tekst źródłaOver the years modelling the dependence between random variables has been a challenge in many areas, like insurance and finance. Recently with the new capital requirement regime for the European insurance business, this subject is increasing importance since, according to Solvency II, the insurer's risks should be modelled separately, then aggregated follow¬ing some dependence structure. The challenge of this framework is to achieve an accurate way of joining dependent risks in order to not over or underestimate the capital require¬ments. The aim of this thesis is to give a practical application of a multivariate model based on copulas as well as all the theoretical and important concepts related to the theory of copulas. Although the definition of copula dates from 1959, only recently some authors such Clemen and Reilly (1999), Daul et al (2003), Dias (2004), Frees and Valdez (1998) and Embrechts et al (2003) applied the copula framework for the finance and insurance data. In the mean time, estimation procedures and goodness-of-fit tests has been developing in the literature. In this thesis we introduce the theory of copulas together with the study of copula models for insurance data. Beforehand, copula definition andpropertiesaswellassome types of copulas discussed in literature are introduced. We present methods to estimate the parameters and a goodness-of-fit test for copulas. Afterwards, we present a summary of Solvency II and a multivariate model to fit claim counts between three homogeneous risk groups that are the core of the automobile business: Third party liability property damages, third party liability bodily injury and material own damages. The methodology followed is based on copula models and the procedures are carried out in two steps. First, we model the marginal distributions of each risk group and test the goodness-of-fit of each distribution. We propose two models to fit the marginal distributions: a discrete and an approximating continuous model. In the first model we test a Poisson and a Negative Binomial distribution whereas in the second one we test a Gamma and a Normal distribution approximations. In spite of being the natural approach to fit the claim counts, the discrete model has some limitations since copulas have serious restrictions when the marginals are discrete. Thus, a continuous model is proposed to fit the data as an alternative avoiding these limitations. Finally, we fitdifferent copulas families estimating its parameters through some procedures, presented along this thesis. We evaluate the goodness-of-fit using a statistical test based on the empirical copula concept.
Ao longo dos últimos anos, avaliar a dependência entre variáveis aleatórias tem sido um desafio constante em muitas áreas, como por exemplo nos seguros e finanças. Recente¬mente, com o novo regime de solvência para as companhias de seguros europeias, este tema tem ganho um grande relevo, uma vez que, de acordo com o programa Solvência II, os riscos de uma companhia de seguros devem ser avaliados separadamente, sendo posteri¬ormente agregados de forma dependente. O grande desafio deste modelo é conseguir en¬contrar a forma mais correcta de agregar os diversos riscos considerando dependência, por forma a não subestimar nem sobrestimar o valor dos requisitos de capital. Posto isto, o objectivo desta tese é, por um lado, apresentar uma aplicação prática de um modelo mul-tivariado onde se utilizam diversas cópulas para avaliar a estrutura de dependência entre os riscos e, por outro lado, fornecer todos os conceitos teóricos mais importantes relacionados com a teoria das cópulas. Apesar da definição de cópula datar de 1959, apenas recente¬mente alguns autores, tais como Clemen and Reilly (1999), Daul et al (2003), Dias (2004), Frees and Valdez (1998) e Embrechts et al (2003), aplicaram este conceito à área da banca e seguros. Entretanto, têm sido desenvolvidos na literatura métodos para a estimação dos parâmetros e testes de ajustamento para modelos multivariados que utilizam cópulas. Nesta tese, a teoria das cópulas é introduzida juntamente com um estudo prático aplicado a dados de uma seguradora real. Em primeiro lugar, é apresentada a definição e as propriedades gerais das cópulas sendo também detalhadas algumas cópulas conheci¬das e estudadas na literatura. Além disto, são apresentados métodos para a estimação dos parâmetros e testes para avaliar a qualidade do ajustamento das cópulas aos dados. Segui¬damente, é feito um breve resumo sobre Solvência II e apresentado um modelo multivariado para ajustar o número de sinistros de três grupos de risco homogéneos que constituem o núcleo do ramo automóvel: Responsabilidade civil de danos materiais, responsabilidade civil de danos corporais e danos próprios. Os procedimentos são realizados em dois passos. Em primeiro lugar ajustam-se várias distribuições marginais para cada grupo de risco testando a qualidade de ajustamento de cada uma. Para o ajustamento das marginais são propostos dois modelos: um modelo discreto e um modelo de aproximação contínuo. No primeiro as distribuições testadas são a Binomial Negativa e a Poisson, enquanto que no segundo são testadas as distribuições Gama e Normal. O modelo discreto tem algumas limitações, uma vez que as cópulas têm sérias restrições quando as marginais são discretas. Daqui advém a necessidade de propor um modelo contínuo aproximado. Finalmente, ajustam-se diver¬sas cópulas utilizando métodos de estimação apresentados ao longo da tese. A qualidade do ajustamento é testada através de uma estatística tese baseada no conceito de cópula empírica.
Tejkal, Martin. "Vybrané transformace náhodných veličin užívané v klasické lineární regresi". Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2017. http://www.nusl.cz/ntk/nusl-318798.
Pełny tekst źródłaBerglund, Filip. "Asymptotics of beta-Hermite Ensembles". Thesis, Linköpings universitet, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-171096.
Pełny tekst źródłaI denna kandidatuppsats presenterar vi resultat om några olika egenvärdens-statistikor från beta-Hermite ensemblerna, först i de klassiska fallen då beta = 1, 2, 4, det vill säga den gaussiska ortogonala ensemblen (bestående av reella symmetriska matriser), den gaussiska unitära ensemblen (bestående av komplexa hermitiska matriser) och den gaussiska symplektiska ensemblen (bestående av kvaternioniska själv-duala matriser). Vi tittar även på de mindre undersökta generella beta-Hermite ensemblerna (bestående av reella symmetriska tridiagonala matriser). Specifikt tittar vi på den empiriska fördelningsfunktionen och två olika normeringar av det största egenvärdet. De resultat vi presenterar för dessa statistikor är den empiriska fördelningsfunktionens konvergens mot halvcirkel-fördelningen, det normerade största egenvärdets konvergens mot Tracy-Widom fördelningen, och, med en annan normering, största egenvärdets konvergens mot 1. Vi illustrerar även dessa resultat med hjälp av simuleringar. För den gaussiska unitära ensemblen presenterar vi ett uttryck för dess nivåtäthet. För att underlätta förståelsen av den gaussiska symplektiska ensemblen presenterar vi egenskaper hos egenvärdena av kvaternioniska matriser. Slutligen bevisar vi en sats om symmetrin hos ordningsstatistikan av egenvärdena av beta-Hermite ensemblerna.
ALENCASTRO, JUNIOR José Vianney Mendonça de. "Conformidade à lei de Newcomb-Benford de grandezas astronômicas segundo a medida de Kolnogorov-Smirnov". Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/18354.
Pełny tekst źródłaMade available in DSpace on 2017-02-21T15:12:08Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Dissertação_JoséVianneyMendonçaDeAlencastroJr.pdf: 648691 bytes, checksum: f2fbc98e547f0284f5aef34aee9249ca (MD5) Previous issue date: 2016-09-09
A lei de Newcomb-Benford, também conhecida como a lei do dígito mais significativo, foi descrita pela primeira vez por Simon Newcomb, sendo apenas embasada estatisticamente após 57 anos pelo físico Frank Benford. Essa lei rege grandezas naturalmente aleatórias e tem sido utilizada por várias áreas como forma de selecionar e validar diversos tipos de dados. Em nosso trabalho tivemos como primeiro objetivo propor o uso de um método substituto ao qui-quadrado, sendo este atualmente o método comumente utilizado pela literatura para verificação da conformidade da Lei de Newcomb-Benford. Fizemos isso pois em uma massa de dados com uma grande quantidade de amostras o método qui-quadrado tende a sofrer de um problema estatístico conhecido por excesso de poder, gerando assim resultados do tipo falso negativo na estatística. Dessa forma propomos a substituição do método qui-quadrado pelo método de Kolmogorov-Smirnov baseado na Função de Distribuição Empírica para análise da conformidade global, pois esse método é mais robusto não sofrendo do excesso de poder e também é mais fiel à definição formal da Lei de Benford, já que o mesmo trabalha considerando as mantissas ao invés de apenas considerar dígitos isolados. Também propomos investigar um intervalo de confiança para o Kolmogorov-Smirnov baseando-nos em um qui-quadrado que não sofre de excesso de poder por se utilizar o Bootstraping. Em dois artigos publicados recentemente, dados de exoplanetas foram analisados e algumas grandezas foram declaradas como conformes à Lei de Benford. Com base nisso eles sugerem que o conhecimento dessa conformidade possa ser usado para uma análise na lista de objetos candidatos, o que poderá ajudar no futuro na identificação de novos exoplanetas nesta lista. Sendo assim, um outro objetivo de nosso trabalho foi explorar diversos bancos e catálogos de dados astronômicos em busca de grandezas, cuja a conformidade à lei do dígito significativo ainda não seja conhecida a fim de propor aplicações práticas para a área das ciências astronômicas.
The Newcomb-Benford law, also known as the most significant digit law, was described for the first time by astronomer and mathematician Simon Newcomb. This law was just statistically grounded after 57 years after the Newcomb’s discovery. This law governing naturally random greatness and, has been used by many knowledge areas to validate several kind of data. In this work, the first goal is propose a substitute of qui-square method. The qui-square method is the currently method used in the literature to verify the Newcomb-Benford Law’s conformity. It’s necessary because in a greatness with a big quantity of samples, the qui-square method can has false negatives results. This problem is named Excess of Power. Because that, we proposed to use the Kolmogorov-Smirnov method based in Empirical Distribution Function (EDF) to global conformity analysis. Because this method is more robust and not suffering of the Excess of Power problem. The Kolmogorov-Smirnov method also more faithful to the formal definition of Benford’s Law since the method working considering the mantissas instead of single digits. We also propose to invetigate a confidence interval for the Kolmogorov-Smirnov method based on a qui-square with Bootstrapping strategy which doesn’t suffer of Excess of Power problem. Recently, two papers were published. I this papaers exoplanets data were analysed and some greatness were declared conform to a Newcomb-Benford distribution. Because that, the authors suggest that knowledge of this conformity can be used for help in future to indentify new exoplanets in the candidates list. Therefore, another goal of this work is explorer a several astronomicals catalogs and database looking for greatness which conformity of Benford’s law is not known yet. And after that , the authors suggested practical aplications for astronomical sciences area.
ARAÚJO, Raphaela Lima Belchior de. "Família composta Poisson-Truncada: propriedades e aplicações". Universidade Federal de Pernambuco, 2015. https://repositorio.ufpe.br/handle/123456789/16315.
Pełny tekst źródłaMade available in DSpace on 2016-04-05T14:28:43Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) dissertacao_Raphaela(CD).pdf: 1067677 bytes, checksum: 6d371901336a7515911aeffd9ee38c74 (MD5) Previous issue date: 2015-07-31
CAPES
Este trabalho analisa propriedades da família de distribuições de probabilidade Composta N e propõe a sub-família Composta Poisson-Truncada como um meio de compor distribuições de probabilidade. Suas propriedades foram estudadas e uma nova distribuição foi investigada: a distribuição Composta Poisson-Truncada Normal. Esta distribuição possui três parâmetros e tem uma flexibilidade para modelar dados multimodais. Demonstramos que sua densidade é dada por uma mistura infinita de densidades normais em que os pesos são dados pela função de massa de probabilidade da Poisson-Truncada. Dentre as propriedades exploradas desta distribuição estão a função característica e expressões para o cálculo dos momentos. Foram analisados três métodos de estimação para os parâmetros da distribuição Composta Poisson-Truncada Normal, sendo eles, o método dos momentos, o da função característica empírica (FCE) e o método de máxima verossimilhança (MV) via algoritmo EM. Simulações comparando estes três métodos foram realizadas e, por fim, para ilustrar o potencial da distribuição proposta, resultados numéricos com modelagem de dados reais são apresentados.
This work analyzes properties of the Compound N family of probability distributions and proposes the sub-family Compound Poisson-Truncated as a means of composing probability distributions. Its properties were studied and a new distribution was investigated: the Compound Poisson-Truncated Normal distribution. This distribution has three parameters and has the flexibility to model multimodal data. We demonstrated that its density is given by an infinite mixture of normal densities where in the weights are given by the Poisson-Truncated probability mass function. Among the explored properties of this distribution are the characteristic function end expressions for the calculation of moments. Three estimation methods were analyzed for the parameters of the Compound Poisson-Truncated Normal distribution, namely, the method of moments, the empirical characteristic function (ECF) and the method of maximum likelihood (ML) by EM algorithm. Simulations comparing these three methods were performed and, finally, to illustrate the potential of the proposed distribution numerical results with real data modeling are presented.
Saleem, Rashid. "Towards an end-to-end multiband OFDM system analysis". Thesis, University of Manchester, 2012. https://www.research.manchester.ac.uk/portal/en/theses/towards-an-endtoend-multiband-ofdm-system-analysis(e711f32f-1ac6-4b48-8f4e-58309c0482d3).html.
Pełny tekst źródłaMcGarry, Gregory John. "Model-based mammographic image analysis". Thesis, Queensland University of Technology, 2002.
Znajdź pełny tekst źródłaChou, Cheng-Chieh, i 周正杰. "Empirical-Distribution-Function Tests for theBeta-Binomial Model". Thesis, 2005. http://ndltd.ncl.edu.tw/handle/63290669580577465856.
Pełny tekst źródła淡江大學
數學學系碩士班
93
Empirical-distribution-function (EDF) goodness-of-fit tests are considered for the beta-binomial model. The testing procedures based on EDF statistics are given. A Monte Carlo study is conducted to investigate the accuracy and power of the tests against various alternative distributions. Our method is found to produce considerably greater power than that of Garren et al. (2001) in most cases. The tests are applied to data sets of the environmental toxicity studies.
Freitas, Luísa Maria Ribeiro de. "Testes de ajustamento baseados na função característica ponderada". Master's thesis, 2016. http://hdl.handle.net/10316/48129.
Pełny tekst źródłaEm 2014, Meintanis, Swanepoel e Allison propuseram um teste de ajustamento baseado numa nova função por eles introduzida e a que chamaram função característica ponderada por ser uma generalização da função característica. Este é o teste que estudamos nesta dissertação. Começamos assim por estabelecer as principais propriedades da nova função, passando a seguir ao estudo do teste de ajustamento nela baseado. Como ponto de partida, analisamos o caso particular do teste de ajustamento a uma distribuição fixa, generalizando de seguida a uma família de distribuições. Para ambos os casos, definimos a estatística de teste envolvente, determinamos a sua distribuição assintótica sob a hipótese nula e analisamos a convergência do teste. Por último, apresentamos os resultados de um estudo de simulação realizado para analisar a potência do teste de ajustamento a uma família de distribuições, tomando como referência um teste recomendado na literatura.
In 2014, Meintanis, Swanepoel e Allison proposed a goodness-of-fit test based on a new function introduced by them and called probability weighted characteristic function because it is a generalization of the usual characteristic function. The study of this test is the main goal of this dissertation. We start by establishing the main properties of the new function and then we examine in detail the goodness-of-fit test that depends on it. The starting point is to study the simple goodness-of-fit test and then the composite goodness-of-fit test. For both, the corresponding test statistics, limiting null distributions and consistency results are presented. Finally, we show the results of a simulation study carried out to analyze the power of the composite goodness-of-fit test and compare it with other well-known test.
Shao, Su-chun, i 邵淑君. "Distribution Systems and Asymmetric Information:An Empirical Test on the Individual Health Insurance". Thesis, 2008. http://ndltd.ncl.edu.tw/handle/86699194427951835465.
Pełny tekst źródła逢甲大學
風險管理與保險研究所
96
Development of the insurance market in the increasingly multi-channel, different distribution systems and marketing of its access patterns are also different, resulting in many different types of agency problem. The pathway of marketing strategy and trends in the development of many related research, in this study, an empirical analysis of different angles in different distribution systems was adopted to analyze the discrepancy under the lease quality and the loss of circumstances. This article (1) Commissions examination system (2) Product complexity (3) Insurance professional (4) marketing contracting mode (5) after-sales service, such as analysis of the five dimensions to be discussed. To a life insurance company to sample data, the insured included age, gender, amount insured, family policy, as independent variables. Tobit model is used to analyze the correlation between direct writer, bancassurance, telemarketing and damage of the lease. This study has two empirical findings. First, in terms of loss frequency, channels of bancassurance and telemarketing towards direct writer are significant negative correlation. Secondly, in terms of the loss ratio on the lease, bancassurance and telemarketing towards direct writer are significant negative correlation. Main reason is that for the sake of sales volume and commission consideration to the neglect of important information gathering and maintaining the quality of the lease. In other words, the agency problem of direct writer caused by the negative effects are conspicuous. Various distribution systems all have different access attributes, in the insurance industry to expand their business in the meantime, should be able to expand product differentiation supported by the correct underwriting mechanism to reduce the asymmetric information and agency cost, but also should attach importance to the Company cost efficiency, profit efficiency and the maintenance of quality of the lease.
Hsu, Pi-chun, i 徐碧君. "A Limiting Distribution Function Approximation to K-S Statistics for Normality Test". Thesis, 2009. http://ndltd.ncl.edu.tw/handle/78424314561879835833.
Pełny tekst źródła靜宜大學
應用數學研究所
98
The main purpose of this study is to explore the limiting distribution of the Kolmogorov-Smirnov statistics in the normality test. Base on Gumble extream value theorem, Empirical Limiting Distribution (ELD) is developed by Monte Carlo method, and is compared with Dallal & Wilkinson (DW, 1986) approximation. Both curves are found to be closed each other, but ELD curve would not produce unreasonable probability value (p>1) like D-W curve does. When they are applies to normality test, the type I error probability will estimated to be closed to the significant level. For the most of the alternative hypothesis distribution, the test power will achieve 1 if n>100. However, if the distribution of the alternative hypothesis is closed to normal, such as Beta (2,2), it is not easily to be detected. At the significant level of 0.05, even when n = 300, both methods still obtain about 0.6 of the test power. Finally these two methods are applied to fitting the size distribution of raindrops into normal and lognormal distribution.
Yu-TienWu i 吳昱瑱. "Evaluation of Semi-empirical Bidirectional Reflectance Distribution Function Models for Terrestrial Laser Scanning Data". Thesis, 2016. http://ndltd.ncl.edu.tw/handle/58899631706766623159.
Pełny tekst źródła國立成功大學
測量及空間資訊學系
104
Terrestrial Laser Scanning (TLS) intensity data can be regarded as the relative reflectance of the surface material. TLS intensity is a function of the angle of incidence of the laser beam, the range distance between the laser and the surface, and the surface type. The Bidirectional Reflectance Distribution Function (BRDF) was used to describe the relation between the angle of incidence and TLS intensity. Five remote sensing semi-empirical BRDF models, including Rahman–Pinty–Verstraete (RPV) model and four kernel-driven models derived from Ross (1981), were evaluated using twelve flat sample materials. The goodness of fit of the BRDF models to the TLS intensity are evaluated based on visual examination of the fitted trend and their corresponding coefficients of determination. The results show that the Rossthin-Roujean model and RPV model are the most suitable for TLS data.
Su, Lo-Shin, i 蘇樂生. "An Empirical Test of Hedging Model of Futures Markets- In the Condition of Different Basis Distribution". Thesis, 1997. http://ndltd.ncl.edu.tw/handle/36367920550202306413.
Pełny tekst źródła銘傳大學
金融研究所
85
This study's purpose is to find the time-varying optimal hedge ratios which can improve the hedge performance and reduce the transaction cost. The functions of futures market are risk transference, price discovery and speculation. Risk transference is the most important function. The hedge ratios are the critical factor of risk transference. An optimal hedge ratios are usually defined as the proper proportion of cash position that should be covered with a opposite position on the futures market. In fact, the affection of hedge is replaced price risk to basis (spot price subtract futures price) risk. So, distribution of basis has article effect on estimating hedge ratios. The study is trying to let basis distribution approach to normal distribution and to get an time-varying optimal hedge ratios. The traditional hedged ratios decided by different basis distribution. In the first model, they think the basis change is constant. They use linear regression model to get a constant hedge ratios in the whole hedged period. The model's is inconsiderate the time-varying information in the market. In the second model,they thought that the basis's change approach to random walk model. They use generalized autoregressive conditional hetersocedastic(GARCH) model to get a time-varying hedge ratios. An empirical test by Myers(1991) compare two model for estimating hedge ratios on futures contract. He find the hedge performance of GARCH model is marginal better than linear regression model. In a addition, because the GARCH is too sensitive in adjusting hedge ratios, the change of hedge ratios is violent. It will induce heavy transaction cost. We think that basis distrbution can approach to normal distribution, then we can get a time-varying optimal hedge ratios. The time-varying optimal hedge model is superior to traditional linear regression and GARCH hedge model.It can solve the following problems, the linear regression model ignore the information of market and the GARCH model induce the heavy transaction cost. It is worth applying to improve the hedge performance and reduce the transaction cost on futures markets.
Wu, Yu-Hsin, i 吳裕新. "The Relationship between Local Economic Growth and Income Distribution in China-An Empirical Test of Kuznets Hypothesis". Thesis, 2009. http://ndltd.ncl.edu.tw/handle/30946106689987302876.
Pełny tekst źródłaLam, Wai Yan Wendy. "An empirical examination of the impact of item parameters on IRT information functions in mixed format tests". 2012. https://scholarworks.umass.edu/dissertations/AAI3498358.
Pełny tekst źródłaKoné, Fangahagnian. "Test d'adéquation à la loi de Poisson bivariée au moyen de la fonction caractéristique". Thèse, 2016. http://hdl.handle.net/1866/18776.
Pełny tekst źródłaOur aim in this thesis is to conduct the goodness-of-fit test based on empirical characteristic functions proposed by Jiménez-Gamero et al. (2009) in the case of the bivariate Poisson distribution. We first evaluate the test’s behaviour in the case of the univariate Poisson distribution and find that the estimated type I error probabilities are close to the nominal values. Next, we extend it to the bivariate case and calculate and compare its power with the dispersion index test for the bivariate Poisson, Crockett’s Quick test for the bivariate Poisson and the two test families proposed by Novoa-Muñoz et Jiménez-Gamero (2014). Simulation results show that the probability of type I error is close to the claimed level and that it is generally less powerful than other tests. We also discovered that the dispersion index test should be bilateral whereas it rejects for large values only. Finally, the p-value of all these tests is calculated on a real dataset from soccer. The p-value of the test is 0,009 and we reject the hypothesis that the data come from a Poisson bivariate while the tests proposed by Novoa-Muñoz et Jiménez-Gamero (2014) leads to a different conclusion.
Liu, Yu-Lun, i 劉宇倫. "The Interaction among Taiwan, American, China,South Korea , and Hong Kong Stock Market: An Empirical Analysis on Cointegration Approach Rank Test and Impulse Response Function". Thesis, 2011. http://ndltd.ncl.edu.tw/handle/36765127780391741802.
Pełny tekst źródła國立高雄第一科技大學
金融研究所
99
This paper uses Breitung (2001)nonlinear cointegration model rank test to analyze cointegraion relation and long run equilibrium between Taiwan, American, China, South Korea , and Hong Kong stock market in Subprime Mortgage from January 4, 2007 through December 31, 2009. Besides, we use impulse response function to detect the change in co-movement relationship between Taiwan and American stock markets as exogenous variables change. Finally, Taiwan''s weighted stock index on the extent of portfolio investors in the proposed standard. According to empirical findings, we assert the following: 1. During the research period, subprime crisis causes a shift Taiwan and other stock index change, there are long cointegration trends for be Taiwan and the national stock markets . 2. It indirectly demonstrates that the T+15 trading day of American stock market has significantly influenced on the T trading day of Taiwan stock market. It indirectly demonstrates that the T+30 trading day of China stock market has significantly influenced on the T trading day of Taiwan stock market. 3. Response of Taiwan to South Korea Significant impact than from Hong Kong, It indirectly demonstrates that the T+30 trading day of South Korea stock market has significantly influenced on the T trading day of Taiwan stock market. Within Taiwan itself, before the more obvious impact on the subprime mortgage crisis 4. The largest impulse of Taiwan''s current to American, then gradually converge, while HK, SK stock market will also affect the Taiwan stock market. Before subprime mortgage crisis in the Taiwan stock market prior to a greater degree the impact of the US, after to a greater degree the impact of the China.
Hu, Keng-Hao, i 胡耿豪. "New weighted mean test and interval estimation for the shape parameter of the lifetime distribution with bathtub-shaped or increasing failure rate function under the censored sample". Thesis, 2006. http://ndltd.ncl.edu.tw/handle/50022998230919898829.
Pełny tekst źródła淡江大學
統計學系碩士班
94
In real life, we often encounter the situation of getting censored sample, for example, because of the restriction of time and cost or human mistake, we can’t get all observations. In practice, there are often these issues concerned in the data of the reliability or survival. In face of this censored data, we can’t apply traditional statistical inference to perform our analysis on it. Therefore, what we discuss in this paper is statistical inference with multiply type Ⅱ censored sample. In this paper, we discuss the lifetime distribution with the shape parameter of the bathtub-shaped or increasing failure rate function under the multiply type Ⅱ censored sample. First we provide 12 unweighted and weighted pivotal quantities to test the shape parameter of the bathtub-shaped or increasing failure rate function and establish confidence interval of the shape parameter under the multiply type Ⅱ censored sample. Secondly, we also find the best test statistic based on their most power of test among all test statistics. In addition, we obtain the best pivotal quantities with the shortest tolerance length. Finally, we give two examples and the Monte Carlo simulation to assess the behavior (including higher power and more shorter length of confidence interval) of these pivotal quantities for testing null hypotheses under given significance level and establishing confidence interval of shape parameter under the given confidence coefficient.
Loureiro, Inês Alexandra Gonçalves. "O teste de uma hipótese univariada de normalidade por combinação de testes baseados na função característica empírica". Master's thesis, 2014. http://hdl.handle.net/10316/33687.
Pełny tekst źródłaEm 1983, Epps e Pulley propuseram uma família de testes para uma hipótese univariada de normalidade baseada na distância ponderada L2 entre a função característica empírica dos resíduos standardizados associados às observações e a função característica da distribuição normal standard, cuja ponderação depende de um parâmetro real β. Neste trabalho, consideramos um procedimento de teste múltiplo que combina um número nito de estatísticas de teste de Epps e Pulley para escolhas extremas ( 2 f0;1g) e não extremas ( 2]0;1[) do parâmetro β. Para cada uma das estatísticas envolvidas no teste múltiplo bem como para este, estudamos as suas propriedades sob a hipótese nula de normalidade e estabelecemos a convergência dos testes associados. Finalmente, apresentamos os resultados de um estudo de simulação realizado para analisar a potência do teste múltiplo e compará-la com outros testes de normalidade recomendados na literatura.
In 1983, Epps and Pulley proposed a family of tests for assessing univariate normality based on a weighted L2 distance between the empirical characteristic function of the scaled residuals and the characteristic function of the standard normal distribution, whose weight function depends on a real parameter β. We consider a new multiple test procedure which combines a nite set of Epps and Pulley test statistics including extreme ( 2 f0;1g) and non-extreme ( 2]0;1[) choices of the tuning parameter β. For each of the statistics involved in the combination and for the multiple test, we study their main properties under the null hypothesis of normality and we establish the convergence of the associated tests. Finally, we present the results of a simulation study carried out to analyze the power of the proposed multiple test and compare it with other highly recommended normality tests.
Augustyniak, Maciej. "Une famille de distributions symétriques et leptocurtiques représentée par la différence de deux variables aléatoires gamma". Thèse, 2008. http://hdl.handle.net/1866/8192.
Pełny tekst źródłaOdintsov, Kirill. "Rozhodovací úlohy a empirická data; aplikace na nové typy úloh". Master's thesis, 2013. http://www.nusl.cz/ntk/nusl-328310.
Pełny tekst źródłaΣπυρόπουλος, Πέτρος. "Κλιματικοί δείκτες και επεξεργασία χρονοσειρών βροχόπτωσης στην Δυτική Ελλάδα". Thesis, 2009. http://nemertes.lis.upatras.gr/jspui/handle/10889/2409.
Pełny tekst źródłaThis work deals with the processing of annual and seasonal precipitation series from 12 stations of West Greece for a 30-year study period (1975-2004). Moreover for 8 out for τηε 12 stations where possible, the processing uses a 50-year study period (1956-2005). By using the annual precipitation height as an climatic index it follows that the total of the twelve stations is characterized generally by a combination of semi-wet and wet climatic type. Making use of nonparametric Mann-Kendall test for ascertaining the existence of trend, it doesn't follow any significant trend for the 30-year period (1975-2004), with the exception of the annual precipitation heights of Pirgos that show a significant negative trend. During the 50-year period (1956-2005) significant negative trends occur in seasonal (mainly during spring) and annual basis as well, for half of the eight stations which have been examined. Gamma distribution is tha type of statistical distribution that describes more effectively the physical quantity precipitation height, and in our case its parameters per station are being computed for the period 1975-2004, by using the maximum likelihood method. Under the framework of a Spectral analysis of the precipitation series (for the verification of periodicity in the variances of precipitation rates) , 7 stations are used for a 50-year study period (1955-2004) by using the Blackman-Tukey method. It follows after this method has been used, that precipitation series don't appear any periodicity during autumn and spring seasons. This is in contrast with the winter season and the annual rainfall values as well, where three parts of periodicity in the spectra of the stations appear that bear a common resemblance. This depicts the fact that genarally the total of West Greece stations are influenced by almost the same barometric pressure systems which leads to the variances of precipitation rates to appear common periodicity components.