Academic literature on the topic 'Mixture models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Mixture models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Mixture models"
Razzaghi, Mehdi, Geoffrey J. McLachan, and Kaye E. Basford. "Mixture Models." Technometrics 33, no. 3 (August 1991): 365. http://dx.doi.org/10.2307/1268796.
Full textRazraghi, Mehdi. "Mixture Models." Technometrics 33, no. 3 (August 1991): 365–66. http://dx.doi.org/10.1080/00401706.1991.10484850.
Full textUeda, Naonori, Ryohei Nakano, Zoubin Ghahramani, and Geoffrey E. Hinton. "SMEM Algorithm for Mixture Models." Neural Computation 12, no. 9 (September 1, 2000): 2109–28. http://dx.doi.org/10.1162/089976600300015088.
Full textAchcar, Jorge A., Emílio A. Coelho-Barros, and Josmar Mazucheli. "Cure fraction models using mixture and non-mixture models." Tatra Mountains Mathematical Publications 51, no. 1 (November 1, 2012): 1–9. http://dx.doi.org/10.2478/v10127-012-0001-4.
Full textLe, Si Quang, Nicolas Lartillot, and Olivier Gascuel. "Phylogenetic mixture models for proteins." Philosophical Transactions of the Royal Society B: Biological Sciences 363, no. 1512 (October 7, 2008): 3965–76. http://dx.doi.org/10.1098/rstb.2008.0180.
Full textMcLachlan, Geoffrey J., Sharon X. Lee, and Suren I. Rathnayake. "Finite Mixture Models." Annual Review of Statistics and Its Application 6, no. 1 (March 7, 2019): 355–78. http://dx.doi.org/10.1146/annurev-statistics-031017-100325.
Full textShanmugam, Ramalingam. "Finite Mixture Models." Technometrics 44, no. 1 (February 2002): 82. http://dx.doi.org/10.1198/tech.2002.s651.
Full textNemec, James M., and Amanda F. L. Nemec. "Mixture models for studying stellar populations. II - Multivariate finite mixture models." Astronomical Journal 105 (April 1993): 1455. http://dx.doi.org/10.1086/116523.
Full textVerbeek, J. J., N. Vlassis, and B. Kröse. "Efficient Greedy Learning of Gaussian Mixture Models." Neural Computation 15, no. 2 (February 1, 2003): 469–85. http://dx.doi.org/10.1162/089976603762553004.
Full textFocke, Walter W. "Mixture Models Based on Neural Network Averaging." Neural Computation 18, no. 1 (January 1, 2006): 1–9. http://dx.doi.org/10.1162/089976606774841576.
Full textDissertations / Theses on the topic "Mixture models"
Xiang, Sijia. "Semiparametric mixture models." Diss., Kansas State University, 2014. http://hdl.handle.net/2097/17338.
Full textDepartment of Statistics
Weixin Yao
This dissertation consists of three parts that are related to semiparametric mixture models. In Part I, we construct the minimum profile Hellinger distance (MPHD) estimator for a class of semiparametric mixture models where one component has known distribution with possibly unknown parameters while the other component density and the mixing proportion are unknown. Such semiparametric mixture models have been often used in biology and the sequential clustering algorithm. In Part II, we propose a new class of semiparametric mixture of regression models, where the mixing proportions and variances are constants, but the component regression functions are smooth functions of a covariate. A one-step backfitting estimate and two EM-type algorithms have been proposed to achieve the optimal convergence rate for both the global parameters and nonparametric regression functions. We derive the asymptotic property of the proposed estimates and show that both proposed EM-type algorithms preserve the asymptotic ascent property. In Part III, we apply the idea of single-index model to the mixture of regression models and propose three new classes of models: the mixture of single-index models (MSIM), the mixture of regression models with varying single-index proportions (MRSIP), and the mixture of regression models with varying single-index proportions and variances (MRSIPV). Backfitting estimates and the corresponding algorithms have been proposed for the new models to achieve the optimal convergence rate for both the parameters and the nonparametric functions. We show that the nonparametric functions can be estimated as if the parameters were known and the parameters can be estimated with the same rate of convergence, n[subscript](-1/2), that is achieved in a parametric model.
Haider, Peter. "Prediction with Mixture Models." Phd thesis, Universität Potsdam, 2013. http://opus.kobv.de/ubp/volltexte/2014/6961/.
Full textDas Lernen eines Modells für den Zusammenhang zwischen den Eingabeattributen und annotierten Zielattributen von Dateninstanzen dient zwei Zwecken. Einerseits ermöglicht es die Vorhersage des Zielattributs für Instanzen ohne Annotation. Andererseits können die Parameter des Modells nützliche Einsichten in die Struktur der Daten liefern. Wenn die Daten eine inhärente Partitionsstruktur besitzen, ist es natürlich, diese Struktur im Modell widerzuspiegeln. Solche Mischmodelle generieren Vorhersagen, indem sie die individuellen Vorhersagen der Mischkomponenten, welche mit den Partitionen der Daten korrespondieren, kombinieren. Oft ist die Partitionsstruktur latent und muss beim Lernen des Mischmodells mitinferiert werden. Eine direkte Evaluierung der Genauigkeit der inferierten Partitionsstruktur ist in vielen Fällen unmöglich, weil keine wahren Referenzdaten zum Vergleich herangezogen werden können. Jedoch kann man sie indirekt einschätzen, indem man die Vorhersagegenauigkeit des darauf basierenden Mischmodells misst. Diese Arbeit beschäftigt sich mit dem Zusammenspiel zwischen der Verbesserung der Vorhersagegenauigkeit durch das Aufdecken latenter Partitionierungen in Daten, und der Bewertung der geschätzen Struktur durch das Messen der Genauigkeit des resultierenden Vorhersagemodells. Bei der Anwendung des Filterns unerwünschter E-Mails sind die E-Mails in der Trainingsmende latent in Werbekampagnen partitioniert. Das Aufdecken dieser latenten Struktur erlaubt das Filtern zukünftiger E-Mails mit sehr niedrigen Falsch-Positiv-Raten. In dieser Arbeit wird ein Bayes'sches Partitionierunsmodell entwickelt, um diese Partitionierungsstruktur zu modellieren. Das Wissen über die Partitionierung von E-Mails in Kampagnen hilft auch dabei herauszufinden, welche E-Mails auf Veranlassen des selben Netzes von infiltrierten Rechnern, sogenannten Botnetzen, verschickt wurden. Dies ist eine weitere Schicht latenter Partitionierung. Diese latente Struktur aufzudecken erlaubt es, die Genauigkeit von E-Mail-Filtern zu erhöhen und sich effektiv gegen verteilte Denial-of-Service-Angriffe zu verteidigen. Zu diesem Zweck wird in dieser Arbeit ein diskriminatives Partitionierungsmodell hergeleitet, welches auf dem Graphen der beobachteten E-Mails basiert. Die mit diesem Modell inferierten Partitionierungen werden via ihrer Leistungsfähigkeit bei der Vorhersage der Kampagnen neuer E-Mails evaluiert. Weiterhin kann bei der Klassifikation des Inhalts einer E-Mail statistische Information über den sendenden Server wertvoll sein. Ein Modell zu lernen das diese Informationen nutzen kann erfordert Trainingsdaten, die Serverstatistiken enthalten. Um zusätzlich Trainingsdaten benutzen zu können, bei denen die Serverstatistiken fehlen, wird ein Modell entwickelt, das eine Mischung über potentiell alle Einsetzungen davon ist. Eine weitere Anwendung ist die Vorhersage des Navigationsverhaltens von Benutzern einer Webseite. Hier gibt es nicht a priori eine Partitionierung der Benutzer. Jedoch ist es notwendig, eine Partitionierung zu erzeugen, um verschiedene Nutzungsszenarien zu verstehen und verschiedene Layouts dafür zu entwerfen. Der vorgestellte Ansatz optimiert gleichzeitig die Fähigkeiten des Modells, sowohl die beste Partition zu bestimmen als auch mittels dieser Partition Vorhersagen über das Verhalten zu generieren. Jedes Modell wird auf realen Daten evaluiert und mit Referenzmethoden verglichen. Die Ergebnisse zeigen, dass das explizite Modellieren der Annahmen über die latente Partitionierungsstruktur zu verbesserten Vorhersagen führt. In den Fällen bei denen die Vorhersagegenauigkeit nicht direkt optimiert werden kann, erweist sich die Hinzunahme einer kleinen Anzahl von übergeordneten, direkt einstellbaren Parametern als nützlich.
Qi, Meng. "Development in Normal Mixture and Mixture of Experts Modeling." UKnowledge, 2016. http://uknowledge.uky.edu/statistics_etds/15.
Full textPolsen, Orathai. "Nonparametric regression and mixture models." Thesis, University of Leeds, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.578651.
Full textJames, S. D. "Mixture models for times series." Thesis, Swansea University, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.637395.
Full textSandhu, Manjinder Kaur. "Optimal designs for mixture models." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1995. http://hub.hku.hk/bib/B31213583.
Full textSánchez, Luis Enrique Benites. "Finite mixture of regression models." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-10052018-131627/.
Full textEsta tese composta por três artigos, visa propor extensões das misturas finitas nos modelos de regressão. Aqui vamos considerar uma classe flexível de distribuições tanto univariada como multivariada, que permitem modelar adequadamente dados assimmétricos, que presentam multimodalidade, caldas pesadas e observações atípicas. Esta classe possui casos especiais tais como as distribuições skew-normal, skew-t, skew slash, skew normal contaminada, assim como os casos simétricos. Inicialmente, é proposto um modelo baseado na suposição de que os erros seguem uma mistura finita da distribuição mistura de escala skew-normal (SMSN) ao invés da convencional distribuição normal. Em seguida, temos um modelo de regressão censurado onde consideramos que o erro segue uma mistura finita da distribuição da mistura de escala normal (SMN). E por último, é considerada um mistura finita de regressão multivariada onde o erro tem uma distribuição SMSN multivariada. Para todos os modelos propostos foram desenvolvidos dois pacotes do software R, que estão exemplificados no apêndice.
Li, Xiongya. "Robust multivariate mixture regression models." Diss., Kansas State University, 2017. http://hdl.handle.net/2097/38427.
Full textDepartment of Statistics
Weixing Song
In this dissertation, we proposed a new robust estimation procedure for two multivariate mixture regression models and applied this novel method to functional mapping of dynamic traits. In the first part, a robust estimation procedure for the mixture of classical multivariate linear regression models is discussed by assuming that the error terms follow a multivariate Laplace distribution. An EM algorithm is developed based on the fact that the multivariate Laplace distribution is a scale mixture of the multivariate standard normal distribution. The performance of the proposed algorithm is thoroughly evaluated by some simulation and comparison studies. In the second part, the similar idea is extended to the mixture of linear mixed regression models by assuming that the random effect and the regression error jointly follow a multivariate Laplace distribution. Compared with the existing robust t procedure in the literature, simulation studies indicate that the finite sample performance of the proposed estimation procedure outperforms or is at least comparable to the robust t procedure. Comparing to t procedure, there is no need to determine the degrees of freedom, so the new robust estimation procedure is computationally more efficient than the robust t procedure. The ascent property for both EM algorithms are also proved. In the third part, the proposed robust method is applied to identify quantitative trait loci (QTL) underlying a functional mapping framework with dynamic traits of agricultural or biomedical interest. A robust multivariate Laplace mapping framework was proposed to replace the normality assumption. Simulation studies show the proposed method is comparable to the robust multivariate t-distribution developed in literature and outperforms the normal procedure. As an illustration, the proposed method is also applied to a real data set.
Kunkel, Deborah Elizabeth. "Anchored Bayesian Gaussian Mixture Models." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1524134234501475.
Full textEvers, Ludger. "Model fitting and model selection for 'mixture of experts' models." Thesis, University of Oxford, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.445776.
Full textBooks on the topic "Mixture models"
Lindsay, Bruce G. Mixture Models. Haywood CA and Alexandria VA: Institute of Mathematical Statistics and American Statistical Association, 1995. http://dx.doi.org/10.1214/cbms/1462106013.
Full textMcLachlan, Geoffrey, and David Peel. Finite Mixture Models. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2000. http://dx.doi.org/10.1002/0471721182.
Full text1952-, Basford Kaye E., ed. Mixture models: Inference and applications to clustering. New York, N.Y: M. Dekker, 1988.
Find full textBouguila, Nizar, and Wentao Fan, eds. Mixture Models and Applications. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-23876-6.
Full textChen, Jiahua. Statistical Inference Under Mixture Models. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-6141-2.
Full textvon Davier, Matthias. Multivariate and Mixture Distribution Rasch Models. Edited by Claus H. Carstensen. New York, NY: Springer New York, 2007. http://dx.doi.org/10.1007/978-0-387-49839-3.
Full textR, Hancock Gregory, and Samuelsen Karen M, eds. Advances in latent variable mixture models. Charlotte, NC: Information Age Pub., 2008.
Find full textMixture models: Theory, geometry, and applications. Hayward, Calif: Institute of Mathematical Statistics, 1995.
Find full textservice), SpringerLink (Online, ed. Medical Applications of Finite Mixture Models. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2009.
Find full textJepson, Allan D. Mixture models for optical flow computation. Toronto: University of Toronto, Dept. of Computer Science, 1993.
Find full textBook chapters on the topic "Mixture models"
Yao, Weixin, and Sijia Xiang. "Hypothesis testing and model selection for mixture models." In Mixture Models, 188–209. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-6.
Full textYao, Weixin, and Sijia Xiang. "Mixture models for discrete data." In Mixture Models, 77–120. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-2.
Full textYao, Weixin, and Sijia Xiang. "Semiparametric mixture regression models." In Mixture Models, 300–338. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-10.
Full textYao, Weixin, and Sijia Xiang. "Label switching for mixture models." In Mixture Models, 157–87. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-5.
Full textYao, Weixin, and Sijia Xiang. "Robust mixture regression models." In Mixture Models, 210–46. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-7.
Full textYao, Weixin, and Sijia Xiang. "Semiparametric mixture models." In Mixture Models, 274–99. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-9.
Full textYao, Weixin, and Sijia Xiang. "Introduction to mixture models." In Mixture Models, 1–76. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-1.
Full textYao, Weixin, and Sijia Xiang. "Mixture models for high-dimensional data." In Mixture Models, 247–73. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-8.
Full textYao, Weixin, and Sijia Xiang. "Mixture regression models." In Mixture Models, 121–44. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-3.
Full textYao, Weixin, and Sijia Xiang. "Bayesian mixture models." In Mixture Models, 145–56. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003038511-4.
Full textConference papers on the topic "Mixture models"
Sandler, Mark. "Hierarchical mixture models." In the 13th ACM SIGKDD international conference. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1281192.1281255.
Full textSak, Hasim, Cyril Allauzen, Kaisuke Nakajima, and Francoise Beaufays. "Mixture of mixture n-gram language models." In 2013 IEEE Workshop on Automatic Speech Recognition & Understanding (ASRU). IEEE, 2013. http://dx.doi.org/10.1109/asru.2013.6707701.
Full textSimo-Serra, Edgar, Carme Torras, and Francesc Moreno-Noguer. "Geodesic Finite Mixture Models." In British Machine Vision Conference 2014. British Machine Vision Association, 2014. http://dx.doi.org/10.5244/c.28.91.
Full textBeaufays, F., M. Weintraub, and Yochai Konig. "Discriminative mixture weight estimation for large Gaussian mixture models." In 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258). IEEE, 1999. http://dx.doi.org/10.1109/icassp.1999.758131.
Full textBar-Yosef, Yossi, and Yuval Bistritz. "Discriminative simplification of mixture models." In ICASSP 2011 - 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2011. http://dx.doi.org/10.1109/icassp.2011.5946927.
Full textMaas, Ryan, Jeremy Hyrkas, Olivia Grace Telford, Magdalena Balazinska, Andrew Connolly, and Bill Howe. "Gaussian Mixture Models Use-Case." In the 3rd VLDB Workshop. New York, New York, USA: ACM Press, 2015. http://dx.doi.org/10.1145/2803140.2803143.
Full textEvangelio, Ruben Heras, Michael Patzold, and Thomas Sikora. "Splitting Gaussians in Mixture Models." In 2012 9th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS). IEEE, 2012. http://dx.doi.org/10.1109/avss.2012.69.
Full textYang, Zhixian, and Xiaojun Wan. "Dependency-based Mixture Language Models." In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics, 2022. http://dx.doi.org/10.18653/v1/2022.acl-long.535.
Full textKim, Kion, Damla Şentürk, and Runze Li. "Recent History Functional Linear Models." In Nonparametric Statistics and Mixture Models - A Festschrift in Honor of Thomas P Hettmansperger. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814340564_0011.
Full textMarden, John I. "QQ Plots for Assessing Symmetry Models." In Nonparametric Statistics and Mixture Models - A Festschrift in Honor of Thomas P Hettmansperger. WORLD SCIENTIFIC, 2011. http://dx.doi.org/10.1142/9789814340564_0013.
Full textReports on the topic "Mixture models"
Lavrenko, Victor. Optimal Mixture Models in IR. Fort Belvoir, VA: Defense Technical Information Center, January 2005. http://dx.doi.org/10.21236/ada440363.
Full textLiu, Songqi. Mixture Models: From Latent Classes/Profiles to Latent Growth, Transitions, and Multilevel Mixture Models. Instats Inc., 2022. http://dx.doi.org/10.61700/ky72m8g8cc8x2469.
Full textMueller, Shane, Andrew Boettcher, and Michael Young. Delineating Cultural Models: Extending the Cultural Mixture Model. Fort Belvoir, VA: Defense Technical Information Center, December 2011. http://dx.doi.org/10.21236/ada572740.
Full textKoenker, Roger, Jiaying Gu, and Stanislav Volgushev. Testing for homogeneity in mixture models. Cemmap, March 2013. http://dx.doi.org/10.1920/wp.cem.2013.0913.
Full textGu, Jiaying, Stanislav Volgushev, and Roger Koenker. Testing for homogeneity in mixture models. The IFS, August 2017. http://dx.doi.org/10.1920/wp.cem.2017.3917.
Full textYu, Guoshen, and Guillermo Sapiro. Statistical Compressive Sensing of Gaussian Mixture Models. Fort Belvoir, VA: Defense Technical Information Center, October 2010. http://dx.doi.org/10.21236/ada540728.
Full textChen, Xiaohong, Elie Tamer, and Maria Ponomareva. Likelihood inference in some finite mixture models. Cemmap, May 2013. http://dx.doi.org/10.1920/wp.cem.2013.1913.
Full textSteele, Russell J., Adrian E. Raftery, and Mary J. Emond. Computing Normalizing Constants for Finite Mixture Models via Incremental Mixture Importance Sampling (IMIS). Fort Belvoir, VA: Defense Technical Information Center, July 2003. http://dx.doi.org/10.21236/ada459853.
Full textKam, Chester. Mixture Modeling for Measurement Scale Assessment. Instats Inc., 2023. http://dx.doi.org/10.61700/8ll0tq1hym0nq469.
Full textHeckman, James, and Christopher Taber. Econometric Mixture Models and More General Models for Unobservables in Duration Analysis. Cambridge, MA: National Bureau of Economic Research, June 1994. http://dx.doi.org/10.3386/t0157.
Full text