To see the other types of publications on this topic, follow the link: Gaussian mixture models.

Journal articles on the topic 'Gaussian mixture models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Gaussian mixture models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ju, Zhaojie, and Honghai Liu. "Fuzzy Gaussian Mixture Models." Pattern Recognition 45, no. 3 (March 2012): 1146–58. http://dx.doi.org/10.1016/j.patcog.2011.08.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

McNicholas, Paul David, and Thomas Brendan Murphy. "Parsimonious Gaussian mixture models." Statistics and Computing 18, no. 3 (April 19, 2008): 285–96. http://dx.doi.org/10.1007/s11222-008-9056-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Viroli, Cinzia, and Geoffrey J. McLachlan. "Deep Gaussian mixture models." Statistics and Computing 29, no. 1 (December 1, 2017): 43–51. http://dx.doi.org/10.1007/s11222-017-9793-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Verbeek, J. J., N. Vlassis, and B. Kröse. "Efficient Greedy Learning of Gaussian Mixture Models." Neural Computation 15, no. 2 (February 1, 2003): 469–85. http://dx.doi.org/10.1162/089976603762553004.

Full text
Abstract:
This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points and quadratic in the (final) number of mixture components. Due to its greedy nature, the algorithm can be particularly useful when the optimal number of mixture components is unknown. Experimental results comparing the proposed algorithm to other methods on density estimation and texture segmentation are provided.
APA, Harvard, Vancouver, ISO, and other styles
5

Kunkel, Deborah, and Mario Peruggia. "Anchored Bayesian Gaussian mixture models." Electronic Journal of Statistics 14, no. 2 (2020): 3869–913. http://dx.doi.org/10.1214/20-ejs1756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chassagnol, Bastien, Antoine Bichat, Cheïma Boudjeniba, Pierre-Henri Wuillemin, Mickaël Guedj, David Gohel, Gregory Nuel, and Etienne Becht. "Gaussian Mixture Models in R." R Journal 15, no. 2 (November 1, 2023): 56–76. http://dx.doi.org/10.32614/rj-2023-043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ruzgas, Tomas, and Indrė Drulytė. "Kernel Density Estimators for Gaussian Mixture Models." Lietuvos statistikos darbai 52, no. 1 (December 20, 2013): 14–21. http://dx.doi.org/10.15388/ljs.2013.13919.

Full text
Abstract:
The problem of nonparametric estimation of probability density function is considered. The performance of kernel estimators based on various common kernels and a new kernel K (see (14)) with both fixed and adaptive smoothing bandwidth is compared in terms of the symmetric mean absolute percentage error using the Monte Carlo method. The kernel K is everywhere positive but has lighter tails than the Gaussian density. Gaussian mixture models from a collection introduced by Marron and Wand (1992) are taken for Monte Carlo simulations. The adaptive kernel method outperforms the smoothing with a fixed bandwidth in the majority of models. The kernel K shows better performance for Gaussian mixtures with considerably overlapping components and multiple peaks (double claw distribution).
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Yongxin, Tryphon T. Georgiou, and Allen Tannenbaum. "Optimal Transport for Gaussian Mixture Models." IEEE Access 7 (2019): 6269–78. http://dx.doi.org/10.1109/access.2018.2889838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Nasios, N., and A. G. Bors. "Variational learning for Gaussian mixture models." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 36, no. 4 (August 2006): 849–62. http://dx.doi.org/10.1109/tsmcb.2006.872273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Baibo, Changshui Zhang, and Xing Yi. "Active curve axis Gaussian mixture models." Pattern Recognition 38, no. 12 (December 2005): 2351–62. http://dx.doi.org/10.1016/j.patcog.2005.01.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Zeng, Jia, Lei Xie, and Zhi-Qiang Liu. "Type-2 fuzzy Gaussian mixture models." Pattern Recognition 41, no. 12 (December 2008): 3636–43. http://dx.doi.org/10.1016/j.patcog.2008.06.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Bolin, David, Jonas Wallin, and Finn Lindgren. "Latent Gaussian random field mixture models." Computational Statistics & Data Analysis 130 (February 2019): 80–93. http://dx.doi.org/10.1016/j.csda.2018.08.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

MAEBASHI, K., N. SUEMATSU, and A. HAYASHI. "Component Reduction for Gaussian Mixture Models." IEICE Transactions on Information and Systems E91-D, no. 12 (December 1, 2008): 2846–53. http://dx.doi.org/10.1093/ietisy/e91-d.12.2846.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Nowakowska, Ewa, Jacek Koronacki, and Stan Lipovetsky. "Clusterability assessment for Gaussian mixture models." Applied Mathematics and Computation 256 (April 2015): 591–601. http://dx.doi.org/10.1016/j.amc.2014.12.038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Di Zio, Marco, Ugo Guarnera, and Orietta Luzi. "Imputation through finite Gaussian mixture models." Computational Statistics & Data Analysis 51, no. 11 (July 2007): 5305–16. http://dx.doi.org/10.1016/j.csda.2006.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Yin, Jian Jun, and Jian Qiu Zhang. "Convolution PHD Filtering for Nonlinear Non-Gaussian Models." Advanced Materials Research 213 (February 2011): 344–48. http://dx.doi.org/10.4028/www.scientific.net/amr.213.344.

Full text
Abstract:
A novel probability hypothesis density (PHD) filter, called the Gaussian mixture convolution PHD (GMCPHD) filter was proposed. The PHD within the filter is approximated by a Gaussian sum, as in the Gaussian mixture PHD (GMPHD) filter, but the model may be non-Gaussian and nonlinear. This is implemented by a bank of convolution filters with Gaussian approximations to the predicted and posterior densities. The analysis results show the lower complexity, more amenable for parallel implementation of the GMCPHD filter than the convolution PHD (CPHD) filter and the ability to deal with complex observation model, small observation noise and non-Gaussian noise of the proposed filter over the existing Gaussian mixture particle PHD (GMPPHD) filter. The multi-target tracking simulation results verify the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
17

Maleki, Mohsen, and A. R. Nematollahi. "Autoregressive Models with Mixture of Scale Mixtures of Gaussian Innovations." Iranian Journal of Science and Technology, Transactions A: Science 41, no. 4 (April 21, 2017): 1099–107. http://dx.doi.org/10.1007/s40995-017-0237-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Meinicke, Peter, and Helge Ritter. "Resolution-Based Complexity Control for Gaussian Mixture Models." Neural Computation 13, no. 2 (February 2001): 453–75. http://dx.doi.org/10.1162/089976601300014600.

Full text
Abstract:
In the domain of unsupervised learning, mixtures of gaussians have become a popular tool for statistical modeling. For this class of generative models, we present a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions. According to some prespecified level of resolution as implied by a fixed variance noise model, the scheme provides an automatic selection of the dimensionalities of some local signal subspaces by maximum likelihood estimation. Together with a resolution-based control scheme for adjusting the number of mixture components, we arrive at an incremental model refinement procedure within a common deterministic annealing framework, which enables an efficient exploration of the model space. The advantages of the resolution-based framework are illustrated by experimental results on synthetic and high-dimensional real-world data.
APA, Harvard, Vancouver, ISO, and other styles
19

Ueda, Naonori, Ryohei Nakano, Zoubin Ghahramani, and Geoffrey E. Hinton. "SMEM Algorithm for Mixture Models." Neural Computation 12, no. 9 (September 1, 2000): 2109–28. http://dx.doi.org/10.1162/089976600300015088.

Full text
Abstract:
We present a split-and-merge expectation-maximization (SMEM) algorithm to overcome the local maxima problem in parameter estimation of finite mixture models. In the case of mixture models, local maxima often involve having too many components of a mixture model in one part of the space and too few in another, widely separated part of the space. To escape from such configurations, we repeatedly perform simultaneous split-and-merge operations using a new criterion for efficiently selecting the split-and-merge candidates. We apply the proposed algorithm to the training of gaussian mixtures and mixtures of factor analyzers using synthetic and real data and show the effectiveness of using the split- and-merge operations to improve the likelihood of both the training data and of held-out test data. We also show the practical usefulness of the proposed algorithm by applying it to image compression and pattern recognition problems.
APA, Harvard, Vancouver, ISO, and other styles
20

Masmoudi, Khalil, and Afif Masmoudi. "An EM algorithm for singular Gaussian mixture models." Filomat 33, no. 15 (2019): 4753–67. http://dx.doi.org/10.2298/fil1915753m.

Full text
Abstract:
In this paper, we introduce finite mixture models with singular multivariate normal components. These models are useful when the observed data involves collinearities, that is when the covariance matrices are singular. They are also useful when the covariance matrices are ill-conditioned. In the latter case, the classical approaches may lead to numerical instabilities and give inaccurate estimations. Hence, an extension of the Expectation Maximization algorithm, with complete proof, is proposed to derive the maximum likelihood estimators and cluster the data instances for mixtures of singular multivariate normal distributions. The accuracy of the proposed algorithm is then demonstrated on the grounds of several numerical experiments. Finally, we discuss the application of the proposed distribution to financial asset returns modeling and portfolio selection.
APA, Harvard, Vancouver, ISO, and other styles
21

YAMADA, Makoto, and Masashi SUGIYAMA. "Direct Importance Estimation with Gaussian Mixture Models." IEICE Transactions on Information and Systems E92-D, no. 10 (2009): 2159–62. http://dx.doi.org/10.1587/transinf.e92.d.2159.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

YE, Peng, Fang LIU, and Zhiyong ZHAO. "Multiple Gaussian Mixture Models for Image Registration." IEICE Transactions on Information and Systems E97.D, no. 7 (2014): 1927–29. http://dx.doi.org/10.1587/transinf.e97.d.1927.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Yang, Jianbo, Xin Yuan, Xuejun Liao, Patrick Llull, David J. Brady, Guillermo Sapiro, and Lawrence Carin. "Video Compressive Sensing Using Gaussian Mixture Models." IEEE Transactions on Image Processing 23, no. 11 (November 2014): 4863–78. http://dx.doi.org/10.1109/tip.2014.2344294.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Zhiwu Lu and H. H. S. Ip. "Generalized Competitive Learning of Gaussian Mixture Models." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39, no. 4 (August 2009): 901–9. http://dx.doi.org/10.1109/tsmcb.2008.2012119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Akaho, Shotaro, and Hilbert J. Kappen. "Nonmonotonic Generalization Bias of Gaussian Mixture Models." Neural Computation 12, no. 6 (June 1, 2000): 1411–27. http://dx.doi.org/10.1162/089976600300015439.

Full text
Abstract:
Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. For temperatures just below the first symmetry breaking point, the effective number of adaptive parameters increases and the generalization bias decreases. We compute the dependence of the neural information criterion on temperature around the symmetry breaking. Our results are confirmed by numerical cross-validation experiments.
APA, Harvard, Vancouver, ISO, and other styles
26

Yu, Guoshen, and Guillermo Sapiro. "Statistical Compressed Sensing of Gaussian Mixture Models." IEEE Transactions on Signal Processing 59, no. 12 (December 2011): 5842–58. http://dx.doi.org/10.1109/tsp.2011.2168521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Matza, Avi, and Yuval Bistritz. "Skew Gaussian mixture models for speaker recognition." IET Signal Processing 8, no. 8 (October 2014): 860–67. http://dx.doi.org/10.1049/iet-spr.2013.0270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Arellano, Claudia, and Rozenn Dahyot. "Robust ellipse detection with Gaussian mixture models." Pattern Recognition 58 (October 2016): 12–26. http://dx.doi.org/10.1016/j.patcog.2016.01.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lee, Kevin H., and Lingzhou Xue. "Nonparametric Finite Mixture of Gaussian Graphical Models." Technometrics 60, no. 4 (October 2, 2018): 511–21. http://dx.doi.org/10.1080/00401706.2017.1408497.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Jones, Daniel M., and Alan F. Heavens. "Gaussian mixture models for blended photometric redshifts." Monthly Notices of the Royal Astronomical Society 490, no. 3 (September 27, 2019): 3966–86. http://dx.doi.org/10.1093/mnras/stz2687.

Full text
Abstract:
ABSTRACT Future cosmological galaxy surveys such as the Large Synoptic Survey Telescope (LSST) will photometrically observe very large numbers of galaxies. Without spectroscopy, the redshifts required for the analysis of these data will need to be inferred using photometric redshift techniques that are scalable to large sample sizes. The high number density of sources will also mean that around half are blended. We present a Bayesian photometric redshift method for blended sources that uses Gaussian mixture models to learn the joint flux–redshift distribution from a set of unblended training galaxies, and Bayesian model comparison to infer the number of galaxies comprising a blended source. The use of Gaussian mixture models renders both of these applications computationally efficient and therefore suitable for upcoming galaxy surveys.
APA, Harvard, Vancouver, ISO, and other styles
31

Wallet, Bradley C., and Robert Hardisty. "Unsupervised seismic facies using Gaussian mixture models." Interpretation 7, no. 3 (August 1, 2019): SE93—SE111. http://dx.doi.org/10.1190/int-2018-0119.1.

Full text
Abstract:
As the use of seismic attributes becomes more widespread, multivariate seismic analysis has become more commonplace for seismic facies analysis. Unsupervised machine-learning techniques provide methods of automatically finding patterns in data with minimal user interaction. When using unsupervised machine-learning techniques, such as [Formula: see text]-means or Kohonen self-organizing maps (SOMs), the number of clusters can often be ambiguously defined and there is no measure of how confident the algorithm is in the classification of data vectors. The model-based probabilistic formulation of Gaussian mixture models (GMMs) allows for the number and shape of clusters to be determined in a more objective manner using a Bayesian framework that considers a model’s likelihood and complexity. Furthermore, the development of alternative expectation-maximization (EM) algorithms has allowed GMMs to be more tailored to unsupervised seismic facies analysis. The classification EM algorithm classifies data vectors according to their posterior probabilities that provide a measurement of uncertainty and ambiguity (often called a soft classification). The neighborhood EM (NEM) algorithm allows for spatial correlations to be considered to make classification volumes more realistic by enforcing spatial continuity. Corendering the classification with the uncertainty and ambiguity measurements produces an intuitive map of unsupervised seismic facies. We apply a model-based classification approach using GMMs to a turbidite system in Canterbury Basin, New Zealand, to clarify results from an initial SOM and highlight areas of uncertainty and ambiguity. Special focus on a channel feature in the turbidite system using an NEM algorithm shows it to be more realistic by considering spatial correlations within the data.
APA, Harvard, Vancouver, ISO, and other styles
32

Silva, Diogo S. F., and Clayton V. Deutsch. "Multivariate data imputation using Gaussian mixture models." Spatial Statistics 27 (October 2018): 74–90. http://dx.doi.org/10.1016/j.spasta.2016.11.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hedelin, P., and J. Skoglund. "Vector quantization based on Gaussian mixture models." IEEE Transactions on Speech and Audio Processing 8, no. 4 (July 2000): 385–401. http://dx.doi.org/10.1109/89.848220.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, J., and D. Ma. "Nonlinear Prediction for Gaussian Mixture Image Models." IEEE Transactions on Image Processing 13, no. 6 (June 2004): 836–47. http://dx.doi.org/10.1109/tip.2004.828197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Jiucang Hao, Te-Won Lee, and Terrence J. Sejnowski. "Speech Enhancement Using Gaussian Scale Mixture Models." IEEE Transactions on Audio, Speech, and Language Processing 18, no. 6 (August 2010): 1127–36. http://dx.doi.org/10.1109/tasl.2009.2030012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Fiez, Tanner, and Lillian J. Ratliff. "Gaussian Mixture Models for Parking Demand Data." IEEE Transactions on Intelligent Transportation Systems 21, no. 8 (August 2020): 3571–80. http://dx.doi.org/10.1109/tits.2019.2939499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Burges, Christopher John. "Discriminative Gaussian mixture models for speaker verification." Journal of the Acoustical Society of America 113, no. 5 (2003): 2393. http://dx.doi.org/10.1121/1.1584172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Morgan, Grant B. "Generating Nonnormal Distributions via Gaussian Mixture Models." Structural Equation Modeling: A Multidisciplinary Journal 27, no. 6 (February 5, 2020): 964–74. http://dx.doi.org/10.1080/10705511.2020.1718502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Reynolds, Douglas A., Thomas F. Quatieri, and Robert B. Dunn. "Speaker Verification Using Adapted Gaussian Mixture Models." Digital Signal Processing 10, no. 1-3 (January 2000): 19–41. http://dx.doi.org/10.1006/dspr.1999.0361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Wei, Hui, and Wei Zheng. "Image Denoising Based on Improved Gaussian Mixture Model." Scientific Programming 2021 (September 22, 2021): 1–8. http://dx.doi.org/10.1155/2021/7982645.

Full text
Abstract:
An image denoising method is proposed based on the improved Gaussian mixture model to reduce the noises and enhance the image quality. Unlike the traditional image denoising methods, the proposed method models the pixel information in the neighborhood around each pixel in the image. The Gaussian mixture model is employed to measure the similarity between pixels by calculating the L2 norm between the Gaussian mixture models corresponding to the two pixels. The Gaussian mixture model can model the statistical information such as the mean and variance of the pixel information in the image area. The L2 norm between the two Gaussian mixture models represents the difference in the local grayscale intensity and the richness of the details of the pixel information around the two pixels. In this sense, the L2 norm between Gaussian mixture models can more accurately measure the similarity between pixels. The experimental results show that the proposed method can improve the denoising performance of the images while retaining the detailed information of the image.
APA, Harvard, Vancouver, ISO, and other styles
41

Krasilnikov, A. I. "Classification of Models of Two-component Mixtures of Symmetrical Distributions with Zero Kurtosis Coefficient." Èlektronnoe modelirovanie 45, no. 5 (October 10, 2023): 20–38. http://dx.doi.org/10.15407/emodel.45.05.020.

Full text
Abstract:
On the basis of a family of two-component mixtures of distributions, a class K of symmetric non-Gaussian distributions with a zero kurtosis coefficient is defined, which is divided into two groups and five types. The dependence of the fourth-order cumulant on the weight coefficient of the mixture is studied, as a result of which the conditions are determined under which the kurtosis coefficient of the mixture is equal to zero. The use of a two-component mixture of Subbotin distributions for modeling single-vertex symmetric distributions with a zero kurtosis coefficient is justified. Examples of symmetric non-Gaussian distributions with zero kurtosis coefficient are given. The use of class K models gives a practical opportunity at the design stage to compare the effectiveness of the developed methods and systems for non-Gaussian signals with zero coefficients of asymmetry and kurtosis processing.
APA, Harvard, Vancouver, ISO, and other styles
42

Røge, Rasmus E., Kristoffer H. Madsen, Mikkel N. Schmidt, and Morten Mørup. "Infinite von Mises–Fisher Mixture Modeling of Whole Brain fMRI Data." Neural Computation 29, no. 10 (October 2017): 2712–41. http://dx.doi.org/10.1162/neco_a_01000.

Full text
Abstract:
Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises–Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians subsequently normalized. Thus, when performing model selection, the two models are not in agreement. Analyzing multisubject whole brain resting-state fMRI data from healthy adult subjects, we find that the vMF mixture model is considerably more reliable than the gaussian mixture model when comparing solutions across models trained on different groups of subjects, and again we find that the two models disagree on the optimal number of components. The analysis indicates that the fMRI data support more than a thousand clusters, and we confirm this is not a result of overfitting by demonstrating better prediction on data from held-out subjects. Our results highlight the utility of using directional statistics to model standardized fMRI data and demonstrate that whole brain segmentation of fMRI data requires a very large number of functional units in order to adequately account for the discernible statistical patterns in the data.
APA, Harvard, Vancouver, ISO, and other styles
43

Chuan, Ching-Hua. "Audio Classification and Retrieval Using Wavelets and Gaussian Mixture Models." International Journal of Multimedia Data Engineering and Management 4, no. 1 (January 2013): 1–20. http://dx.doi.org/10.4018/jmdem.2013010101.

Full text
Abstract:
This paper presents an audio classification and retrieval system using wavelets for extracting low-level acoustic features. The author performed multiple-level decomposition using discrete wavelet transform to extract acoustic features from audio recordings at different scales and times. The extracted features are then translated into a compact vector representation. Gaussian mixture models with expectation maximization algorithm are used to build models for audio classes and individual audio examples. The system is evaluated using three audio classification tasks: speech/music, male/female speech, and music genre. They also show how wavelets and Gaussian mixture models are used for class-based audio retrieval in two approaches: indexing using only wavelets versus indexing by Gaussian components. By evaluating the system through 10-fold cross-validation, the author shows the promising capability of wavelets and Gaussian mixture models for audio classification and retrieval. They also compare how parameters including frame size, wavelet level, Gaussian components, and sampling size affect performance in Gaussian models.
APA, Harvard, Vancouver, ISO, and other styles
44

Kiruthika, V., and R. Karpagam. "Super Pixel Segmentation Using Fuzzy Genetic Filter (FGF) with Gaussian Mixture Models (GMMs)." Journal of Advanced Research in Dynamical and Control Systems 11, no. 9 (September 30, 2019): 89–100. http://dx.doi.org/10.5373/jardcs/v11i9/20192918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Zhao, Mingyang, Xiaohong Jia, Lubin Fan, Yuan Liang, and Dong-Ming Yan. "Robust Ellipse Fitting Using Hierarchical Gaussian Mixture Models." IEEE Transactions on Image Processing 30 (2021): 3828–43. http://dx.doi.org/10.1109/tip.2021.3065799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Belton, D., S. Moncrieff, and J. Chapman. "Processing tree point clouds using Gaussian Mixture Models." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences II-5/W2 (October 16, 2013): 43–48. http://dx.doi.org/10.5194/isprsannals-ii-5-w2-43-2013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

NISHIMOTO, Hiroki, Renyuan ZHANG, and Yasuhiko NAKASHIMA. "GPGPU Implementation of Variational Bayesian Gaussian Mixture Models." IEICE Transactions on Information and Systems E105.D, no. 3 (March 1, 2022): 611–22. http://dx.doi.org/10.1587/transinf.2021edp7121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Liang, Xi-Long, Yu-Qin Chen, Jing-Kun Zhao, and Gang Zhao. "Partitioning the Galactic halo with Gaussian Mixture Models." Research in Astronomy and Astrophysics 21, no. 5 (June 1, 2021): 128. http://dx.doi.org/10.1088/1674-4527/21/5/128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Chen, Xiaohui, and Yun Yang. "Cutoff for Exact Recovery of Gaussian Mixture Models." IEEE Transactions on Information Theory 67, no. 6 (June 2021): 4223–38. http://dx.doi.org/10.1109/tit.2021.3063155.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

OHATA, Masashi, Tsuyoshi TOKUNARI, and Kiyotoshi MATSUOKA. "Online Blind Signals Separation with Gaussian Mixture Models." Transactions of the Society of Instrument and Control Engineers 36, no. 11 (2000): 985–94. http://dx.doi.org/10.9746/sicetr1965.36.985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography