Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Gaussian mixture models.

Artykuły w czasopismach na temat „Gaussian mixture models”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Gaussian mixture models”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Ju, Zhaojie, i Honghai Liu. "Fuzzy Gaussian Mixture Models". Pattern Recognition 45, nr 3 (marzec 2012): 1146–58. http://dx.doi.org/10.1016/j.patcog.2011.08.028.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

McNicholas, Paul David, i Thomas Brendan Murphy. "Parsimonious Gaussian mixture models". Statistics and Computing 18, nr 3 (19.04.2008): 285–96. http://dx.doi.org/10.1007/s11222-008-9056-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Viroli, Cinzia, i Geoffrey J. McLachlan. "Deep Gaussian mixture models". Statistics and Computing 29, nr 1 (1.12.2017): 43–51. http://dx.doi.org/10.1007/s11222-017-9793-z.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Verbeek, J. J., N. Vlassis i B. Kröse. "Efficient Greedy Learning of Gaussian Mixture Models". Neural Computation 15, nr 2 (1.02.2003): 469–85. http://dx.doi.org/10.1162/089976603762553004.

Pełny tekst źródła
Streszczenie:
This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points and quadratic in the (final) number of mixture components. Due to its greedy nature, the algorithm can be particularly useful when the optimal number of mixture components is unknown. Experimental results comparing the proposed algorithm to other methods on density estimation and texture segmentation are provided.
Style APA, Harvard, Vancouver, ISO itp.
5

Kunkel, Deborah, i Mario Peruggia. "Anchored Bayesian Gaussian mixture models". Electronic Journal of Statistics 14, nr 2 (2020): 3869–913. http://dx.doi.org/10.1214/20-ejs1756.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Chassagnol, Bastien, Antoine Bichat, Cheïma Boudjeniba, Pierre-Henri Wuillemin, Mickaël Guedj, David Gohel, Gregory Nuel i Etienne Becht. "Gaussian Mixture Models in R". R Journal 15, nr 2 (1.11.2023): 56–76. http://dx.doi.org/10.32614/rj-2023-043.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Ruzgas, Tomas, i Indrė Drulytė. "Kernel Density Estimators for Gaussian Mixture Models". Lietuvos statistikos darbai 52, nr 1 (20.12.2013): 14–21. http://dx.doi.org/10.15388/ljs.2013.13919.

Pełny tekst źródła
Streszczenie:
The problem of nonparametric estimation of probability density function is considered. The performance of kernel estimators based on various common kernels and a new kernel K (see (14)) with both fixed and adaptive smoothing bandwidth is compared in terms of the symmetric mean absolute percentage error using the Monte Carlo method. The kernel K is everywhere positive but has lighter tails than the Gaussian density. Gaussian mixture models from a collection introduced by Marron and Wand (1992) are taken for Monte Carlo simulations. The adaptive kernel method outperforms the smoothing with a fixed bandwidth in the majority of models. The kernel K shows better performance for Gaussian mixtures with considerably overlapping components and multiple peaks (double claw distribution).
Style APA, Harvard, Vancouver, ISO itp.
8

Chen, Yongxin, Tryphon T. Georgiou i Allen Tannenbaum. "Optimal Transport for Gaussian Mixture Models". IEEE Access 7 (2019): 6269–78. http://dx.doi.org/10.1109/access.2018.2889838.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Nasios, N., i A. G. Bors. "Variational learning for Gaussian mixture models". IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 36, nr 4 (sierpień 2006): 849–62. http://dx.doi.org/10.1109/tsmcb.2006.872273.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Zhang, Baibo, Changshui Zhang i Xing Yi. "Active curve axis Gaussian mixture models". Pattern Recognition 38, nr 12 (grudzień 2005): 2351–62. http://dx.doi.org/10.1016/j.patcog.2005.01.017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Zeng, Jia, Lei Xie i Zhi-Qiang Liu. "Type-2 fuzzy Gaussian mixture models". Pattern Recognition 41, nr 12 (grudzień 2008): 3636–43. http://dx.doi.org/10.1016/j.patcog.2008.06.006.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Bolin, David, Jonas Wallin i Finn Lindgren. "Latent Gaussian random field mixture models". Computational Statistics & Data Analysis 130 (luty 2019): 80–93. http://dx.doi.org/10.1016/j.csda.2018.08.007.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

MAEBASHI, K., N. SUEMATSU i A. HAYASHI. "Component Reduction for Gaussian Mixture Models". IEICE Transactions on Information and Systems E91-D, nr 12 (1.12.2008): 2846–53. http://dx.doi.org/10.1093/ietisy/e91-d.12.2846.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Nowakowska, Ewa, Jacek Koronacki i Stan Lipovetsky. "Clusterability assessment for Gaussian mixture models". Applied Mathematics and Computation 256 (kwiecień 2015): 591–601. http://dx.doi.org/10.1016/j.amc.2014.12.038.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Di Zio, Marco, Ugo Guarnera i Orietta Luzi. "Imputation through finite Gaussian mixture models". Computational Statistics & Data Analysis 51, nr 11 (lipiec 2007): 5305–16. http://dx.doi.org/10.1016/j.csda.2006.10.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Yin, Jian Jun, i Jian Qiu Zhang. "Convolution PHD Filtering for Nonlinear Non-Gaussian Models". Advanced Materials Research 213 (luty 2011): 344–48. http://dx.doi.org/10.4028/www.scientific.net/amr.213.344.

Pełny tekst źródła
Streszczenie:
A novel probability hypothesis density (PHD) filter, called the Gaussian mixture convolution PHD (GMCPHD) filter was proposed. The PHD within the filter is approximated by a Gaussian sum, as in the Gaussian mixture PHD (GMPHD) filter, but the model may be non-Gaussian and nonlinear. This is implemented by a bank of convolution filters with Gaussian approximations to the predicted and posterior densities. The analysis results show the lower complexity, more amenable for parallel implementation of the GMCPHD filter than the convolution PHD (CPHD) filter and the ability to deal with complex observation model, small observation noise and non-Gaussian noise of the proposed filter over the existing Gaussian mixture particle PHD (GMPPHD) filter. The multi-target tracking simulation results verify the effectiveness of the proposed method.
Style APA, Harvard, Vancouver, ISO itp.
17

Maleki, Mohsen, i A. R. Nematollahi. "Autoregressive Models with Mixture of Scale Mixtures of Gaussian Innovations". Iranian Journal of Science and Technology, Transactions A: Science 41, nr 4 (21.04.2017): 1099–107. http://dx.doi.org/10.1007/s40995-017-0237-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
18

Meinicke, Peter, i Helge Ritter. "Resolution-Based Complexity Control for Gaussian Mixture Models". Neural Computation 13, nr 2 (luty 2001): 453–75. http://dx.doi.org/10.1162/089976601300014600.

Pełny tekst źródła
Streszczenie:
In the domain of unsupervised learning, mixtures of gaussians have become a popular tool for statistical modeling. For this class of generative models, we present a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions. According to some prespecified level of resolution as implied by a fixed variance noise model, the scheme provides an automatic selection of the dimensionalities of some local signal subspaces by maximum likelihood estimation. Together with a resolution-based control scheme for adjusting the number of mixture components, we arrive at an incremental model refinement procedure within a common deterministic annealing framework, which enables an efficient exploration of the model space. The advantages of the resolution-based framework are illustrated by experimental results on synthetic and high-dimensional real-world data.
Style APA, Harvard, Vancouver, ISO itp.
19

Ueda, Naonori, Ryohei Nakano, Zoubin Ghahramani i Geoffrey E. Hinton. "SMEM Algorithm for Mixture Models". Neural Computation 12, nr 9 (1.09.2000): 2109–28. http://dx.doi.org/10.1162/089976600300015088.

Pełny tekst źródła
Streszczenie:
We present a split-and-merge expectation-maximization (SMEM) algorithm to overcome the local maxima problem in parameter estimation of finite mixture models. In the case of mixture models, local maxima often involve having too many components of a mixture model in one part of the space and too few in another, widely separated part of the space. To escape from such configurations, we repeatedly perform simultaneous split-and-merge operations using a new criterion for efficiently selecting the split-and-merge candidates. We apply the proposed algorithm to the training of gaussian mixtures and mixtures of factor analyzers using synthetic and real data and show the effectiveness of using the split- and-merge operations to improve the likelihood of both the training data and of held-out test data. We also show the practical usefulness of the proposed algorithm by applying it to image compression and pattern recognition problems.
Style APA, Harvard, Vancouver, ISO itp.
20

Masmoudi, Khalil, i Afif Masmoudi. "An EM algorithm for singular Gaussian mixture models". Filomat 33, nr 15 (2019): 4753–67. http://dx.doi.org/10.2298/fil1915753m.

Pełny tekst źródła
Streszczenie:
In this paper, we introduce finite mixture models with singular multivariate normal components. These models are useful when the observed data involves collinearities, that is when the covariance matrices are singular. They are also useful when the covariance matrices are ill-conditioned. In the latter case, the classical approaches may lead to numerical instabilities and give inaccurate estimations. Hence, an extension of the Expectation Maximization algorithm, with complete proof, is proposed to derive the maximum likelihood estimators and cluster the data instances for mixtures of singular multivariate normal distributions. The accuracy of the proposed algorithm is then demonstrated on the grounds of several numerical experiments. Finally, we discuss the application of the proposed distribution to financial asset returns modeling and portfolio selection.
Style APA, Harvard, Vancouver, ISO itp.
21

YAMADA, Makoto, i Masashi SUGIYAMA. "Direct Importance Estimation with Gaussian Mixture Models". IEICE Transactions on Information and Systems E92-D, nr 10 (2009): 2159–62. http://dx.doi.org/10.1587/transinf.e92.d.2159.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
22

YE, Peng, Fang LIU i Zhiyong ZHAO. "Multiple Gaussian Mixture Models for Image Registration". IEICE Transactions on Information and Systems E97.D, nr 7 (2014): 1927–29. http://dx.doi.org/10.1587/transinf.e97.d.1927.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
23

Yang, Jianbo, Xin Yuan, Xuejun Liao, Patrick Llull, David J. Brady, Guillermo Sapiro i Lawrence Carin. "Video Compressive Sensing Using Gaussian Mixture Models". IEEE Transactions on Image Processing 23, nr 11 (listopad 2014): 4863–78. http://dx.doi.org/10.1109/tip.2014.2344294.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
24

Zhiwu Lu i H. H. S. Ip. "Generalized Competitive Learning of Gaussian Mixture Models". IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39, nr 4 (sierpień 2009): 901–9. http://dx.doi.org/10.1109/tsmcb.2008.2012119.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
25

Akaho, Shotaro, i Hilbert J. Kappen. "Nonmonotonic Generalization Bias of Gaussian Mixture Models". Neural Computation 12, nr 6 (1.06.2000): 1411–27. http://dx.doi.org/10.1162/089976600300015439.

Pełny tekst źródła
Streszczenie:
Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. For temperatures just below the first symmetry breaking point, the effective number of adaptive parameters increases and the generalization bias decreases. We compute the dependence of the neural information criterion on temperature around the symmetry breaking. Our results are confirmed by numerical cross-validation experiments.
Style APA, Harvard, Vancouver, ISO itp.
26

Yu, Guoshen, i Guillermo Sapiro. "Statistical Compressed Sensing of Gaussian Mixture Models". IEEE Transactions on Signal Processing 59, nr 12 (grudzień 2011): 5842–58. http://dx.doi.org/10.1109/tsp.2011.2168521.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
27

Matza, Avi, i Yuval Bistritz. "Skew Gaussian mixture models for speaker recognition". IET Signal Processing 8, nr 8 (październik 2014): 860–67. http://dx.doi.org/10.1049/iet-spr.2013.0270.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
28

Arellano, Claudia, i Rozenn Dahyot. "Robust ellipse detection with Gaussian mixture models". Pattern Recognition 58 (październik 2016): 12–26. http://dx.doi.org/10.1016/j.patcog.2016.01.017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
29

Lee, Kevin H., i Lingzhou Xue. "Nonparametric Finite Mixture of Gaussian Graphical Models". Technometrics 60, nr 4 (2.10.2018): 511–21. http://dx.doi.org/10.1080/00401706.2017.1408497.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
30

Jones, Daniel M., i Alan F. Heavens. "Gaussian mixture models for blended photometric redshifts". Monthly Notices of the Royal Astronomical Society 490, nr 3 (27.09.2019): 3966–86. http://dx.doi.org/10.1093/mnras/stz2687.

Pełny tekst źródła
Streszczenie:
ABSTRACT Future cosmological galaxy surveys such as the Large Synoptic Survey Telescope (LSST) will photometrically observe very large numbers of galaxies. Without spectroscopy, the redshifts required for the analysis of these data will need to be inferred using photometric redshift techniques that are scalable to large sample sizes. The high number density of sources will also mean that around half are blended. We present a Bayesian photometric redshift method for blended sources that uses Gaussian mixture models to learn the joint flux–redshift distribution from a set of unblended training galaxies, and Bayesian model comparison to infer the number of galaxies comprising a blended source. The use of Gaussian mixture models renders both of these applications computationally efficient and therefore suitable for upcoming galaxy surveys.
Style APA, Harvard, Vancouver, ISO itp.
31

Wallet, Bradley C., i Robert Hardisty. "Unsupervised seismic facies using Gaussian mixture models". Interpretation 7, nr 3 (1.08.2019): SE93—SE111. http://dx.doi.org/10.1190/int-2018-0119.1.

Pełny tekst źródła
Streszczenie:
As the use of seismic attributes becomes more widespread, multivariate seismic analysis has become more commonplace for seismic facies analysis. Unsupervised machine-learning techniques provide methods of automatically finding patterns in data with minimal user interaction. When using unsupervised machine-learning techniques, such as [Formula: see text]-means or Kohonen self-organizing maps (SOMs), the number of clusters can often be ambiguously defined and there is no measure of how confident the algorithm is in the classification of data vectors. The model-based probabilistic formulation of Gaussian mixture models (GMMs) allows for the number and shape of clusters to be determined in a more objective manner using a Bayesian framework that considers a model’s likelihood and complexity. Furthermore, the development of alternative expectation-maximization (EM) algorithms has allowed GMMs to be more tailored to unsupervised seismic facies analysis. The classification EM algorithm classifies data vectors according to their posterior probabilities that provide a measurement of uncertainty and ambiguity (often called a soft classification). The neighborhood EM (NEM) algorithm allows for spatial correlations to be considered to make classification volumes more realistic by enforcing spatial continuity. Corendering the classification with the uncertainty and ambiguity measurements produces an intuitive map of unsupervised seismic facies. We apply a model-based classification approach using GMMs to a turbidite system in Canterbury Basin, New Zealand, to clarify results from an initial SOM and highlight areas of uncertainty and ambiguity. Special focus on a channel feature in the turbidite system using an NEM algorithm shows it to be more realistic by considering spatial correlations within the data.
Style APA, Harvard, Vancouver, ISO itp.
32

Silva, Diogo S. F., i Clayton V. Deutsch. "Multivariate data imputation using Gaussian mixture models". Spatial Statistics 27 (październik 2018): 74–90. http://dx.doi.org/10.1016/j.spasta.2016.11.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
33

Hedelin, P., i J. Skoglund. "Vector quantization based on Gaussian mixture models". IEEE Transactions on Speech and Audio Processing 8, nr 4 (lipiec 2000): 385–401. http://dx.doi.org/10.1109/89.848220.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Zhang, J., i D. Ma. "Nonlinear Prediction for Gaussian Mixture Image Models". IEEE Transactions on Image Processing 13, nr 6 (czerwiec 2004): 836–47. http://dx.doi.org/10.1109/tip.2004.828197.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
35

Jiucang Hao, Te-Won Lee i Terrence J. Sejnowski. "Speech Enhancement Using Gaussian Scale Mixture Models". IEEE Transactions on Audio, Speech, and Language Processing 18, nr 6 (sierpień 2010): 1127–36. http://dx.doi.org/10.1109/tasl.2009.2030012.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

Fiez, Tanner, i Lillian J. Ratliff. "Gaussian Mixture Models for Parking Demand Data". IEEE Transactions on Intelligent Transportation Systems 21, nr 8 (sierpień 2020): 3571–80. http://dx.doi.org/10.1109/tits.2019.2939499.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Burges, Christopher John. "Discriminative Gaussian mixture models for speaker verification". Journal of the Acoustical Society of America 113, nr 5 (2003): 2393. http://dx.doi.org/10.1121/1.1584172.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Morgan, Grant B. "Generating Nonnormal Distributions via Gaussian Mixture Models". Structural Equation Modeling: A Multidisciplinary Journal 27, nr 6 (5.02.2020): 964–74. http://dx.doi.org/10.1080/10705511.2020.1718502.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Reynolds, Douglas A., Thomas F. Quatieri i Robert B. Dunn. "Speaker Verification Using Adapted Gaussian Mixture Models". Digital Signal Processing 10, nr 1-3 (styczeń 2000): 19–41. http://dx.doi.org/10.1006/dspr.1999.0361.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Wei, Hui, i Wei Zheng. "Image Denoising Based on Improved Gaussian Mixture Model". Scientific Programming 2021 (22.09.2021): 1–8. http://dx.doi.org/10.1155/2021/7982645.

Pełny tekst źródła
Streszczenie:
An image denoising method is proposed based on the improved Gaussian mixture model to reduce the noises and enhance the image quality. Unlike the traditional image denoising methods, the proposed method models the pixel information in the neighborhood around each pixel in the image. The Gaussian mixture model is employed to measure the similarity between pixels by calculating the L2 norm between the Gaussian mixture models corresponding to the two pixels. The Gaussian mixture model can model the statistical information such as the mean and variance of the pixel information in the image area. The L2 norm between the two Gaussian mixture models represents the difference in the local grayscale intensity and the richness of the details of the pixel information around the two pixels. In this sense, the L2 norm between Gaussian mixture models can more accurately measure the similarity between pixels. The experimental results show that the proposed method can improve the denoising performance of the images while retaining the detailed information of the image.
Style APA, Harvard, Vancouver, ISO itp.
41

Krasilnikov, A. I. "Classification of Models of Two-component Mixtures of Symmetrical Distributions with Zero Kurtosis Coefficient". Èlektronnoe modelirovanie 45, nr 5 (10.10.2023): 20–38. http://dx.doi.org/10.15407/emodel.45.05.020.

Pełny tekst źródła
Streszczenie:
On the basis of a family of two-component mixtures of distributions, a class K of symmetric non-Gaussian distributions with a zero kurtosis coefficient is defined, which is divided into two groups and five types. The dependence of the fourth-order cumulant on the weight coefficient of the mixture is studied, as a result of which the conditions are determined under which the kurtosis coefficient of the mixture is equal to zero. The use of a two-component mixture of Subbotin distributions for modeling single-vertex symmetric distributions with a zero kurtosis coefficient is justified. Examples of symmetric non-Gaussian distributions with zero kurtosis coefficient are given. The use of class K models gives a practical opportunity at the design stage to compare the effectiveness of the developed methods and systems for non-Gaussian signals with zero coefficients of asymmetry and kurtosis processing.
Style APA, Harvard, Vancouver, ISO itp.
42

Røge, Rasmus E., Kristoffer H. Madsen, Mikkel N. Schmidt i Morten Mørup. "Infinite von Mises–Fisher Mixture Modeling of Whole Brain fMRI Data". Neural Computation 29, nr 10 (październik 2017): 2712–41. http://dx.doi.org/10.1162/neco_a_01000.

Pełny tekst źródła
Streszczenie:
Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises–Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Monte Carlo sampling. Comparing the vMF and gaussian mixture models on synthetic data, we demonstrate that the vMF model has a slight advantage inferring the true underlying clustering when compared to gaussian-based models on data generated from both a mixture of vMFs and a mixture of gaussians subsequently normalized. Thus, when performing model selection, the two models are not in agreement. Analyzing multisubject whole brain resting-state fMRI data from healthy adult subjects, we find that the vMF mixture model is considerably more reliable than the gaussian mixture model when comparing solutions across models trained on different groups of subjects, and again we find that the two models disagree on the optimal number of components. The analysis indicates that the fMRI data support more than a thousand clusters, and we confirm this is not a result of overfitting by demonstrating better prediction on data from held-out subjects. Our results highlight the utility of using directional statistics to model standardized fMRI data and demonstrate that whole brain segmentation of fMRI data requires a very large number of functional units in order to adequately account for the discernible statistical patterns in the data.
Style APA, Harvard, Vancouver, ISO itp.
43

Chuan, Ching-Hua. "Audio Classification and Retrieval Using Wavelets and Gaussian Mixture Models". International Journal of Multimedia Data Engineering and Management 4, nr 1 (styczeń 2013): 1–20. http://dx.doi.org/10.4018/jmdem.2013010101.

Pełny tekst źródła
Streszczenie:
This paper presents an audio classification and retrieval system using wavelets for extracting low-level acoustic features. The author performed multiple-level decomposition using discrete wavelet transform to extract acoustic features from audio recordings at different scales and times. The extracted features are then translated into a compact vector representation. Gaussian mixture models with expectation maximization algorithm are used to build models for audio classes and individual audio examples. The system is evaluated using three audio classification tasks: speech/music, male/female speech, and music genre. They also show how wavelets and Gaussian mixture models are used for class-based audio retrieval in two approaches: indexing using only wavelets versus indexing by Gaussian components. By evaluating the system through 10-fold cross-validation, the author shows the promising capability of wavelets and Gaussian mixture models for audio classification and retrieval. They also compare how parameters including frame size, wavelet level, Gaussian components, and sampling size affect performance in Gaussian models.
Style APA, Harvard, Vancouver, ISO itp.
44

Kiruthika, V., i R. Karpagam. "Super Pixel Segmentation Using Fuzzy Genetic Filter (FGF) with Gaussian Mixture Models (GMMs)". Journal of Advanced Research in Dynamical and Control Systems 11, nr 9 (30.09.2019): 89–100. http://dx.doi.org/10.5373/jardcs/v11i9/20192918.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
45

Zhao, Mingyang, Xiaohong Jia, Lubin Fan, Yuan Liang i Dong-Ming Yan. "Robust Ellipse Fitting Using Hierarchical Gaussian Mixture Models". IEEE Transactions on Image Processing 30 (2021): 3828–43. http://dx.doi.org/10.1109/tip.2021.3065799.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
46

Belton, D., S. Moncrieff i J. Chapman. "Processing tree point clouds using Gaussian Mixture Models". ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences II-5/W2 (16.10.2013): 43–48. http://dx.doi.org/10.5194/isprsannals-ii-5-w2-43-2013.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
47

NISHIMOTO, Hiroki, Renyuan ZHANG i Yasuhiko NAKASHIMA. "GPGPU Implementation of Variational Bayesian Gaussian Mixture Models". IEICE Transactions on Information and Systems E105.D, nr 3 (1.03.2022): 611–22. http://dx.doi.org/10.1587/transinf.2021edp7121.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
48

Liang, Xi-Long, Yu-Qin Chen, Jing-Kun Zhao i Gang Zhao. "Partitioning the Galactic halo with Gaussian Mixture Models". Research in Astronomy and Astrophysics 21, nr 5 (1.06.2021): 128. http://dx.doi.org/10.1088/1674-4527/21/5/128.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
49

Chen, Xiaohui, i Yun Yang. "Cutoff for Exact Recovery of Gaussian Mixture Models". IEEE Transactions on Information Theory 67, nr 6 (czerwiec 2021): 4223–38. http://dx.doi.org/10.1109/tit.2021.3063155.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

OHATA, Masashi, Tsuyoshi TOKUNARI i Kiyotoshi MATSUOKA. "Online Blind Signals Separation with Gaussian Mixture Models". Transactions of the Society of Instrument and Control Engineers 36, nr 11 (2000): 985–94. http://dx.doi.org/10.9746/sicetr1965.36.985.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii