Siga este enlace para ver otros tipos de publicaciones sobre el tema: Clustering spectral.

Artículos de revistas sobre el tema "Clustering spectral"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Clustering spectral".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Hess, Sibylle, Wouter Duivesteijn, Philipp Honysz y Katharina Morik. "The SpectACl of Nonconvex Clustering: A Spectral Approach to Density-Based Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julio de 2019): 3788–95. http://dx.doi.org/10.1609/aaai.v33i01.33013788.

Texto completo
Resumen
When it comes to clustering nonconvex shapes, two paradigms are used to find the most suitable clustering: minimum cut and maximum density. The most popular algorithms incorporating these paradigms are Spectral Clustering and DBSCAN. Both paradigms have their pros and cons. While minimum cut clusterings are sensitive to noise, density-based clusterings have trouble handling clusters with varying densities. In this paper, we propose SPECTACL: a method combining the advantages of both approaches, while solving the two mentioned drawbacks. Our method is easy to implement, such as Spectral Clustering, and theoretically founded to optimize a proposed density criterion of clusterings. Through experiments on synthetic and real-world data, we demonstrate that our approach provides robust and reliable clusterings.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Li, Hongmin, Xiucai Ye, Akira Imakura y Tetsuya Sakurai. "LSEC: Large-scale spectral ensemble clustering". Intelligent Data Analysis 27, n.º 1 (30 de enero de 2023): 59–77. http://dx.doi.org/10.3233/ida-216240.

Texto completo
Resumen
A fundamental problem in machine learning is ensemble clustering, that is, combining multiple base clusterings to obtain improved clustering result. However, most of the existing methods are unsuitable for large-scale ensemble clustering tasks owing to efficiency bottlenecks. In this paper, we propose a large-scale spectral ensemble clustering (LSEC) method to balance efficiency and effectiveness. In LSEC, a large-scale spectral clustering-based efficient ensemble generation framework is designed to generate various base clusterings with low computational complexity. Thereafter, all the base clusterings are combined using a bipartite graph partition-based consensus function to obtain improved consensus clustering results. The LSEC method achieves a lower computational complexity than most existing ensemble clustering methods. Experiments conducted on ten large-scale datasets demonstrate the efficiency and effectiveness of the LSEC method. The MATLAB code of the proposed method and experimental datasets are available at https://github.com/Li-Hongmin/MyPaperWithCode.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Zhuang, Xinwei y Sean Hanna. "Space Frame Optimisation with Spectral Clustering". International Journal of Machine Learning and Computing 10, n.º 4 (julio de 2020): 507–12. http://dx.doi.org/10.18178/ijmlc.2020.10.4.965.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Sun, Gan, Yang Cong, Qianqian Wang, Jun Li y Yun Fu. "Lifelong Spectral Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 04 (3 de abril de 2020): 5867–74. http://dx.doi.org/10.1609/aaai.v34i04.6045.

Texto completo
Resumen
In the past decades, spectral clustering (SC) has become one of the most effective clustering algorithms. However, most previous studies focus on spectral clustering tasks with a fixed task set, which cannot incorporate with a new spectral clustering task without accessing to previously learned tasks. In this paper, we aim to explore the problem of spectral clustering in a lifelong machine learning framework, i.e., Lifelong Spectral Clustering (L2SC). Its goal is to efficiently learn a model for a new spectral clustering task by selectively transferring previously accumulated experience from knowledge library. Specifically, the knowledge library of L2SC contains two components: 1) orthogonal basis library: capturing latent cluster centers among the clusters in each pair of tasks; 2) feature embedding library: embedding the feature manifold information shared among multiple related tasks. As a new spectral clustering task arrives, L2SC firstly transfers knowledge from both basis library and feature library to obtain encoding matrix, and further redefines the library base over time to maximize performance across all the clustering tasks. Meanwhile, a general online update formulation is derived to alternatively update the basis library and feature library. Finally, the empirical experiments on several real-world benchmark datasets demonstrate that our L2SC model can effectively improve the clustering performance when comparing with other state-of-the-art spectral clustering algorithms.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Ling Ping, Rong Xiangsheng y Dong Yongquan. "Incremental Spectral Clustering". Journal of Convergence Information Technology 7, n.º 15 (31 de agosto de 2012): 286–93. http://dx.doi.org/10.4156/jcit.vol7.issue15.34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kim, Jaehwan y Seungjin Choi. "Semidefinite spectral clustering". Pattern Recognition 39, n.º 11 (noviembre de 2006): 2025–35. http://dx.doi.org/10.1016/j.patcog.2006.05.021.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Challa, Aditya, Sravan Danda, B. S. Daya Sagar y Laurent Najman. "Power Spectral Clustering". Journal of Mathematical Imaging and Vision 62, n.º 9 (11 de julio de 2020): 1195–213. http://dx.doi.org/10.1007/s10851-020-00980-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Huang, Jin, Feiping Nie y Heng Huang. "Spectral Rotation versus K-Means in Spectral Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 27, n.º 1 (30 de junio de 2013): 431–37. http://dx.doi.org/10.1609/aaai.v27i1.8683.

Texto completo
Resumen
Spectral clustering has been a popular data clustering algorithm. This category of approaches often resort to other clustering methods, such as K-Means, to get the final cluster. The potential flaw of such common practice is that the obtained relaxed continuous spectral solution could severely deviate from the true discrete solution. In this paper, we propose to impose an additional orthonormal constraint to better approximate the optimal continuous solution to the graph cut objective functions. Such a method, called spectral rotation in literature, optimizes the spectral clustering objective functions better than K-Means, and improves the clustering accuracy. We would provide efficient algorithm to solve the new problem rigorously, which is not significantly more costly than K-Means. We also establish the connection between our method andK-Means to provide theoretical motivation of our method. Experimental results show that our algorithm consistently reaches better cut and meanwhile outperforms in clustering metrics than classic spectral clustering methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

JIN, Hui-zhen. "Multilevel spectral clustering with ascertainable clustering number". Journal of Computer Applications 28, n.º 5 (17 de octubre de 2008): 1229–31. http://dx.doi.org/10.3724/sp.j.1087.2008.01229.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Huang, Dong, Chang-Dong Wang, Jian-Sheng Wu, Jian-Huang Lai y Chee-Keong Kwoh. "Ultra-Scalable Spectral Clustering and Ensemble Clustering". IEEE Transactions on Knowledge and Data Engineering 32, n.º 6 (1 de junio de 2020): 1212–26. http://dx.doi.org/10.1109/tkde.2019.2903410.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Yousefi, Bardia, Clemente Ibarra-Castanedo, Martin Chamberland, Xavier P. V. Maldague y Georges Beaudoin. "Unsupervised Identification of Targeted Spectra Applying Rank1-NMF and FCC Algorithms in Long-Wave Hyperspectral Infrared Imagery". Remote Sensing 13, n.º 11 (28 de mayo de 2021): 2125. http://dx.doi.org/10.3390/rs13112125.

Texto completo
Resumen
Clustering methods unequivocally show considerable influence on many recent algorithms and play an important role in hyperspectral data analysis. Here, we challenge the clustering for mineral identification using two different strategies in hyperspectral long wave infrared (LWIR, 7.7–11.8 μm). For that, we compare two algorithms to perform the mineral identification in a unique dataset. The first algorithm uses spectral comparison techniques for all the pixel-spectra and creates RGB false color composites (FCC). Then, a color based clustering is used to group the regions (called FCC-clustering). The second algorithm clusters all the pixel-spectra to directly group the spectra. Then, the first rank of non-negative matrix factorization (NMF) extracts the representative of each cluster and compares results with the spectral library of JPL/NASA. These techniques give the comparison values as features which convert into RGB-FCC as the results (called clustering rank1-NMF). We applied K-means as clustering approach, which can be modified in any other similar clustering approach. The results of the clustering-rank1-NMF algorithm indicate significant computational efficiency (more than 20 times faster than the previous approach) and promising performance for mineral identification having up to 75.8% and 84.8% average accuracies for FCC-clustering and clustering-rank1 NMF algorithms (using spectral angle mapper (SAM)), respectively. Furthermore, several spectral comparison techniques are used also such as adaptive matched subspace detector (AMSD), orthogonal subspace projection (OSP) algorithm, principal component analysis (PCA), local matched filter (PLMF), SAM, and normalized cross correlation (NCC) for both algorithms and most of them show a similar range in accuracy. However, SAM and NCC are preferred due to their computational simplicity. Our algorithms strive to identify eleven different mineral grains (biotite, diopside, epidote, goethite, kyanite, scheelite, smithsonite, tourmaline, pyrope, olivine, and quartz).
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Fu, Li Li, Yong Li Liu y Li Jing Hao. "Research on Spectral Clustering". Applied Mechanics and Materials 687-691 (noviembre de 2014): 1350–53. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.1350.

Texto completo
Resumen
Spectral clustering algorithm is a kind of clustering algorithm based on spectral graph theory. As spectral clustering has deep theoretical foundation as well as the advantage in dealing with non-convex distribution, it has received much attention in machine learning and data mining areas. The algorithm is easy to implement, and outperforms traditional clustering algorithms such as K-means algorithm. This paper aims to give some intuitions on spectral clustering. We describe different graph partition criteria, the definition of spectral clustering, and clustering steps, etc. Finally, in order to solve the disadvantage of spectral clustering, some improvements are introduced briefly.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Wada, Yuichiro, Shugo Miyamoto, Takumi Nakagama, Léo Andéol, Wataru Kumagai y Takafumi Kanamori. "Spectral Embedded Deep Clustering". Entropy 21, n.º 8 (15 de agosto de 2019): 795. http://dx.doi.org/10.3390/e21080795.

Texto completo
Resumen
We propose a new clustering method based on a deep neural network. Given an unlabeled dataset and the number of clusters, our method directly groups the dataset into the given number of clusters in the original space. We use a conditional discrete probability distribution defined by a deep neural network as a statistical model. Our strategy is first to estimate the cluster labels of unlabeled data points selected from a high-density region, and then to conduct semi-supervised learning to train the model by using the estimated cluster labels and the remaining unlabeled data points. Lastly, by using the trained model, we obtain the estimated cluster labels of all given unlabeled data points. The advantage of our method is that it does not require key conditions. Existing clustering methods with deep neural networks assume that the cluster balance of a given dataset is uniform. Moreover, it also can be applied to various data domains as long as the data is expressed by a feature vector. In addition, it is observed that our method is robust against outliers. Therefore, the proposed method is expected to perform, on average, better than previous methods. We conducted numerical experiments on five commonly used datasets to confirm the effectiveness of the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Chi, Yun, Xiaodan Song, Dengyong Zhou, Koji Hino y Belle L. Tseng. "On evolutionary spectral clustering". ACM Transactions on Knowledge Discovery from Data 3, n.º 4 (noviembre de 2009): 1–30. http://dx.doi.org/10.1145/1631162.1631165.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Chen, Jiansheng, Zhengqin Li y Bo Huang. "Linear Spectral Clustering Superpixel". IEEE Transactions on Image Processing 26, n.º 7 (julio de 2017): 3317–30. http://dx.doi.org/10.1109/tip.2017.2651389.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Li, Jianyuan, Yingjie Xia, Zhenyu Shan y Yuncai Liu. "Scalable Constrained Spectral Clustering". IEEE Transactions on Knowledge and Data Engineering 27, n.º 2 (1 de febrero de 2015): 589–93. http://dx.doi.org/10.1109/tkde.2014.2356471.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Yang, Yang, Fumin Shen, Zi Huang, Heng Tao Shen y Xuelong Li. "Discrete Nonnegative Spectral Clustering". IEEE Transactions on Knowledge and Data Engineering 29, n.º 9 (1 de septiembre de 2017): 1834–45. http://dx.doi.org/10.1109/tkde.2017.2701825.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Huang, Shudong, Hongjun Wang, Dingcheng Li, Yan Yang y Tianrui Li. "Spectral co-clustering ensemble". Knowledge-Based Systems 84 (agosto de 2015): 46–55. http://dx.doi.org/10.1016/j.knosys.2015.03.027.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Langone, Rocco, Marc Van Barel y Johan A. K. Suykens. "Efficient evolutionary spectral clustering". Pattern Recognition Letters 84 (diciembre de 2016): 78–84. http://dx.doi.org/10.1016/j.patrec.2016.08.012.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Alzate, Carlos y Johan A. K. Suykens. "Hierarchical kernel spectral clustering". Neural Networks 35 (noviembre de 2012): 21–30. http://dx.doi.org/10.1016/j.neunet.2012.06.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Ozertem, Umut, Deniz Erdogmus y Robert Jenssen. "Mean shift spectral clustering". Pattern Recognition 41, n.º 6 (junio de 2008): 1924–38. http://dx.doi.org/10.1016/j.patcog.2007.09.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Binkiewicz, N., J. T. Vogelstein y K. Rohe. "Covariate-assisted spectral clustering". Biometrika 104, n.º 2 (19 de marzo de 2017): 361–77. http://dx.doi.org/10.1093/biomet/asx008.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

von Luxburg, Ulrike, Mikhail Belkin y Olivier Bousquet. "Consistency of spectral clustering". Annals of Statistics 36, n.º 2 (abril de 2008): 555–86. http://dx.doi.org/10.1214/009053607000000640.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Langone, Rocco y Johan A. K. Suykens. "Fast kernel spectral clustering". Neurocomputing 268 (diciembre de 2017): 27–33. http://dx.doi.org/10.1016/j.neucom.2016.12.085.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Chen, Guangliang y Gilad Lerman. "Spectral Curvature Clustering (SCC)". International Journal of Computer Vision 81, n.º 3 (10 de diciembre de 2008): 317–30. http://dx.doi.org/10.1007/s11263-008-0178-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Bolla, Marianna y Ahmed Elbanna. "Discrepancy minimizing spectral clustering". Discrete Applied Mathematics 243 (julio de 2018): 286–89. http://dx.doi.org/10.1016/j.dam.2018.02.016.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Pang, Yanwei, Jin Xie, Feiping Nie y Xuelong Li. "Spectral Clustering by Joint Spectral Embedding and Spectral Rotation". IEEE Transactions on Cybernetics 50, n.º 1 (enero de 2020): 247–58. http://dx.doi.org/10.1109/tcyb.2018.2868742.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Yan, Yuguang, Zhihao Xu, Canlin Yang, Jie Zhang, Ruichu Cai y Michael Kwok-Po Ng. "An Optimal Transport View for Subspace Clustering and Spectral Clustering". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 15 (24 de marzo de 2024): 16281–89. http://dx.doi.org/10.1609/aaai.v38i15.29563.

Texto completo
Resumen
Clustering is one of the most fundamental problems in machine learning and data mining, and many algorithms have been proposed in the past decades. Among them, subspace clustering and spectral clustering are the most famous approaches. In this paper, we provide an explanation for subspace clustering and spectral clustering from the perspective of optimal transport. Optimal transport studies how to move samples from one distribution to another distribution with minimal transport cost, and has shown a powerful ability to extract geometric information. By considering a self optimal transport model with only one group of samples, we observe that both subspace clustering and spectral clustering can be explained in the framework of optimal transport, and the optimal transport matrix bridges the spaces of features and spectral embeddings. Inspired by this connection, we propose a spectral optimal transport barycenter model, which learns spectral embeddings by solving a barycenter problem equipped with an optimal transport discrepancy and guidance of data. Based on our proposed model, we take advantage of optimal transport to exploit both feature and metric information involved in data for learning coupled spectral embeddings and affinity matrix in a unified model. We develop an alternating optimization algorithm to solve the resultant problems, and conduct experiments in different settings to evaluate the performance of our proposed methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Chen, Ji, Kaiping Zhan, Qingzhou Li, Zhiyang Tang, Chenwei Zhu, Ke Liu y Xiangyou Li. "Spectral clustering based on histogram of oriented gradient (HOG) of coal using laser-induced breakdown spectroscopy". Journal of Analytical Atomic Spectrometry 36, n.º 6 (2021): 1297–305. http://dx.doi.org/10.1039/d1ja00104c.

Texto completo
Resumen
Histogram of oriented gradients (HOG) was introduced in the unsupervised spectral clustering in LIBS. After clustering, the spectra of different matrices were clearly distinguished, and the accuracy of quantitative analysis of coal was improved.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Blanza, Jojo. "Wireless Propagation Multipaths using Spectral Clustering and Three-Constraint Affinity Matrix Spectral Clustering". Baghdad Science Journal 18, n.º 2(Suppl.) (20 de junio de 2021): 1001. http://dx.doi.org/10.21123/bsj.2021.18.2(suppl.).1001.

Texto completo
Resumen
This study focused on spectral clustering (SC) and three-constraint affinity matrix spectral clustering (3CAM-SC) to determine the number of clusters and the membership of the clusters of the COST 2100 channel model (C2CM) multipath dataset simultaneously. Various multipath clustering approaches solve only the number of clusters without taking into consideration the membership of clusters. The problem of giving only the number of clusters is that there is no assurance that the membership of the multipath clusters is accurate even though the number of clusters is correct. SC and 3CAM-SC aimed to solve this problem by determining the membership of the clusters. The cluster and the cluster count were then computed through the cluster-wise Jaccard index of the membership of the multipaths to their clusters. The multipaths generated by C2CM were transformed using the directional cosine transform (DCT) and the whitening transform (WT). The transformed dataset was clustered using SC and 3CAM-SC. The clustering performance was validated using the Jaccard index by comparing the reference multipath dataset with the calculated multipath clusters. The results show that the effectiveness of SC is similar to the state-of-the-art clustering approaches. However, 3CAM-SC outperforms SC in all channel scenarios. SC can be used in indoor scenarios based on accuracy, while 3CAM-SC is applicable in indoor and semi-urban scenarios. Thus, the clustering approaches can be applied as alternative clustering techniques in the field of channel modeling.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Chen, Guangchun, Juan Hu, Hong Peng, Jun Wang y Xiangnian Huang. "A Spectral Clustering Algorithm Improved by P Systems". International Journal of Computers Communications & Control 13, n.º 5 (29 de septiembre de 2018): 759–71. http://dx.doi.org/10.15837/ijccc.2018.5.3238.

Texto completo
Resumen
Using spectral clustering algorithm is diffcult to find the clusters in the cases that dataset has a large difference in density and its clustering effect depends on the selection of initial centers. To overcome the shortcomings, we propose a novel spectral clustering algorithm based on membrane computing framework, called MSC algorithm, whose idea is to use membrane clustering algorithm to realize the clustering component in spectral clustering. A tissue-like P system is used as its computing framework, where each object in cells denotes a set of cluster centers and velocity-location model is used as the evolution rules. Under the control of evolutioncommunication mechanism, the tissue-like P system can obtain a good clustering partition for each dataset. The proposed spectral clustering algorithm is evaluated on three artiffcial datasets and ten UCI datasets, and it is further compared with classical spectral clustering algorithms. The comparison results demonstrate the advantage of the proposed spectral clustering algorithm.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Li, Qiang. "A Spectrum Clustering Algorithm Based on Weighted Fuzzy Similar Matrix". Advanced Materials Research 482-484 (febrero de 2012): 2109–13. http://dx.doi.org/10.4028/www.scientific.net/amr.482-484.2109.

Texto completo
Resumen
Unlike those traditional clustering algorithms, the spectral clustering algorithm can be applied to non-convex sphere of sample spaces and be converged to global optimal. As a entry point that the similar of spectral clustering, introduce improved weighted fuzzy similar matrix to spectral in this paper which avoids influence from parameters changes of fuzzy similar matrix in traditional spectral clustering on clustering effect and improves the effectiveness of clustering. It is more actual and scientific, which is tested based on UCI data set.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Li, Ziyue, Emma L. D'Ambro, Siegfried Schobesberger, Cassandra J. Gaston, Felipe D. Lopez-Hilfiker, Jiumeng Liu, John E. Shilling, Joel A. Thornton y Christopher D. Cappa. "A robust clustering algorithm for analysis of composition-dependent organic aerosol thermal desorption measurements". Atmospheric Chemistry and Physics 20, n.º 4 (2 de marzo de 2020): 2489–512. http://dx.doi.org/10.5194/acp-20-2489-2020.

Texto completo
Resumen
Abstract. One of the challenges of understanding atmospheric organic aerosol (OA) particles stems from its complex composition. Mass spectrometry is commonly used to characterize the compositional variability of OA. Clustering of a mass spectral dataset helps identify components that exhibit similar behavior or have similar properties, facilitating understanding of sources and processes that govern compositional variability. Here, we developed an algorithm for clustering mass spectra, the noise-sorted scanning clustering (NSSC), appropriate for application to thermal desorption measurements of collected OA particles from the Filter Inlet for Gases and AEROsols coupled to a chemical ionization mass spectrometer (FIGAERO-CIMS). NSSC, which extends the common density-based special clustering of applications with noise (DBSCAN) algorithm, provides a robust, reproducible analysis of the FIGAERO temperature-dependent mass spectral data. The NSSC allows for the determination of thermal profiles for compositionally distinct clusters of mass spectra, increasing the accessibility and enhancing the interpretation of FIGAERO data. Applications of NSSC to several laboratory biogenic secondary organic aerosol (BSOA) systems demonstrate the ability of NSSC to distinguish different types of thermal behaviors for the components comprising the particles along with the relative mass contributions and chemical properties (e.g., average molecular formula) of each mass spectral cluster. For each of the systems examined, more than 80 % of the total mass is clustered into 9–13 mass spectral clusters. Comparison of the average thermograms of the mass spectral clusters between systems indicates some commonality in terms of the thermal properties of different BSOA, although with some system-specific behavior. Application of NSSC to sets of experiments in which one experimental parameter, such as the concentration of NO, is varied demonstrates the potential for mass spectral clustering to elucidate the chemical factors that drive changes in the thermal properties of OA particles. Further quantitative interpretation of the thermograms of the mass spectral clusters will allow for a more comprehensive understanding of the thermochemical properties of OA particles.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Mandal, Jyotsna Kumar y Parthajit Roy. "A Novel Spectral Clustering based on Local Distribution". International Journal of Electrical and Computer Engineering (IJECE) 5, n.º 2 (1 de abril de 2015): 361. http://dx.doi.org/10.11591/ijece.v5i2.pp361-370.

Texto completo
Resumen
This paper proposed a novel variation of spectral clustering model based on a novel affinitymetric that considers the distribution of the neighboring points to learn the underlayingstructures in the data set. Proposed affinity metric is calculated using Mahalanobis distancethat exploits the concept of outlier detection for identifying the neighborhoods of the datapoints. RandomWalk Laplacian of the representative graph and its spectra has been consideredfor the clustering purpose and the first k number of eigenvectors have been consideredin the second phase of clustering. The model has been tested with benchmark data and thequality of the output of the proposed model has been tested in various clustering indicesscales.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Mizutani, Tomohiko. "Convex programming based spectral clustering". Machine Learning 110, n.º 5 (14 de abril de 2021): 933–64. http://dx.doi.org/10.1007/s10994-020-05940-1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Zhao, Qianli, Linlin Zong, Xianchao Zhang, Xinyue Liu y Hong Yu. "Incomplete multi-view spectral clustering". Journal of Intelligent & Fuzzy Systems 38, n.º 3 (4 de marzo de 2020): 2991–3001. http://dx.doi.org/10.3233/jifs-190380.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Hong Yu, He Jiang, Xianchao Zhang y Yuansheng Yang. "K_Neighbors Path Based Spectral Clustering". International Journal of Advancements in Computing Technology 4, n.º 1 (31 de enero de 2012): 50–58. http://dx.doi.org/10.4156/ijact.vol4.issue1.6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Wang, Hongtao, Ang Li, Bolin Shen, Yuyan Sun y Hongmei Wang. "Federated Multi-View Spectral Clustering". IEEE Access 8 (2020): 202249–59. http://dx.doi.org/10.1109/access.2020.3036747.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Mohamed, Samar S. y Magdy MA Salama. "Spectral clustering for TRUS images". BioMedical Engineering OnLine 6, n.º 1 (2007): 10. http://dx.doi.org/10.1186/1475-925x-6-10.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Liu, Mingming, Bing Liu, Chen Zhang y Wei Sun. "Spectral Nonlinearly Embedded Clustering Algorithm". Mathematical Problems in Engineering 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/9264561.

Texto completo
Resumen
As is well known, traditional spectral clustering (SC) methods are developed based on themanifold assumption, namely, that two nearby data points in the high-density region of a low-dimensional data manifold have the same cluster label. But, for some high-dimensional and sparse data, such an assumption might be invalid. Consequently, the clustering performance of SC will be degraded sharply in this case. To solve this problem, in this paper, we propose a general spectral embedded framework, which embeds the true cluster assignment matrix for high-dimensional data into a nonlinear space by a predefined embedding function. Based on this framework, several algorithms are presented by using different embedding functions, which aim at learning the final cluster assignment matrix and a transformation into a low dimensionality space simultaneously. More importantly, the proposed method can naturally handle the out-of-sample extension problem. The experimental results on benchmark datasets demonstrate that the proposed method significantly outperforms existing clustering methods.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Zhou, Peng, Yi-Dong Shen, Liang Du, Fan Ye y Xuejun Li. "Incremental multi-view spectral clustering". Knowledge-Based Systems 174 (junio de 2019): 73–86. http://dx.doi.org/10.1016/j.knosys.2019.02.036.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Jiang, Wenhao, Wei Liu y Fu-lai Chung. "Knowledge transfer for spectral clustering". Pattern Recognition 81 (septiembre de 2018): 484–96. http://dx.doi.org/10.1016/j.patcog.2018.04.018.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Chen, Weifu y Guocan Feng. "Spectral clustering with discriminant cuts". Knowledge-Based Systems 28 (abril de 2012): 27–37. http://dx.doi.org/10.1016/j.knosys.2011.11.010.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Cai, Yang, Yuanyuan Jiao, Wenzhang Zhuge, Hong Tao y Chenping Hou. "Partial multi-view spectral clustering". Neurocomputing 311 (octubre de 2018): 316–24. http://dx.doi.org/10.1016/j.neucom.2018.05.053.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Wen, Guoqiu. "Robust self-tuning spectral clustering". Neurocomputing 391 (mayo de 2020): 243–48. http://dx.doi.org/10.1016/j.neucom.2018.11.105.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Chang, Hong y Dit-Yan Yeung. "Robust path-based spectral clustering". Pattern Recognition 41, n.º 1 (enero de 2008): 191–203. http://dx.doi.org/10.1016/j.patcog.2007.04.010.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Xiang, Tao y Shaogang Gong. "Spectral clustering with eigenvector selection". Pattern Recognition 41, n.º 3 (marzo de 2008): 1012–29. http://dx.doi.org/10.1016/j.patcog.2007.07.023.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

von Luxburg, Ulrike. "A tutorial on spectral clustering". Statistics and Computing 17, n.º 4 (22 de agosto de 2007): 395–416. http://dx.doi.org/10.1007/s11222-007-9033-z.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Yong Wang, Yuan Jiang, Yi Wu y Zhi-Hua Zhou. "Spectral Clustering on Multiple Manifolds". IEEE Transactions on Neural Networks 22, n.º 7 (julio de 2011): 1149–61. http://dx.doi.org/10.1109/tnn.2011.2147798.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Wang, Qi, Zequn Qin, Feiping Nie y Xuelong Li. "Spectral Embedded Adaptive Neighbors Clustering". IEEE Transactions on Neural Networks and Learning Systems 30, n.º 4 (abril de 2019): 1265–71. http://dx.doi.org/10.1109/tnnls.2018.2861209.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía