To see the other types of publications on this topic, follow the link: Clustering spectral.

Journal articles on the topic 'Clustering spectral'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Clustering spectral.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Hess, Sibylle, Wouter Duivesteijn, Philipp Honysz, and Katharina Morik. "The SpectACl of Nonconvex Clustering: A Spectral Approach to Density-Based Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3788–95. http://dx.doi.org/10.1609/aaai.v33i01.33013788.

Full text
Abstract:
When it comes to clustering nonconvex shapes, two paradigms are used to find the most suitable clustering: minimum cut and maximum density. The most popular algorithms incorporating these paradigms are Spectral Clustering and DBSCAN. Both paradigms have their pros and cons. While minimum cut clusterings are sensitive to noise, density-based clusterings have trouble handling clusters with varying densities. In this paper, we propose SPECTACL: a method combining the advantages of both approaches, while solving the two mentioned drawbacks. Our method is easy to implement, such as Spectral Clustering, and theoretically founded to optimize a proposed density criterion of clusterings. Through experiments on synthetic and real-world data, we demonstrate that our approach provides robust and reliable clusterings.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Hongmin, Xiucai Ye, Akira Imakura, and Tetsuya Sakurai. "LSEC: Large-scale spectral ensemble clustering." Intelligent Data Analysis 27, no. 1 (January 30, 2023): 59–77. http://dx.doi.org/10.3233/ida-216240.

Full text
Abstract:
A fundamental problem in machine learning is ensemble clustering, that is, combining multiple base clusterings to obtain improved clustering result. However, most of the existing methods are unsuitable for large-scale ensemble clustering tasks owing to efficiency bottlenecks. In this paper, we propose a large-scale spectral ensemble clustering (LSEC) method to balance efficiency and effectiveness. In LSEC, a large-scale spectral clustering-based efficient ensemble generation framework is designed to generate various base clusterings with low computational complexity. Thereafter, all the base clusterings are combined using a bipartite graph partition-based consensus function to obtain improved consensus clustering results. The LSEC method achieves a lower computational complexity than most existing ensemble clustering methods. Experiments conducted on ten large-scale datasets demonstrate the efficiency and effectiveness of the LSEC method. The MATLAB code of the proposed method and experimental datasets are available at https://github.com/Li-Hongmin/MyPaperWithCode.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhuang, Xinwei, and Sean Hanna. "Space Frame Optimisation with Spectral Clustering." International Journal of Machine Learning and Computing 10, no. 4 (July 2020): 507–12. http://dx.doi.org/10.18178/ijmlc.2020.10.4.965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sun, Gan, Yang Cong, Qianqian Wang, Jun Li, and Yun Fu. "Lifelong Spectral Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 5867–74. http://dx.doi.org/10.1609/aaai.v34i04.6045.

Full text
Abstract:
In the past decades, spectral clustering (SC) has become one of the most effective clustering algorithms. However, most previous studies focus on spectral clustering tasks with a fixed task set, which cannot incorporate with a new spectral clustering task without accessing to previously learned tasks. In this paper, we aim to explore the problem of spectral clustering in a lifelong machine learning framework, i.e., Lifelong Spectral Clustering (L2SC). Its goal is to efficiently learn a model for a new spectral clustering task by selectively transferring previously accumulated experience from knowledge library. Specifically, the knowledge library of L2SC contains two components: 1) orthogonal basis library: capturing latent cluster centers among the clusters in each pair of tasks; 2) feature embedding library: embedding the feature manifold information shared among multiple related tasks. As a new spectral clustering task arrives, L2SC firstly transfers knowledge from both basis library and feature library to obtain encoding matrix, and further redefines the library base over time to maximize performance across all the clustering tasks. Meanwhile, a general online update formulation is derived to alternatively update the basis library and feature library. Finally, the empirical experiments on several real-world benchmark datasets demonstrate that our L2SC model can effectively improve the clustering performance when comparing with other state-of-the-art spectral clustering algorithms.
APA, Harvard, Vancouver, ISO, and other styles
5

Ling Ping, Rong Xiangsheng, and Dong Yongquan. "Incremental Spectral Clustering." Journal of Convergence Information Technology 7, no. 15 (August 31, 2012): 286–93. http://dx.doi.org/10.4156/jcit.vol7.issue15.34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kim, Jaehwan, and Seungjin Choi. "Semidefinite spectral clustering." Pattern Recognition 39, no. 11 (November 2006): 2025–35. http://dx.doi.org/10.1016/j.patcog.2006.05.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Challa, Aditya, Sravan Danda, B. S. Daya Sagar, and Laurent Najman. "Power Spectral Clustering." Journal of Mathematical Imaging and Vision 62, no. 9 (July 11, 2020): 1195–213. http://dx.doi.org/10.1007/s10851-020-00980-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Huang, Jin, Feiping Nie, and Heng Huang. "Spectral Rotation versus K-Means in Spectral Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (June 30, 2013): 431–37. http://dx.doi.org/10.1609/aaai.v27i1.8683.

Full text
Abstract:
Spectral clustering has been a popular data clustering algorithm. This category of approaches often resort to other clustering methods, such as K-Means, to get the final cluster. The potential flaw of such common practice is that the obtained relaxed continuous spectral solution could severely deviate from the true discrete solution. In this paper, we propose to impose an additional orthonormal constraint to better approximate the optimal continuous solution to the graph cut objective functions. Such a method, called spectral rotation in literature, optimizes the spectral clustering objective functions better than K-Means, and improves the clustering accuracy. We would provide efficient algorithm to solve the new problem rigorously, which is not significantly more costly than K-Means. We also establish the connection between our method andK-Means to provide theoretical motivation of our method. Experimental results show that our algorithm consistently reaches better cut and meanwhile outperforms in clustering metrics than classic spectral clustering methods.
APA, Harvard, Vancouver, ISO, and other styles
9

JIN, Hui-zhen. "Multilevel spectral clustering with ascertainable clustering number." Journal of Computer Applications 28, no. 5 (October 17, 2008): 1229–31. http://dx.doi.org/10.3724/sp.j.1087.2008.01229.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Huang, Dong, Chang-Dong Wang, Jian-Sheng Wu, Jian-Huang Lai, and Chee-Keong Kwoh. "Ultra-Scalable Spectral Clustering and Ensemble Clustering." IEEE Transactions on Knowledge and Data Engineering 32, no. 6 (June 1, 2020): 1212–26. http://dx.doi.org/10.1109/tkde.2019.2903410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Yousefi, Bardia, Clemente Ibarra-Castanedo, Martin Chamberland, Xavier P. V. Maldague, and Georges Beaudoin. "Unsupervised Identification of Targeted Spectra Applying Rank1-NMF and FCC Algorithms in Long-Wave Hyperspectral Infrared Imagery." Remote Sensing 13, no. 11 (May 28, 2021): 2125. http://dx.doi.org/10.3390/rs13112125.

Full text
Abstract:
Clustering methods unequivocally show considerable influence on many recent algorithms and play an important role in hyperspectral data analysis. Here, we challenge the clustering for mineral identification using two different strategies in hyperspectral long wave infrared (LWIR, 7.7–11.8 μm). For that, we compare two algorithms to perform the mineral identification in a unique dataset. The first algorithm uses spectral comparison techniques for all the pixel-spectra and creates RGB false color composites (FCC). Then, a color based clustering is used to group the regions (called FCC-clustering). The second algorithm clusters all the pixel-spectra to directly group the spectra. Then, the first rank of non-negative matrix factorization (NMF) extracts the representative of each cluster and compares results with the spectral library of JPL/NASA. These techniques give the comparison values as features which convert into RGB-FCC as the results (called clustering rank1-NMF). We applied K-means as clustering approach, which can be modified in any other similar clustering approach. The results of the clustering-rank1-NMF algorithm indicate significant computational efficiency (more than 20 times faster than the previous approach) and promising performance for mineral identification having up to 75.8% and 84.8% average accuracies for FCC-clustering and clustering-rank1 NMF algorithms (using spectral angle mapper (SAM)), respectively. Furthermore, several spectral comparison techniques are used also such as adaptive matched subspace detector (AMSD), orthogonal subspace projection (OSP) algorithm, principal component analysis (PCA), local matched filter (PLMF), SAM, and normalized cross correlation (NCC) for both algorithms and most of them show a similar range in accuracy. However, SAM and NCC are preferred due to their computational simplicity. Our algorithms strive to identify eleven different mineral grains (biotite, diopside, epidote, goethite, kyanite, scheelite, smithsonite, tourmaline, pyrope, olivine, and quartz).
APA, Harvard, Vancouver, ISO, and other styles
12

Fu, Li Li, Yong Li Liu, and Li Jing Hao. "Research on Spectral Clustering." Applied Mechanics and Materials 687-691 (November 2014): 1350–53. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.1350.

Full text
Abstract:
Spectral clustering algorithm is a kind of clustering algorithm based on spectral graph theory. As spectral clustering has deep theoretical foundation as well as the advantage in dealing with non-convex distribution, it has received much attention in machine learning and data mining areas. The algorithm is easy to implement, and outperforms traditional clustering algorithms such as K-means algorithm. This paper aims to give some intuitions on spectral clustering. We describe different graph partition criteria, the definition of spectral clustering, and clustering steps, etc. Finally, in order to solve the disadvantage of spectral clustering, some improvements are introduced briefly.
APA, Harvard, Vancouver, ISO, and other styles
13

Wada, Yuichiro, Shugo Miyamoto, Takumi Nakagama, Léo Andéol, Wataru Kumagai, and Takafumi Kanamori. "Spectral Embedded Deep Clustering." Entropy 21, no. 8 (August 15, 2019): 795. http://dx.doi.org/10.3390/e21080795.

Full text
Abstract:
We propose a new clustering method based on a deep neural network. Given an unlabeled dataset and the number of clusters, our method directly groups the dataset into the given number of clusters in the original space. We use a conditional discrete probability distribution defined by a deep neural network as a statistical model. Our strategy is first to estimate the cluster labels of unlabeled data points selected from a high-density region, and then to conduct semi-supervised learning to train the model by using the estimated cluster labels and the remaining unlabeled data points. Lastly, by using the trained model, we obtain the estimated cluster labels of all given unlabeled data points. The advantage of our method is that it does not require key conditions. Existing clustering methods with deep neural networks assume that the cluster balance of a given dataset is uniform. Moreover, it also can be applied to various data domains as long as the data is expressed by a feature vector. In addition, it is observed that our method is robust against outliers. Therefore, the proposed method is expected to perform, on average, better than previous methods. We conducted numerical experiments on five commonly used datasets to confirm the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
14

Chi, Yun, Xiaodan Song, Dengyong Zhou, Koji Hino, and Belle L. Tseng. "On evolutionary spectral clustering." ACM Transactions on Knowledge Discovery from Data 3, no. 4 (November 2009): 1–30. http://dx.doi.org/10.1145/1631162.1631165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Jiansheng, Zhengqin Li, and Bo Huang. "Linear Spectral Clustering Superpixel." IEEE Transactions on Image Processing 26, no. 7 (July 2017): 3317–30. http://dx.doi.org/10.1109/tip.2017.2651389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Jianyuan, Yingjie Xia, Zhenyu Shan, and Yuncai Liu. "Scalable Constrained Spectral Clustering." IEEE Transactions on Knowledge and Data Engineering 27, no. 2 (February 1, 2015): 589–93. http://dx.doi.org/10.1109/tkde.2014.2356471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Yang, Yang, Fumin Shen, Zi Huang, Heng Tao Shen, and Xuelong Li. "Discrete Nonnegative Spectral Clustering." IEEE Transactions on Knowledge and Data Engineering 29, no. 9 (September 1, 2017): 1834–45. http://dx.doi.org/10.1109/tkde.2017.2701825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Huang, Shudong, Hongjun Wang, Dingcheng Li, Yan Yang, and Tianrui Li. "Spectral co-clustering ensemble." Knowledge-Based Systems 84 (August 2015): 46–55. http://dx.doi.org/10.1016/j.knosys.2015.03.027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Langone, Rocco, Marc Van Barel, and Johan A. K. Suykens. "Efficient evolutionary spectral clustering." Pattern Recognition Letters 84 (December 2016): 78–84. http://dx.doi.org/10.1016/j.patrec.2016.08.012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Alzate, Carlos, and Johan A. K. Suykens. "Hierarchical kernel spectral clustering." Neural Networks 35 (November 2012): 21–30. http://dx.doi.org/10.1016/j.neunet.2012.06.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Ozertem, Umut, Deniz Erdogmus, and Robert Jenssen. "Mean shift spectral clustering." Pattern Recognition 41, no. 6 (June 2008): 1924–38. http://dx.doi.org/10.1016/j.patcog.2007.09.009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Binkiewicz, N., J. T. Vogelstein, and K. Rohe. "Covariate-assisted spectral clustering." Biometrika 104, no. 2 (March 19, 2017): 361–77. http://dx.doi.org/10.1093/biomet/asx008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

von Luxburg, Ulrike, Mikhail Belkin, and Olivier Bousquet. "Consistency of spectral clustering." Annals of Statistics 36, no. 2 (April 2008): 555–86. http://dx.doi.org/10.1214/009053607000000640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Langone, Rocco, and Johan A. K. Suykens. "Fast kernel spectral clustering." Neurocomputing 268 (December 2017): 27–33. http://dx.doi.org/10.1016/j.neucom.2016.12.085.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Chen, Guangliang, and Gilad Lerman. "Spectral Curvature Clustering (SCC)." International Journal of Computer Vision 81, no. 3 (December 10, 2008): 317–30. http://dx.doi.org/10.1007/s11263-008-0178-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Bolla, Marianna, and Ahmed Elbanna. "Discrepancy minimizing spectral clustering." Discrete Applied Mathematics 243 (July 2018): 286–89. http://dx.doi.org/10.1016/j.dam.2018.02.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Pang, Yanwei, Jin Xie, Feiping Nie, and Xuelong Li. "Spectral Clustering by Joint Spectral Embedding and Spectral Rotation." IEEE Transactions on Cybernetics 50, no. 1 (January 2020): 247–58. http://dx.doi.org/10.1109/tcyb.2018.2868742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Yan, Yuguang, Zhihao Xu, Canlin Yang, Jie Zhang, Ruichu Cai, and Michael Kwok-Po Ng. "An Optimal Transport View for Subspace Clustering and Spectral Clustering." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 15 (March 24, 2024): 16281–89. http://dx.doi.org/10.1609/aaai.v38i15.29563.

Full text
Abstract:
Clustering is one of the most fundamental problems in machine learning and data mining, and many algorithms have been proposed in the past decades. Among them, subspace clustering and spectral clustering are the most famous approaches. In this paper, we provide an explanation for subspace clustering and spectral clustering from the perspective of optimal transport. Optimal transport studies how to move samples from one distribution to another distribution with minimal transport cost, and has shown a powerful ability to extract geometric information. By considering a self optimal transport model with only one group of samples, we observe that both subspace clustering and spectral clustering can be explained in the framework of optimal transport, and the optimal transport matrix bridges the spaces of features and spectral embeddings. Inspired by this connection, we propose a spectral optimal transport barycenter model, which learns spectral embeddings by solving a barycenter problem equipped with an optimal transport discrepancy and guidance of data. Based on our proposed model, we take advantage of optimal transport to exploit both feature and metric information involved in data for learning coupled spectral embeddings and affinity matrix in a unified model. We develop an alternating optimization algorithm to solve the resultant problems, and conduct experiments in different settings to evaluate the performance of our proposed methods.
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Ji, Kaiping Zhan, Qingzhou Li, Zhiyang Tang, Chenwei Zhu, Ke Liu, and Xiangyou Li. "Spectral clustering based on histogram of oriented gradient (HOG) of coal using laser-induced breakdown spectroscopy." Journal of Analytical Atomic Spectrometry 36, no. 6 (2021): 1297–305. http://dx.doi.org/10.1039/d1ja00104c.

Full text
Abstract:
Histogram of oriented gradients (HOG) was introduced in the unsupervised spectral clustering in LIBS. After clustering, the spectra of different matrices were clearly distinguished, and the accuracy of quantitative analysis of coal was improved.
APA, Harvard, Vancouver, ISO, and other styles
30

Blanza, Jojo. "Wireless Propagation Multipaths using Spectral Clustering and Three-Constraint Affinity Matrix Spectral Clustering." Baghdad Science Journal 18, no. 2(Suppl.) (June 20, 2021): 1001. http://dx.doi.org/10.21123/bsj.2021.18.2(suppl.).1001.

Full text
Abstract:
This study focused on spectral clustering (SC) and three-constraint affinity matrix spectral clustering (3CAM-SC) to determine the number of clusters and the membership of the clusters of the COST 2100 channel model (C2CM) multipath dataset simultaneously. Various multipath clustering approaches solve only the number of clusters without taking into consideration the membership of clusters. The problem of giving only the number of clusters is that there is no assurance that the membership of the multipath clusters is accurate even though the number of clusters is correct. SC and 3CAM-SC aimed to solve this problem by determining the membership of the clusters. The cluster and the cluster count were then computed through the cluster-wise Jaccard index of the membership of the multipaths to their clusters. The multipaths generated by C2CM were transformed using the directional cosine transform (DCT) and the whitening transform (WT). The transformed dataset was clustered using SC and 3CAM-SC. The clustering performance was validated using the Jaccard index by comparing the reference multipath dataset with the calculated multipath clusters. The results show that the effectiveness of SC is similar to the state-of-the-art clustering approaches. However, 3CAM-SC outperforms SC in all channel scenarios. SC can be used in indoor scenarios based on accuracy, while 3CAM-SC is applicable in indoor and semi-urban scenarios. Thus, the clustering approaches can be applied as alternative clustering techniques in the field of channel modeling.
APA, Harvard, Vancouver, ISO, and other styles
31

Chen, Guangchun, Juan Hu, Hong Peng, Jun Wang, and Xiangnian Huang. "A Spectral Clustering Algorithm Improved by P Systems." International Journal of Computers Communications & Control 13, no. 5 (September 29, 2018): 759–71. http://dx.doi.org/10.15837/ijccc.2018.5.3238.

Full text
Abstract:
Using spectral clustering algorithm is diffcult to find the clusters in the cases that dataset has a large difference in density and its clustering effect depends on the selection of initial centers. To overcome the shortcomings, we propose a novel spectral clustering algorithm based on membrane computing framework, called MSC algorithm, whose idea is to use membrane clustering algorithm to realize the clustering component in spectral clustering. A tissue-like P system is used as its computing framework, where each object in cells denotes a set of cluster centers and velocity-location model is used as the evolution rules. Under the control of evolutioncommunication mechanism, the tissue-like P system can obtain a good clustering partition for each dataset. The proposed spectral clustering algorithm is evaluated on three artiffcial datasets and ten UCI datasets, and it is further compared with classical spectral clustering algorithms. The comparison results demonstrate the advantage of the proposed spectral clustering algorithm.
APA, Harvard, Vancouver, ISO, and other styles
32

Li, Qiang. "A Spectrum Clustering Algorithm Based on Weighted Fuzzy Similar Matrix." Advanced Materials Research 482-484 (February 2012): 2109–13. http://dx.doi.org/10.4028/www.scientific.net/amr.482-484.2109.

Full text
Abstract:
Unlike those traditional clustering algorithms, the spectral clustering algorithm can be applied to non-convex sphere of sample spaces and be converged to global optimal. As a entry point that the similar of spectral clustering, introduce improved weighted fuzzy similar matrix to spectral in this paper which avoids influence from parameters changes of fuzzy similar matrix in traditional spectral clustering on clustering effect and improves the effectiveness of clustering. It is more actual and scientific, which is tested based on UCI data set.
APA, Harvard, Vancouver, ISO, and other styles
33

Li, Ziyue, Emma L. D'Ambro, Siegfried Schobesberger, Cassandra J. Gaston, Felipe D. Lopez-Hilfiker, Jiumeng Liu, John E. Shilling, Joel A. Thornton, and Christopher D. Cappa. "A robust clustering algorithm for analysis of composition-dependent organic aerosol thermal desorption measurements." Atmospheric Chemistry and Physics 20, no. 4 (March 2, 2020): 2489–512. http://dx.doi.org/10.5194/acp-20-2489-2020.

Full text
Abstract:
Abstract. One of the challenges of understanding atmospheric organic aerosol (OA) particles stems from its complex composition. Mass spectrometry is commonly used to characterize the compositional variability of OA. Clustering of a mass spectral dataset helps identify components that exhibit similar behavior or have similar properties, facilitating understanding of sources and processes that govern compositional variability. Here, we developed an algorithm for clustering mass spectra, the noise-sorted scanning clustering (NSSC), appropriate for application to thermal desorption measurements of collected OA particles from the Filter Inlet for Gases and AEROsols coupled to a chemical ionization mass spectrometer (FIGAERO-CIMS). NSSC, which extends the common density-based special clustering of applications with noise (DBSCAN) algorithm, provides a robust, reproducible analysis of the FIGAERO temperature-dependent mass spectral data. The NSSC allows for the determination of thermal profiles for compositionally distinct clusters of mass spectra, increasing the accessibility and enhancing the interpretation of FIGAERO data. Applications of NSSC to several laboratory biogenic secondary organic aerosol (BSOA) systems demonstrate the ability of NSSC to distinguish different types of thermal behaviors for the components comprising the particles along with the relative mass contributions and chemical properties (e.g., average molecular formula) of each mass spectral cluster. For each of the systems examined, more than 80 % of the total mass is clustered into 9–13 mass spectral clusters. Comparison of the average thermograms of the mass spectral clusters between systems indicates some commonality in terms of the thermal properties of different BSOA, although with some system-specific behavior. Application of NSSC to sets of experiments in which one experimental parameter, such as the concentration of NO, is varied demonstrates the potential for mass spectral clustering to elucidate the chemical factors that drive changes in the thermal properties of OA particles. Further quantitative interpretation of the thermograms of the mass spectral clusters will allow for a more comprehensive understanding of the thermochemical properties of OA particles.
APA, Harvard, Vancouver, ISO, and other styles
34

Mandal, Jyotsna Kumar, and Parthajit Roy. "A Novel Spectral Clustering based on Local Distribution." International Journal of Electrical and Computer Engineering (IJECE) 5, no. 2 (April 1, 2015): 361. http://dx.doi.org/10.11591/ijece.v5i2.pp361-370.

Full text
Abstract:
This paper proposed a novel variation of spectral clustering model based on a novel affinitymetric that considers the distribution of the neighboring points to learn the underlayingstructures in the data set. Proposed affinity metric is calculated using Mahalanobis distancethat exploits the concept of outlier detection for identifying the neighborhoods of the datapoints. RandomWalk Laplacian of the representative graph and its spectra has been consideredfor the clustering purpose and the first k number of eigenvectors have been consideredin the second phase of clustering. The model has been tested with benchmark data and thequality of the output of the proposed model has been tested in various clustering indicesscales.
APA, Harvard, Vancouver, ISO, and other styles
35

Mizutani, Tomohiko. "Convex programming based spectral clustering." Machine Learning 110, no. 5 (April 14, 2021): 933–64. http://dx.doi.org/10.1007/s10994-020-05940-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Zhao, Qianli, Linlin Zong, Xianchao Zhang, Xinyue Liu, and Hong Yu. "Incomplete multi-view spectral clustering." Journal of Intelligent & Fuzzy Systems 38, no. 3 (March 4, 2020): 2991–3001. http://dx.doi.org/10.3233/jifs-190380.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Hong Yu, He Jiang, Xianchao Zhang, and Yuansheng Yang. "K_Neighbors Path Based Spectral Clustering." International Journal of Advancements in Computing Technology 4, no. 1 (January 31, 2012): 50–58. http://dx.doi.org/10.4156/ijact.vol4.issue1.6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Wang, Hongtao, Ang Li, Bolin Shen, Yuyan Sun, and Hongmei Wang. "Federated Multi-View Spectral Clustering." IEEE Access 8 (2020): 202249–59. http://dx.doi.org/10.1109/access.2020.3036747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Mohamed, Samar S., and Magdy MA Salama. "Spectral clustering for TRUS images." BioMedical Engineering OnLine 6, no. 1 (2007): 10. http://dx.doi.org/10.1186/1475-925x-6-10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Liu, Mingming, Bing Liu, Chen Zhang, and Wei Sun. "Spectral Nonlinearly Embedded Clustering Algorithm." Mathematical Problems in Engineering 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/9264561.

Full text
Abstract:
As is well known, traditional spectral clustering (SC) methods are developed based on themanifold assumption, namely, that two nearby data points in the high-density region of a low-dimensional data manifold have the same cluster label. But, for some high-dimensional and sparse data, such an assumption might be invalid. Consequently, the clustering performance of SC will be degraded sharply in this case. To solve this problem, in this paper, we propose a general spectral embedded framework, which embeds the true cluster assignment matrix for high-dimensional data into a nonlinear space by a predefined embedding function. Based on this framework, several algorithms are presented by using different embedding functions, which aim at learning the final cluster assignment matrix and a transformation into a low dimensionality space simultaneously. More importantly, the proposed method can naturally handle the out-of-sample extension problem. The experimental results on benchmark datasets demonstrate that the proposed method significantly outperforms existing clustering methods.
APA, Harvard, Vancouver, ISO, and other styles
41

Zhou, Peng, Yi-Dong Shen, Liang Du, Fan Ye, and Xuejun Li. "Incremental multi-view spectral clustering." Knowledge-Based Systems 174 (June 2019): 73–86. http://dx.doi.org/10.1016/j.knosys.2019.02.036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Jiang, Wenhao, Wei Liu, and Fu-lai Chung. "Knowledge transfer for spectral clustering." Pattern Recognition 81 (September 2018): 484–96. http://dx.doi.org/10.1016/j.patcog.2018.04.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, Weifu, and Guocan Feng. "Spectral clustering with discriminant cuts." Knowledge-Based Systems 28 (April 2012): 27–37. http://dx.doi.org/10.1016/j.knosys.2011.11.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cai, Yang, Yuanyuan Jiao, Wenzhang Zhuge, Hong Tao, and Chenping Hou. "Partial multi-view spectral clustering." Neurocomputing 311 (October 2018): 316–24. http://dx.doi.org/10.1016/j.neucom.2018.05.053.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Wen, Guoqiu. "Robust self-tuning spectral clustering." Neurocomputing 391 (May 2020): 243–48. http://dx.doi.org/10.1016/j.neucom.2018.11.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Chang, Hong, and Dit-Yan Yeung. "Robust path-based spectral clustering." Pattern Recognition 41, no. 1 (January 2008): 191–203. http://dx.doi.org/10.1016/j.patcog.2007.04.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Xiang, Tao, and Shaogang Gong. "Spectral clustering with eigenvector selection." Pattern Recognition 41, no. 3 (March 2008): 1012–29. http://dx.doi.org/10.1016/j.patcog.2007.07.023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

von Luxburg, Ulrike. "A tutorial on spectral clustering." Statistics and Computing 17, no. 4 (August 22, 2007): 395–416. http://dx.doi.org/10.1007/s11222-007-9033-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Yong Wang, Yuan Jiang, Yi Wu, and Zhi-Hua Zhou. "Spectral Clustering on Multiple Manifolds." IEEE Transactions on Neural Networks 22, no. 7 (July 2011): 1149–61. http://dx.doi.org/10.1109/tnn.2011.2147798.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Wang, Qi, Zequn Qin, Feiping Nie, and Xuelong Li. "Spectral Embedded Adaptive Neighbors Clustering." IEEE Transactions on Neural Networks and Learning Systems 30, no. 4 (April 2019): 1265–71. http://dx.doi.org/10.1109/tnnls.2018.2861209.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography