Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Approximation de Nyström“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Approximation de Nyström" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Approximation de Nyström"
Ding, Lizhong, Yong Liu, Shizhong Liao, Yu Li, Peng Yang, Yijie Pan, Chao Huang, Ling Shao und Xin Gao. „Approximate Kernel Selection with Strong Approximate Consistency“. Proceedings of the AAAI Conference on Artificial Intelligence 33 (17.07.2019): 3462–69. http://dx.doi.org/10.1609/aaai.v33i01.33013462.
Der volle Inhalt der QuelleWang, Ling, Hongqiao Wang und Guangyuan Fu. „Multi-Nyström Method Based on Multiple Kernel Learning for Large Scale Imbalanced Classification“. Computational Intelligence and Neuroscience 2021 (13.06.2021): 1–11. http://dx.doi.org/10.1155/2021/9911871.
Der volle Inhalt der QuelleZhang, Kai, und James T. Kwok. „Density-Weighted Nyström Method for Computing Large Kernel Eigensystems“. Neural Computation 21, Nr. 1 (Januar 2009): 121–46. http://dx.doi.org/10.1162/neco.2009.11-07-651.
Der volle Inhalt der QuelleDíaz de Alba, Patricia, Luisa Fermo und Giuseppe Rodriguez. „Solution of second kind Fredholm integral equations by means of Gauss and anti-Gauss quadrature rules“. Numerische Mathematik 146, Nr. 4 (18.11.2020): 699–728. http://dx.doi.org/10.1007/s00211-020-01163-7.
Der volle Inhalt der QuelleRudi, Alessandro, Leonard Wossnig, Carlo Ciliberto, Andrea Rocchetto, Massimiliano Pontil und Simone Severini. „Approximating Hamiltonian dynamics with the Nyström method“. Quantum 4 (20.02.2020): 234. http://dx.doi.org/10.22331/q-2020-02-20-234.
Der volle Inhalt der QuelleTrokicić, Aleksandar, und Branimir Todorović. „Constrained spectral clustering via multi–layer graph embeddings on a grassmann manifold“. International Journal of Applied Mathematics and Computer Science 29, Nr. 1 (01.03.2019): 125–37. http://dx.doi.org/10.2478/amcs-2019-0010.
Der volle Inhalt der QuelleCai, Difeng, und Panayot S. Vassilevski. „Eigenvalue Problems for Exponential-Type Kernels“. Computational Methods in Applied Mathematics 20, Nr. 1 (01.01.2020): 61–78. http://dx.doi.org/10.1515/cmam-2018-0186.
Der volle Inhalt der QuelleHe, Li, und Hong Zhang. „Kernel K-Means Sampling for Nyström Approximation“. IEEE Transactions on Image Processing 27, Nr. 5 (Mai 2018): 2108–20. http://dx.doi.org/10.1109/tip.2018.2796860.
Der volle Inhalt der QuelleWang, Shiyuan, Lujuan Dang, Guobing Qian und Yunxiang Jiang. „Kernel recursive maximum correntropy with Nyström approximation“. Neurocomputing 329 (Februar 2019): 424–32. http://dx.doi.org/10.1016/j.neucom.2018.10.064.
Der volle Inhalt der QuelleLaguardia, Anna Lucia, und Maria Grazia Russo. „A Nyström Method for 2D Linear Fredholm Integral Equations on Curvilinear Domains“. Mathematics 11, Nr. 23 (03.12.2023): 4859. http://dx.doi.org/10.3390/math11234859.
Der volle Inhalt der QuelleDissertationen zum Thema "Approximation de Nyström"
Cherfaoui, Farah. „Echantillonnage pour l'accélération des méthodes à noyaux et sélection gloutonne pour les représentations parcimonieuses“. Electronic Thesis or Diss., Aix-Marseille, 2022. http://www.theses.fr/2022AIXM0256.
Der volle Inhalt der QuelleThe contributions of this thesis are divided into two parts. The first part is dedicated to the acceleration of kernel methods and the second to optimization under sparsity constraints. Kernel methods are widely known and used in machine learning. However, the complexity of their implementation is high and they become unusable when the number of data is large. We first propose an approximation of Ridge leverage scores. We then use these scores to define a probability distribution for the sampling process of the Nyström method in order to speed up the kernel methods. We then propose a new kernel-based framework for representing and comparing discrete probability distributions. We then exploit the link between our framework and the maximum mean discrepancy to propose an accurate and fast approximation of the latter. The second part of this thesis is devoted to optimization with sparsity constraint for signal optimization and random forest pruning. First, we prove under certain conditions on the coherence of the dictionary, the reconstruction and convergence properties of the Frank-Wolfe algorithm. Then, we use the OMP algorithm to reduce the size of random forests and thus reduce the size needed for its storage. The pruned forest consists of a subset of trees from the initial forest selected and weighted by OMP in order to minimize its empirical prediction error
Li, Jun 1977. „A computational model for the diffusion coefficients of DNA with applications“. Thesis, 2010. http://hdl.handle.net/2152/ETD-UT-2010-05-1098.
Der volle Inhalt der Quelletext
Buchteile zum Thema "Approximation de Nyström"
Hutchings, Matthew, und Bertrand Gauthier. „Local Optimisation of Nyström Samples Through Stochastic Gradient Descent“. In Machine Learning, Optimization, and Data Science, 123–40. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-25599-1_10.
Der volle Inhalt der QuelleFu, Zhouyu. „Optimal Landmark Selection for Nyström Approximation“. In Neural Information Processing, 311–18. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-12640-1_38.
Der volle Inhalt der QuelleLi, Hongyu, und Lin Zhang. „Dynamic Subspace Update with Incremental Nyström Approximation“. In Computer Vision – ACCV 2010 Workshops, 384–93. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22819-3_39.
Der volle Inhalt der QuelleZhang, Huaxiang, Zhichao Wang und Linlin Cao. „Fast Nyström for Low Rank Matrix Approximation“. In Advanced Data Mining and Applications, 456–64. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-35527-1_38.
Der volle Inhalt der QuelleFrammartino, Carmelina. „A Nyström Method for Solving a Boundary Value Problem on [0, ∞)“. In Approximation and Computation, 311–25. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-1-4419-6594-3_20.
Der volle Inhalt der QuelleJia, Hongjie, Liangjun Wang und Heping Song. „Large-Scale Spectral Clustering with Stochastic Nyström Approximation“. In IFIP Advances in Information and Communication Technology, 26–34. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46931-3_3.
Der volle Inhalt der QuelleAllouch, Chafik, Ikram Hamzaoui und Driss Sbibih. „Richardson Extrapolation of Nyström Method Associated with a Sextic Spline Quasi-Interpolant“. In Mathematical and Computational Methods for Modelling, Approximation and Simulation, 105–19. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-94339-4_5.
Der volle Inhalt der QuelleYun, Jeong-Min, und Seungjin Choi. „Nyström Approximations for Scalable Face Recognition: A Comparative Study“. In Neural Information Processing, 325–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24958-7_38.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Approximation de Nyström"
Giffon, Luc, Stephane Ayache, Thierry Artieres und Hachem Kadri. „Deep Networks with Adaptive Nyström Approximation“. In 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8851711.
Der volle Inhalt der QuelleZhang, Kai, Ivor W. Tsang und James T. Kwok. „Improved Nyström low-rank approximation and error analysis“. In the 25th international conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1390156.1390311.
Der volle Inhalt der QuelleMathur, Anant, Sarat Moka und Zdravko Botev. „Column Subset Selection and Nyström Approximation via Continuous Optimization“. In 2023 Winter Simulation Conference (WSC). IEEE, 2023. http://dx.doi.org/10.1109/wsc60868.2023.10407416.
Der volle Inhalt der QuelleMünch, Maximilian, Katrin Sophie Bohnsack, Alexander Engelsberger, Frank-Michael Schleif und Thomas Villmann. „Sparse Nyström Approximation for Non-Vectorial Data Using Class-informed Landmark Selection“. In ESANN 2023 - European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Louvain-la-Neuve (Belgium): Ciaco - i6doc.com, 2023. http://dx.doi.org/10.14428/esann/2023.es2023-136.
Der volle Inhalt der QuellePatel, Raajen, Tom Goldstein, Eva Dyer, Azalia Mirhoseini und Richard Baraniuk. „Deterministic Column Sampling for Low-Rank Matrix Approximation: Nyström vs. Incomplete Cholesky Decomposition“. In Proceedings of the 2016 SIAM International Conference on Data Mining. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2016. http://dx.doi.org/10.1137/1.9781611974348.67.
Der volle Inhalt der QuelleLee, Jieun, und Yoonsik Choe. „Graph-Regularized Fast Low-Rank Matrix Approximation Using The NystrÖM Method for Clustering“. In 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2018. http://dx.doi.org/10.1109/mlsp.2018.8517034.
Der volle Inhalt der QuelleDereziński, Michał, Rajiv Khanna und Michael W. Mahoney. „Improved Guarantees and a Multiple-descent Curve for Column Subset Selection and the Nystrom Method (Extended Abstract)“. In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/647.
Der volle Inhalt der QuelleAhmed, Hesham Ibrahim, Wan Qun, Ding Xue-ke und Zhou Zhi-ping. „Squared distance matrix completion through Nystrom approximation“. In 2016 22nd Asia-Pacific Conference on Communications (APCC). IEEE, 2016. http://dx.doi.org/10.1109/apcc.2016.7581449.
Der volle Inhalt der QuelleHou, Bo-Jian, Lijun Zhang und Zhi-Hua Zhou. „Storage Fit Learning with Unlabeled Data“. In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/256.
Der volle Inhalt der QuellePatel, Lokendra Singh, Suman Sana und S. P. Ghrera. „Efficient Nystrom method for low rank approximation and error analysis“. In 2015 Third International Conference on Image Information Processing (ICIIP). IEEE, 2015. http://dx.doi.org/10.1109/iciip.2015.7414831.
Der volle Inhalt der Quelle