Academic literature on the topic 'Dot product kernels'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dot product kernels.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Dot product kernels"
Menegatto, V. A., C. P. Oliveira, and A. P. Peron. "Conditionally positive definite dot product kernels." Journal of Mathematical Analysis and Applications 321, no. 1 (September 2006): 223–41. http://dx.doi.org/10.1016/j.jmaa.2005.08.024.
Full textMenegatto, V. A., C. P. Oliveira, and Ana P. Peron. "On conditionally positive definite dot product kernels." Acta Mathematica Sinica, English Series 24, no. 7 (July 2008): 1127–38. http://dx.doi.org/10.1007/s10114-007-6227-4.
Full textLu, Fangyan, and Hongwei Sun. "Positive definite dot product kernels in learning theory." Advances in Computational Mathematics 22, no. 2 (February 2005): 181–98. http://dx.doi.org/10.1007/s10444-004-3140-6.
Full textGriffiths, Matthew P., Denys Grombacher, Mason A. Kass, Mathias Ø. Vang, Lichao Liu, and Jakob Juul Larsen. "A surface NMR forward in a dot product." Geophysical Journal International 234, no. 3 (April 27, 2023): 2284–90. http://dx.doi.org/10.1093/gji/ggad203.
Full textDonini, Michele, and Fabio Aiolli. "Learning deep kernels in the space of dot product polynomials." Machine Learning 106, no. 9-10 (November 7, 2016): 1245–69. http://dx.doi.org/10.1007/s10994-016-5590-8.
Full textFilippas, Dionysios, Chrysostomos Nicopoulos, and Giorgos Dimitrakopoulos. "Templatized Fused Vector Floating-Point Dot Product for High-Level Synthesis." Journal of Low Power Electronics and Applications 12, no. 4 (October 17, 2022): 56. http://dx.doi.org/10.3390/jlpea12040056.
Full textBishwas, Arit Kumar, Ashish Mani, and Vasile Palade. "Gaussian kernel in quantum learning." International Journal of Quantum Information 18, no. 03 (April 2020): 2050006. http://dx.doi.org/10.1142/s0219749920500069.
Full textXiao, Lechao, Hong Hu, Theodor Misiakiewicz, Yue M. Lu, and Jeffrey Pennington. "Precise learning curves and higher-order scaling limits for dot-product kernel regression *." Journal of Statistical Mechanics: Theory and Experiment 2023, no. 11 (November 1, 2023): 114005. http://dx.doi.org/10.1088/1742-5468/ad01b7.
Full textIakymchuk, Roman, Stef Graillat, David Defour, and Enrique S. Quintana-Ortí. "Hierarchical approach for deriving a reproducible unblocked LU factorization." International Journal of High Performance Computing Applications 33, no. 5 (March 17, 2019): 791–803. http://dx.doi.org/10.1177/1094342019832968.
Full textAzevedo, D., and V. A. Menegatto. "Sharp estimates for eigenvalues of integral operators generated by dot product kernels on the sphere." Journal of Approximation Theory 177 (January 2014): 57–68. http://dx.doi.org/10.1016/j.jat.2013.10.002.
Full textDissertations / Theses on the topic "Dot product kernels"
Wacker, Jonas. "Random features for dot product kernels and beyond." Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS241.
Full textDot product kernels, such as polynomial and exponential (softmax) kernels, are among the most widely used kernels in machine learning, as they enable modeling the interactions between input features, which is crucial in applications like computer vision, natural language processing, and recommender systems. However, a fundamental drawback of kernel-based statistical models is their limited scalability to a large number of inputs, which requires resorting to approximations. In this thesis, we study techniques to linearize kernel-based methods by means of random feature approximations and we focus on the approximation of polynomial kernels and more general dot product kernels to make these kernels more useful in large scale learning. In particular, we focus on a variance analysis as a main tool to study and improve the statistical efficiency of such sketches
Book chapters on the topic "Dot product kernels"
Chen, Degang, Qiang He, Chunru Dong, and Xizhao Wang. "A Method to Construct the Mapping to the Feature Space for the Dot Product Kernels." In Advances in Machine Learning and Cybernetics, 918–29. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11739685_96.
Full textLauriola, Ivano, Mirko Polato, and Fabio Aiolli. "Radius-Margin Ratio Optimization for Dot-Product Boolean Kernel Learning." In Artificial Neural Networks and Machine Learning – ICANN 2017, 183–91. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68612-7_21.
Full textConference papers on the topic "Dot product kernels"
Azevedo, Douglas, and Valdir A. Menegatto. "Eigenvalues of dot-product kernels on the sphere." In XXXV CNMAC - Congresso Nacional de Matemática Aplicada e Computacional. SBMAC, 2015. http://dx.doi.org/10.5540/03.2015.003.01.0039.
Full textChen, G. Y., and P. Bhattacharya. "Function Dot Product Kernels for Support Vector Machine." In 18th International Conference on Pattern Recognition (ICPR'06). IEEE, 2006. http://dx.doi.org/10.1109/icpr.2006.586.
Full textRashed, Muhammad Rashedul Haq, Sumit Kumar Jha, and Rickard Ewetz. "Discovering the in-Memory Kernels of 3D Dot-Product Engines." In ASPDAC '23: 28th Asia and South Pacific Design Automation Conference. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3566097.3567855.
Full textLi Zhang, Zhou Weida, Ying Lin, and Licheng Jiao. "Support vector novelty detection with dot product kernels for non-spherical data." In 2008 International Conference on Information and Automation (ICIA). IEEE, 2008. http://dx.doi.org/10.1109/icinfa.2008.4607965.
Full textVenkatesan, Sibi, James K. Miller, Jeff Schneider, and Artur Dubrawski. "Scaling Active Search using Linear Similarity Functions." In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/401.
Full textDe Jesús Rivera, Edward, Fanny Besem-Cordova, and Jean-Charles Bonaccorsi. "Optimization of a High Pressure Industrial Fan." In ASME Turbo Expo 2021: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/gt2021-58967.
Full text