Literatura científica selecionada sobre o tema "Dot product kernels"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Dot product kernels".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Artigos de revistas sobre o assunto "Dot product kernels"
Menegatto, V. A., C. P. Oliveira e A. P. Peron. "Conditionally positive definite dot product kernels". Journal of Mathematical Analysis and Applications 321, n.º 1 (setembro de 2006): 223–41. http://dx.doi.org/10.1016/j.jmaa.2005.08.024.
Texto completo da fonteMenegatto, V. A., C. P. Oliveira e Ana P. Peron. "On conditionally positive definite dot product kernels". Acta Mathematica Sinica, English Series 24, n.º 7 (julho de 2008): 1127–38. http://dx.doi.org/10.1007/s10114-007-6227-4.
Texto completo da fonteLu, Fangyan, e Hongwei Sun. "Positive definite dot product kernels in learning theory". Advances in Computational Mathematics 22, n.º 2 (fevereiro de 2005): 181–98. http://dx.doi.org/10.1007/s10444-004-3140-6.
Texto completo da fonteGriffiths, Matthew P., Denys Grombacher, Mason A. Kass, Mathias Ø. Vang, Lichao Liu e Jakob Juul Larsen. "A surface NMR forward in a dot product". Geophysical Journal International 234, n.º 3 (27 de abril de 2023): 2284–90. http://dx.doi.org/10.1093/gji/ggad203.
Texto completo da fonteDonini, Michele, e Fabio Aiolli. "Learning deep kernels in the space of dot product polynomials". Machine Learning 106, n.º 9-10 (7 de novembro de 2016): 1245–69. http://dx.doi.org/10.1007/s10994-016-5590-8.
Texto completo da fonteFilippas, Dionysios, Chrysostomos Nicopoulos e Giorgos Dimitrakopoulos. "Templatized Fused Vector Floating-Point Dot Product for High-Level Synthesis". Journal of Low Power Electronics and Applications 12, n.º 4 (17 de outubro de 2022): 56. http://dx.doi.org/10.3390/jlpea12040056.
Texto completo da fonteBishwas, Arit Kumar, Ashish Mani e Vasile Palade. "Gaussian kernel in quantum learning". International Journal of Quantum Information 18, n.º 03 (abril de 2020): 2050006. http://dx.doi.org/10.1142/s0219749920500069.
Texto completo da fonteXiao, Lechao, Hong Hu, Theodor Misiakiewicz, Yue M. Lu e Jeffrey Pennington. "Precise learning curves and higher-order scaling limits for dot-product kernel regression *". Journal of Statistical Mechanics: Theory and Experiment 2023, n.º 11 (1 de novembro de 2023): 114005. http://dx.doi.org/10.1088/1742-5468/ad01b7.
Texto completo da fonteIakymchuk, Roman, Stef Graillat, David Defour e Enrique S. Quintana-Ortí. "Hierarchical approach for deriving a reproducible unblocked LU factorization". International Journal of High Performance Computing Applications 33, n.º 5 (17 de março de 2019): 791–803. http://dx.doi.org/10.1177/1094342019832968.
Texto completo da fonteAzevedo, D., e V. A. Menegatto. "Sharp estimates for eigenvalues of integral operators generated by dot product kernels on the sphere". Journal of Approximation Theory 177 (janeiro de 2014): 57–68. http://dx.doi.org/10.1016/j.jat.2013.10.002.
Texto completo da fonteTeses / dissertações sobre o assunto "Dot product kernels"
Wacker, Jonas. "Random features for dot product kernels and beyond". Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS241.
Texto completo da fonteDot product kernels, such as polynomial and exponential (softmax) kernels, are among the most widely used kernels in machine learning, as they enable modeling the interactions between input features, which is crucial in applications like computer vision, natural language processing, and recommender systems. However, a fundamental drawback of kernel-based statistical models is their limited scalability to a large number of inputs, which requires resorting to approximations. In this thesis, we study techniques to linearize kernel-based methods by means of random feature approximations and we focus on the approximation of polynomial kernels and more general dot product kernels to make these kernels more useful in large scale learning. In particular, we focus on a variance analysis as a main tool to study and improve the statistical efficiency of such sketches
Capítulos de livros sobre o assunto "Dot product kernels"
Chen, Degang, Qiang He, Chunru Dong e Xizhao Wang. "A Method to Construct the Mapping to the Feature Space for the Dot Product Kernels". In Advances in Machine Learning and Cybernetics, 918–29. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11739685_96.
Texto completo da fonteLauriola, Ivano, Mirko Polato e Fabio Aiolli. "Radius-Margin Ratio Optimization for Dot-Product Boolean Kernel Learning". In Artificial Neural Networks and Machine Learning – ICANN 2017, 183–91. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68612-7_21.
Texto completo da fonteTrabalhos de conferências sobre o assunto "Dot product kernels"
Azevedo, Douglas, e Valdir A. Menegatto. "Eigenvalues of dot-product kernels on the sphere". In XXXV CNMAC - Congresso Nacional de Matemática Aplicada e Computacional. SBMAC, 2015. http://dx.doi.org/10.5540/03.2015.003.01.0039.
Texto completo da fonteChen, G. Y., e P. Bhattacharya. "Function Dot Product Kernels for Support Vector Machine". In 18th International Conference on Pattern Recognition (ICPR'06). IEEE, 2006. http://dx.doi.org/10.1109/icpr.2006.586.
Texto completo da fonteRashed, Muhammad Rashedul Haq, Sumit Kumar Jha e Rickard Ewetz. "Discovering the in-Memory Kernels of 3D Dot-Product Engines". In ASPDAC '23: 28th Asia and South Pacific Design Automation Conference. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3566097.3567855.
Texto completo da fonteLi Zhang, Zhou Weida, Ying Lin e Licheng Jiao. "Support vector novelty detection with dot product kernels for non-spherical data". In 2008 International Conference on Information and Automation (ICIA). IEEE, 2008. http://dx.doi.org/10.1109/icinfa.2008.4607965.
Texto completo da fonteVenkatesan, Sibi, James K. Miller, Jeff Schneider e Artur Dubrawski. "Scaling Active Search using Linear Similarity Functions". In Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/401.
Texto completo da fonteDe Jesús Rivera, Edward, Fanny Besem-Cordova e Jean-Charles Bonaccorsi. "Optimization of a High Pressure Industrial Fan". In ASME Turbo Expo 2021: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/gt2021-58967.
Texto completo da fonte