Academic literature on the topic 'Dot product kernels'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dot product kernels.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Dot product kernels"

1

Menegatto, V. A., C. P. Oliveira, and A. P. Peron. "Conditionally positive definite dot product kernels." Journal of Mathematical Analysis and Applications 321, no. 1 (2006): 223–41. http://dx.doi.org/10.1016/j.jmaa.2005.08.024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Menegatto, V. A., C. P. Oliveira, and Ana P. Peron. "On conditionally positive definite dot product kernels." Acta Mathematica Sinica, English Series 24, no. 7 (2008): 1127–38. http://dx.doi.org/10.1007/s10114-007-6227-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lu, Fangyan, and Hongwei Sun. "Positive definite dot product kernels in learning theory." Advances in Computational Mathematics 22, no. 2 (2005): 181–98. http://dx.doi.org/10.1007/s10444-004-3140-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Griffiths, Matthew P., Denys Grombacher, Mason A. Kass, Mathias Ø. Vang, Lichao Liu, and Jakob Juul Larsen. "A surface NMR forward in a dot product." Geophysical Journal International 234, no. 3 (2023): 2284–90. http://dx.doi.org/10.1093/gji/ggad203.

Full text
Abstract:
SUMMARY The computation required to simulate surface nuclear magnetic resonance (SNMR) data increases proportionally with the number of sequences and number of pulses in each sequence. This poses a particular challenge to modelling steady-state SNMR, where suites of sequences are acquired, each of which require modelling 10–100 s of pulses. To model such data efficiently, we have developed a reformulation of surface NMR forward model, where the geometry of transmit and receive fields are encapsulated into a vector (or set of vectors), which we call B1-volume-receive (BVR) curves. Projecting BV
APA, Harvard, Vancouver, ISO, and other styles
5

Donini, Michele, and Fabio Aiolli. "Learning deep kernels in the space of dot product polynomials." Machine Learning 106, no. 9-10 (2016): 1245–69. http://dx.doi.org/10.1007/s10994-016-5590-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Filippas, Dionysios, Chrysostomos Nicopoulos, and Giorgos Dimitrakopoulos. "Templatized Fused Vector Floating-Point Dot Product for High-Level Synthesis." Journal of Low Power Electronics and Applications 12, no. 4 (2022): 56. http://dx.doi.org/10.3390/jlpea12040056.

Full text
Abstract:
Machine-learning accelerators rely on floating-point matrix and vector multiplication kernels. To reduce their cost, customized many-term fused architectures are preferred, which improve the latency, power, and area of the designs. In this work, we design a parameterized fused many-term floating-point dot product architecture that is ready for high-level synthesis. In this way, we can exploit the efficiency offered by a well-structured fused dot-product architecture and the freedom offered by high-level synthesis in tuning the design’s pipeline to the selected floating-point format and archite
APA, Harvard, Vancouver, ISO, and other styles
7

Bishwas, Arit Kumar, Ashish Mani, and Vasile Palade. "Gaussian kernel in quantum learning." International Journal of Quantum Information 18, no. 03 (2020): 2050006. http://dx.doi.org/10.1142/s0219749920500069.

Full text
Abstract:
The Gaussian kernel is a very popular kernel function used in many machine learning algorithms, especially in support vector machines (SVMs). It is more often used than polynomial kernels when learning from nonlinear datasets and is usually employed in formulating the classical SVM for nonlinear problems. Rebentrost et al. discussed an elegant quantum version of a least square support vector machine using quantum polynomial kernels, which is exponentially faster than the classical counterpart. This paper demonstrates a quantum version of the Gaussian kernel and analyzes its runtime complexity
APA, Harvard, Vancouver, ISO, and other styles
8

Iakymchuk, Roman, Stef Graillat, David Defour, and Enrique S. Quintana-Ortí. "Hierarchical approach for deriving a reproducible unblocked LU factorization." International Journal of High Performance Computing Applications 33, no. 5 (2019): 791–803. http://dx.doi.org/10.1177/1094342019832968.

Full text
Abstract:
We propose a reproducible variant of the unblocked LU factorization for graphics processor units (GPUs). For this purpose, we build upon Level-1/2 BLAS kernels that deliver correctly-rounded and reproducible results for the dot (inner) product, vector scaling, and the matrix-vector product. In addition, we draw a strategy to enhance the accuracy of the triangular solve via iterative refinement. Following a bottom-up approach, we finally construct a reproducible unblocked implementation of the LU factorization for GPUs, which accommodates partial pivoting for stability and can be eventually int
APA, Harvard, Vancouver, ISO, and other styles
9

Xiao, Lechao, Hong Hu, Theodor Misiakiewicz, Yue M. Lu, and Jeffrey Pennington. "Precise learning curves and higher-order scaling limits for dot-product kernel regression *." Journal of Statistical Mechanics: Theory and Experiment 2023, no. 11 (2023): 114005. http://dx.doi.org/10.1088/1742-5468/ad01b7.

Full text
Abstract:
Abstract As modern machine learning models continue to advance the computational frontier, it has become increasingly important to develop precise estimates for expected performance improvements under different model and data scaling regimes. Currently, theoretical understanding of the learning curves (LCs) that characterize how the prediction error depends on the number of samples is restricted to either large-sample asymptotics ( m → ∞ ) or, for certain simple data distributions, to the high-dimensional asymptotics in which the number of samples scales linearly with the dimension ( m ∝ d ).
APA, Harvard, Vancouver, ISO, and other styles
10

Azevedo, D., and V. A. Menegatto. "Sharp estimates for eigenvalues of integral operators generated by dot product kernels on the sphere." Journal of Approximation Theory 177 (January 2014): 57–68. http://dx.doi.org/10.1016/j.jat.2013.10.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Dot product kernels"

1

Wacker, Jonas. "Random features for dot product kernels and beyond." Electronic Thesis or Diss., Sorbonne université, 2022. http://www.theses.fr/2022SORUS241.

Full text
Abstract:
Les noyaux de produit scalaire, tels que les noyaux polynomiaux et exponentiels (softmax), sont parmi les noyaux les plus utilisés en apprentissage automatique, car ils permettent de modéliser les interactions entre les composantes des vecteurs d'entrée, ce qui est crucial dans des applications telles que la vision par ordinateur, le traitement du langage naturel et les systèmes de recommandation. Cependant, un inconvénient fondamental des modèles statistiques basés sur les noyaux est leur évolutivité limitée à un grand nombre de données d'entrée, ce qui nécessite de recourir à des approximati
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Dot product kernels"

1

Chen, Degang, Qiang He, Chunru Dong, and Xizhao Wang. "A Method to Construct the Mapping to the Feature Space for the Dot Product Kernels." In Advances in Machine Learning and Cybernetics. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11739685_96.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lauriola, Ivano, Mirko Polato, and Fabio Aiolli. "Radius-Margin Ratio Optimization for Dot-Product Boolean Kernel Learning." In Artificial Neural Networks and Machine Learning – ICANN 2017. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68612-7_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Dot product kernels"

1

Azevedo, Douglas, and Valdir A. Menegatto. "Eigenvalues of dot-product kernels on the sphere." In XXXV CNMAC - Congresso Nacional de Matemática Aplicada e Computacional. SBMAC, 2015. http://dx.doi.org/10.5540/03.2015.003.01.0039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, G. Y., and P. Bhattacharya. "Function Dot Product Kernels for Support Vector Machine." In 18th International Conference on Pattern Recognition (ICPR'06). IEEE, 2006. http://dx.doi.org/10.1109/icpr.2006.586.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rashed, Muhammad Rashedul Haq, Sumit Kumar Jha, and Rickard Ewetz. "Discovering the in-Memory Kernels of 3D Dot-Product Engines." In ASPDAC '23: 28th Asia and South Pacific Design Automation Conference. ACM, 2023. http://dx.doi.org/10.1145/3566097.3567855.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li Zhang, Zhou Weida, Ying Lin, and Licheng Jiao. "Support vector novelty detection with dot product kernels for non-spherical data." In 2008 International Conference on Information and Automation (ICIA). IEEE, 2008. http://dx.doi.org/10.1109/icinfa.2008.4607965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Venkatesan, Sibi, James K. Miller, Jeff Schneider, and Artur Dubrawski. "Scaling Active Search using Linear Similarity Functions." In Twenty-Sixth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/401.

Full text
Abstract:
Active Search has become an increasingly useful tool in information retrieval problems where the goal is to discover as many target elements as possible using only limited label queries. With the advent of big data, there is a growing emphasis on the scalability of such techniques to handle very large and very complex datasets. In this paper, we consider the problem of Active Search where we are given a similarity function between data points. We look at an algorithm introduced by Wang et al. [Wang et al., 2013] known as Active Search on Graphs and propose crucial modifications which allow it
APA, Harvard, Vancouver, ISO, and other styles
6

De Jesús Rivera, Edward, Fanny Besem-Cordova, and Jean-Charles Bonaccorsi. "Optimization of a High Pressure Industrial Fan." In ASME Turbo Expo 2021: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/gt2021-58967.

Full text
Abstract:
Abstract Fans are used in industrial refineries, power generation, petrochemistry, pollution control, etc. These fans can perform in sometimes extreme, mission-critical conditions. The design of fans has historically relied on turbomachinery affinity laws, resulting in oversized machines that are expensive to manufacture and transport. With the increasingly lower CPU cost of fluid modeling, designers can now turn to CFD optimization to produce the necessary machine performance and flow conditions while respecting manufacturing constraints. The objective of this study is to maximize the pressur
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!