Academic literature on the topic 'Generalized cosine similarity'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Generalized cosine similarity.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Generalized cosine similarity"
Ye, Jun. "Generalized Ordered Weighted Simplified Neutrosophic Cosine Similarity Measure for Multiple Attribute Group Decision Making." International Journal of Cognitive Informatics and Natural Intelligence 14, no. 1 (January 2020): 51–62. http://dx.doi.org/10.4018/ijcini.2020010104.
Full textLiu, Donghai, Xiaohong Chen, and Dan Peng. "Interval-Valued Intuitionistic Fuzzy Ordered Weighted Cosine Similarity Measure and Its Application in Investment Decision-Making." Complexity 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/1891923.
Full textYe, Jun, Shigui Du, and Rui Yong. "Similarity Measures between Intuitionistic Fuzzy Credibility Sets and Their Multicriteria Decision-Making Method for the Performance Evaluation of Industrial Robots." Mathematical Problems in Engineering 2021 (January 19, 2021): 1–10. http://dx.doi.org/10.1155/2021/6630898.
Full textLiu, Donghai, Xiaohong Chen, and Dan Peng. "Cosine Distance Measure between Neutrosophic Hesitant Fuzzy Linguistic Sets and Its Application in Multiple Criteria Decision Making." Symmetry 10, no. 11 (November 7, 2018): 602. http://dx.doi.org/10.3390/sym10110602.
Full textGulistan, Muhammad, Mutaz Mohammad, Faruk Karaaslan, Seifedine Kadry, Salma Khan, and Hafiz Abdul Wahab. "Neutrosophic cubic Heronian mean operators with applications in multiple attribute group decision-making using cosine similarity functions." International Journal of Distributed Sensor Networks 15, no. 9 (September 2019): 155014771987761. http://dx.doi.org/10.1177/1550147719877613.
Full textKate, Rohit J. "Normalizing clinical terms using learned edit distance patterns." Journal of the American Medical Informatics Association 23, no. 2 (July 31, 2015): 380–86. http://dx.doi.org/10.1093/jamia/ocv108.
Full textArafah, Muhammad. "IMPLEMENTATION OF GENERALIZED VECTOR SPACE MODEL METHOD AT AUTOMATIC ASSESSMENT OF ONLINE ESSAY EXAM." Journal of Information Technology and Its Utilization 1, no. 2 (December 17, 2018): 43. http://dx.doi.org/10.30818/jitu.1.2.1893.
Full textKar, Arindam, Debotosh Bhattacharjee, Dipak Kumar Basu, Mita Nasipuri, and Mahantapas Kundu. "A Gabor-Block-Based Kernel Discriminative Common Vector Approach Using Cosine Kernels for Human Face Recognition." Computational Intelligence and Neuroscience 2012 (2012): 1–12. http://dx.doi.org/10.1155/2012/421032.
Full textLiu, Peide, Muhammad Munir, Tahir Mahmood, and Kifayat Ullah. "Some Similarity Measures for Interval-Valued Picture Fuzzy Sets and Their Applications in Decision Making." Information 10, no. 12 (November 25, 2019): 369. http://dx.doi.org/10.3390/info10120369.
Full textSong, Wanqing, Wujin Deng, Dongdong Chen, Rong Jin, and Aleksey Kudreyko. "Hybrid Approach of Fractional Generalized Pareto Motion and Cosine Similarity Hidden Markov Model for Solar Radiation Forecasting." Fractal and Fractional 7, no. 1 (January 13, 2023): 93. http://dx.doi.org/10.3390/fractalfract7010093.
Full textDissertations / Theses on the topic "Generalized cosine similarity"
Qamar, Ali Mustafa. "Mesures de similarité et cosinus généralisé : une approche d'apprentissage supervisé fondée sur les k plus proches voisins." Phd thesis, Grenoble, 2010. http://www.theses.fr/2010GRENM083.
Full textAlmost all machine learning problems depend heavily on the metric used. Many works have proved that it is a far better approach to learn the metric structure from the data rather than assuming a simple geometry based on the identity matrix. This has paved the way for a new research theme called metric learning. Most of the works in this domain have based their approaches on distance learning only. However some other works have shown that similarity should be preferred over distance metrics while dealing with textual datasets as well as with non-textual ones. Being able to efficiently learn appropriate similarity measures, as opposed to distances, is thus of high importance for various collections. If several works have partially addressed this problem for different applications, no previous work is known which has fully addressed it in the context of learning similarity metrics for kNN classification. This is exactly the focus of the current study. In the case of information filtering systems where the aim is to filter an incoming stream of documents into a set of predefined topics with little supervision, cosine based category specific thresholds can be learned. Learning such thresholds can be seen as a first step towards learning a complete similarity measure. This strategy was used to develop Online and Batch algorithms for information filtering during the INFILE (Information Filtering) track of the CLEF (Cross Language Evaluation Forum) campaign during the years 2008 and 2009. However, provided enough supervised information is available, as is the case in classification settings, it is usually beneficial to learn a complete metric as opposed to learning thresholds. To this end, we developed numerous algorithms for learning complete similarity metrics for kNN classification. An unconstrained similarity learning algorithm called SiLA is developed in which case the normalization is independent of the similarity matrix. SiLA encompasses, among others, the standard cosine measure, as well as the Dice and Jaccard coefficients. SiLA is an extension of the voted perceptron algorithm and allows to learn different types of similarity functions (based on diagonal, symmetric or asymmetric matrices). We then compare SiLA with RELIEF, a well known feature re-weighting algorithm. It has recently been suggested by Sun and Wu that RELIEF can be seen as a distance metric learning algorithm optimizing a cost function which is an approximation of the 0-1 loss. We show here that this approximation is loose, and propose a stricter version closer to the the 0-1 loss, leading to a new, and better, RELIEF-based algorithm for classification. We then focus on a direct extension of the cosine similarity measure, defined as a normalized scalar product in a projected space. The associated algorithm is called generalized Cosine simiLarity Algorithm (gCosLA). All of the algorithms are tested on many different datasets. A statistical test, the s-test, is employed to assess whether the results are significantly different. GCosLA performed statistically much better than SiLA on many of the datasets. Furthermore, SiLA and gCosLA were compared with many state of the art algorithms, illustrating their well-foundedness
Conference papers on the topic "Generalized cosine similarity"
Kauderer, M. H. "Three-dimensional Fourier optics: the linear scalar transfer function." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1988. http://dx.doi.org/10.1364/oam.1988.tuu3.
Full text