Littérature scientifique sur le sujet « Generalized cosine similarity »
Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres
Sommaire
Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Generalized cosine similarity ».
À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.
Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.
Articles de revues sur le sujet "Generalized cosine similarity"
Ye, Jun. « Generalized Ordered Weighted Simplified Neutrosophic Cosine Similarity Measure for Multiple Attribute Group Decision Making ». International Journal of Cognitive Informatics and Natural Intelligence 14, no 1 (janvier 2020) : 51–62. http://dx.doi.org/10.4018/ijcini.2020010104.
Texte intégralLiu, Donghai, Xiaohong Chen et Dan Peng. « Interval-Valued Intuitionistic Fuzzy Ordered Weighted Cosine Similarity Measure and Its Application in Investment Decision-Making ». Complexity 2017 (2017) : 1–11. http://dx.doi.org/10.1155/2017/1891923.
Texte intégralYe, Jun, Shigui Du et Rui Yong. « Similarity Measures between Intuitionistic Fuzzy Credibility Sets and Their Multicriteria Decision-Making Method for the Performance Evaluation of Industrial Robots ». Mathematical Problems in Engineering 2021 (19 janvier 2021) : 1–10. http://dx.doi.org/10.1155/2021/6630898.
Texte intégralLiu, Donghai, Xiaohong Chen et Dan Peng. « Cosine Distance Measure between Neutrosophic Hesitant Fuzzy Linguistic Sets and Its Application in Multiple Criteria Decision Making ». Symmetry 10, no 11 (7 novembre 2018) : 602. http://dx.doi.org/10.3390/sym10110602.
Texte intégralGulistan, Muhammad, Mutaz Mohammad, Faruk Karaaslan, Seifedine Kadry, Salma Khan et Hafiz Abdul Wahab. « Neutrosophic cubic Heronian mean operators with applications in multiple attribute group decision-making using cosine similarity functions ». International Journal of Distributed Sensor Networks 15, no 9 (septembre 2019) : 155014771987761. http://dx.doi.org/10.1177/1550147719877613.
Texte intégralKate, Rohit J. « Normalizing clinical terms using learned edit distance patterns ». Journal of the American Medical Informatics Association 23, no 2 (31 juillet 2015) : 380–86. http://dx.doi.org/10.1093/jamia/ocv108.
Texte intégralArafah, Muhammad. « IMPLEMENTATION OF GENERALIZED VECTOR SPACE MODEL METHOD AT AUTOMATIC ASSESSMENT OF ONLINE ESSAY EXAM ». Journal of Information Technology and Its Utilization 1, no 2 (17 décembre 2018) : 43. http://dx.doi.org/10.30818/jitu.1.2.1893.
Texte intégralKar, Arindam, Debotosh Bhattacharjee, Dipak Kumar Basu, Mita Nasipuri et Mahantapas Kundu. « A Gabor-Block-Based Kernel Discriminative Common Vector Approach Using Cosine Kernels for Human Face Recognition ». Computational Intelligence and Neuroscience 2012 (2012) : 1–12. http://dx.doi.org/10.1155/2012/421032.
Texte intégralLiu, Peide, Muhammad Munir, Tahir Mahmood et Kifayat Ullah. « Some Similarity Measures for Interval-Valued Picture Fuzzy Sets and Their Applications in Decision Making ». Information 10, no 12 (25 novembre 2019) : 369. http://dx.doi.org/10.3390/info10120369.
Texte intégralSong, Wanqing, Wujin Deng, Dongdong Chen, Rong Jin et Aleksey Kudreyko. « Hybrid Approach of Fractional Generalized Pareto Motion and Cosine Similarity Hidden Markov Model for Solar Radiation Forecasting ». Fractal and Fractional 7, no 1 (13 janvier 2023) : 93. http://dx.doi.org/10.3390/fractalfract7010093.
Texte intégralThèses sur le sujet "Generalized cosine similarity"
Qamar, Ali Mustafa. « Mesures de similarité et cosinus généralisé : une approche d'apprentissage supervisé fondée sur les k plus proches voisins ». Phd thesis, Grenoble, 2010. http://www.theses.fr/2010GRENM083.
Texte intégralAlmost all machine learning problems depend heavily on the metric used. Many works have proved that it is a far better approach to learn the metric structure from the data rather than assuming a simple geometry based on the identity matrix. This has paved the way for a new research theme called metric learning. Most of the works in this domain have based their approaches on distance learning only. However some other works have shown that similarity should be preferred over distance metrics while dealing with textual datasets as well as with non-textual ones. Being able to efficiently learn appropriate similarity measures, as opposed to distances, is thus of high importance for various collections. If several works have partially addressed this problem for different applications, no previous work is known which has fully addressed it in the context of learning similarity metrics for kNN classification. This is exactly the focus of the current study. In the case of information filtering systems where the aim is to filter an incoming stream of documents into a set of predefined topics with little supervision, cosine based category specific thresholds can be learned. Learning such thresholds can be seen as a first step towards learning a complete similarity measure. This strategy was used to develop Online and Batch algorithms for information filtering during the INFILE (Information Filtering) track of the CLEF (Cross Language Evaluation Forum) campaign during the years 2008 and 2009. However, provided enough supervised information is available, as is the case in classification settings, it is usually beneficial to learn a complete metric as opposed to learning thresholds. To this end, we developed numerous algorithms for learning complete similarity metrics for kNN classification. An unconstrained similarity learning algorithm called SiLA is developed in which case the normalization is independent of the similarity matrix. SiLA encompasses, among others, the standard cosine measure, as well as the Dice and Jaccard coefficients. SiLA is an extension of the voted perceptron algorithm and allows to learn different types of similarity functions (based on diagonal, symmetric or asymmetric matrices). We then compare SiLA with RELIEF, a well known feature re-weighting algorithm. It has recently been suggested by Sun and Wu that RELIEF can be seen as a distance metric learning algorithm optimizing a cost function which is an approximation of the 0-1 loss. We show here that this approximation is loose, and propose a stricter version closer to the the 0-1 loss, leading to a new, and better, RELIEF-based algorithm for classification. We then focus on a direct extension of the cosine similarity measure, defined as a normalized scalar product in a projected space. The associated algorithm is called generalized Cosine simiLarity Algorithm (gCosLA). All of the algorithms are tested on many different datasets. A statistical test, the s-test, is employed to assess whether the results are significantly different. GCosLA performed statistically much better than SiLA on many of the datasets. Furthermore, SiLA and gCosLA were compared with many state of the art algorithms, illustrating their well-foundedness
Actes de conférences sur le sujet "Generalized cosine similarity"
Kauderer, M. H. « Three-dimensional Fourier optics : the linear scalar transfer function ». Dans OSA Annual Meeting. Washington, D.C. : Optica Publishing Group, 1988. http://dx.doi.org/10.1364/oam.1988.tuu3.
Texte intégral