Добірка наукової літератури з теми "Dirichlet allocation"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Dirichlet allocation".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Dirichlet allocation"
Du, Lan, Wray Buntine, Huidong Jin, and Changyou Chen. "Sequential latent Dirichlet allocation." Knowledge and Information Systems 31, no. 3 (June 10, 2011): 475–503. http://dx.doi.org/10.1007/s10115-011-0425-1.
Повний текст джерелаSchwarz, Carlo. "Ldagibbs: A Command for Topic Modeling in Stata Using Latent Dirichlet Allocation." Stata Journal: Promoting communications on statistics and Stata 18, no. 1 (March 2018): 101–17. http://dx.doi.org/10.1177/1536867x1801800107.
Повний текст джерелаYoshida, Takahiro, Ryohei Hisano, and Takaaki Ohnishi. "Gaussian hierarchical latent Dirichlet allocation: Bringing polysemy back." PLOS ONE 18, no. 7 (July 12, 2023): e0288274. http://dx.doi.org/10.1371/journal.pone.0288274.
Повний текст джерелаArchambeau, Cedric, Balaji Lakshminarayanan, and Guillaume Bouchard. "Latent IBP Compound Dirichlet Allocation." IEEE Transactions on Pattern Analysis and Machine Intelligence 37, no. 2 (February 2015): 321–33. http://dx.doi.org/10.1109/tpami.2014.2313122.
Повний текст джерелаPion-Tonachini, Luca, Scott Makeig, and Ken Kreutz-Delgado. "Crowd labeling latent Dirichlet allocation." Knowledge and Information Systems 53, no. 3 (April 19, 2017): 749–65. http://dx.doi.org/10.1007/s10115-017-1053-1.
Повний текст джерелаS.S., Ramyadharshni, and Pabitha Dr.P. "Topic Categorization on Social Network Using Latent Dirichlet Allocation." Bonfring International Journal of Software Engineering and Soft Computing 8, no. 2 (April 30, 2018): 16–20. http://dx.doi.org/10.9756/bijsesc.8390.
Повний текст джерелаSyed, Shaheen, and Marco Spruit. "Exploring Symmetrical and Asymmetrical Dirichlet Priors for Latent Dirichlet Allocation." International Journal of Semantic Computing 12, no. 03 (September 2018): 399–423. http://dx.doi.org/10.1142/s1793351x18400184.
Повний текст джерелаLi, Gen, and Hazri Jamil. "Teacher professional learning community and interdisciplinary collaborative teaching path under the informationization basic education model." Yugoslav Journal of Operations Research, no. 00 (2024): 29. http://dx.doi.org/10.2298/yjor2403029l.
Повний текст джерелаGarg, Mohit, and Priya Rangra. "Bibliometric Analysis of Latent Dirichlet Allocation." DESIDOC Journal of Library & Information Technology 42, no. 2 (February 28, 2022): 105–13. http://dx.doi.org/10.14429/djlit.42.2.17307.
Повний текст джерелаChauhan, Uttam, and Apurva Shah. "Topic Modeling Using Latent Dirichlet allocation." ACM Computing Surveys 54, no. 7 (September 30, 2022): 1–35. http://dx.doi.org/10.1145/3462478.
Повний текст джерелаДисертації з теми "Dirichlet allocation"
Ponweiser, Martin. "Latent Dirichlet Allocation in R." WU Vienna University of Economics and Business, 2012. http://epub.wu.ac.at/3558/1/main.pdf.
Повний текст джерелаSeries: Theses / Institute for Statistics and Mathematics
Arnekvist, Isac, and Ludvig Ericson. "Finding competitors using Latent Dirichlet Allocation." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186386.
Повний текст джерелаDet finns ett intresse av att kunna identifiera affärskonkurrenter, men detta blir allt svårare på en ständigt växande och alltmer global marknad. Syftet med denna rapport är att undersöka om Latent Dirichlet Allocation (LDA) kan användas för att identifiera och rangordna konkurrenter. Detta genom att jämföra avstånden mellan LDA-representationerna av dessas företagsbeskrivningar. Effektiviteten av LDA i detta syfte jämfördes med den för bag-of-words samt slumpmässig ordning, detta med hjälp av några vanliga informationsteoretiska mått. Flera olika avståndsmått utvärderades för att bestämma vilken av dessa som bäst åstadkommer att konkurrerande företag hamnar nära varandra. I detta fall fanns Cosine similarity överträffa andra avståndsmått. Medan både LDA och bag-of-words konstaterades vara signifikant bättre än slumpmässig ordning så fanns att LDA presterar kvalitativt sämre än bag-of-words. Uträkning av avståndsmått var dock betydligt snabbare med LDA-representationer. Att omvandla webbinnehåll till LDA-representationer fångar dock vissa ospecifika likheter som inte nödvändigt beskriver konkurrenter. Det kan möjligen vara fördelaktigt att använda LDA-representationer ihop med någon ytterligare datakälla och/eller heuristik.
Choubey, Rahul. "Tag recommendation using Latent Dirichlet Allocation." Thesis, Kansas State University, 2011. http://hdl.handle.net/2097/9785.
Повний текст джерелаDepartment of Computing and Information Sciences
Doina Caragea
The vast amount of data present on the internet calls for ways to label and organize this data according to specific categories, in order to facilitate search and browsing activities. This can be easily accomplished by making use of folksonomies and user provided tags. However, it can be difficult for users to provide meaningful tags. Tag recommendation systems can guide the users towards informative tags for online resources such as websites, pictures, etc. The aim of this thesis is to build a system for recommending tags to URLs available through a bookmark sharing service, called BibSonomy. We assume that the URLs for which we recommend tags do not have any prior tags assigned to them. Two approaches are proposed to address the tagging problem, both of them based on Latent Dirichlet Allocation (LDA) Blei et al. [2003]. LDA is a generative and probabilistic topic model which aims to infer the hidden topical structure in a collection of documents. According to LDA, documents can be seen as mixtures of topics, while topics can be seen as mixtures of words (in our case, tags). The first approach that we propose, called topic words based approach, recommends the top words in the top topics representing a resource as tags for that particular resource. The second approach, called topic distance based approach, uses the tags of the most similar training resources (identified using the KL-divergence Kullback and Liebler [1951]) to recommend tags for a test untagged resource. The dataset used in this work was made available through the ECML/PKDD Discovery Challenge 2009. We construct the documents that are provided as input to LDA in two ways, thus producing two different datasets. In the first dataset, we use only the description and the tags (when available) corresponding to a URL. In the second dataset, we crawl the URL content and use it to construct the document. Experimental results show that the LDA approach is not very effective at recommending tags for new untagged resources. However, using the resource content gives better results than using the description only. Furthermore, the topic distance based approach is better than the topic words based approach, when only the descriptions are used to construct documents, while the topic words based approach works better when the contents are used to construct documents.
Risch, Johan. "Detecting Twitter topics using Latent Dirichlet Allocation." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-277260.
Повний текст джерелаLiu, Zelong. "High performance latent dirichlet allocation for text mining." Thesis, Brunel University, 2013. http://bura.brunel.ac.uk/handle/2438/7726.
Повний текст джерелаKulhanek, Raymond Daniel. "A Latent Dirichlet Allocation/N-gram Composite Language Model." Wright State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=wright1379520876.
Повний текст джерелаAnaya, Leticia H. "Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers." Thesis, University of North Texas, 2011. https://digital.library.unt.edu/ark:/67531/metadc103284/.
Повний текст джерелаJaradat, Shatha. "OLLDA: Dynamic and Scalable Topic Modelling for Twitter : AN ONLINE SUPERVISED LATENT DIRICHLET ALLOCATION ALGORITHM." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177535.
Повний текст джерелаTillhandahålla högkvalitativa ämnen slutsats i dagens stora och dynamiska korpusar, såsom Twitter, är en utmanande uppgift. Detta är särskilt utmanande med tanke på att innehållet i den här miljön innehåller korta texter och många förkortningar. Projektet föreslår en förbättring med en populär online ämnen modellering algoritm för Latent Dirichlet Tilldelning (LDA), genom att införliva tillsyn för att göra den lämplig för Twitter sammanhang. Denna förbättring motiveras av behovet av en enda algoritm som uppnår båda målen: analysera stora mängder av dokument, inklusive nya dokument som anländer i en bäck, och samtidigt uppnå hög kvalitet på ämnen "upptäckt i speciella fall miljöer, till exempel som Twitter. Den föreslagna algoritmen är en kombination av en online-algoritm för LDA och en övervakad variant av LDA - Labeled LDA. Prestanda och kvalitet av den föreslagna algoritmen jämförs med dessa två algoritmer. Resultaten visar att den föreslagna algoritmen har visat bättre prestanda och kvalitet i jämförelse med den övervakade varianten av LDA, och det uppnådde bättre resultat i fråga om kvalitet i jämförelse med den online-algoritmen. Dessa förbättringar gör vår algoritm till ett attraktivt alternativ när de tillämpas på dynamiska miljöer, som Twitter. En miljö för att analysera och märkning uppgifter är utformad för att förbereda dataset innan du utför experimenten. Möjliga användningsområden för den föreslagna algoritmen är tweets rekommendation och trender upptäckt.
Yalamanchili, Hima Bindu. "A Novel Approach For Cancer Characterization Using Latent Dirichlet Allocation and Disease-Specific Genomic Analysis." Wright State University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=wright1527600876174758.
Повний текст джерелаSheikha, Hassan. "Text mining Twitter social media for Covid-19 : Comparing latent semantic analysis and latent Dirichlet allocation." Thesis, Högskolan i Gävle, Avdelningen för datavetenskap och samhällsbyggnad, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-32567.
Повний текст джерелаКниги з теми "Dirichlet allocation"
Shi, Feng. Learn About Latent Dirichlet Allocation in R With Data From the News Articles Dataset (2016). 1 Oliver's Yard, 55 City Road, London EC1Y 1SP United Kingdom: SAGE Publications, Ltd., 2019. http://dx.doi.org/10.4135/9781526495693.
Повний текст джерелаShi, Feng. Learn About Latent Dirichlet Allocation in Python With Data From the News Articles Dataset (2016). 1 Oliver's Yard, 55 City Road, London EC1Y 1SP United Kingdom: SAGE Publications, Ltd., 2019. http://dx.doi.org/10.4135/9781526497727.
Повний текст джерелаAugmenting Latent Dirichlet Allocation and Rank Threshold Detection with Ontologies. CreateSpace Independent Publishing Platform, 2014.
Знайти повний текст джерелаJockers, Matthew L. Theme. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252037528.003.0008.
Повний текст джерелаЧастини книг з теми "Dirichlet allocation"
Li, Hang. "Latent Dirichlet Allocation." In Machine Learning Methods, 439–71. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3917-6_20.
Повний текст джерелаTang, Yi-Kun, Xian-Ling Mao, and Heyan Huang. "Labeled Phrase Latent Dirichlet Allocation." In Web Information Systems Engineering – WISE 2016, 525–36. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-48740-3_39.
Повний текст джерелаMoon, Gordon E., Israt Nisa, Aravind Sukumaran-Rajam, Bortik Bandyopadhyay, Srinivasan Parthasarathy, and P. Sadayappan. "Parallel Latent Dirichlet Allocation on GPUs." In Lecture Notes in Computer Science, 259–72. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-93701-4_20.
Повний текст джерелаCalvo, Hiram, Ángel Hernández-Castañeda, and Jorge García-Flores. "Author Identification Using Latent Dirichlet Allocation." In Computational Linguistics and Intelligent Text Processing, 303–12. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77116-8_22.
Повний текст джерелаHao, Jing, and Hongxi Wei. "Latent Dirichlet Allocation Based Image Retrieval." In Lecture Notes in Computer Science, 211–21. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-68699-8_17.
Повний текст джерелаMaanicshah, Kamal, Manar Amayri, and Nizar Bouguila. "Interactive Generalized Dirichlet Mixture Allocation Model." In Lecture Notes in Computer Science, 33–42. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-23028-8_4.
Повний текст джерелаWheeler, Jordan M., Shiyu Wang, and Allan S. Cohen. "Latent Dirichlet Allocation of Constructed Responses." In The Routledge International Handbook of Automated Essay Evaluation, 535–55. New York: Routledge, 2024. http://dx.doi.org/10.4324/9781003397618-31.
Повний текст джерелаRus, Vasile, Nobal Niraula, and Rajendra Banjade. "Similarity Measures Based on Latent Dirichlet Allocation." In Computational Linguistics and Intelligent Text Processing, 459–70. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-37247-6_37.
Повний текст джерелаBíró, István, and Jácint Szabó. "Latent Dirichlet Allocation for Automatic Document Categorization." In Machine Learning and Knowledge Discovery in Databases, 430–41. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04174-7_28.
Повний текст джерелаLovato, Pietro, Manuele Bicego, Vittorio Murino, and Alessandro Perina. "Robust Initialization for Learning Latent Dirichlet Allocation." In Similarity-Based Pattern Recognition, 117–32. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24261-3_10.
Повний текст джерелаТези доповідей конференцій з теми "Dirichlet allocation"
Tahsin, Faiza, Hafsa Ennajari, and Nizar Bouguila. "Author Dirichlet Multinomial Allocation Model with Generalized Distribution (ADMAGD)." In 2024 International Symposium on Networks, Computers and Communications (ISNCC), 1–7. IEEE, 2024. http://dx.doi.org/10.1109/isncc62547.2024.10758998.
Повний текст джерелаKoltcov, Sergei, Olessia Koltsova, and Sergey Nikolenko. "Latent dirichlet allocation." In the 2014 ACM conference. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2615569.2615680.
Повний текст джерелаChien, Jen-Tzung, Chao-Hsi Lee, and Zheng-Hua Tan. "Dirichlet mixture allocation." In 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2016. http://dx.doi.org/10.1109/mlsp.2016.7738866.
Повний текст джерелаShen, Zhi-Yong, Jun Sun, and Yi-Dong Shen. "Collective Latent Dirichlet Allocation." In 2008 Eighth IEEE International Conference on Data Mining (ICDM). IEEE, 2008. http://dx.doi.org/10.1109/icdm.2008.75.
Повний текст джерелаLi, Shuangyin, Guan Huang, Ruiyang Tan, and Rong Pan. "Tag-Weighted Dirichlet Allocation." In 2013 IEEE International Conference on Data Mining (ICDM). IEEE, 2013. http://dx.doi.org/10.1109/icdm.2013.11.
Повний текст джерелаHsin, Wei-Cheng, and Jen-Wei Huang. "Multi-dependent Latent Dirichlet Allocation." In 2017 Conference on Technologies and Applications of Artificial Intelligence (TAAI). IEEE, 2017. http://dx.doi.org/10.1109/taai.2017.51.
Повний текст джерелаKrestel, Ralf, Peter Fankhauser, and Wolfgang Nejdl. "Latent dirichlet allocation for tag recommendation." In the third ACM conference. New York, New York, USA: ACM Press, 2009. http://dx.doi.org/10.1145/1639714.1639726.
Повний текст джерелаTan, Yimin, and Zhijian Ou. "Topic-weak-correlated Latent Dirichlet allocation." In 2010 7th International Symposium on Chinese Spoken Language Processing (ISCSLP). IEEE, 2010. http://dx.doi.org/10.1109/iscslp.2010.5684906.
Повний текст джерелаXiang, Yingzhuo, Dongmei Yang, and Jikun Yan. "The Auto Annotation Latent Dirichlet Allocation." In First International Conference on Information Sciences, Machinery, Materials and Energy. Paris, France: Atlantis Press, 2015. http://dx.doi.org/10.2991/icismme-15.2015.387.
Повний текст джерелаBhutada, Sunil, V. V. S. S. S. Balaram, and Vishnu Vardhan Bulusu. "Latent Dirichlet Allocation based multilevel classification." In 2014 International Conference on Control, Instrumentation, Communication and Computational Technologies (ICCICCT). IEEE, 2014. http://dx.doi.org/10.1109/iccicct.2014.6993109.
Повний текст джерелаЗвіти організацій з теми "Dirichlet allocation"
Teh, Yee W., David Newman, and Max Welling. A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation. Fort Belvoir, VA: Defense Technical Information Center, September 2007. http://dx.doi.org/10.21236/ada629956.
Повний текст джерелаAntón Sarabia, Arturo, Santiago Bazdresch, and Alejandra Lelo-de-Larrea. The Influence of Central Bank's Projections and Economic Narrative on Professional Forecasters' Expectations: Evidence from Mexico. Banco de México, December 2023. http://dx.doi.org/10.36095/banxico/di.2023.21.
Повний текст джерелаMoreno Pérez, Carlos, and Marco Minozzo. “Making Text Talk”: The Minutes of the Central Bank of Brazil and the Real Economy. Madrid: Banco de España, November 2022. http://dx.doi.org/10.53479/23646.
Повний текст джерелаAlonso-Robisco, Andrés, José Manuel Carbó, and José Manuel Carbó. Machine Learning methods in climate finance: a systematic review. Madrid: Banco de España, February 2023. http://dx.doi.org/10.53479/29594.
Повний текст джерела