Academic literature on the topic 'Pointwise Mutual Information'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Pointwise Mutual Information.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Pointwise Mutual Information"

1

Aji, S. "Document Summarization Using Positive Pointwise Mutual Information." International Journal of Computer Science and Information Technology 4, no. 2 (April 30, 2012): 47–55. http://dx.doi.org/10.5121/ijcsit.2012.4204.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Takada, Teruko. "Mining local and tail dependence structures based on pointwise mutual information." Data Mining and Knowledge Discovery 24, no. 1 (May 6, 2011): 78–102. http://dx.doi.org/10.1007/s10618-011-0220-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Torun, Orhan, and Seniha Esen Yuksel. "Unsupervised segmentation of LiDAR fused hyperspectral imagery using pointwise mutual information." International Journal of Remote Sensing 42, no. 17 (June 23, 2021): 6461–76. http://dx.doi.org/10.1080/01431161.2021.1939906.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Finn, Conor, and Joseph Lizier. "Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices." Entropy 20, no. 4 (April 18, 2018): 297. http://dx.doi.org/10.3390/e20040297.

Full text
Abstract:
What are the distinct ways in which a set of predictor variables can provide information about a target variable? When does a variable provide unique information, when do variables share redundant information, and when do variables combine synergistically to provide complementary information? The redundancy lattice from the partial information decomposition of Williams and Beer provided a promising glimpse at the answer to these questions. However, this structure was constructed using a much criticised measure of redundant information, and despite sustained research, no completely satisfactory replacement measure has been proposed. In this paper, we take a different approach, applying the axiomatic derivation of the redundancy lattice to a single realisation from a set of discrete variables. To overcome the difficulty associated with signed pointwise mutual information, we apply this decomposition separately to the unsigned entropic components of pointwise mutual information which we refer to as the specificity and ambiguity. This yields a separate redundancy lattice for each component. Then based upon an operational interpretation of redundancy, we define measures of redundant specificity and ambiguity enabling us to evaluate the partial information atoms in each lattice. These atoms can be recombined to yield the sought-after multivariate information decomposition. We apply this framework to canonical examples from the literature and discuss the results and the various properties of the decomposition. In particular, the pointwise decomposition using specificity and ambiguity satisfies a chain rule over target variables, which provides new insights into the so-called two-bit-copy example.
APA, Harvard, Vancouver, ISO, and other styles
5

C N, Pushpa, Gerard Deepak, Mohammed Zakir, Thriveni J, and Venugopal K R. "ENHANCED NEIGHBORHOOD NORMALIZED POINTWISE MUTUAL INFORMATION ALGORITHM FOR CONSTRAINT AWARE DATA CLUSTERING." ICTACT Journal on Soft Computing 6, no. 4 (July 1, 2016): 1287–92. http://dx.doi.org/10.21917/ijsc.2016.0176.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Recchia, Gabriel, and Michael N. Jones. "More data trumps smarter algorithms: Comparing pointwise mutual information with latent semantic analysis." Behavior Research Methods 41, no. 3 (August 2009): 647–56. http://dx.doi.org/10.3758/brm.41.3.647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Takada, Teruko. "Erratum to: Mining local and tail dependence structures based on pointwise mutual information." Data Mining and Knowledge Discovery 26, no. 1 (October 14, 2011): 213–15. http://dx.doi.org/10.1007/s10618-011-0241-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chennubhotla, SChakra, DanielM Spagnolo, Rekha Gyanchandani, Yousef Al-Kofahi, AndrewM Stern, TimothyR Lezon, Albert Gough, et al. "Pointwise mutual information quantifies intratumor heterogeneity in tissue sections labeled with multiple fluorescent biomarkers." Journal of Pathology Informatics 7, no. 1 (2016): 47. http://dx.doi.org/10.4103/2153-3539.194839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Rahman, A. "Comparison Extraction Feature Using Double Propagation and Pointwise Mutual Information to Select a Product." IOP Conference Series: Materials Science and Engineering 407 (September 26, 2018): 012147. http://dx.doi.org/10.1088/1757-899x/407/1/012147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Manivannan, P., and C. S. Kanimozhiselvi. "Pointwise Mutual Information Based Integral Classifier for Sentiment Analysis in Cross Domain Opinion Mining." Journal of Computational and Theoretical Nanoscience 14, no. 11 (November 1, 2017): 5435–43. http://dx.doi.org/10.1166/jctn.2017.6967.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Pointwise Mutual Information"

1

Jareš, Petr. "Rychlá adaptace počítačové podpory hry Krycí jména pro nové jazyky." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2021. http://www.nusl.cz/ntk/nusl-445475.

Full text
Abstract:
This thesis extends a system of an artificial player of a word-association game Codenames to easy addition of support for new languages. The system is able to play Codenames in roles as a guessing player, a clue giver or, by their combination a Duet version player. For analysis of different languages a neural toolkit Stanza was used, which is language independent and enables automated processing of many languages. It was mainly about lemmatization and part of speech tagging for selection of clues in the game. For evaluation of word associations were several models tested, where the best results had a method Pointwise Mutual Information and predictive model fastText. The system supports playing Codenames in 36 languages comprising 8 different alphabets.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Pointwise Mutual Information"

1

Isola, Phillip, Daniel Zoran, Dilip Krishnan, and Edward H. Adelson. "Crisp Boundary Detection Using Pointwise Mutual Information." In Computer Vision – ECCV 2014, 799–814. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-10578-9_52.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ding, Yuxin, and Shengli Yan. "Topic Optimization Method Based on Pointwise Mutual Information." In Neural Information Processing, 148–55. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-26555-1_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Su, Qi, Kun Xiang, Houfeng Wang, Bin Sun, and Shiwen Yu. "Using Pointwise Mutual Information to Identify Implicit Features in Customer Reviews." In Computer Processing of Oriental Languages. Beyond the Orient: The Research Challenges Ahead, 22–30. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11940098_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Schneider, Karl-Michael. "Weighted Average Pointwise Mutual Information for Feature Selection in Text Categorization." In Knowledge Discovery in Databases: PKDD 2005, 252–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11564126_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wu, Lifang, Dan Wang, Cheng Guo, Jianan Zhang, and Chang wen Chen. "User Profiling by Combining Topic Modeling and Pointwise Mutual Information (TM-PMI)." In MultiMedia Modeling, 152–61. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-27674-8_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vishal, K., Gerard Deepak, and A. Santhanavijayan. "An Approach for Retrieval of Text Documents by Hybridizing Structural Topic Modeling and Pointwise Mutual Information." In Lecture Notes in Electrical Engineering, 969–77. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-0749-3_74.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Pointwise Mutual Information"

1

Takayama, Junya, and Yuki Arase. "Relevant and Informative Response Generation using Pointwise Mutual Information." In Proceedings of the First Workshop on NLP for Conversational AI. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019. http://dx.doi.org/10.18653/v1/w19-4115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fang, Yiqiu, Chunjiang Li, and Junwei Ge. "Product Attribute Extraction Based on Affinity Propagation Clustering Algorithm and Pointwise Mutual Information Pruning." In 2019 International Conference on Artificial Intelligence and Advanced Manufacturing (AIAM). IEEE, 2019. http://dx.doi.org/10.1109/aiam48774.2019.00137.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Zhengrong, and Yang Hu. "Two-stage Photovoltaic Power Forecasting based on Extreme Learning Machine and Improved Pointwise Mutual Information." In 2019 IEEE PES Asia-Pacific Power and Energy Engineering Conference (APPEEC). IEEE, 2019. http://dx.doi.org/10.1109/appeec45492.2019.8994387.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

MaByszko, Jacek, and Agata Filipowska. "Lexicon-free and context-free drug names identification methods using hidden markov models and pointwise mutual information." In the ACM sixth international workshop. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2390068.2390072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Xueyujie. "Analysis of Sentence Boundary of the Host's Spoken Language Based on Semantic Orientation Pointwise Mutual Information Algorithm." In 2020 12th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA). IEEE, 2020. http://dx.doi.org/10.1109/icmtma50254.2020.00114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"HANDLING THE IMPACT OF LOW FREQUENCY EVENTS ON CO-OCCURRENCE BASED MEASURES OF WORD SIMILARITY - A Case Study of Pointwise Mutual Information." In International Conference on Knowledge Discovery and Information Retrieval. SciTePress - Science and and Technology Publications, 2011. http://dx.doi.org/10.5220/0003655102260231.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pesaranghader, Ahmad, Saravanan Muthaiyah, and Ali Pesaranghader. "Improving Gloss Vector Semantic Relatedness Measure by Integrating Pointwise Mutual Information: Optimizing Second-Order Co-occurrence Vectors Computed from Biomedical Corpus and UMLS." In 2013 International Conference on Informatics and Creative Multimedia (ICICM). IEEE, 2013. http://dx.doi.org/10.1109/icicm.2013.41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bai, Tian, Brian L. Egleston, Richard Bleicher, and Slobodan Vucetic. "Medical Concept Representation Learning from Multi-source Data." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/680.

Full text
Abstract:
Representing words as low dimensional vectors is very useful in many natural language processing tasks. This idea has been extended to medical domain where medical codes listed in medical claims are represented as vectors to facilitate exploratory analysis and predictive modeling. However, depending on a type of a medical provider, medical claims can use medical codes from different ontologies or from a combination of ontologies, which complicates learning of the representations. To be able to properly utilize such multi-source medical claim data, we propose an approach that represents medical codes from different ontologies in the same vector space. We first modify the Pointwise Mutual Information (PMI) measure of similarity between the codes. We then develop a new negative sampling method for word2vec model that implicitly factorizes the modified PMI matrix. The new approach was evaluated on the code cross-reference problem, which aims at identifying similar codes across different ontologies. In our experiments, we evaluated cross-referencing between ICD-9 and CPT medical code ontologies. Our results indicate that vector representations of codes learned by the proposed approach provide superior cross-referencing when compared to several existing approaches.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography