Academic literature on the topic 'Visual and auditory languages'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Visual and auditory languages.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Visual and auditory languages"
BURNHAM, DENIS, BENJAWAN KASISOPA, AMANDA REID, SUDAPORN LUKSANEEYANAWIN, FRANCISCO LACERDA, VIRGINIA ATTINA, NAN XU RATTANASONE, IRIS-CORINNA SCHWARZ, and DIANE WEBSTER. "Universality and language-specific experience in the perception of lexical tone and pitch." Applied Psycholinguistics 36, no. 6 (November 21, 2014): 1459–91. http://dx.doi.org/10.1017/s0142716414000496.
Full textVÉLEZ-URIBE, IDALY, and MÓNICA ROSSELLI. "The auditory and visual appraisal of emotion-related words in Spanish–English bilinguals." Bilingualism: Language and Cognition 22, no. 1 (October 5, 2017): 30–46. http://dx.doi.org/10.1017/s1366728917000517.
Full textLu, Youtao, and James L. Morgan. "Homophone auditory processing in cross-linguistic perspective." Proceedings of the Linguistic Society of America 5, no. 1 (March 23, 2020): 529. http://dx.doi.org/10.3765/plsa.v5i1.4733.
Full textBrookshire, Geoffrey, Jenny Lu, Howard C. Nusbaum, Susan Goldin-Meadow, and Daniel Casasanto. "Visual cortex entrains to sign language." Proceedings of the National Academy of Sciences 114, no. 24 (May 30, 2017): 6352–57. http://dx.doi.org/10.1073/pnas.1620350114.
Full textKubicek, Claudia, Anne Hillairet de Boisferon, Eve Dupierrix, Hélène Lœvenbruck, Judit Gervain, and Gudrun Schwarzer. "Face-scanning behavior to silently-talking faces in 12-month-old infants: The impact of pre-exposed auditory speech." International Journal of Behavioral Development 37, no. 2 (February 25, 2013): 106–10. http://dx.doi.org/10.1177/0165025412473016.
Full textde la Cruz-Pavía, Irene, Janet F. Werker, Eric Vatikiotis-Bateson, and Judit Gervain. "Finding Phrases: The Interplay of Word Frequency, Phrasal Prosody and Co-speech Visual Information in Chunking Speech by Monolingual and Bilingual Adults." Language and Speech 63, no. 2 (April 19, 2019): 264–91. http://dx.doi.org/10.1177/0023830919842353.
Full textNewman-Norlund, Roger D., Scott H. Frey, Laura-Ann Petitto, and Scott T. Grafton. "Anatomical Substrates of Visual and Auditory Miniature Second-language Learning." Journal of Cognitive Neuroscience 18, no. 12 (December 2006): 1984–97. http://dx.doi.org/10.1162/jocn.2006.18.12.1984.
Full textStorms, Russell L., and Michael J. Zyda. "Interactions in Perceived Quality of Auditory-Visual Displays." Presence: Teleoperators and Virtual Environments 9, no. 6 (December 2000): 557–80. http://dx.doi.org/10.1162/105474600300040385.
Full textHasenäcker, Jana, Luianta Verra, and Sascha Schroeder. "Comparing length and frequency effects in children across modalities." Quarterly Journal of Experimental Psychology 72, no. 7 (October 20, 2018): 1682–91. http://dx.doi.org/10.1177/1747021818805063.
Full textLallier, Marie, Nicola Molinaro, Mikel Lizarazu, Mathieu Bourguignon, and Manuel Carreiras. "Amodal Atypical Neural Oscillatory Activity in Dyslexia." Clinical Psychological Science 5, no. 2 (December 21, 2016): 379–401. http://dx.doi.org/10.1177/2167702616670119.
Full textDissertations / Theses on the topic "Visual and auditory languages"
Spencer, Dawna. "Visual and auditory metalinguistic methods for Spanish second language acquisition." Connect online, 2008. http://library2.up.edu/theses/2008_spencerd.pdf.
Full textErdener, Vahit Dogu, University of Western Sydney, of Arts Education and Social Sciences College, and School of Psychology. "The effect of auditory, visual and orthographic information on second language acquisition." THESIS_CAESS_PSY_Erdener_V.xml, 2002. http://handle.uws.edu.au:8081/1959.7/685.
Full textMaster of Arts (Hons)
Erdener, Vahit Doğu. "The effect of auditory, visual and orthographic information on second language acquisition /." View thesis View thesis, 2002. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030408.114825/index.html.
Full text"A thesis submitted in partial fulfillment of the requirements for the degree of Masters of Arts (Honours), MARCS Auditory Laboratories & School of Psychology, University of Western Sydney, May 2002" Bibliography : leaves 83-93.
Nácar, García Loreto 1988. "Language acquisition in bilingual infants : Early language discrimination in the auditory and visual domains." Doctoral thesis, Universitat Pompeu Fabra, 2016. http://hdl.handle.net/10803/511361.
Full textLa adquisición del lenguaje es una pieza fundamental en el desarrollo cognitivo durante el primer año de vida. Una diferencia fundamental entre los bebés que crecen en ambientes monolingües y bilingües es que estos últimos necesitan discriminar entre dos sistemas lingüísticos desde muy temprano en la vida. Para poder aprender dos idiomas, los bebés bilingües tienen que percibir las regularidades de cada uno de sus idiomas y a la vez mantenerlos separados. En esta tesis exploramos las diferencias entre bebés monolingües y bilingües tanto en sus capacidades de discriminación tempranas, como en las estrategias que desarrolla cada grupo como consecuencia de la adaptación a su entorno lingüístico. En el segundo capítulo, examinamos la capacidad de los bebés bilingües y monolingües a los 4 meses de edad para discriminar entre la lengua nativa/dominante de otra extranjera en el dominio auditivo. Nuestros resultados muestran que, en este contexto, los bebés monolingües y bilingües presentan diferentes señales auditivas cuando escuchan su lengua nativa. Los resultados señalan que discriminar la lengua nativa representa un coste cognitivo mayor para los bebés bilingües que para los monolingües cuando sólo sólo disponen de información auditiva. En el capítulo 3, exploramos las habilidades de los bebés monolingües y bilingües a los 8 meses de edad para discriminar lenguas en el dominio visual. Aquí, mostramos a bebés que nunca han sido expuestos a lengua de signos, videos de dos lenguas de signos diferentes y medimos sus habilidades discriminatorias usando un paradigma de habituación. Los resultados muestran que a esta edad sólo los bebés bilingües son capaces de hacer la distinción y apuntan que para ello aprovechan la información proveniente de la cara de la signante.
Greenwood, Toni Elspeth. "Auditory language comprehension, and sequential interference in working memory following sustained visual attention /." Title page, contents and abstract only, 2001. http://web4.library.adelaide.edu.au/theses/09ARPS/09arpsg8166.pdf.
Full textWroblewski, Marcin. "Developmental predictors of auditory-visual integration of speech in reverberation and noise." Diss., University of Iowa, 2017. https://ir.uiowa.edu/etd/6017.
Full textRybarczyk, Aubrey Rachel. "Weighting of Visual and Auditory Stimuli in Children with Autism Spectrum Disorders." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1459977848.
Full textBosworth, Rain G. "Psychophysical investigation of visual perception in deaf and hearing adults : effects of auditory deprivation and sign language experience /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC IP addresses, 2001. http://wwwlib.umi.com/cr/ucsd/fullcit?p3015850.
Full textPénicaud, Sidonie. "Insights about age of language exposure and brain development : a voxel-based morphometry approach." Thesis, McGill University, 2009. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=111591.
Full textLima, Fernanda Leitão de Castro Nunes de [UNESP]. "Julgamento perceptivo-auditivo e perceptivo-visual das produções gradientes de fricativas coronais surdas." Universidade Estadual Paulista (UNESP), 2018. http://hdl.handle.net/11449/154302.
Full textApproved for entry into archive by Satie Tagara (satie@marilia.unesp.br) on 2018-06-19T14:10:24Z (GMT) No. of bitstreams: 1 lima_flcn_me_mar.pdf: 1310670 bytes, checksum: ab7f761d3d1be439f987de5d800203cd (MD5)
Made available in DSpace on 2018-06-19T14:10:24Z (GMT). No. of bitstreams: 1 lima_flcn_me_mar.pdf: 1310670 bytes, checksum: ab7f761d3d1be439f987de5d800203cd (MD5) Previous issue date: 2018-05-22
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Objetivo: O objetivo do presente estudo foi analisar a porcentagem de respostas dos juízes no julgamento perceptivo-auditivo dos áudios e no julgamento perceptivo-visual de imagens ultrassonográficas na detecção de produções gradientes das fricativas coronais surdas. Ainda, verificar se há diferenças entre essas formas de julgamento e se elas se correlacionam. Métodos: Foram selecionados 20 juízes com conhecimento sobre o processo de produção da fala, além da classificação e descrição fonética dos diferentes fonemas do Português Brasileiro (PB). Os estímulos julgados foram coletados de um banco de dados, arquivos de áudio e vídeo (imagens ultrassonográficas) relativos à produção de palavras “sapo” e “chave”, de 11 crianças falantes do PB, na faixa etária de 6 a 12 anos de idade (9 meninos e 2 meninas), com produção de fala atípica. Foi realizada uma codificação prévia dos arquivos coletados. Após instrução prévia, os juízes deveriam escolher, imediatamente à apresentação de um estímulo, uma dentre três opções dispostas na tela do computador.O procedimento experimental consistiu no julgamento dos arquivos de áudio e julgamento das imagens ultrassonográficas, executado pelo software PERCEVAL.No julgamento dos arquivos de áudio as opções eram: produção correta, incorreta ou gradiente, enquanto no julgamento das imagens ultrassonográficasas opções eram: produção de [s], produção de [∫] ou produção indiferenciada.O tempo de apresentação, o modo aleatorizado de seleção dos estímulos e o tempo de reação foram controlados automaticamente pelo software PERCEVAL. Os dados foram submetidos à análise estatística. Resultados: O julgamento de imagens propiciou uma maior identificação dos estímulos gradientes (137 estímulos) e um menor tempo de reação na realização da tarefa (média=1073,12 ms) comparativamente ao julgamento perceptivo-auditivo (80 estímulos, tempo de reação médio=3126,26 ms), ambos estatisticamente significante (p<0,00). O teste de correlação de Spearman não mostrou significância estatística para porcentagem de respostas, nem para o tempo de reação. Conclusão: O uso das imagens ultrassonográficas no julgamento é o método mais sensível para a detecção da produção gradiente na produção de fala, podendo ser utilizado como método complementar do julgamento perceptivo-auditivo na análise de fala.
Purpose: The purpose of this study was to analyze the percentage of judges' answers in the auditory-perceptual judgment of the audios and in the visual-perceptual judgment of ultrasound images in the detection of gradient productions of the voiceless coronal fricatives. Also, to verify whether there are differences between these forms of judgment and whether they correlate. Methods: 20 judges with knowledge about the speech production process, besides the phonetic classification and description of the different Brazilian Portuguese (BP) phonemes were selected. The judged stimuli were collected from a database, audio and video files (ultrasound images) related to the production of "sapo”(frog) and "chave" (key) words, of 11 BP speakers children aged from 6 to 12 years old (9 boys and 2 girls), with atypical speech production. A previous encoding of the collected files was performed. After previous instruction, the judges should choose, immediately the presentation of a stimulus, one of three options arranged on the computer screen. The experimental procedure consisted in the judgment of the audio files and judgment of the ultrasound images, executed by the PERCEVAL software. In the judgment of the audio files the options were: correct, incorrect or gradient production, while in the judgment of the ultrasound images the options were: production of [s], production of [∫] or undifferentiated production. The presentation time, the randomized mode of selection of the stimuli and the reaction time were controlled automatically by PERCEVAL software. The data were submitted to statistical analysis. Results: The judgment of images provided a greater identification of the gradient stimuli (137 stimuli) and a shorter response time (mean = 1073.12 ms) compared to the auditory-perceptual judgment (80 stimuli, mean reaction time = 3126.26 ms), both statistically significant (p <0.00). Spearman's correlation test did not show statistical significance for percentage of responses, nor for reaction time. Conclusion: The use of ultrasound images in the judgment is the most sensitive method for the detection of gradient production in speech production, and can be used as a complementary method of auditory-perceptual judgment in the speech analysis.
Books on the topic "Visual and auditory languages"
Teaching writing to visual, auditory, and kinesthetic learners. Thousand Oaks: Corwin Press, 2006.
Find full textJabr, Yaḥyá ʻAbd al-Raʼūf. al-Lughah wa-al-ḥawāss. Nābulus: [s.n.], 1999.
Find full textAndo, Yoichi. Auditory and Visual Sensations. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/b13253.
Full textservice), SpringerLink (Online, ed. Auditory and Visual Sensations. New York, NY: Springer-Verlag New York, 2009.
Find full textPress, Leonard J. Parallels between auditory & visual processing. Santa Ana, CA: Optometric Extension Program Foundation, 2012.
Find full textStorms, Russell L. Auditory-visual cross-modal perception phenomena. Monterey, Calif: Naval Postgraduate School, 1998.
Find full textChang, Shi-Kuo, Tadao Ichikawa, and Panos A. Ligomenides, eds. Visual Languages. Boston, MA: Springer US, 1987. http://dx.doi.org/10.1007/978-1-4613-1805-7.
Full text1944-, Chang S. K., Ichikawa Tadao, and Ligomenides Panos A, eds. Visual languages. New York: Plenum Press, 1986.
Find full textChang, Shi-Kuo. Visual Languages. Boston, MA: Springer US, 1987.
Find full textEvamy, Barbara. Auditory & visual discrimination exercises: A teacher's aid. [Great Britain]: B. Evamy, 2003.
Find full textBook chapters on the topic "Visual and auditory languages"
Stokoe, William C. "Visual and Auditory Orientations to Language learning." In Scientific and Humanistic Dimensions of Language, 315. Amsterdam: John Benjamins Publishing Company, 1985. http://dx.doi.org/10.1075/z.22.42sto.
Full textUssishkin, Adam, and Alina Twist. "Auditory and visual lexical decision in Maltese." In Studies in Language Companion Series, 233–49. Amsterdam: John Benjamins Publishing Company, 2009. http://dx.doi.org/10.1075/slcs.113.16uss.
Full textKretschmer, Laura W., and Richard R. Kretschmer. "Intervention for Children with Auditory or Visual Sensory Impairments." In The Handbook of Language and Speech Disorders, 57–98. Oxford, UK: Wiley-Blackwell, 2010. http://dx.doi.org/10.1002/9781444318975.ch3.
Full textBurnham, Denis, and Barbara Dodd. "Auditory-Visual Speech Perception as a Direct Process: The McGurk Effect in Infants and Across Languages." In Speechreading by Humans and Machines, 103–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-13015-5_7.
Full textRobert-Ribes, Jordi, Jean-Luc Schwartz, and Pierre Escudier. "A Comparison of Models for Fusion of the Auditory and Visual Sensors in Speech Perception." In Integration of Natural Language and Vision Processing, 81–104. Dordrecht: Springer Netherlands, 1995. http://dx.doi.org/10.1007/978-94-009-1639-5_7.
Full textPotapova, Rodmonga, and Vsevolod Potapov. "Auditory and Visual Recognition of Emotional Behaviour of Foreign Language Subjects (by Native and Non-native Speakers)." In Speech and Computer, 62–69. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-01931-4_9.
Full textXu, Li, and Ning Zhou. "Tonal Languages and Cochlear Implants." In Auditory Prostheses, 341–64. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4419-9434-9_14.
Full textChang, Shi-Kuo. "Introduction: Visual Languages and Iconic Languages." In Visual Languages, 1–7. Boston, MA: Springer US, 1986. http://dx.doi.org/10.1007/978-1-4613-1805-7_1.
Full textHirakawa, Masahito, Noriaki Monden, Iwao Yoshimoto, Minoru Tanaka, and Tadao Ichikawa. "Hi-Visual." In Visual Languages, 233–59. Boston, MA: Springer US, 1986. http://dx.doi.org/10.1007/978-1-4613-1805-7_10.
Full textVatikiotis-Bateson, Eric, and Kevin G. Munhall. "Auditory-Visual Speech Processing." In The Handbook of Speech Production, 178–99. Hoboken, NJ: John Wiley & Sons, Inc, 2015. http://dx.doi.org/10.1002/9781118584156.ch9.
Full textConference papers on the topic "Visual and auditory languages"
Gable, Thomas M., Brianna Tomlinson, Stanley Cantrell, and Bruce N. Walker. "Spindex and Spearcons in Mandarin: Auditory Menu Enhancements Successful in A Tonal Language." In The 23rd International Conference on Auditory Display. Arlington, Virginia: The International Community for Auditory Display, 2017. http://dx.doi.org/10.21785/icad2017.025.
Full textStenger, I., and T. Avgustinova. "VISUAL VS. AUDITORY PERCEPTION OF BULGARIAN STIMULI BY RUSSIAN NATIVE SPEAKERS." In International Conference on Computational Linguistics and Intellectual Technologies "Dialogue". Russian State University for the Humanities, 2020. http://dx.doi.org/10.28995/2075-7182-2020-19-684-695.
Full textMassaro, Dominic W., and Michael M. Cohen. "Auditory/visual speech in multimodal human interfaces." In 3rd International Conference on Spoken Language Processing (ICSLP 1994). ISCA: ISCA, 1994. http://dx.doi.org/10.21437/icslp.1994-135.
Full textMixdorff, Hansjörg, Angelika Hönemann, Albert Rilliard, Tan Lee, and Matthew Ma. "Cross-Language Perception of Audio-visual Attitudinal Expressions." In The 14th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/avsp.2017-23.
Full textCruz, Marisa, Marc Swerts, and Sónia Frota. "Do visual cues to interrogativity vary between language modalities? Evidence from spoken Portuguese and Portuguese Sign Language." In The 15th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2019. http://dx.doi.org/10.21437/avsp.2019-1.
Full textÖster, Anne-Marie. "Spoken L2 teaching with contrastive visual and auditory feedback." In 5th International Conference on Spoken Language Processing (ICSLP 1998). ISCA: ISCA, 1998. http://dx.doi.org/10.21437/icslp.1998-765.
Full textNunnemann, Eva Maria, Kirsten Bergmann, Helene Kreysa, and Pia Knoeferle. "Referential Gaze Makes a Difference in Spoken Language Comprehension: Human Speaker vs. Virtual Agent Listener Gaze." In The 14th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/avsp.2017-4.
Full textSanjanaashree P, Anand Kumar M, and Soman K.P. "Language learning for visual and auditory learners using scratch toolkit." In 2014 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2014. http://dx.doi.org/10.1109/iccci.2014.6921765.
Full textSekiyama, Kaoru, and Yoichi Sugita. "Auditory-visual speech perception examined by brain imaging and reaction time." In 7th International Conference on Spoken Language Processing (ICSLP 2002). ISCA: ISCA, 2002. http://dx.doi.org/10.21437/icslp.2002-428.
Full textNouza, Jan. "Computer-aided spoken-language training with enhanced visual and auditory feedback." In 6th European Conference on Speech Communication and Technology (Eurospeech 1999). ISCA: ISCA, 1999. http://dx.doi.org/10.21437/eurospeech.1999-49.
Full textReports on the topic "Visual and auditory languages"
Yu, Wanchi. Implicit Learning of Children with and without Developmental Language Disorder across Auditory and Visual Categories. Portland State University Library, January 2000. http://dx.doi.org/10.15760/etd.7460.
Full textVisram, Anisa, Iain Jackson, Ibrahim Almufarrij, Michael Stone, and Kevin Munro. Comparing visual reinforcement audiometry outcomes using different auditory stimuli and visual rewards. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, January 2021. http://dx.doi.org/10.37766/inplasy2021.1.0080.
Full textRichardson, James. Auditory and Visual Sensory Stores: a Recognition Task. Portland State University Library, January 2000. http://dx.doi.org/10.15760/etd.1557.
Full textDriesen, Jacob. Differential Effects of Visual and Auditory Presentation on Logical Reasoning. Portland State University Library, January 2000. http://dx.doi.org/10.15760/etd.2546.
Full textBrady-Herbst, Brenene. An Analysis of Spondee Recognition Thresholds in Auditory-only and Audio-visual Conditions. Portland State University Library, January 2000. http://dx.doi.org/10.15760/etd.7094.
Full textHarsh, John R. Auditory and Visual Evoked Potentials as a Function of Sleep Deprivation and Irregular Sleep. Fort Belvoir, VA: Defense Technical Information Center, August 1989. http://dx.doi.org/10.21236/ada228488.
Full textJokeit, H., R. Goertzl, E. Kuchleri, and S. Makeig. Event-Related Changes in the 40 Hz Electroencephalogram in Auditory and Visual Reaction Time Tasks. Fort Belvoir, VA: Defense Technical Information Center, January 1994. http://dx.doi.org/10.21236/ada379543.
Full textDavis, Bradley M. Effects of Visual, Auditory, and Tactile Navigation Cues on Navigation Performance, Situation Awareness, and Mental Workload. Fort Belvoir, VA: Defense Technical Information Center, February 2007. http://dx.doi.org/10.21236/ada463244.
Full textYatsymirska, Mariya. SOCIAL EXPRESSION IN MULTIMEDIA TEXTS. Ivan Franko National University of Lviv, February 2021. http://dx.doi.org/10.30970/vjo.2021.49.11072.
Full textBeiker, Sven, ed. Unsettled Issues Regarding Visual Communication Between Automated Vehicles and Other Road Users. SAE International, July 2021. http://dx.doi.org/10.4271/epr2021016.
Full text