Letteratura scientifica selezionata sul tema "Visual and auditory languages"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Visual and auditory languages".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Visual and auditory languages"
BURNHAM, DENIS, BENJAWAN KASISOPA, AMANDA REID, SUDAPORN LUKSANEEYANAWIN, FRANCISCO LACERDA, VIRGINIA ATTINA, NAN XU RATTANASONE, IRIS-CORINNA SCHWARZ e DIANE WEBSTER. "Universality and language-specific experience in the perception of lexical tone and pitch". Applied Psycholinguistics 36, n. 6 (21 novembre 2014): 1459–91. http://dx.doi.org/10.1017/s0142716414000496.
Testo completoVÉLEZ-URIBE, IDALY, e MÓNICA ROSSELLI. "The auditory and visual appraisal of emotion-related words in Spanish–English bilinguals". Bilingualism: Language and Cognition 22, n. 1 (5 ottobre 2017): 30–46. http://dx.doi.org/10.1017/s1366728917000517.
Testo completoLu, Youtao, e James L. Morgan. "Homophone auditory processing in cross-linguistic perspective". Proceedings of the Linguistic Society of America 5, n. 1 (23 marzo 2020): 529. http://dx.doi.org/10.3765/plsa.v5i1.4733.
Testo completoBrookshire, Geoffrey, Jenny Lu, Howard C. Nusbaum, Susan Goldin-Meadow e Daniel Casasanto. "Visual cortex entrains to sign language". Proceedings of the National Academy of Sciences 114, n. 24 (30 maggio 2017): 6352–57. http://dx.doi.org/10.1073/pnas.1620350114.
Testo completoKubicek, Claudia, Anne Hillairet de Boisferon, Eve Dupierrix, Hélène Lœvenbruck, Judit Gervain e Gudrun Schwarzer. "Face-scanning behavior to silently-talking faces in 12-month-old infants: The impact of pre-exposed auditory speech". International Journal of Behavioral Development 37, n. 2 (25 febbraio 2013): 106–10. http://dx.doi.org/10.1177/0165025412473016.
Testo completode la Cruz-Pavía, Irene, Janet F. Werker, Eric Vatikiotis-Bateson e Judit Gervain. "Finding Phrases: The Interplay of Word Frequency, Phrasal Prosody and Co-speech Visual Information in Chunking Speech by Monolingual and Bilingual Adults". Language and Speech 63, n. 2 (19 aprile 2019): 264–91. http://dx.doi.org/10.1177/0023830919842353.
Testo completoNewman-Norlund, Roger D., Scott H. Frey, Laura-Ann Petitto e Scott T. Grafton. "Anatomical Substrates of Visual and Auditory Miniature Second-language Learning". Journal of Cognitive Neuroscience 18, n. 12 (dicembre 2006): 1984–97. http://dx.doi.org/10.1162/jocn.2006.18.12.1984.
Testo completoStorms, Russell L., e Michael J. Zyda. "Interactions in Perceived Quality of Auditory-Visual Displays". Presence: Teleoperators and Virtual Environments 9, n. 6 (dicembre 2000): 557–80. http://dx.doi.org/10.1162/105474600300040385.
Testo completoHasenäcker, Jana, Luianta Verra e Sascha Schroeder. "Comparing length and frequency effects in children across modalities". Quarterly Journal of Experimental Psychology 72, n. 7 (20 ottobre 2018): 1682–91. http://dx.doi.org/10.1177/1747021818805063.
Testo completoLallier, Marie, Nicola Molinaro, Mikel Lizarazu, Mathieu Bourguignon e Manuel Carreiras. "Amodal Atypical Neural Oscillatory Activity in Dyslexia". Clinical Psychological Science 5, n. 2 (21 dicembre 2016): 379–401. http://dx.doi.org/10.1177/2167702616670119.
Testo completoTesi sul tema "Visual and auditory languages"
Spencer, Dawna. "Visual and auditory metalinguistic methods for Spanish second language acquisition". Connect online, 2008. http://library2.up.edu/theses/2008_spencerd.pdf.
Testo completoErdener, Vahit Dogu, University of Western Sydney, of Arts Education and Social Sciences College e School of Psychology. "The effect of auditory, visual and orthographic information on second language acquisition". THESIS_CAESS_PSY_Erdener_V.xml, 2002. http://handle.uws.edu.au:8081/1959.7/685.
Testo completoMaster of Arts (Hons)
Erdener, Vahit Doğu. "The effect of auditory, visual and orthographic information on second language acquisition /". View thesis View thesis, 2002. http://library.uws.edu.au/adt-NUWS/public/adt-NUWS20030408.114825/index.html.
Testo completo"A thesis submitted in partial fulfillment of the requirements for the degree of Masters of Arts (Honours), MARCS Auditory Laboratories & School of Psychology, University of Western Sydney, May 2002" Bibliography : leaves 83-93.
Nácar, García Loreto 1988. "Language acquisition in bilingual infants : Early language discrimination in the auditory and visual domains". Doctoral thesis, Universitat Pompeu Fabra, 2016. http://hdl.handle.net/10803/511361.
Testo completoLa adquisición del lenguaje es una pieza fundamental en el desarrollo cognitivo durante el primer año de vida. Una diferencia fundamental entre los bebés que crecen en ambientes monolingües y bilingües es que estos últimos necesitan discriminar entre dos sistemas lingüísticos desde muy temprano en la vida. Para poder aprender dos idiomas, los bebés bilingües tienen que percibir las regularidades de cada uno de sus idiomas y a la vez mantenerlos separados. En esta tesis exploramos las diferencias entre bebés monolingües y bilingües tanto en sus capacidades de discriminación tempranas, como en las estrategias que desarrolla cada grupo como consecuencia de la adaptación a su entorno lingüístico. En el segundo capítulo, examinamos la capacidad de los bebés bilingües y monolingües a los 4 meses de edad para discriminar entre la lengua nativa/dominante de otra extranjera en el dominio auditivo. Nuestros resultados muestran que, en este contexto, los bebés monolingües y bilingües presentan diferentes señales auditivas cuando escuchan su lengua nativa. Los resultados señalan que discriminar la lengua nativa representa un coste cognitivo mayor para los bebés bilingües que para los monolingües cuando sólo sólo disponen de información auditiva. En el capítulo 3, exploramos las habilidades de los bebés monolingües y bilingües a los 8 meses de edad para discriminar lenguas en el dominio visual. Aquí, mostramos a bebés que nunca han sido expuestos a lengua de signos, videos de dos lenguas de signos diferentes y medimos sus habilidades discriminatorias usando un paradigma de habituación. Los resultados muestran que a esta edad sólo los bebés bilingües son capaces de hacer la distinción y apuntan que para ello aprovechan la información proveniente de la cara de la signante.
Greenwood, Toni Elspeth. "Auditory language comprehension, and sequential interference in working memory following sustained visual attention /". Title page, contents and abstract only, 2001. http://web4.library.adelaide.edu.au/theses/09ARPS/09arpsg8166.pdf.
Testo completoWroblewski, Marcin. "Developmental predictors of auditory-visual integration of speech in reverberation and noise". Diss., University of Iowa, 2017. https://ir.uiowa.edu/etd/6017.
Testo completoRybarczyk, Aubrey Rachel. "Weighting of Visual and Auditory Stimuli in Children with Autism Spectrum Disorders". The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1459977848.
Testo completoBosworth, Rain G. "Psychophysical investigation of visual perception in deaf and hearing adults : effects of auditory deprivation and sign language experience /". Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC IP addresses, 2001. http://wwwlib.umi.com/cr/ucsd/fullcit?p3015850.
Testo completoPénicaud, Sidonie. "Insights about age of language exposure and brain development : a voxel-based morphometry approach". Thesis, McGill University, 2009. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=111591.
Testo completoLima, Fernanda Leitão de Castro Nunes de [UNESP]. "Julgamento perceptivo-auditivo e perceptivo-visual das produções gradientes de fricativas coronais surdas". Universidade Estadual Paulista (UNESP), 2018. http://hdl.handle.net/11449/154302.
Testo completoApproved for entry into archive by Satie Tagara (satie@marilia.unesp.br) on 2018-06-19T14:10:24Z (GMT) No. of bitstreams: 1 lima_flcn_me_mar.pdf: 1310670 bytes, checksum: ab7f761d3d1be439f987de5d800203cd (MD5)
Made available in DSpace on 2018-06-19T14:10:24Z (GMT). No. of bitstreams: 1 lima_flcn_me_mar.pdf: 1310670 bytes, checksum: ab7f761d3d1be439f987de5d800203cd (MD5) Previous issue date: 2018-05-22
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Objetivo: O objetivo do presente estudo foi analisar a porcentagem de respostas dos juízes no julgamento perceptivo-auditivo dos áudios e no julgamento perceptivo-visual de imagens ultrassonográficas na detecção de produções gradientes das fricativas coronais surdas. Ainda, verificar se há diferenças entre essas formas de julgamento e se elas se correlacionam. Métodos: Foram selecionados 20 juízes com conhecimento sobre o processo de produção da fala, além da classificação e descrição fonética dos diferentes fonemas do Português Brasileiro (PB). Os estímulos julgados foram coletados de um banco de dados, arquivos de áudio e vídeo (imagens ultrassonográficas) relativos à produção de palavras “sapo” e “chave”, de 11 crianças falantes do PB, na faixa etária de 6 a 12 anos de idade (9 meninos e 2 meninas), com produção de fala atípica. Foi realizada uma codificação prévia dos arquivos coletados. Após instrução prévia, os juízes deveriam escolher, imediatamente à apresentação de um estímulo, uma dentre três opções dispostas na tela do computador.O procedimento experimental consistiu no julgamento dos arquivos de áudio e julgamento das imagens ultrassonográficas, executado pelo software PERCEVAL.No julgamento dos arquivos de áudio as opções eram: produção correta, incorreta ou gradiente, enquanto no julgamento das imagens ultrassonográficasas opções eram: produção de [s], produção de [∫] ou produção indiferenciada.O tempo de apresentação, o modo aleatorizado de seleção dos estímulos e o tempo de reação foram controlados automaticamente pelo software PERCEVAL. Os dados foram submetidos à análise estatística. Resultados: O julgamento de imagens propiciou uma maior identificação dos estímulos gradientes (137 estímulos) e um menor tempo de reação na realização da tarefa (média=1073,12 ms) comparativamente ao julgamento perceptivo-auditivo (80 estímulos, tempo de reação médio=3126,26 ms), ambos estatisticamente significante (p<0,00). O teste de correlação de Spearman não mostrou significância estatística para porcentagem de respostas, nem para o tempo de reação. Conclusão: O uso das imagens ultrassonográficas no julgamento é o método mais sensível para a detecção da produção gradiente na produção de fala, podendo ser utilizado como método complementar do julgamento perceptivo-auditivo na análise de fala.
Purpose: The purpose of this study was to analyze the percentage of judges' answers in the auditory-perceptual judgment of the audios and in the visual-perceptual judgment of ultrasound images in the detection of gradient productions of the voiceless coronal fricatives. Also, to verify whether there are differences between these forms of judgment and whether they correlate. Methods: 20 judges with knowledge about the speech production process, besides the phonetic classification and description of the different Brazilian Portuguese (BP) phonemes were selected. The judged stimuli were collected from a database, audio and video files (ultrasound images) related to the production of "sapo”(frog) and "chave" (key) words, of 11 BP speakers children aged from 6 to 12 years old (9 boys and 2 girls), with atypical speech production. A previous encoding of the collected files was performed. After previous instruction, the judges should choose, immediately the presentation of a stimulus, one of three options arranged on the computer screen. The experimental procedure consisted in the judgment of the audio files and judgment of the ultrasound images, executed by the PERCEVAL software. In the judgment of the audio files the options were: correct, incorrect or gradient production, while in the judgment of the ultrasound images the options were: production of [s], production of [∫] or undifferentiated production. The presentation time, the randomized mode of selection of the stimuli and the reaction time were controlled automatically by PERCEVAL software. The data were submitted to statistical analysis. Results: The judgment of images provided a greater identification of the gradient stimuli (137 stimuli) and a shorter response time (mean = 1073.12 ms) compared to the auditory-perceptual judgment (80 stimuli, mean reaction time = 3126.26 ms), both statistically significant (p <0.00). Spearman's correlation test did not show statistical significance for percentage of responses, nor for reaction time. Conclusion: The use of ultrasound images in the judgment is the most sensitive method for the detection of gradient production in speech production, and can be used as a complementary method of auditory-perceptual judgment in the speech analysis.
Libri sul tema "Visual and auditory languages"
Teaching writing to visual, auditory, and kinesthetic learners. Thousand Oaks: Corwin Press, 2006.
Cerca il testo completoJabr, Yaḥyá ʻAbd al-Raʼūf. al-Lughah wa-al-ḥawāss. Nābulus: [s.n.], 1999.
Cerca il testo completoAndo, Yoichi. Auditory and Visual Sensations. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/b13253.
Testo completoservice), SpringerLink (Online, a cura di. Auditory and Visual Sensations. New York, NY: Springer-Verlag New York, 2009.
Cerca il testo completoPress, Leonard J. Parallels between auditory & visual processing. Santa Ana, CA: Optometric Extension Program Foundation, 2012.
Cerca il testo completoStorms, Russell L. Auditory-visual cross-modal perception phenomena. Monterey, Calif: Naval Postgraduate School, 1998.
Cerca il testo completoChang, Shi-Kuo, Tadao Ichikawa e Panos A. Ligomenides, a cura di. Visual Languages. Boston, MA: Springer US, 1987. http://dx.doi.org/10.1007/978-1-4613-1805-7.
Testo completo1944-, Chang S. K., Ichikawa Tadao e Ligomenides Panos A, a cura di. Visual languages. New York: Plenum Press, 1986.
Cerca il testo completoChang, Shi-Kuo. Visual Languages. Boston, MA: Springer US, 1987.
Cerca il testo completoEvamy, Barbara. Auditory & visual discrimination exercises: A teacher's aid. [Great Britain]: B. Evamy, 2003.
Cerca il testo completoCapitoli di libri sul tema "Visual and auditory languages"
Stokoe, William C. "Visual and Auditory Orientations to Language learning". In Scientific and Humanistic Dimensions of Language, 315. Amsterdam: John Benjamins Publishing Company, 1985. http://dx.doi.org/10.1075/z.22.42sto.
Testo completoUssishkin, Adam, e Alina Twist. "Auditory and visual lexical decision in Maltese". In Studies in Language Companion Series, 233–49. Amsterdam: John Benjamins Publishing Company, 2009. http://dx.doi.org/10.1075/slcs.113.16uss.
Testo completoKretschmer, Laura W., e Richard R. Kretschmer. "Intervention for Children with Auditory or Visual Sensory Impairments". In The Handbook of Language and Speech Disorders, 57–98. Oxford, UK: Wiley-Blackwell, 2010. http://dx.doi.org/10.1002/9781444318975.ch3.
Testo completoBurnham, Denis, e Barbara Dodd. "Auditory-Visual Speech Perception as a Direct Process: The McGurk Effect in Infants and Across Languages". In Speechreading by Humans and Machines, 103–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-662-13015-5_7.
Testo completoRobert-Ribes, Jordi, Jean-Luc Schwartz e Pierre Escudier. "A Comparison of Models for Fusion of the Auditory and Visual Sensors in Speech Perception". In Integration of Natural Language and Vision Processing, 81–104. Dordrecht: Springer Netherlands, 1995. http://dx.doi.org/10.1007/978-94-009-1639-5_7.
Testo completoPotapova, Rodmonga, e Vsevolod Potapov. "Auditory and Visual Recognition of Emotional Behaviour of Foreign Language Subjects (by Native and Non-native Speakers)". In Speech and Computer, 62–69. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-01931-4_9.
Testo completoXu, Li, e Ning Zhou. "Tonal Languages and Cochlear Implants". In Auditory Prostheses, 341–64. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4419-9434-9_14.
Testo completoChang, Shi-Kuo. "Introduction: Visual Languages and Iconic Languages". In Visual Languages, 1–7. Boston, MA: Springer US, 1986. http://dx.doi.org/10.1007/978-1-4613-1805-7_1.
Testo completoHirakawa, Masahito, Noriaki Monden, Iwao Yoshimoto, Minoru Tanaka e Tadao Ichikawa. "Hi-Visual". In Visual Languages, 233–59. Boston, MA: Springer US, 1986. http://dx.doi.org/10.1007/978-1-4613-1805-7_10.
Testo completoVatikiotis-Bateson, Eric, e Kevin G. Munhall. "Auditory-Visual Speech Processing". In The Handbook of Speech Production, 178–99. Hoboken, NJ: John Wiley & Sons, Inc, 2015. http://dx.doi.org/10.1002/9781118584156.ch9.
Testo completoAtti di convegni sul tema "Visual and auditory languages"
Gable, Thomas M., Brianna Tomlinson, Stanley Cantrell e Bruce N. Walker. "Spindex and Spearcons in Mandarin: Auditory Menu Enhancements Successful in A Tonal Language". In The 23rd International Conference on Auditory Display. Arlington, Virginia: The International Community for Auditory Display, 2017. http://dx.doi.org/10.21785/icad2017.025.
Testo completoStenger, I., e T. Avgustinova. "VISUAL VS. AUDITORY PERCEPTION OF BULGARIAN STIMULI BY RUSSIAN NATIVE SPEAKERS". In International Conference on Computational Linguistics and Intellectual Technologies "Dialogue". Russian State University for the Humanities, 2020. http://dx.doi.org/10.28995/2075-7182-2020-19-684-695.
Testo completoMassaro, Dominic W., e Michael M. Cohen. "Auditory/visual speech in multimodal human interfaces". In 3rd International Conference on Spoken Language Processing (ICSLP 1994). ISCA: ISCA, 1994. http://dx.doi.org/10.21437/icslp.1994-135.
Testo completoMixdorff, Hansjörg, Angelika Hönemann, Albert Rilliard, Tan Lee e Matthew Ma. "Cross-Language Perception of Audio-visual Attitudinal Expressions". In The 14th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/avsp.2017-23.
Testo completoCruz, Marisa, Marc Swerts e Sónia Frota. "Do visual cues to interrogativity vary between language modalities? Evidence from spoken Portuguese and Portuguese Sign Language". In The 15th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2019. http://dx.doi.org/10.21437/avsp.2019-1.
Testo completoÖster, Anne-Marie. "Spoken L2 teaching with contrastive visual and auditory feedback". In 5th International Conference on Spoken Language Processing (ICSLP 1998). ISCA: ISCA, 1998. http://dx.doi.org/10.21437/icslp.1998-765.
Testo completoNunnemann, Eva Maria, Kirsten Bergmann, Helene Kreysa e Pia Knoeferle. "Referential Gaze Makes a Difference in Spoken Language Comprehension: Human Speaker vs. Virtual Agent Listener Gaze". In The 14th International Conference on Auditory-Visual Speech Processing. ISCA: ISCA, 2017. http://dx.doi.org/10.21437/avsp.2017-4.
Testo completoSanjanaashree P, Anand Kumar M e Soman K.P. "Language learning for visual and auditory learners using scratch toolkit". In 2014 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2014. http://dx.doi.org/10.1109/iccci.2014.6921765.
Testo completoSekiyama, Kaoru, e Yoichi Sugita. "Auditory-visual speech perception examined by brain imaging and reaction time". In 7th International Conference on Spoken Language Processing (ICSLP 2002). ISCA: ISCA, 2002. http://dx.doi.org/10.21437/icslp.2002-428.
Testo completoNouza, Jan. "Computer-aided spoken-language training with enhanced visual and auditory feedback". In 6th European Conference on Speech Communication and Technology (Eurospeech 1999). ISCA: ISCA, 1999. http://dx.doi.org/10.21437/eurospeech.1999-49.
Testo completoRapporti di organizzazioni sul tema "Visual and auditory languages"
Yu, Wanchi. Implicit Learning of Children with and without Developmental Language Disorder across Auditory and Visual Categories. Portland State University Library, gennaio 2000. http://dx.doi.org/10.15760/etd.7460.
Testo completoVisram, Anisa, Iain Jackson, Ibrahim Almufarrij, Michael Stone e Kevin Munro. Comparing visual reinforcement audiometry outcomes using different auditory stimuli and visual rewards. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, gennaio 2021. http://dx.doi.org/10.37766/inplasy2021.1.0080.
Testo completoRichardson, James. Auditory and Visual Sensory Stores: a Recognition Task. Portland State University Library, gennaio 2000. http://dx.doi.org/10.15760/etd.1557.
Testo completoDriesen, Jacob. Differential Effects of Visual and Auditory Presentation on Logical Reasoning. Portland State University Library, gennaio 2000. http://dx.doi.org/10.15760/etd.2546.
Testo completoBrady-Herbst, Brenene. An Analysis of Spondee Recognition Thresholds in Auditory-only and Audio-visual Conditions. Portland State University Library, gennaio 2000. http://dx.doi.org/10.15760/etd.7094.
Testo completoHarsh, John R. Auditory and Visual Evoked Potentials as a Function of Sleep Deprivation and Irregular Sleep. Fort Belvoir, VA: Defense Technical Information Center, agosto 1989. http://dx.doi.org/10.21236/ada228488.
Testo completoJokeit, H., R. Goertzl, E. Kuchleri e S. Makeig. Event-Related Changes in the 40 Hz Electroencephalogram in Auditory and Visual Reaction Time Tasks. Fort Belvoir, VA: Defense Technical Information Center, gennaio 1994. http://dx.doi.org/10.21236/ada379543.
Testo completoDavis, Bradley M. Effects of Visual, Auditory, and Tactile Navigation Cues on Navigation Performance, Situation Awareness, and Mental Workload. Fort Belvoir, VA: Defense Technical Information Center, febbraio 2007. http://dx.doi.org/10.21236/ada463244.
Testo completoYatsymirska, Mariya. SOCIAL EXPRESSION IN MULTIMEDIA TEXTS. Ivan Franko National University of Lviv, febbraio 2021. http://dx.doi.org/10.30970/vjo.2021.49.11072.
Testo completoBeiker, Sven, a cura di. Unsettled Issues Regarding Visual Communication Between Automated Vehicles and Other Road Users. SAE International, luglio 2021. http://dx.doi.org/10.4271/epr2021016.
Testo completo