Academic literature on the topic 'Gesture'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gesture.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Gesture"
PIKA, SIMONE, ELENA NICOLADIS, and PAULA F. MARENTETTE. "A cross-cultural study on the use of gestures: Evidence for cross-linguistic transfer?" Bilingualism: Language and Cognition 9, no. 3 (October 20, 2006): 319–27. http://dx.doi.org/10.1017/s1366728906002665.
Full textBraddock, Barbara A., Christina Gabany, Meera Shah, Eric S. Armbrecht, and Kimberly A. Twyman. "Patterns of Gesture Use in Adolescents With Autism Spectrum Disorder." American Journal of Speech-Language Pathology 25, no. 3 (August 2016): 408–15. http://dx.doi.org/10.1044/2015_ajslp-14-0112.
Full textSekine, Kazuki, and Miranda L. Rose. "The Relationship of Aphasia Type and Gesture Production in People With Aphasia." American Journal of Speech-Language Pathology 22, no. 4 (November 2013): 662–72. http://dx.doi.org/10.1044/1058-0360(2013/12-0030).
Full textKong, Anthony Pak-Hin, Sam-Po Law, and Gigi Wan-Chi Chak. "A Comparison of Coverbal Gesture Use in Oral Discourse Among Speakers With Fluent and Nonfluent Aphasia." Journal of Speech, Language, and Hearing Research 60, no. 7 (July 12, 2017): 2031–46. http://dx.doi.org/10.1044/2017_jslhr-l-16-0093.
Full textPARRILL, FEY, BRITTANY LAVANTY, AUSTIN BENNETT, ALAYNA KLCO, and OZLEM ECE DEMIR-LIRA. "The relationship between character viewpoint gesture and narrative structure in children." Language and Cognition 10, no. 3 (July 12, 2018): 408–34. http://dx.doi.org/10.1017/langcog.2018.9.
Full textCooperrider, Kensy. "Foreground gesture, background gesture." Gesture 16, no. 2 (December 31, 2017): 176–202. http://dx.doi.org/10.1075/gest.16.2.02coo.
Full textCASEY, SHANNON, KAREN EMMOREY, and HEATHER LARRABEE. "The effects of learning American Sign Language on co-speech gesture." Bilingualism: Language and Cognition 15, no. 4 (January 3, 2012): 677–86. http://dx.doi.org/10.1017/s1366728911000575.
Full textForan, Lori, and Brenda Beverly. "Points to Ponder: Gesture and Language in Math Talk." Perspectives on Language Learning and Education 22, no. 2 (March 2015): 72–81. http://dx.doi.org/10.1044/lle22.2.71.
Full textJasim, Mahmood, Tao Zhang, and Md Hasanuzzaman. "A Real-Time Computer Vision-Based Static and Dynamic Hand Gesture Recognition System." International Journal of Image and Graphics 14, no. 01n02 (January 2014): 1450006. http://dx.doi.org/10.1142/s0219467814500065.
Full textKelly, Spencer D., Peter Creigh, and James Bartolotti. "Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing." Journal of Cognitive Neuroscience 22, no. 4 (April 2010): 683–94. http://dx.doi.org/10.1162/jocn.2009.21254.
Full textDissertations / Theses on the topic "Gesture"
Lindberg, Martin. "Introducing Gestures: Exploring Feedforward in Touch-Gesture Interfaces." Thesis, Malmö universitet, Fakulteten för kultur och samhälle (KS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-23555.
Full textCampbell, Lee Winston. "Visual classification of co-verbal gestures for gesture understanding." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/8707.
Full textIncludes bibliographical references (leaves 86-92).
A person's communicative intent can be better understood by either a human or a machine if the person's gestures are understood. This thesis project demonstrates an expansion of both the range of co-verbal gestures a machine can identify, and the range of communicative intents the machine can infer. We develop an automatic system that uses realtime video as sensory input and then segments, classifies, and responds to co-verbal gestures made by users in realtime as they converse with a synthetic character known as REA, which is being developed in parallel by Justine Cassell and her students at the MIT Media Lab. A set of 670 natural gestures, videotaped and visually tracked in the course of conversational interviews and then hand segmented and annotated according to a widely used gesture classification scheme, is used in an offline training process that trains Hidden Markov Model classifiers. A number of feature sets are extracted and tested in the offline training process, and the best performer is employed in an online HMM segmenter and classifier that requires no encumbering attachments to the user. Modifications made to the REA system enable REA to respond to the user's beat and deictic gestures as well as turntaking requests the user may convey in gesture.
(cont.) The recognition results obtained are far above chance, but too low for use in a production recognition system. The results provide a measure of validity for the gesture categories chosen, and they provide positive evidence for an appealing but difficult to prove proposition: to the extent that a machine can recognize and use these categories of gestures to infer information not present in the words spoken, there is exploitable complementary information in the gesture stream.
by Lee Winston Campbell.
Ph.D.
Smith, Jason Alan. "Naturalistic skeletal gesture movement and rendered gesture decoding." Diss., Online access via UMI:, 2006.
Find full textDavis, James W. "Gesture recognition." Honors in the Major Thesis, University of Central Florida, 1994. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/126.
Full textBachelors
Arts and Sciences
Computer Science
Yunus, Fajrian. "Prediction of Gesture Timing and Study About Image Schema for Metaphoric Gestures." Electronic Thesis or Diss., Sorbonne université, 2021. http://www.theses.fr/2021SORUS551.
Full textCommunicative gestures and speech are tightly linked. We want to automatically predict the gestures based on the speech. The speech itself has two constituents, namely the acoustic and the content of the speech (i.e. the text). In one part of this dissertation, we develop a model based on a recurrent neural network with attention mechanism to predict the gesture timing, that is when the gesture should happen and what kind of gesture should happen. We use a sequence comparison technique to evaluate the model performance. We also perform a subjective study to measure how our respondents judge the naturalness, the time consistency, and the semantic consistency of the generated gestures. In another part of the dissertation, we deal with the generation of metaphoric gestures. Metaphoric gestures carry meaning, and thus it is necessary to extract the relevant semantics from the content of the speech. This is done by using the concept of image schema as demonstrated by Ravenet et al. However, to be able to use image schema in machine learning techniques, the image schemas has to be converted into vectors of real numbers. Therefore, we investigate how we can transform the image schema into vector by using word embedding techniques. Lastly, we investigate how we can represent hand gesture shapes. The representation has to be compact enough yet it also has to be broad enough such that it can cover sufficient shapes which can represent sufficient range of semantics
Cheng, You-Chi. "Robust gesture recognition." Diss., Georgia Institute of Technology, 2014. http://hdl.handle.net/1853/53492.
Full textCometti, Jean Pierre. "The architect's gesture." Pontificia Universidad Católica del Perú - Departamento de Humanidades, 2012. http://repositorio.pucp.edu.pe/index/handle/123456789/112899.
Full textKaâniche, Mohamed Bécha. "Human gesture recognition." Nice, 2009. http://www.theses.fr/2009NICE4032.
Full textIn this thesis, we aim to recognize gestures (e. G. Hand raising) and more generally short actions (e. G. Fall, bending) accomplished by an individual. Many techniques have already been proposed for gesture recognition in specific environment (e. G. Laboratory) using the cooperation of several sensors (e. G. Camera network, individual equipped with markers). Despite these strong hypotheses, gesture recognition is still brittle and often depends on the position of the individual relatively to the cameras. We propose to reduce these hypotheses in order to conceive general algorithm enabling the recognition of the gesture of an individual involving in an unconstrained environment and observed through limited number of cameras. The goal is to estimate the likelihood of gesture recognition in function of the observation conditions. Our method consists of classifying a set of gestures by learning motion descriptors. These motion descriptors are local signatures of the motion of corner points which are associated with their local textural description. We demonstrate the effectiveness of our motion descriptors by recognizing the actions of the public KTH database
Alon, Jonathan. "Spatiotemporal Gesture Segmentation." Boston University Computer Science Department, 2006. https://hdl.handle.net/2144/1884.
Full textMacleod, Tracy. "Gesture signs in social interaction : how group size influences gesture communication." Thesis, University of Glasgow, 2009. http://theses.gla.ac.uk/1205/.
Full textBooks on the topic "Gesture"
Stam, Gale, Gale Stam, and Mika Ishino. Integrating gestures: The interdisciplinary nature of gesture. Amsterdam: John Benjamins Pub., 2011.
Find full textOrmerod, Roger. Farewell gesture. New York: Doubleday, 1991.
Find full textConnors, April. Gesture Drawing. Boca Raton, FL : CRC Press, Taylor & Francis Group, 2018.: CRC Press, 2017. http://dx.doi.org/10.1201/9781315156385.
Full textChurch, R. Breckinridge, Martha W. Alibali, and Spencer D. Kelly, eds. Why Gesture? Amsterdam: John Benjamins Publishing Company, 2017. http://dx.doi.org/10.1075/gs.7.
Full textEscalera, Sergio, Isabelle Guyon, and Vassilis Athitsos, eds. Gesture Recognition. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57021-1.
Full textKonar, Amit, and Sriparna Saha. Gesture Recognition. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-62212-5.
Full textLinda, Corriveau, and Hatay Nona, eds. Pure gesture. Nevada City, CA: Gateways, 1990.
Find full textWinterthur, Kunstmuseum, ed. Frozen gesture: Gesten in der Malerei = gestures in painting. München: Hirmer Verlag, 2019.
Find full textMcNeill, David. Gesture and Thought. Chicago: University of Chicago Press, 2008.
Find full textCienki, Alan, and Cornelia Müller, eds. Metaphor and Gesture. Amsterdam: John Benjamins Publishing Company, 2008. http://dx.doi.org/10.1075/gs.3.
Full textBook chapters on the topic "Gesture"
Vermeerbergen, Myriam, and Eline Demey. "Sign + Gesture = Speech + Gesture?" In Simultaneity in Signed Languages, 257–82. Amsterdam: John Benjamins Publishing Company, 2007. http://dx.doi.org/10.1075/cilt.281.12ver.
Full textSchneider, Rebecca. "Gesture." In Critical Terms in Futures Studies, 145–49. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-28987-4_23.
Full textPeacock, Steven. "Gesture." In Hollywood and Intimacy, 56–77. London: Palgrave Macmillan UK, 2012. http://dx.doi.org/10.1057/9780230355330_3.
Full textLamb, Jonathan. "Gesture." In The Routledge Handbook of Reenactment Studies, 94–96. First edition. | New York: Routledge, 2020.: Routledge, 2019. http://dx.doi.org/10.4324/9780429445637-19.
Full textCartmill, Erica A., and Susan Goldin-Meadow. "Gesture." In APA handbook of nonverbal communication., 307–33. Washington: American Psychological Association, 2016. http://dx.doi.org/10.1037/14669-012.
Full textHolme, Randal. "Gesture." In Cognitive Linguistics and Language Teaching, 54–62. London: Palgrave Macmillan UK, 2009. http://dx.doi.org/10.1057/9780230233676_4.
Full textReichl, Karl. "Gesture." In The Oral Epic, 85–106. New York: Routledge, 2021. http://dx.doi.org/10.4324/9781003189114-8.
Full textSmith, Roger. "Gesture." In Kinaesthesia in the Psychology, Philosophy and Culture of Human Experience, 109–14. London: Routledge, 2023. http://dx.doi.org/10.4324/9781003368021-14.
Full textStanchfield, Walt. "Gesture." In Drawn to Life: 20 Golden Years of Disney Master Classes, 43–100. 2nd ed. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003215363-2.
Full textWulf, Christoph. "Gesture." In Handbook of the Anthropocene, 1429–33. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-25910-4_233.
Full textConference papers on the topic "Gesture"
Chen, Ting, and Wencheng Tang. "Interactive Gesture of Exhibition Hall Mobile Follow Service Robot." In Human Systems Engineering and Design (IHSED 2021) Future Trends and Applications. AHFE International, 2021. http://dx.doi.org/10.54941/ahfe1001110.
Full textDeVito, Matthew P., and Karthik Ramani. "Talking to TAD: Animating an Everyday Object for Use in Augmented Workspaces." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34189.
Full textYang, Sicheng, Zhiyong Wu, Minglei Li, Zhensong Zhang, Lei Hao, Weihong Bao, Ming Cheng, and Long Xiao. "DiffuseStyleGesture: Stylized Audio-Driven Co-Speech Gesture Generation with Diffusion Models." In Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}. California: International Joint Conferences on Artificial Intelligence Organization, 2023. http://dx.doi.org/10.24963/ijcai.2023/650.
Full textLiang, Cao. "Bridging the Cognitive Gap: Optimizing Gesture Interaction Design for the Elderly." In Intelligent Human Systems Integration (IHSI 2024) Integrating People and Intelligent Systems. AHFE International, 2024. http://dx.doi.org/10.54941/ahfe1004543.
Full textHe, Yanming, Shumeng Hou, and Peiyao Cheng. "Generating a Gesture Set Using the User-defined Method in Smart Home Contexts." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002181.
Full textJung, Euichul, Young Joo Jang, and Whang Jae Lee. "Study on Preferred Gestural Interaction of Playing Music for Wrist Wearable Devices." In Applied Human Factors and Ergonomics Conference. AHFE International, 2021. http://dx.doi.org/10.54941/ahfe100581.
Full textHuang, Jinmiao, and Rahul Rai. "Hand Gesture Based Intuitive CAD Interface." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34070.
Full textNyaga, Casam, and Ruth Wario. "Towards Kenyan Sign Language Hand Gesture Recognition Dataset." In 14th International Conference on Applied Human Factors and Ergonomics (AHFE 2023). AHFE International, 2023. http://dx.doi.org/10.54941/ahfe1003281.
Full textRafiq, Riyad Bin, Weishi Shi, and Mark V. Albert. "Wearable Sensor-Based Few-Shot Continual Learning on Hand Gestures for Motor-Impaired Individuals via Latent Embedding Exploitation." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. California: International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/823.
Full textRadkowski, Rafael, and Christian Stritzke. "Comparison Between 2D and 3D Hand Gesture Interaction for Augmented Reality Applications." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-48155.
Full textReports on the topic "Gesture"
Yang, Jie, and Yangsheng Xu. Hidden Markov Model for Gesture Recognition. Fort Belvoir, VA: Defense Technical Information Center, May 1994. http://dx.doi.org/10.21236/ada282845.
Full textMorton, Paul R., Edward L. Fix, and Gloria L. Calhoun. Hand Gesture Recognition Using Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, May 1996. http://dx.doi.org/10.21236/ada314933.
Full textCassell, Justine, Matthew Stone, Brett Douville, Scott Prevost, and Brett Achorn. Modeling the Interaction between Speech and Gesture. Fort Belvoir, VA: Defense Technical Information Center, May 1994. http://dx.doi.org/10.21236/ada290549.
Full textVira, Naren. Gesture Recognition Development for the Interactive Datawall. Fort Belvoir, VA: Defense Technical Information Center, January 2008. http://dx.doi.org/10.21236/ada476755.
Full textZhao, Ruyin. CSI-based Gesture Recognition and Object Detection. Ames (Iowa): Iowa State University, January 2021. http://dx.doi.org/10.31274/cc-20240624-456.
Full textLampton, Donald R., Bruce W. Knerr, Bryan R. Clark, Glenn A. Martin, and Donald A. Washburn. Gesture Recognition System for Hand and Arm Signals. Fort Belvoir, VA: Defense Technical Information Center, November 2002. http://dx.doi.org/10.21236/ada408459.
Full textVenetsky, Larry, Mark Husni, and Mark Yager. Gesture Recognition for UCAV-N Flight Deck Operations. Fort Belvoir, VA: Defense Technical Information Center, January 2003. http://dx.doi.org/10.21236/ada422629.
Full textYacoob, Yaser, and Larry Davis. Gesture-Based Control of Spaces and Objects in Augmented. Fort Belvoir, VA: Defense Technical Information Center, October 2002. http://dx.doi.org/10.21236/ada408623.
Full textPerzanowski, Dennis, Alan C. Schultz, William Adams, and Elaine Marsh. Using a Natural Language and Gesture Interface for Unmanned Vehicles. Fort Belvoir, VA: Defense Technical Information Center, January 2000. http://dx.doi.org/10.21236/ada435161.
Full textElliott, Linda R., Susan G. Hill, and Michael Barnes. Gesture-Based Controls for Robots: Overview and Implications for Use by Soldiers. Fort Belvoir, VA: Defense Technical Information Center, July 2016. http://dx.doi.org/10.21236/ad1011904.
Full text