Journal articles on the topic 'Robot-Robot interaction'

To see the other types of publications on this topic, follow the link: Robot-Robot interaction.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Robot-Robot interaction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Lee, Heejin. "A Human-Robot Interaction Entertainment Pet Robot." Journal of Korean Institute of Intelligent Systems 24, no. 2 (April 25, 2014): 179–85. http://dx.doi.org/10.5391/jkiis.2014.24.2.179.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mitsunaga, N., C. Smith, T. Kanda, H. Ishiguro, and N. Hagita. "Adapting Robot Behavior for Human--Robot Interaction." IEEE Transactions on Robotics 24, no. 4 (August 2008): 911–16. http://dx.doi.org/10.1109/tro.2008.926867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lai, Yujun, Gavin Paul, Yunduan Cui, and Takamitsu Matsubara. "User intent estimation during robot learning using physical human robot interaction primitives." Autonomous Robots 46, no. 2 (January 15, 2022): 421–36. http://dx.doi.org/10.1007/s10514-021-10030-9.

Full text
Abstract:
AbstractAs robotic systems transition from traditional setups to collaborative work spaces, the prevalence of physical Human Robot Interaction has risen in both industrial and domestic environments. A popular representation for robot behavior is movement primitives which learn, imitate, and generalize from expert demonstrations. While there are existing works in context-aware movement primitives, they are usually limited to contact-free human robot interactions. This paper presents physical Human Robot Interaction Primitives (pHRIP), which utilize only the interaction forces between the human user and robot to estimate user intent and generate the appropriate robot response during physical human robot interactions. The efficacy of pHRIP is evaluated through multiple experiments based on target-directed reaching and obstacle avoidance tasks using a real seven degree of freedom robot arm. The results are validated against Interaction Primitives which use observations of robotic trajectories, with discussions of future pHRI applications utilizing pHRIP.
APA, Harvard, Vancouver, ISO, and other styles
4

Takamatsu, Jun. "Human-Robot Interaction." Journal of the Robotics Society of Japan 37, no. 4 (2019): 293–96. http://dx.doi.org/10.7210/jrsj.37.293.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jia, Yunyi, Biao Zhang, Miao Li, Brady King, and Ali Meghdari. "Human-Robot Interaction." Journal of Robotics 2018 (October 1, 2018): 1–2. http://dx.doi.org/10.1155/2018/3879547.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Murphy, Robin, Tatsuya Nomura, Aude Billard, and Jennifer Burke. "Human–Robot Interaction." IEEE Robotics & Automation Magazine 17, no. 2 (June 2010): 85–89. http://dx.doi.org/10.1109/mra.2010.936953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sethumadhavan, Arathi. "Human-Robot Interaction." Ergonomics in Design: The Quarterly of Human Factors Applications 20, no. 3 (July 2012): 27–28. http://dx.doi.org/10.1177/1064804612449796.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sheridan, Thomas B. "Human–Robot Interaction." Human Factors: The Journal of the Human Factors and Ergonomics Society 58, no. 4 (April 20, 2016): 525–32. http://dx.doi.org/10.1177/0018720816644364.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pearson, Yvette. "Child-Robot Interaction." American Scientist 108, no. 1 (2020): 16. http://dx.doi.org/10.1511/2020.108.1.16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jones, Keith S., and Elizabeth A. Schmidlin. "Human-Robot Interaction." Reviews of Human Factors and Ergonomics 7, no. 1 (August 25, 2011): 100–148. http://dx.doi.org/10.1177/1557234x11410388.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Couto, Marta, Shruti Chandra, Elmira Yadollahi, and Vicky Charisi. "Child-robot interaction." Interaction Studies 23, no. 2 (December 31, 2022): 151–56. http://dx.doi.org/10.1075/is.00014.edi.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Bharatharaj, Jaishankar, Loulin Huang, Ahmed Al-Jumaily, Rajesh Elara Mohan, and Chris Krägeloh. "Sociopsychological and physiological effects of a robot-assisted therapy for children with autism." International Journal of Advanced Robotic Systems 14, no. 5 (September 1, 2017): 172988141773689. http://dx.doi.org/10.1177/1729881417736895.

Full text
Abstract:
This article reports our findings from a robot-assisted therapeutic study conducted over 49 days to investigate the sociopsychological and physiological effects in children with autism spectrum disorder using a parrot-inspired robot, KiliRo, that we developed to help in therapeutic settings. We investigated the frequency of participants’ interactions among each other and assessed any changes in interaction using social network analysis. Interactions were assessed through manual observation before and after exposure to the robot. Urinary and salivary tests were performed to obtain protein and α-amylase levels, respectively, to report the physiological changes in participating children with autism spectrum disorder before and after interacting with the robot. This is a pioneering human–robot interaction study to investigate changes in stress levels using salivary samples. Systolic and diastolic blood pressure, heart rate, and arterial oxygen saturation level in blood were also monitored to investigate the physiological changes in participating children before, during, and after interacting with our parrot-inspired robot, KiliRo. The results show that the robot can help increase social interaction among children with autism spectrum disorder and assist in learning tasks. Furthermore, the clinical biochemistry test report using urinary and salivary samples indicates that the stress levels of children with autism reduced notably after interacting with the robot. Nevertheless, blood pressure, heart rate, and oxygen levels in blood did not show positive change in all participants.
APA, Harvard, Vancouver, ISO, and other styles
13

Park, Eunil, and Jaeryoung Lee. "I am a warm robot: the effects of temperature in physical human–robot interaction." Robotica 32, no. 1 (August 2, 2013): 133–42. http://dx.doi.org/10.1017/s026357471300074x.

Full text
Abstract:
SUMMARYWhat factors affect users' perceptions of physical human–robot interactions? To answer this question, this study examined whether the skin temperature of a social robot affected users' perceptions of the robot during physical interaction. Results from a between-subjects experiment (warm, intermediate, cool, or no interaction) with a dinosaur robot demonstrated that skin temperature significantly affects users' perceptions and evaluations of a socially interactive robot. Additionally, this study found that social presence had partial mediating effects on several dependent variables. Important implications and limitations for improving human–robot interactions are discussed here.
APA, Harvard, Vancouver, ISO, and other styles
14

Kim, Yoon-Sang, Kwang-Ho Seok, Chang-Mug Lee, and Oh-Young Kwon. "A Robot Motion Authoring Using Finger-Robot Interaction." Journal of information and communication convergence engineering 8, no. 2 (April 30, 2010): 180–84. http://dx.doi.org/10.6109/jicce.2010.8.2.180.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Berg, Julia, Albrecht Lottermoser, Christoph Richter, and Gunther Reinhart. "Human-Robot-Interaction for mobile industrial robot teams." Procedia CIRP 79 (2019): 614–19. http://dx.doi.org/10.1016/j.procir.2019.02.080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Du, Guanglong, Mingxuan Chen, Caibing Liu, Bo Zhang, and Ping Zhang. "Online Robot Teaching With Natural Human–Robot Interaction." IEEE Transactions on Industrial Electronics 65, no. 12 (December 2018): 9571–81. http://dx.doi.org/10.1109/tie.2018.2823667.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Su, Wei Hua, Jing Gong Sun, Fu Niu, and Xin Yue Xu. "The Human-Robot Interaction: An Investigation of Rescue Robot." Advanced Materials Research 711 (June 2013): 523–28. http://dx.doi.org/10.4028/www.scientific.net/amr.711.523.

Full text
Abstract:
The thesis research aimed to further the study of human-robot interaction (HRI) issues, especially regarding the development of rescue robot. The paper firstly discussed the status of the rescue robot and described the framework of human-robot interaction of search-rescue robot and rescue-evacuation robot. Subsequently, the general HRI issues will be discussed to explain how they affect the use of robots. Finally, we present suggested this multidisciplinary field of research, namely human-robot interaction, requires contributions from a variety of research fields such as robotics, human-computer interaction, and artificial intelligence.
APA, Harvard, Vancouver, ISO, and other styles
18

Robins, Ben, Kerstin Dautenhahn, and Janek Dubowski. "Does appearance matter in the interaction of children with autism with a humanoid robot?" Interaction Studies 7, no. 3 (November 13, 2006): 479–512. http://dx.doi.org/10.1075/is.7.3.16rob.

Full text
Abstract:
This article studies the impact of a robot’s appearance on interactions involving four children with autism. This work is part of the Aurora project with the overall aim to support interaction skills in children with autism, using robots as ‘interactive toys’ that can encourage and mediate interactions. We follow an approach commonly adopted in assistive robotics and work with a small group of children with autism. This article investigates which robot appearances are suitable to encourage interactions between a robot and children with autism. The children’s levels of interaction with and response to different appearances of two types of robots are compared: a small humanoid doll, and a life-sized ‘Theatrical Robot’ (a mime artist behaving like a robot). The small humanoid robot appeared either as a human-like ‘pretty doll’ or as a ‘robot’ with plain features. The Theatrical Robot was presented either as an ordinary human, or with plain clothing and a featureless, masked face. The results of these trials clearly indicate the children’s preference in their initial response for interaction with a plain, featureless robot over the interaction with a human-like robot. In the case of the life-size Theatrical Robot, the response of children towards the plain/robotic robot was notably more social and pro-active. Implications of these results for our work on using robots as assistive technology for children with autism and their possible use in autism research are discussed.
APA, Harvard, Vancouver, ISO, and other styles
19

Woo, Jinseok, and Naoyuki Kubota. "Human-Robot Interaction Design Using Smart Device Based Robot Partner." International Journal of Artificial Life Research 6, no. 2 (July 2016): 23–43. http://dx.doi.org/10.4018/ijalr.2016070102.

Full text
Abstract:
Nowadays, various robot partners have been developed to realize human-friendly interactions. In general, a robot system is composed of hardware modules, software modules, and application contents. It takes much time to design utterance contents and motion patterns as application contents simultaneously, but the design support systems mainly focus on the generation of robot motion patterns. Furthermore, a methodology is needed to easily change the specification of hardware and software according to diversified needs, and the developmental environment to design the application contents on verbal and nonverbal communication with people. In this paper, the authors propose robot partners with the modularized architecture of hardware and software by using smart devices, and propose a developmental environment to realize easy contents design of verbal and nonverbal communication. In order to solve the problem of difficulty in the content design, they develop a design support environment using design templates of communication application contents. Next, they apply the robot partner to navigate visitors to the robot contest of the system design forum held in Tokyo Metropolitan University. Finally, they show several examples of the interaction cases, and discuss the interaction design for smart device based robot partners.
APA, Harvard, Vancouver, ISO, and other styles
20

Regmi, Sambad, Devin Burns, and Yun Seong Song. "A robot for overground physical human-robot interaction experiments." PLOS ONE 17, no. 11 (November 10, 2022): e0276980. http://dx.doi.org/10.1371/journal.pone.0276980.

Full text
Abstract:
Many anticipated physical human-robot interaction (pHRI) applications in the near future are overground tasks such as walking assistance. For investigating the biomechanics of human movement during pHRI, this work presents Ophrie, a novel interactive robot dedicated for physical interaction tasks with a human in overground settings. Unique design requirements for pHRI were considered in implementing the one-arm mobile robot, such as the low output impedance and the ability to apply small interaction forces. The robot can measure the human arm stiffness, an important physical quantity that can reveal human biomechanics during overground pHRI, while the human walks alongside the robot. This robot is anticipated to enable novel pHRI experiments and advance our understanding of intuitive and effective overground pHRI.
APA, Harvard, Vancouver, ISO, and other styles
21

Blevis, Eli. "Future robot." Interactions 23, no. 1 (December 28, 2015): 88. http://dx.doi.org/10.1145/2856122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Xin, Hong Bing, and Qiang Huang. "Interaction and Coupling of Robot." Advanced Materials Research 383-390 (November 2011): 1299–303. http://dx.doi.org/10.4028/www.scientific.net/amr.383-390.1299.

Full text
Abstract:
Robot interaction involves in the robot sociality, on the basis of bionics, robotic technologies, psychology, network and system science etc., after the basic characteristics of the robot interaction and implementation discussed, a design for coupling to heterogeneous robot for motion control is introduced. The dynamic coupling method of heterogeneous robots has been implemented by adopting the inheritance, polymorphism and template technologies.
APA, Harvard, Vancouver, ISO, and other styles
23

Shiomi, Masahiro, Hidenobu Sumioka, and Hiroshi Ishiguro. "Special Issue on Human-Robot Interaction in Close Distance." Journal of Robotics and Mechatronics 32, no. 1 (February 20, 2020): 7. http://dx.doi.org/10.20965/jrm.2020.p0007.

Full text
Abstract:
As social robot research is advancing, the interaction distance between people and robots is decreasing. Indeed, although we were once required to maintain a certain physical distance from traditional industrial robots for safety, we can now interact with social robots in such a close distance that we can touch them. The physical existence of social robots will be essential to realize natural and acceptable interactions with people in daily environments. Because social robots function in our daily environments, we must design scenarios where robots interact closely with humans by considering various viewpoints. Interactions that involve touching robots influence the changes in the behavior of a person strongly. Therefore, robotics researchers and developers need to design such scenarios carefully. Based on these considerations, this special issue focuses on close human-robot interactions. This special issue on “Human-Robot Interaction in Close Distance” includes a review paper and 11 other interesting papers covering various topics such as social touch interactions, non-verbal behavior design for touch interactions, child-robot interactions including physical contact, conversations with physical interactions, motion copying systems, and mobile human-robot interactions. We thank all the authors and reviewers of the papers and hope this special issue will help readers better understand human-robot interaction in close distance.
APA, Harvard, Vancouver, ISO, and other styles
24

Hsieh, Wei-Fen, Eri Sato-Shimokawara, and Toru Yamaguchi. "Investigation of Robot Expression Style in Human-Robot Interaction." Journal of Robotics and Mechatronics 32, no. 1 (February 20, 2020): 224–35. http://dx.doi.org/10.20965/jrm.2020.p0224.

Full text
Abstract:
In our daily conversation, we obtain considerable information from our interlocutor’s non-verbal behaviors, such as gaze and gestures. Several studies have shown that nonverbal messages are prominent factors in smoothing the process of human-robot interaction. Our previous studies have shown that not only a robot’s appearance but also its gestures, tone, and other nonverbal factors influence a person’s impression of it. The paper presented an analysis of the impressions made when human motions are implemented on a humanoid robot, and experiments were conducted to evaluate impressions made by robot expressions to analyze the sensations. The results showed the relation between robot expression patterns and human preferences. To further investigate biofeedback elicited by different robot styles of expression, a scenario-based experiment was done. The results revealed that people’s emotions can definitely be affected by robot behavior, and the robot’s way of expressing itself is what most influences whether or not it is perceived as friendly. The results show that it is potentially useful to combine our concept into a robot system to meet individual needs.
APA, Harvard, Vancouver, ISO, and other styles
25

Lee, Youngho, Young Jae Ryoo, and Jongmyung Choi. "Framework for Interaction Among Human–Robot-Environment in DigiLog Space." International Journal of Humanoid Robotics 11, no. 04 (December 2014): 1442005. http://dx.doi.org/10.1142/s0219843614420055.

Full text
Abstract:
With the development of computing technology, robots are now popular in our daily life. Human–robot interaction is not restricted to a direct communication between them. The communication could include various different human to human interactions. In this paper, we present a framework for enhancing the interaction among human–robot-environments. The proposed framework is composed of a robot part, a user part, and the DigiLog space. To evaluate the proposed framework, we applied the framework into a real-time remote robot-control platform in the smart DigiLog space. We are implementing real time controlling and monitoring of a robot by using one smart phone as the robot brain and the other smart phone as the remote controller.
APA, Harvard, Vancouver, ISO, and other styles
26

Tanaka, Ryosuke, Jinseok Woo, and Naoyuki Kubota. "Nonverbal Communication Based on Instructed Learning for Socially Embedded Robot Partners." Journal of Advanced Computational Intelligence and Intelligent Informatics 23, no. 3 (May 20, 2019): 584–91. http://dx.doi.org/10.20965/jaciii.2019.p0584.

Full text
Abstract:
The research and development of robot partners have been actively conducted to support human daily life. Human-robot interaction is one of the important research field, in which verbal and nonverbal communication are essential elements for improving the interactions between humans and robots. Thus, the purpose of this research was to establish a method to adapt a human-robot interaction mechanism for robot partners to various situations. In the proposed system, the robot needs to analyze the gestures of humans to interact with them. Humans have the ability to interact according to dynamically changing environmental conditions. Therefore, when robots interact with a human, it is necessary for robots to interact appropriately by correctly judging the situation according to human gestures to carry out natural human-robot interaction. In this paper, we propose a constructive methodology on a system that enables nonverbal communication elements for human-robot interaction. The proposed method was validated through a series of experiments.
APA, Harvard, Vancouver, ISO, and other styles
27

Lee, Jae-Joon, Dae-Won Kim, and Bo-Yeong Kang. "Exploiting Child-Robot Aesthetic Interaction for a Social Robot." International Journal of Advanced Robotic Systems 9, no. 3 (January 2012): 81. http://dx.doi.org/10.5772/51191.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Jeong, Jaesik, Jeehyun Yang, and Jacky Baltes. "Robot magic show as testbed for humanoid robot interaction." Entertainment Computing 40 (January 2022): 100456. http://dx.doi.org/10.1016/j.entcom.2021.100456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Giannopulu, I., and G. Pradel. "From Child-Robot Interaction to Child-Robot-Therapist Interaction: A Case Study in Autism." Applied Bionics and Biomechanics 9, no. 2 (2012): 173–79. http://dx.doi.org/10.1155/2012/682601.

Full text
Abstract:
Troubles in social communication as well as deficits in the cognitive treatment of emotions are supposed to be a fundamental part of autism. We present a case study based on multimodal interaction between a mobile robot and a child with autism in spontaneous, free game play. This case study tells us that the robot mediates the interaction between the autistic child and therapist once the robot-child interaction has been established. In addition, the child uses the robot as a mediator to express positive emotion playing with the therapist. It is thought that the three-pronged interaction i.e., child-robot-therapist could better facilitate the transfer of social and emotional abilities to real life settings. Robot therapy has a high potential to improve the condition of brain activity in autistic children.
APA, Harvard, Vancouver, ISO, and other styles
30

Thomaz, Andrea, Guy Hoffman, and Maya Cakmak. "Computational Human-Robot Interaction." Foundations and Trends in Robotics 4, no. 2-3 (2016): 104–223. http://dx.doi.org/10.1561/2300000049.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Karniel, Amir, Angelika Peer, Opher Donchin, Ferdinando A. Mussa-Ivaldi, and Gerald E. Loeb. "Haptic Human-Robot Interaction." IEEE Transactions on Haptics 5, no. 3 (2012): 193–95. http://dx.doi.org/10.1109/toh.2012.47.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Pook, Polly K., and Dana H. Ballard. "Deictic human/robot interaction." Robotics and Autonomous Systems 18, no. 1-2 (July 1996): 259–69. http://dx.doi.org/10.1016/0921-8890(95)00080-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Young, James E., JaYoung Sung, Amy Voida, Ehud Sharlin, Takeo Igarashi, Henrik I. Christensen, and Rebecca E. Grinter. "Evaluating Human-Robot Interaction." International Journal of Social Robotics 3, no. 1 (October 1, 2010): 53–67. http://dx.doi.org/10.1007/s12369-010-0081-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Takase, Noriko, Takahiro Takeda, Janos Botzheim, and Naoyuki Kubota. "Interaction, Communication, and Experience Design in Robot Edutainment." Abstracts of the international conference on advanced mechatronics : toward evolutionary fusion of IT and mechatronics : ICAM 2015.6 (2015): 159–60. http://dx.doi.org/10.1299/jsmeicam.2015.6.159.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Ullrich, Daniel. "Robot Personality Insights. Designing Suitable Robot Personalities for Different Domains." i-com 16, no. 1 (April 1, 2017): 57–67. http://dx.doi.org/10.1515/icom-2017-0003.

Full text
Abstract:
AbstractWith the development of social robots that are primarily designed for interacting with humans, particular facets of interaction need to be explored. One of them is the manifestation of robot personalities, which have the potential to raise acceptance and enhance user experience if done appropriate – or ruin both if done wrong.The present paper argues for the relevance of suitable robot personalities and discusses the factors that affect suitability, in particular interaction domain and personal preferences.An experiment ($\mathrm{N}=30$) with four different interaction scenarios (goal- and experience oriented) and three robot personalities (positive, neutral, negative) was performed to explore effects of personality and domain on personality suitability and acceptance. Results indicate that users can differentiate between different robot personalities and evaluate accordingly. In a goal-oriented stressful situation (train-ticket purchase under time pressure) the neutral personality was rated best. In experience-oriented scenarios, the positive robot personality was preferred. In the context of strictly performance oriented tasks, the effect of robot personality seems to be insignificant. Personal preferences for personalities seem to be influential, however, no clear pattern could be found.Lastly, directions for future research are depicted and implications for researchers and designers are discussed.
APA, Harvard, Vancouver, ISO, and other styles
36

Tapus, Adriana, Andreea Peca, Amir Aly, Cristina A. Pop, Lavinia Jisa, Sebastian Pintea, Alina S. Rusu, and Daniel O. David. "Children with autism social engagement in interaction with Nao, an imitative robot." Interaction Studies 13, no. 3 (December 19, 2012): 315–47. http://dx.doi.org/10.1075/is.13.3.01tap.

Full text
Abstract:
This paper presents a series of 4 single subject experiments aimed to investigate whether children with autism show more social engagement when interacting with the Nao robot, compared to a human partner in a motor imitation task. The Nao robot imitates gross arm movements of the child in real-time. Different behavioral criteria (i.e. eye gaze, gaze shifting, free initiations and prompted initiations of arm movements, and smile/laughter) were analyzed based on the video data of the interaction. The results are mixed and suggest a high variability in reactions to the Nao robot. The results are as follows: For Child2 and Child3, the results indicate no effect of the Nao robot in any of the target variables. Child1 and Child4 showed more eye gaze and smile/laughter in the interaction with the Nao robot compared to the human partner and Child1 showed a higher frequency of motor initiations in the interaction with the Nao robot compared to the baselines, but not with respect to the human-interaction. The robot proved to be a better facilitator of shared attention only for Child1. Keywords: human-robot interaction; assistive robotics; autism
APA, Harvard, Vancouver, ISO, and other styles
37

Maijala, Minna, and Maarit Mutta. "The Teacher's Role in Robot-assisted Language Learning and its Impact on Classroom Ecology." EuroCALL Review 30, no. 2 (January 26, 2024): 6–23. http://dx.doi.org/10.4995/eurocall.2023.17018.

Full text
Abstract:
In recent years, social robots have emerged as a new teaching aid in foreign language (FL) classrooms. Interaction in FL classrooms usually takes place between teachers and learners or among learners. However, this constellation of interactions changes when a robot enters the classroom. The robot’s role in the classroom has been studied previously, however, in this article we examine how initial encounters between a social robot and learners occur, focusing on the teacher’s role during these encounters. Additionally, we examine how children seek help or assurance from their teacher when interacting with the robot. Research data consists of video recorded in FL classrooms in primary schools in Finland in 2019. The primary school learners (N = 22) who participated in this study ranged in age from 10 to 13 years. The results show that during the robot-assisted language learning (RALL) interaction, the teacher had several roles: she validated children’s contributions, guided or mediated the discussion, encouraged the children to speak with the robot, and provided technical support. The results also suggest that the teacher’s role in RALL classrooms, while not necessarily central, is essential to ensure smooth interactions between the robot and learners.
APA, Harvard, Vancouver, ISO, and other styles
38

van Maris, Anouk, Nancy Zook, Sanja Dogramadzi, Matthew Studley, Alan Winfield, and Praminda Caleb-Solly. "A New Perspective on Robot Ethics through Investigating Human–Robot Interactions with Older Adults." Applied Sciences 11, no. 21 (October 29, 2021): 10136. http://dx.doi.org/10.3390/app112110136.

Full text
Abstract:
This work explored the use of human–robot interaction research to investigate robot ethics. A longitudinal human–robot interaction study was conducted with self-reported healthy older adults to determine whether expression of artificial emotions by a social robot could result in emotional deception and emotional attachment. The findings from this study have highlighted that currently there appears to be no adequate tools, or the means, to determine the ethical impact and concerns ensuing from long-term interactions between social robots and older adults. This raises the question whether we should continue the fundamental development of social robots if we cannot determine their potential negative impact and whether we should shift our focus to the development of human–robot interaction assessment tools that provide more objective measures of ethical impact.
APA, Harvard, Vancouver, ISO, and other styles
39

Momen, Ali, and Eva Wiese. "Noticing Extroversion Effects Attention: How Robot and Participant Personality Affect Gaze Cueing." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62, no. 1 (September 2018): 1557–61. http://dx.doi.org/10.1177/1541931218621352.

Full text
Abstract:
Social robots with expressive gaze have positive effects on human-robot interaction. In particular, research suggests that when robots are programmed to express introverted or extroverted gaze behavior, individuals enjoy interacting more with robots that match their personality. However, how this affects social-cognitive performance during human-robot interactions has not been thoroughly examined yet. In the current paper, we examine whether the perceived match between human and robot personality positively affects the degree to which the robot’s gaze is followed (i.e., gaze cueing, as a proxy for more complex social-cognitive behavior). While social attention has been examined extensively outside of human-robot interaction, recent research shows that a robot’s gaze is attended to in a similar way as a human’s gaze. While our results did not support the hypothesis that gaze cueing would be strongest when the participant’s personality matched the robot’s personality, we did find evidence that participants followed the gaze of introverted robots more strongly than the gaze of extroverted robots. This finding suggests that agent’s displaying extroverted gaze behavior may hurt performance in human-robot interaction.
APA, Harvard, Vancouver, ISO, and other styles
40

Losey, Dylan P., Andrea Bajcsy, Marcia K. O’Malley, and Anca D. Dragan. "Physical interaction as communication: Learning robot objectives online from human corrections." International Journal of Robotics Research 41, no. 1 (October 25, 2021): 20–44. http://dx.doi.org/10.1177/02783649211050958.

Full text
Abstract:
When a robot performs a task next to a human, physical interaction is inevitable: the human might push, pull, twist, or guide the robot. The state of the art treats these interactions as disturbances that the robot should reject or avoid. At best, these robots respond safely while the human interacts; but after the human lets go, these robots simply return to their original behavior. We recognize that physical human–robot interaction (pHRI) is often intentional: the human intervenes on purpose because the robot is not doing the task correctly. In this article, we argue that when pHRI is intentional it is also informative: the robot can leverage interactions to learn how it should complete the rest of its current task even after the person lets go. We formalize pHRI as a dynamical system, where the human has in mind an objective function they want the robot to optimize, but the robot does not get direct access to the parameters of this objective: they are internal to the human. Within our proposed framework human interactions become observations about the true objective. We introduce approximations to learn from and respond to pHRI in real-time. We recognize that not all human corrections are perfect: often users interact with the robot noisily, and so we improve the efficiency of robot learning from pHRI by reducing unintended learning. Finally, we conduct simulations and user studies on a robotic manipulator to compare our proposed approach with the state of the art. Our results indicate that learning from pHRI leads to better task performance and improved human satisfaction.
APA, Harvard, Vancouver, ISO, and other styles
41

Fischer, Kerstin. "Tracking Anthropomorphizing Behavior in Human-Robot Interaction." ACM Transactions on Human-Robot Interaction 11, no. 1 (March 31, 2022): 1–28. http://dx.doi.org/10.1145/3442677.

Full text
Abstract:
Existing methodologies to describe anthropomorphism in human-robot interaction often rely either on specific one-time responses to robot behavior, such as keeping the robot's secret, or on post hoc measures, such as questionnaires. Currently, there is no method to describe the dynamics of people's behavior over the course of an interaction and in response to robot behavior. In this paper, I propose a method that allows the researcher to trace anthropomorphizing and non-anthropomorphizing responses to robots dynamically moment-by-moment over the course of human-robot interactions. I illustrate this methodology in a case study and find considerable variation between participants, but also considerable intrapersonal variation in the ways the robot is anthropomorphized. That is, people may respond to the robot as if it was another human in one moment and to its machine-like properties in the next. These findings may influence explanatory models of anthropomorphism.
APA, Harvard, Vancouver, ISO, and other styles
42

Tyler, Neil. "Human Robot Interactions." New Electronics 51, no. 22 (December 10, 2019): 12–14. http://dx.doi.org/10.12968/s0047-9624(22)61505-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Zhao, Mengyao. "Emotion Recognition in Psychology of Human-robot Interaction." Psychomachina 1 (November 21, 2023): 1–11. http://dx.doi.org/10.59388/pm00331.

Full text
Abstract:
The field of Human-Robot Interaction (HRI) has garnered significant attention in recent years, with researchers and practitioners seeking to understand the psychological aspects underlying the interactions between humans and robots. One crucial area of focus within HRI is the psychology of emotion recognition, which plays a fundamental role in shaping the dynamics of human-robot interaction. This paper provides an overview of the background of psychology in the context of human-robot interaction, emphasizing the significance of understanding human emotions in this domain. The concept of emotion recognition, a key component of human psychology, is explored in detail, highlighting its relevance in the context of human-robot interaction. Emotion recognition allows robots to perceive and interpret human emotions, enabling them to respond appropriately and enhance the quality of interaction. The role of emotion recognition in HRI is examined from a psychological standpoint, shedding light on its implications for the design and development of effective human-robot interfaces. Furthermore, this paper delves into the application of machine learning techniques for emotion recognition in the context of human-robot interaction. Machine learning algorithms have shown promise in enabling robots to recognize and respond to human emotions, thereby contributing to more natural and intuitive interactions. The utilization of machine learning in emotion recognition reflects the intersection of psychology and technological advancements in the field of HRI. Finally, the challenges associated with emotion recognition in HRI are discussed, encompassing issues such as cross-cultural variations in emotional expression, individual differences, and the ethical implications of emotion detection. Addressing these challenges is pivotal in advancing the understanding and implementation of emotion recognition in human-robot interaction, underscoring the interdisciplinary nature of this endeavor. In conclusion, this paper underscores the critical role of emotion recognition in the psychology of human-robot interaction, emphasizing its potential to revolutionize the way humans and robots engage with each other. By integrating insights from psychology, machine learning, and technology, advancements in emotion recognition have the potential to pave the way for more empathetic and responsive human-robot interactions, offering new avenues for research and practical applications in this burgeoning field.
APA, Harvard, Vancouver, ISO, and other styles
44

Dautenhahn, Kerstin, Chrystopher L. Nehaniv, Michael L. Walters, Ben Robins, Hatice Kose-Bagci, N. Assif Mirza, and Mike Blow. "KASPAR – A Minimally Expressive Humanoid Robot for Human–Robot Interaction Research." Applied Bionics and Biomechanics 6, no. 3-4 (2009): 369–97. http://dx.doi.org/10.1155/2009/708594.

Full text
Abstract:
This paper provides a comprehensive introduction to the design of the minimally expressive robot KASPAR, which is particularly suitable for human–robot interaction studies. A low-cost design with off-the-shelf components has been used in a novel design inspired from a multi-disciplinary viewpoint, including comics design and Japanese Noh theatre. The design rationale of the robot and its technical features are described in detail. Three research studies will be presented that have been using KASPAR extensively. Firstly, we present its application in robot-assisted play and therapy for children with autism. Secondly, we illustrate its use in human–robot interaction studies investigating the role of interaction kinesics and gestures. Lastly, we describe a study in the field of developmental robotics into computational architectures based on interaction histories for robot ontogeny. The three areas differ in the way as to how the robot is being operated and its role in social interaction scenarios. Each will be introduced briefly and examples of the results will be presented. Reflections on the specific design features of KASPAR that were important in these studies and lessons learnt from these studies concerning the design of humanoid robots for social interaction will also be discussed. An assessment of the robot in terms of utility of the design for human–robot interaction experiments concludes the paper.
APA, Harvard, Vancouver, ISO, and other styles
45

Mori, Yoshikazu, Koji Ota, and Tatsuya Nakamura. "Robot Motion Algorithm Based on Interaction with Human." Journal of Robotics and Mechatronics 14, no. 5 (October 20, 2002): 462–70. http://dx.doi.org/10.20965/jrm.2002.p0462.

Full text
Abstract:
In this paper, we quantitatively analyze weariness and impression that a human senses for a robot when the human interacted with the robot through some movements. A red ball and a blue ball are displayed on a simulation screen. The human moves the red ball with a mouse and the computer moves the blue ball. By using these balls, the impression that the action of the robot gives to the human is examined. We analyze the relationship between robot's interactive characterisrtics and produced impressions about the robot in human-robot-interction experiments by using methods of information theory. The difference of the impression between the simulation and the actual robot is proved by an omni-directional robot.
APA, Harvard, Vancouver, ISO, and other styles
46

Schadenberg, Bob R., Dennis Reidsma, Dirk K. J. Heylen, and Vanessa Evers. "“I See What You Did There”." ACM Transactions on Human-Robot Interaction 10, no. 3 (July 2021): 1–28. http://dx.doi.org/10.1145/3461534.

Full text
Abstract:
Unpredictability in robot behaviour can cause difficulties in interacting with robots. However, for social interactions with robots, a degree of unpredictability in robot behaviour may be desirable for facilitating engagement and increasing the attribution of mental states to the robot. To generate a better conceptual understanding of predictability, we looked at two facets of predictability, namely, the ability to predict robot actions and the association of predictability as an attribute of the robot. We carried out a video human-robot interaction study where we manipulated whether participants could either see the cause of a robot’s responsive action or could not see this, because there was no cause, or because we obstructed the visual cues. Our results indicate that when the cause of the robot’s responsive actions was not visible, participants rated the robot as more unpredictable and less competent, compared to when it was visible. The relationship between seeing the cause of the responsive actions and the attribution of competence was partially mediated by the attribution of unpredictability to the robot. We argue that the effects of unpredictability may be mitigated when the robot identifies when a person may not be aware of what the robot wants to respond to and uses additional actions to make its response predictable.
APA, Harvard, Vancouver, ISO, and other styles
47

Daza, Marcos, Dennis Barrios-Aranibar, José Diaz-Amado, Yudith Cardinale, and João Vilasboas. "An Approach of Social Navigation Based on Proxemics for Crowded Environments of Humans and Robots." Micromachines 12, no. 2 (February 13, 2021): 193. http://dx.doi.org/10.3390/mi12020193.

Full text
Abstract:
Nowadays, mobile robots are playing an important role in different areas of science, industry, academia and even in everyday life. In this sense, their abilities and behaviours become increasingly complex. In particular, in indoor environments, such as hospitals, schools, banks and museums, where the robot coincides with people and other robots, its movement and navigation must be programmed and adapted to robot–robot and human–robot interactions. However, existing approaches are focused either on multi-robot navigation (robot–robot interaction) or social navigation with human presence (human–robot interaction), neglecting the integration of both approaches. Proxemic interaction is recently being used in this domain of research, to improve Human–Robot Interaction (HRI). In this context, we propose an autonomous navigation approach for mobile robots in indoor environments, based on the principles of proxemic theory, integrated with classical navigation algorithms, such as ORCA, Social Momentum, and A*. With this novel approach, the mobile robot adapts its behaviour, by analysing the proximity of people to each other, with respect to it, and with respect to other robots to decide and plan its respective navigation, while showing acceptable social behaviours in presence of humans. We describe our proposed approach and show how proxemics and the classical navigation algorithms are combined to provide an effective navigation, while respecting social human distances. To show the suitability of our approach, we simulate several situations of coexistence of robots and humans, demonstrating an effective social navigation.
APA, Harvard, Vancouver, ISO, and other styles
48

Agrawal, Pramila, Changchun Liu, and Nilanjan Sarkar. "Interaction between human and robot." Interaction Studies 9, no. 2 (May 26, 2008): 230–57. http://dx.doi.org/10.1075/is.9.2.05agr.

Full text
Abstract:
This paper presents a human–robot interaction framework where a robot can infer implicit affective cues of a human and respond to them appropriately. Affective cues are inferred by the robot in real-time from physiological signals. A robot-based basketball game is designed where a robotic “coach” monitors the human participant’s anxiety to dynamically reconfigure game parameters to allow skill improvement while maintaining desired anxiety levels. The results of the above-mentioned anxiety-based sessions are compared with performance-based sessions where in the latter sessions, the game is adapted only according to the player’s performance. It was observed that 79% of the participants showed lower anxiety during anxiety-based session than in the performance-based session, 64% showed a greater improvement in performance after the anxiety-based session and 71% of the participants reported greater overall satisfaction during the anxiety-based sessions. This is the first time, to our knowledge, that the impact of real-time affective communication between a robot and a human has been demonstrated experimentally.
APA, Harvard, Vancouver, ISO, and other styles
49

Simmons, Reid, Maxim Makatchev, Rachel Kirby, Min Kyung Lee, Imran Fanaswala, Brett Browning, Jodi Forlizzi, and Majd Sakr. "Believable Robot Characters." AI Magazine 32, no. 4 (December 16, 2011): 39–52. http://dx.doi.org/10.1609/aimag.v32i4.2383.

Full text
Abstract:
Believability of characters has been an objective in literature, theater, film, and animation. We argue that believable robot characters are important in human-robot interaction, as well. In particular, we contend that believable characters evoke users’ social responses that, for some tasks, lead to more natural interactions and are associated with improved task performance. In a dialogue-capable robot, a key to such believability is the integration of a consistent storyline, verbal and nonverbal behaviors, and sociocultural context. We describe our work in this area and present empirical results from three robot receptionist testbeds that operate "in the wild."
APA, Harvard, Vancouver, ISO, and other styles
50

Ling, Honson, and Elin Björling. "Sharing Stress With a Robot: What Would a Robot Say?" Human-Machine Communication 1 (February 1, 2020): 133–58. http://dx.doi.org/10.30658/hmc.1.8.

Full text
Abstract:
With the prevalence of mental health problems today, designing human-robot interaction for mental health intervention is not only possible, but critical. The current experiment examined how three types of robot disclosure (emotional, technical, and by-proxy) affect robot perception and human disclosure behavior during a stress-sharing activity. Emotional robot disclosure resulted in the lowest robot perceived safety. Post-hoc analysis revealed that increased perceived stress predicted reduced human disclosure, user satisfaction, robot likability, and future robot use. Negative attitudes toward robots also predicted reduced intention for future robot use. This work informs on the possible design of robot disclosure, as well as how individual attributes, such as perceived stress, can impact human robot interaction in a mental health context.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography