Segui questo link per vedere altri tipi di pubblicazioni sul tema: Human-robot interaction.

Articoli di riviste sul tema "Human-robot interaction"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 articoli di riviste per l'attività di ricerca sul tema "Human-robot interaction".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi gli articoli di riviste di molte aree scientifiche e compila una bibliografia corretta.

1

Takamatsu, Jun. "Human-Robot Interaction". Journal of the Robotics Society of Japan 37, n. 4 (2019): 293–96. http://dx.doi.org/10.7210/jrsj.37.293.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Jia, Yunyi, Biao Zhang, Miao Li, Brady King e Ali Meghdari. "Human-Robot Interaction". Journal of Robotics 2018 (1 ottobre 2018): 1–2. http://dx.doi.org/10.1155/2018/3879547.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Murphy, Robin, Tatsuya Nomura, Aude Billard e Jennifer Burke. "Human–Robot Interaction". IEEE Robotics & Automation Magazine 17, n. 2 (giugno 2010): 85–89. http://dx.doi.org/10.1109/mra.2010.936953.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Sethumadhavan, Arathi. "Human-Robot Interaction". Ergonomics in Design: The Quarterly of Human Factors Applications 20, n. 3 (luglio 2012): 27–28. http://dx.doi.org/10.1177/1064804612449796.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Sheridan, Thomas B. "Human–Robot Interaction". Human Factors: The Journal of the Human Factors and Ergonomics Society 58, n. 4 (20 aprile 2016): 525–32. http://dx.doi.org/10.1177/0018720816644364.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Jones, Keith S., e Elizabeth A. Schmidlin. "Human-Robot Interaction". Reviews of Human Factors and Ergonomics 7, n. 1 (25 agosto 2011): 100–148. http://dx.doi.org/10.1177/1557234x11410388.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Thomaz, Andrea, Guy Hoffman e Maya Cakmak. "Computational Human-Robot Interaction". Foundations and Trends in Robotics 4, n. 2-3 (2016): 104–223. http://dx.doi.org/10.1561/2300000049.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Karniel, Amir, Angelika Peer, Opher Donchin, Ferdinando A. Mussa-Ivaldi e Gerald E. Loeb. "Haptic Human-Robot Interaction". IEEE Transactions on Haptics 5, n. 3 (2012): 193–95. http://dx.doi.org/10.1109/toh.2012.47.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Pook, Polly K., e Dana H. Ballard. "Deictic human/robot interaction". Robotics and Autonomous Systems 18, n. 1-2 (luglio 1996): 259–69. http://dx.doi.org/10.1016/0921-8890(95)00080-1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Young, James E., JaYoung Sung, Amy Voida, Ehud Sharlin, Takeo Igarashi, Henrik I. Christensen e Rebecca E. Grinter. "Evaluating Human-Robot Interaction". International Journal of Social Robotics 3, n. 1 (1 ottobre 2010): 53–67. http://dx.doi.org/10.1007/s12369-010-0081-8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Lee, Heejin. "A Human-Robot Interaction Entertainment Pet Robot". Journal of Korean Institute of Intelligent Systems 24, n. 2 (25 aprile 2014): 179–85. http://dx.doi.org/10.5391/jkiis.2014.24.2.179.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Mitsunaga, N., C. Smith, T. Kanda, H. Ishiguro e N. Hagita. "Adapting Robot Behavior for Human--Robot Interaction". IEEE Transactions on Robotics 24, n. 4 (agosto 2008): 911–16. http://dx.doi.org/10.1109/tro.2008.926867.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Qu, Jingtao, Mateusz Jarosz e Bartlomiej Sniezynski. "Robot Control Platform for Multimodal Interactions with Humans Based on ChatGPT". Applied Sciences 14, n. 17 (7 settembre 2024): 8011. http://dx.doi.org/10.3390/app14178011.

Testo completo
Abstract (sommario):
This paper presents the architecture of a multimodal human–robot interaction control platform that leverages the advanced language capabilities of ChatGPT to facilitate more natural and engaging conversations between humans and robots. Implemented on the Pepper humanoid robot, the platform aims to enhance communication by providing a richer and more intuitive interface. The motivation behind this study is to enhance robot performance in human interaction through cutting-edge natural language processing technology, thereby improving public attitudes toward robots, fostering the development and application of robotic technology, and reducing the negative attitudes often associated with human–robot interactions. To validate the system, we conducted experiments measuring negative attitude robot scale and their robot anxiety scale scores before and after interacting with the robot. Statistical analysis of the data revealed a significant improvement in the participants’ attitudes and a notable reduction in anxiety following the interaction, indicating that the system holds promise for fostering more positive human–robot relationships.
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Lai, Yujun, Gavin Paul, Yunduan Cui e Takamitsu Matsubara. "User intent estimation during robot learning using physical human robot interaction primitives". Autonomous Robots 46, n. 2 (15 gennaio 2022): 421–36. http://dx.doi.org/10.1007/s10514-021-10030-9.

Testo completo
Abstract (sommario):
AbstractAs robotic systems transition from traditional setups to collaborative work spaces, the prevalence of physical Human Robot Interaction has risen in both industrial and domestic environments. A popular representation for robot behavior is movement primitives which learn, imitate, and generalize from expert demonstrations. While there are existing works in context-aware movement primitives, they are usually limited to contact-free human robot interactions. This paper presents physical Human Robot Interaction Primitives (pHRIP), which utilize only the interaction forces between the human user and robot to estimate user intent and generate the appropriate robot response during physical human robot interactions. The efficacy of pHRIP is evaluated through multiple experiments based on target-directed reaching and obstacle avoidance tasks using a real seven degree of freedom robot arm. The results are validated against Interaction Primitives which use observations of robotic trajectories, with discussions of future pHRI applications utilizing pHRIP.
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Tyler, Neil. "Human Robot Interactions". New Electronics 51, n. 22 (10 dicembre 2019): 12–14. http://dx.doi.org/10.12968/s0047-9624(22)61505-0.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
16

KAMBAROV, Ikrom, Matthias BROSSOG, Jorg FRANKE, David KUNZ e Jamshid INOYATKHODJAEV. "From Human to Robot Interaction towards Human to Robot Communication in Assembly Systems". Eurasia Proceedings of Science Technology Engineering and Mathematics 23 (16 ottobre 2023): 241–52. http://dx.doi.org/10.55549/epstem.1365802.

Testo completo
Abstract (sommario):
The interaction between humans and robots has been a rapidly developing technology and a frequently discussed research topic in the last decade because current robots ensure the physical safety of humans during close proximity assembly operations. This interaction promises capability flexibility due to human dexterity skills and capacity flexibility due to robot accuracy. Nevertheless, in these interactions, the humans are marginally outside of the system, while the robots are seen as a crucial component of the assembly activities, which causes the systems to lack flexibility and efficiency. Therefore, this paper presents a study on Human to Robot communication in assembly systems. We conducted a systematic review of related literature and industrial applications involving human and robot interaction modes over the last decade to identify research gaps in the integration of collaborative robots into assembly systems. We believe that we are in a transformation phase from physical interaction mode towards cognitive interaction mode between humans and robots, where humans and robots are able to interact with each other during mutual working conditions and humans are able to guide robots. The main contribution of this paper is to propose a future mode of human-robot interaction in which a skilled operator performs not only physical cooperative tasks with robots but also work aided by smart technologies that allow communication with robots. This interaction mode allows for an increase in the flexibility and productivity of the assembly operation as well as the wellbeing of the human operator in a human-centered manufacturing environment.
Gli stili APA, Harvard, Vancouver, ISO e altri
17

SUWANNATHAT, Thatsaphan, Jun-ichi IMAI e Masahide KANEKO. "1P1-K06 Audio-Visual Speaker Detection in Human-Robot Interaction". Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2007 (2007): _1P1—K06_1—_1P1—K06_4. http://dx.doi.org/10.1299/jsmermd.2007._1p1-k06_1.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Shiomi, Masahiro, Hidenobu Sumioka e Hiroshi Ishiguro. "Special Issue on Human-Robot Interaction in Close Distance". Journal of Robotics and Mechatronics 32, n. 1 (20 febbraio 2020): 7. http://dx.doi.org/10.20965/jrm.2020.p0007.

Testo completo
Abstract (sommario):
As social robot research is advancing, the interaction distance between people and robots is decreasing. Indeed, although we were once required to maintain a certain physical distance from traditional industrial robots for safety, we can now interact with social robots in such a close distance that we can touch them. The physical existence of social robots will be essential to realize natural and acceptable interactions with people in daily environments. Because social robots function in our daily environments, we must design scenarios where robots interact closely with humans by considering various viewpoints. Interactions that involve touching robots influence the changes in the behavior of a person strongly. Therefore, robotics researchers and developers need to design such scenarios carefully. Based on these considerations, this special issue focuses on close human-robot interactions. This special issue on “Human-Robot Interaction in Close Distance” includes a review paper and 11 other interesting papers covering various topics such as social touch interactions, non-verbal behavior design for touch interactions, child-robot interactions including physical contact, conversations with physical interactions, motion copying systems, and mobile human-robot interactions. We thank all the authors and reviewers of the papers and hope this special issue will help readers better understand human-robot interaction in close distance.
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Zhao, Mengyao. "Emotion Recognition in Psychology of Human-robot Interaction". Psychomachina 1 (21 novembre 2023): 1–11. http://dx.doi.org/10.59388/pm00331.

Testo completo
Abstract (sommario):
The field of Human-Robot Interaction (HRI) has garnered significant attention in recent years, with researchers and practitioners seeking to understand the psychological aspects underlying the interactions between humans and robots. One crucial area of focus within HRI is the psychology of emotion recognition, which plays a fundamental role in shaping the dynamics of human-robot interaction. This paper provides an overview of the background of psychology in the context of human-robot interaction, emphasizing the significance of understanding human emotions in this domain. The concept of emotion recognition, a key component of human psychology, is explored in detail, highlighting its relevance in the context of human-robot interaction. Emotion recognition allows robots to perceive and interpret human emotions, enabling them to respond appropriately and enhance the quality of interaction. The role of emotion recognition in HRI is examined from a psychological standpoint, shedding light on its implications for the design and development of effective human-robot interfaces. Furthermore, this paper delves into the application of machine learning techniques for emotion recognition in the context of human-robot interaction. Machine learning algorithms have shown promise in enabling robots to recognize and respond to human emotions, thereby contributing to more natural and intuitive interactions. The utilization of machine learning in emotion recognition reflects the intersection of psychology and technological advancements in the field of HRI. Finally, the challenges associated with emotion recognition in HRI are discussed, encompassing issues such as cross-cultural variations in emotional expression, individual differences, and the ethical implications of emotion detection. Addressing these challenges is pivotal in advancing the understanding and implementation of emotion recognition in human-robot interaction, underscoring the interdisciplinary nature of this endeavor. In conclusion, this paper underscores the critical role of emotion recognition in the psychology of human-robot interaction, emphasizing its potential to revolutionize the way humans and robots engage with each other. By integrating insights from psychology, machine learning, and technology, advancements in emotion recognition have the potential to pave the way for more empathetic and responsive human-robot interactions, offering new avenues for research and practical applications in this burgeoning field.
Gli stili APA, Harvard, Vancouver, ISO e altri
20

Goodrich, Michael A., e Alan C. Schultz. "Human-Robot Interaction: A Survey". Foundations and Trends® in Human-Computer Interaction 1, n. 3 (2007): 203–75. http://dx.doi.org/10.1561/1100000005.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Pieskä, Sakari, Jari Kaarela e Ossi Saukko. "Towards easier human-robot interaction". Intelligent Decision Technologies 9, n. 1 (10 dicembre 2014): 41–53. http://dx.doi.org/10.3233/idt-140204.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Archer, Susan, e Patricia L. McDermott. "Advances in Human-Robot Interaction". Proceedings of the Human Factors and Ergonomics Society Annual Meeting 49, n. 3 (settembre 2005): 386. http://dx.doi.org/10.1177/154193120504900336.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
23

Agrawal, Pramila, Changchun Liu e Nilanjan Sarkar. "Interaction between human and robot". Interaction Studies 9, n. 2 (26 maggio 2008): 230–57. http://dx.doi.org/10.1075/is.9.2.05agr.

Testo completo
Abstract (sommario):
This paper presents a human–robot interaction framework where a robot can infer implicit affective cues of a human and respond to them appropriately. Affective cues are inferred by the robot in real-time from physiological signals. A robot-based basketball game is designed where a robotic “coach” monitors the human participant’s anxiety to dynamically reconfigure game parameters to allow skill improvement while maintaining desired anxiety levels. The results of the above-mentioned anxiety-based sessions are compared with performance-based sessions where in the latter sessions, the game is adapted only according to the player’s performance. It was observed that 79% of the participants showed lower anxiety during anxiety-based session than in the performance-based session, 64% showed a greater improvement in performance after the anxiety-based session and 71% of the participants reported greater overall satisfaction during the anxiety-based sessions. This is the first time, to our knowledge, that the impact of real-time affective communication between a robot and a human has been demonstrated experimentally.
Gli stili APA, Harvard, Vancouver, ISO e altri
24

Scalzone, Franco, e Guglielmo Tamburrini. "Human-robot interaction and psychoanalysis". AI & SOCIETY 28, n. 3 (17 febbraio 2012): 297–307. http://dx.doi.org/10.1007/s00146-012-0413-3.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
25

Berns, Karsten, e Zuhair Zafar. "Emotion based human-robot interaction". MATEC Web of Conferences 161 (2018): 01001. http://dx.doi.org/10.1051/matecconf/201816101001.

Testo completo
Abstract (sommario):
Human-machine interaction is a major challenge in the development of complex humanoid robots. In addition to verbal communication the use of non-verbal cues such as hand, arm and body gestures or mimics can improve the understanding of the intention of the robot. On the other hand, by perceiving such mechanisms of a human in a typical interaction scenario the humanoid robot can adapt its interaction skills in a better way. In this work, the perception system of two social robots, ROMAN and ROBIN of the RRLAB of the TU Kaiserslautern, is presented in the range of human-robot interaction.
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Bradshaw, Melissa. "FREEpHRI Targets Human-Robot Interaction". Engineer 302, n. 7934 (marzo 2022): 9. http://dx.doi.org/10.12968/s0013-7758(22)90353-8.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Bonarini, Andrea. "Communication in Human-Robot Interaction". Current Robotics Reports 1, n. 4 (27 agosto 2020): 279–85. http://dx.doi.org/10.1007/s43154-020-00026-1.

Testo completo
Abstract (sommario):
Abstract Purpose of Review To present the multi-faceted aspects of communication between robot and humans (HRI), putting in evidence that it is not limited to language-based interaction, but it includes all aspects that are relevant in communication among physical beings, exploiting all the available sensor channels. Recent Findings For specific purposes, machine learning algorithms could be exploited when data sets and appropriate algorithms are available. Summary Together with linguistic aspects, physical aspects play an important role in HRI and make the difference with respect to the more limited human-computer interaction (HCI). A review of the recent literature about the exploitation of different interaction channels is presented. The interpretation of signals and the production of appropriate communication actions require to consider psychological, sociological, and practical aspects, which may affect the performance. Communication is just one of the functionalities of an interactive robot and, as all the others, will need to be benchmarked to support the possibility for social robots to reach a real market.
Gli stili APA, Harvard, Vancouver, ISO e altri
28

Marti, Patrizia, Leonardo Giusti, Alessandro Pollini e Alessia Rullo. "Expressiveness in Human-Robot Interaction". Interaction Design and Architecture(s), n. 5_6 (20 marzo 2009): 93–98. http://dx.doi.org/10.55612/s-5002-005_6-015.

Testo completo
Abstract (sommario):
This article presents the design of Iromec, a modular robot companion tailored towards engaging in social exchanges with children with different disabilities with the aim to empower them to discover a wide rage of play styles from solitary to social and cooperative play. In particular this paper focuses on expressiveness as a fundamental feature of the robot for engaging in meaningful interaction with different typologies of disable children – Autistic children, Moderate Mentally Retarded children and Severe Motor Impaired children. Modularity and configurability of expressive traits contribute to the flexibility of the system in creating rewarding games that can be easily understood by the child and can promote fun and learning. Other key features of the system are the combination of autonomous and user-controlled behaviour and a strong emphasis on identity and expressiveness that can be dynamically adapted during play. A main contribution of this work is that the robot’s expressiveness is achieved through different channels (facial expression, gesture, pose, body language -appearance, shape, movement-) and realised through the use of both digital and mechanical components but also of smart materials and textiles.
Gli stili APA, Harvard, Vancouver, ISO e altri
29

Kim, Rae Yule. "Anthropomorphism and Human-Robot Interaction". Communications of the ACM 67, n. 2 (25 gennaio 2024): 80–85. http://dx.doi.org/10.1145/3624716.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
30

Berg, Julia, Albrecht Lottermoser, Christoph Richter e Gunther Reinhart. "Human-Robot-Interaction for mobile industrial robot teams". Procedia CIRP 79 (2019): 614–19. http://dx.doi.org/10.1016/j.procir.2019.02.080.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
31

Du, Guanglong, Mingxuan Chen, Caibing Liu, Bo Zhang e Ping Zhang. "Online Robot Teaching With Natural Human–Robot Interaction". IEEE Transactions on Industrial Electronics 65, n. 12 (dicembre 2018): 9571–81. http://dx.doi.org/10.1109/tie.2018.2823667.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
32

Momen, Ali, e Eva Wiese. "Noticing Extroversion Effects Attention: How Robot and Participant Personality Affect Gaze Cueing". Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62, n. 1 (settembre 2018): 1557–61. http://dx.doi.org/10.1177/1541931218621352.

Testo completo
Abstract (sommario):
Social robots with expressive gaze have positive effects on human-robot interaction. In particular, research suggests that when robots are programmed to express introverted or extroverted gaze behavior, individuals enjoy interacting more with robots that match their personality. However, how this affects social-cognitive performance during human-robot interactions has not been thoroughly examined yet. In the current paper, we examine whether the perceived match between human and robot personality positively affects the degree to which the robot’s gaze is followed (i.e., gaze cueing, as a proxy for more complex social-cognitive behavior). While social attention has been examined extensively outside of human-robot interaction, recent research shows that a robot’s gaze is attended to in a similar way as a human’s gaze. While our results did not support the hypothesis that gaze cueing would be strongest when the participant’s personality matched the robot’s personality, we did find evidence that participants followed the gaze of introverted robots more strongly than the gaze of extroverted robots. This finding suggests that agent’s displaying extroverted gaze behavior may hurt performance in human-robot interaction.
Gli stili APA, Harvard, Vancouver, ISO e altri
33

Tanaka, Ryosuke, Jinseok Woo e Naoyuki Kubota. "Nonverbal Communication Based on Instructed Learning for Socially Embedded Robot Partners". Journal of Advanced Computational Intelligence and Intelligent Informatics 23, n. 3 (20 maggio 2019): 584–91. http://dx.doi.org/10.20965/jaciii.2019.p0584.

Testo completo
Abstract (sommario):
The research and development of robot partners have been actively conducted to support human daily life. Human-robot interaction is one of the important research field, in which verbal and nonverbal communication are essential elements for improving the interactions between humans and robots. Thus, the purpose of this research was to establish a method to adapt a human-robot interaction mechanism for robot partners to various situations. In the proposed system, the robot needs to analyze the gestures of humans to interact with them. Humans have the ability to interact according to dynamically changing environmental conditions. Therefore, when robots interact with a human, it is necessary for robots to interact appropriately by correctly judging the situation according to human gestures to carry out natural human-robot interaction. In this paper, we propose a constructive methodology on a system that enables nonverbal communication elements for human-robot interaction. The proposed method was validated through a series of experiments.
Gli stili APA, Harvard, Vancouver, ISO e altri
34

Park, Eunil, e Jaeryoung Lee. "I am a warm robot: the effects of temperature in physical human–robot interaction". Robotica 32, n. 1 (2 agosto 2013): 133–42. http://dx.doi.org/10.1017/s026357471300074x.

Testo completo
Abstract (sommario):
SUMMARYWhat factors affect users' perceptions of physical human–robot interactions? To answer this question, this study examined whether the skin temperature of a social robot affected users' perceptions of the robot during physical interaction. Results from a between-subjects experiment (warm, intermediate, cool, or no interaction) with a dinosaur robot demonstrated that skin temperature significantly affects users' perceptions and evaluations of a socially interactive robot. Additionally, this study found that social presence had partial mediating effects on several dependent variables. Important implications and limitations for improving human–robot interactions are discussed here.
Gli stili APA, Harvard, Vancouver, ISO e altri
35

Filippini, Chiara, David Perpetuini, Daniela Cardone e Arcangelo Merla. "Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression". Sensors 21, n. 19 (27 settembre 2021): 6438. http://dx.doi.org/10.3390/s21196438.

Testo completo
Abstract (sommario):
An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s.
Gli stili APA, Harvard, Vancouver, ISO e altri
36

Lee, Youngho, Young Jae Ryoo e Jongmyung Choi. "Framework for Interaction Among Human–Robot-Environment in DigiLog Space". International Journal of Humanoid Robotics 11, n. 04 (dicembre 2014): 1442005. http://dx.doi.org/10.1142/s0219843614420055.

Testo completo
Abstract (sommario):
With the development of computing technology, robots are now popular in our daily life. Human–robot interaction is not restricted to a direct communication between them. The communication could include various different human to human interactions. In this paper, we present a framework for enhancing the interaction among human–robot-environments. The proposed framework is composed of a robot part, a user part, and the DigiLog space. To evaluate the proposed framework, we applied the framework into a real-time remote robot-control platform in the smart DigiLog space. We are implementing real time controlling and monitoring of a robot by using one smart phone as the robot brain and the other smart phone as the remote controller.
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Nakauchi, Yasushi. "Special Issue on Human Robot Interaction". Journal of Robotics and Mechatronics 14, n. 5 (20 ottobre 2002): 431. http://dx.doi.org/10.20965/jrm.2002.p0431.

Testo completo
Abstract (sommario):
Recent advances in robotics are disseminating robots into the social living environment as humanoids, pets, and caregivers. Novel human-robot interaction techniques and interfaces must be developed, however, to ensure that such robots interact as expected in daily life and work. Unlike conventional personal computers, such robots may assume a variety of configurations, such as industrial, wheel-based, ambulatory, remotely operated, autonomous, and wearable. They may also implement different communications modalities, including voice, video, haptics, and gestures. All of these aspects require that research on human-robot interaction become interdisciplinary, combining research from such fields as robotics, ergonomics, computer science and, psychology. In the field of computer science, new directions in human-computer interaction are emerging as post graphical user interfaces (GUIs). These include wearable, ubiquitous, and real-world computing. Such advances are thereby bridging the gap between robotics and computer science. The open-ended problems that potentially face include the following: What is the most desirable type of interaction between human beings and robots? What sort of technology will enable these interactions? How will human beings accept robots in their daily life and work? We are certain that readers of this special issue will be able to find many of the answers and become open to future directions concerning these problems. Any information that readers find herein will be a great pleasure to its editors.
Gli stili APA, Harvard, Vancouver, ISO e altri
38

Animesh, Kumar, e Dr Srikanth V. "Enhancing Healthcare through Human-Robot Interaction using AI and Machine Learning". International Journal of Research Publication and Reviews 5, n. 3 (21 marzo 2024): 184–90. http://dx.doi.org/10.55248/gengpi.5.0324.0831.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
39

Fischer, Kerstin. "Tracking Anthropomorphizing Behavior in Human-Robot Interaction". ACM Transactions on Human-Robot Interaction 11, n. 1 (31 marzo 2022): 1–28. http://dx.doi.org/10.1145/3442677.

Testo completo
Abstract (sommario):
Existing methodologies to describe anthropomorphism in human-robot interaction often rely either on specific one-time responses to robot behavior, such as keeping the robot's secret, or on post hoc measures, such as questionnaires. Currently, there is no method to describe the dynamics of people's behavior over the course of an interaction and in response to robot behavior. In this paper, I propose a method that allows the researcher to trace anthropomorphizing and non-anthropomorphizing responses to robots dynamically moment-by-moment over the course of human-robot interactions. I illustrate this methodology in a case study and find considerable variation between participants, but also considerable intrapersonal variation in the ways the robot is anthropomorphized. That is, people may respond to the robot as if it was another human in one moment and to its machine-like properties in the next. These findings may influence explanatory models of anthropomorphism.
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Priyanayana, S., B. Jayasekara e R. Gopura. "Adapting concept of human-human multimodal interaction in human-robot applications". Bolgoda Plains 2, n. 2 (dicembre 2022): 18–20. http://dx.doi.org/10.31705/bprm.v2(2).2022.4.

Testo completo
Abstract (sommario):
Human communication is multimodal in nature. In a normal environment, people use to interact with other humans and with the environment using more than one modality or medium of communication. They speak, use gestures and look at things to interact with nature and other humans. By listening to the different voice tones, looking at face gazes, and arm movements people understand communication cues. A discussion with two people will be in vocal communication, hand gestures, head gestures, and facial cues, etc. [1]. If textbook definition is considered synergistic use of these interaction methods is known as multimodal interaction [2]. For example, , a wheelchair user might instruct the smart wheelchair or the assistant to go forward, as shown in Fig. 1(a). However, with a hand gesture shown in the figure, he or she might want to go slowly. In the same way as of Fig. 1(b), a person might give someone a direction with a vocal command ‘that way’ and gesture the direction with his or her hand. In most Human-Robot Interaction (HRI) developments, there is an assumption that human interactions are unimodal. This forces the researchers to ignore the information other modalities carry with them. Therefore, it would provide an additional dimension for interpretation of human robot interactions. This article provides a concise description of how to adapt the concept of multimodal interaction in human-robot applications.
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Su, Wei Hua, Jing Gong Sun, Fu Niu e Xin Yue Xu. "The Human-Robot Interaction: An Investigation of Rescue Robot". Advanced Materials Research 711 (giugno 2013): 523–28. http://dx.doi.org/10.4028/www.scientific.net/amr.711.523.

Testo completo
Abstract (sommario):
The thesis research aimed to further the study of human-robot interaction (HRI) issues, especially regarding the development of rescue robot. The paper firstly discussed the status of the rescue robot and described the framework of human-robot interaction of search-rescue robot and rescue-evacuation robot. Subsequently, the general HRI issues will be discussed to explain how they affect the use of robots. Finally, we present suggested this multidisciplinary field of research, namely human-robot interaction, requires contributions from a variety of research fields such as robotics, human-computer interaction, and artificial intelligence.
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Losey, Dylan P., Andrea Bajcsy, Marcia K. O’Malley e Anca D. Dragan. "Physical interaction as communication: Learning robot objectives online from human corrections". International Journal of Robotics Research 41, n. 1 (25 ottobre 2021): 20–44. http://dx.doi.org/10.1177/02783649211050958.

Testo completo
Abstract (sommario):
When a robot performs a task next to a human, physical interaction is inevitable: the human might push, pull, twist, or guide the robot. The state of the art treats these interactions as disturbances that the robot should reject or avoid. At best, these robots respond safely while the human interacts; but after the human lets go, these robots simply return to their original behavior. We recognize that physical human–robot interaction (pHRI) is often intentional: the human intervenes on purpose because the robot is not doing the task correctly. In this article, we argue that when pHRI is intentional it is also informative: the robot can leverage interactions to learn how it should complete the rest of its current task even after the person lets go. We formalize pHRI as a dynamical system, where the human has in mind an objective function they want the robot to optimize, but the robot does not get direct access to the parameters of this objective: they are internal to the human. Within our proposed framework human interactions become observations about the true objective. We introduce approximations to learn from and respond to pHRI in real-time. We recognize that not all human corrections are perfect: often users interact with the robot noisily, and so we improve the efficiency of robot learning from pHRI by reducing unintended learning. Finally, we conduct simulations and user studies on a robotic manipulator to compare our proposed approach with the state of the art. Our results indicate that learning from pHRI leads to better task performance and improved human satisfaction.
Gli stili APA, Harvard, Vancouver, ISO e altri
43

Avelino, João, Tiago Paulino, Carlos Cardoso, Ricardo Nunes, Plinio Moreno e Alexandre Bernardino. "Towards natural handshakes for social robots: human-aware hand grasps using tactile sensors". Paladyn, Journal of Behavioral Robotics 9, n. 1 (1 agosto 2018): 221–34. http://dx.doi.org/10.1515/pjbr-2018-0017.

Testo completo
Abstract (sommario):
Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.
Gli stili APA, Harvard, Vancouver, ISO e altri
44

Yang, Shangshang, Xiao Gao, Zhao Feng e Xiaohui Xiao. "Learning Pose Dynamical System for Contact Tasks under Human Interaction". Actuators 12, n. 4 (20 aprile 2023): 179. http://dx.doi.org/10.3390/act12040179.

Testo completo
Abstract (sommario):
Robots are expected to execute various operation tasks like a human by learning human working skills, especially for complex contact tasks. Increasing demands for human–robot interaction during task execution makes robot motion planning and control a considerable challenge, not only to reproduce demonstration motion and force in the contact space but also to resume working after interacting with a human without re-planning motion. In this article, we propose a novel framework based on a time-invariant dynamical system (DS), taking into account both human skills transfer and human–robot interaction. In the proposed framework, the human demonstration trajectory was modeled by the pose diffeomorphic DS to achieve online motion planning. Furthermore, the motion of the DS was modified by admittance control to satisfy different demands. We evaluated the method with a UR5e robot in the contact task of the composite woven layup. The experimental results show that our approach can effectively reproduce the trajectory and force learned from human demonstration, allow human–robot interaction safely during the task, and control the robot to return to work automatically after human interaction.
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Oliveira, Raquel, Patrícia Arriaga e Ana Paiva. "Human-Robot Interaction in Groups: Methodological and Research Practices". Multimodal Technologies and Interaction 5, n. 10 (30 settembre 2021): 59. http://dx.doi.org/10.3390/mti5100059.

Testo completo
Abstract (sommario):
Understanding the behavioral dynamics that underline human-robot interactions in groups remains one of the core challenges in social robotics research. However, despite a growing interest in this topic, there is still a lack of established and validated measures that allow researchers to analyze human-robot interactions in group scenarios; and very few that have been developed and tested specifically for research conducted in-the-wild. This is a problem because it hinders the development of general models of human-robot interaction, and makes the comprehension of the inner workings of the relational dynamics between humans and robots, in group contexts, significantly more difficult. In this paper, we aim to provide a reflection on the current state of research on human-robot interaction in small groups, as well as to outline directions for future research with an emphasis on methodological and transversal issues.
Gli stili APA, Harvard, Vancouver, ISO e altri
46

Regmi, Sambad, Devin Burns e Yun Seong Song. "A robot for overground physical human-robot interaction experiments". PLOS ONE 17, n. 11 (10 novembre 2022): e0276980. http://dx.doi.org/10.1371/journal.pone.0276980.

Testo completo
Abstract (sommario):
Many anticipated physical human-robot interaction (pHRI) applications in the near future are overground tasks such as walking assistance. For investigating the biomechanics of human movement during pHRI, this work presents Ophrie, a novel interactive robot dedicated for physical interaction tasks with a human in overground settings. Unique design requirements for pHRI were considered in implementing the one-arm mobile robot, such as the low output impedance and the ability to apply small interaction forces. The robot can measure the human arm stiffness, an important physical quantity that can reveal human biomechanics during overground pHRI, while the human walks alongside the robot. This robot is anticipated to enable novel pHRI experiments and advance our understanding of intuitive and effective overground pHRI.
Gli stili APA, Harvard, Vancouver, ISO e altri
47

Woo, Jinseok, e Naoyuki Kubota. "Human-Robot Interaction Design Using Smart Device Based Robot Partner". International Journal of Artificial Life Research 6, n. 2 (luglio 2016): 23–43. http://dx.doi.org/10.4018/ijalr.2016070102.

Testo completo
Abstract (sommario):
Nowadays, various robot partners have been developed to realize human-friendly interactions. In general, a robot system is composed of hardware modules, software modules, and application contents. It takes much time to design utterance contents and motion patterns as application contents simultaneously, but the design support systems mainly focus on the generation of robot motion patterns. Furthermore, a methodology is needed to easily change the specification of hardware and software according to diversified needs, and the developmental environment to design the application contents on verbal and nonverbal communication with people. In this paper, the authors propose robot partners with the modularized architecture of hardware and software by using smart devices, and propose a developmental environment to realize easy contents design of verbal and nonverbal communication. In order to solve the problem of difficulty in the content design, they develop a design support environment using design templates of communication application contents. Next, they apply the robot partner to navigate visitors to the robot contest of the system design forum held in Tokyo Metropolitan University. Finally, they show several examples of the interaction cases, and discuss the interaction design for smart device based robot partners.
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Mori, Yoshikazu, Koji Ota e Tatsuya Nakamura. "Robot Motion Algorithm Based on Interaction with Human". Journal of Robotics and Mechatronics 14, n. 5 (20 ottobre 2002): 462–70. http://dx.doi.org/10.20965/jrm.2002.p0462.

Testo completo
Abstract (sommario):
In this paper, we quantitatively analyze weariness and impression that a human senses for a robot when the human interacted with the robot through some movements. A red ball and a blue ball are displayed on a simulation screen. The human moves the red ball with a mouse and the computer moves the blue ball. By using these balls, the impression that the action of the robot gives to the human is examined. We analyze the relationship between robot's interactive characterisrtics and produced impressions about the robot in human-robot-interction experiments by using methods of information theory. The difference of the impression between the simulation and the actual robot is proved by an omni-directional robot.
Gli stili APA, Harvard, Vancouver, ISO e altri
49

Mostafaoui, Ghiles, R. C. Schmidt, Syed Khursheed Hasnain, Robin Salesse e Ludovic Marin. "Human unintentional and intentional interpersonal coordination in interaction with a humanoid robot". PLOS ONE 17, n. 1 (19 gennaio 2022): e0261174. http://dx.doi.org/10.1371/journal.pone.0261174.

Testo completo
Abstract (sommario):
In order to establish natural social synchrony between two humans, two requirements need to be fulfilled. First, the coupling must be bi-directional. The two humans react to each other’s actions. Second, natural social bodily synchronization has to be intentional or unintentional. Assuming that these essential aspects of human-human interactions are present, the present paper investigates whether similar bodily synchrony emerges between an interacting human and an artificial agent such as a robot. More precisely, we investigate whether the same human unintentional rhythmic entrainment and synchronization is present in Human Robot Interaction (HRI). We also evaluate which model (e.g., an adaptive vs non adaptive robot) better reproduces such unintentional entrainment. And finally, we compare interagent coordination stability of the HRI under 1) unidirectional (robot with fixed frequency) versus bidirectional (robot with adaptive frequency) rhythmic entrainment and 2) human intentional versus unintentional coupling. Fifteen young adults made vertical arm movements in front of the NAO robot under five different conditions of intentional/unintentional and unidirectional/bidirectional interactions. Consistent with prior research investigating human-human interpersonal coordination, when humans interact with our robot, (i) unintentional entrainment was present, (ii) bi-directional coupling produced more stable in-phase un-intentional and intentional coordination, (iii) and intentional coordination was more stable than unintentional coordination. To conclude, this study provides a foundation for modeling future social robots involving unintentional and bidirectional synchronization—aspects which seem to enhance humans’ willingness to interact with robots.
Gli stili APA, Harvard, Vancouver, ISO e altri
50

Hsieh, Wei-Fen, Eri Sato-Shimokawara e Toru Yamaguchi. "Investigation of Robot Expression Style in Human-Robot Interaction". Journal of Robotics and Mechatronics 32, n. 1 (20 febbraio 2020): 224–35. http://dx.doi.org/10.20965/jrm.2020.p0224.

Testo completo
Abstract (sommario):
In our daily conversation, we obtain considerable information from our interlocutor’s non-verbal behaviors, such as gaze and gestures. Several studies have shown that nonverbal messages are prominent factors in smoothing the process of human-robot interaction. Our previous studies have shown that not only a robot’s appearance but also its gestures, tone, and other nonverbal factors influence a person’s impression of it. The paper presented an analysis of the impressions made when human motions are implemented on a humanoid robot, and experiments were conducted to evaluate impressions made by robot expressions to analyze the sensations. The results showed the relation between robot expression patterns and human preferences. To further investigate biofeedback elicited by different robot styles of expression, a scenario-based experiment was done. The results revealed that people’s emotions can definitely be affected by robot behavior, and the robot’s way of expressing itself is what most influences whether or not it is perceived as friendly. The results show that it is potentially useful to combine our concept into a robot system to meet individual needs.
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia