Academic literature on the topic 'Human-robot interaction'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Human-robot interaction.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Human-robot interaction"
Takamatsu, Jun. "Human-Robot Interaction." Journal of the Robotics Society of Japan 37, no. 4 (2019): 293–96. http://dx.doi.org/10.7210/jrsj.37.293.
Full textJia, Yunyi, Biao Zhang, Miao Li, Brady King, and Ali Meghdari. "Human-Robot Interaction." Journal of Robotics 2018 (October 1, 2018): 1–2. http://dx.doi.org/10.1155/2018/3879547.
Full textMurphy, Robin, Tatsuya Nomura, Aude Billard, and Jennifer Burke. "Human–Robot Interaction." IEEE Robotics & Automation Magazine 17, no. 2 (June 2010): 85–89. http://dx.doi.org/10.1109/mra.2010.936953.
Full textSethumadhavan, Arathi. "Human-Robot Interaction." Ergonomics in Design: The Quarterly of Human Factors Applications 20, no. 3 (July 2012): 27–28. http://dx.doi.org/10.1177/1064804612449796.
Full textSheridan, Thomas B. "Human–Robot Interaction." Human Factors: The Journal of the Human Factors and Ergonomics Society 58, no. 4 (April 20, 2016): 525–32. http://dx.doi.org/10.1177/0018720816644364.
Full textJones, Keith S., and Elizabeth A. Schmidlin. "Human-Robot Interaction." Reviews of Human Factors and Ergonomics 7, no. 1 (August 25, 2011): 100–148. http://dx.doi.org/10.1177/1557234x11410388.
Full textThomaz, Andrea, Guy Hoffman, and Maya Cakmak. "Computational Human-Robot Interaction." Foundations and Trends in Robotics 4, no. 2-3 (2016): 104–223. http://dx.doi.org/10.1561/2300000049.
Full textKarniel, Amir, Angelika Peer, Opher Donchin, Ferdinando A. Mussa-Ivaldi, and Gerald E. Loeb. "Haptic Human-Robot Interaction." IEEE Transactions on Haptics 5, no. 3 (2012): 193–95. http://dx.doi.org/10.1109/toh.2012.47.
Full textPook, Polly K., and Dana H. Ballard. "Deictic human/robot interaction." Robotics and Autonomous Systems 18, no. 1-2 (July 1996): 259–69. http://dx.doi.org/10.1016/0921-8890(95)00080-1.
Full textYoung, James E., JaYoung Sung, Amy Voida, Ehud Sharlin, Takeo Igarashi, Henrik I. Christensen, and Rebecca E. Grinter. "Evaluating Human-Robot Interaction." International Journal of Social Robotics 3, no. 1 (October 1, 2010): 53–67. http://dx.doi.org/10.1007/s12369-010-0081-8.
Full textDissertations / Theses on the topic "Human-robot interaction"
Kruse, Thibault. "Planning for human robot interaction." Thesis, Toulouse 3, 2015. http://www.theses.fr/2015TOU30059/document.
Full textThe recent advances in robotics inspire visions of household and service robots making our lives easier and more comfortable. Such robots will be able to perform several object manipulation tasks required for household chores, autonomously or in cooperation with humans. In that role of human companion, the robot has to satisfy many additional requirements compared to well established fields of industrial robotics. The purpose of planning for robots is to achieve robot behavior that is goal-directed and establishes correct results. But in human-robot-interaction, robot behavior cannot merely be judged in terms of correct results, but must be agree-able to human stakeholders. This means that the robot behavior must suffice additional quality criteria. It must be safe, comfortable to human, and intuitively be understood. There are established practices to ensure safety and provide comfort by keeping sufficient distances between the robot and nearby persons. However providing behavior that is intuitively understood remains a challenge. This challenge greatly increases in cases of dynamic human-robot interactions, where the actions of the human in the future are unpredictable, and the robot needs to constantly adapt its plans to changes. This thesis provides novel approaches to improve the legibility of robot behavior in such dynamic situations. Key to that approach is not to merely consider the quality of a single plan, but the behavior of the robot as a result of replanning multiple times during an interaction. For navigation planning, this thesis introduces directional cost functions that avoid problems in conflict situations. For action planning, this thesis provides the approach of local replanning of transport actions based on navigational costs, to provide opportunistic behavior. Both measures help human observers understand the robot's beliefs and intentions during interactions and reduce confusion
Bodiroža, Saša. "Gestures in human-robot interaction." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2017. http://dx.doi.org/10.18452/17705.
Full textGestures consist of movements of body parts and are a mean of communication that conveys information or intentions to an observer. Therefore, they can be effectively used in human-robot interaction, or in general in human-machine interaction, as a way for a robot or a machine to infer a meaning. In order for people to intuitively use gestures and understand robot gestures, it is necessary to define mappings between gestures and their associated meanings -- a gesture vocabulary. Human gesture vocabulary defines which gestures a group of people would intuitively use to convey information, while robot gesture vocabulary displays which robot gestures are deemed as fitting for a particular meaning. Effective use of vocabularies depends on techniques for gesture recognition, which considers classification of body motion into discrete gesture classes, relying on pattern recognition and machine learning. This thesis addresses both research areas, presenting development of gesture vocabularies as well as gesture recognition techniques, focusing on hand and arm gestures. Attentional models for humanoid robots were developed as a prerequisite for human-robot interaction and a precursor to gesture recognition. A method for defining gesture vocabularies for humans and robots, based on user observations and surveys, is explained and experimental results are presented. As a result of the robot gesture vocabulary experiment, an evolutionary-based approach for refinement of robot gestures is introduced, based on interactive genetic algorithms. A robust and well-performing gesture recognition algorithm based on dynamic time warping has been developed. Most importantly, it employs one-shot learning, meaning that it can be trained using a low number of training samples and employed in real-life scenarios, lowering the effect of environmental constraints and gesture features. Finally, an approach for learning a relation between self-motion and pointing gestures is presented.
Miners, William Ben. "Toward Understanding Human Expression in Human-Robot Interaction." Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/789.
Full textAn intuitive method to minimize human communication effort with intelligent devices is to take advantage of our existing interpersonal communication experience. Recent advances in speech, hand gesture, and facial expression recognition provide alternate viable modes of communication that are more natural than conventional tactile interfaces. Use of natural human communication eliminates the need to adapt and invest time and effort using less intuitive techniques required for traditional keyboard and mouse based interfaces.
Although the state of the art in natural but isolated modes of communication achieves impressive results, significant hurdles must be conquered before communication with devices in our daily lives will feel natural and effortless. Research has shown that combining information between multiple noise-prone modalities improves accuracy. Leveraging this complementary and redundant content will improve communication robustness and relax current unimodal limitations.
This research presents and evaluates a novel multimodal framework to help reduce the total human effort and time required to communicate with intelligent devices. This reduction is realized by determining human intent using a knowledge-based architecture that combines and leverages conflicting information available across multiple natural communication modes and modalities. The effectiveness of this approach is demonstrated using dynamic hand gestures and simple facial expressions characterizing basic emotions. It is important to note that the framework is not restricted to these two forms of communication. The framework presented in this research provides the flexibility necessary to include additional or alternate modalities and channels of information in future research, including improving the robustness of speech understanding.
The primary contributions of this research include the leveraging of conflicts in a closed-loop multimodal framework, explicit use of uncertainty in knowledge representation and reasoning across multiple modalities, and a flexible approach for leveraging domain specific knowledge to help understand multimodal human expression. Experiments using a manually defined knowledge base demonstrate an improved average accuracy of individual concepts and an improved average accuracy of overall intents when leveraging conflicts as compared to an open-loop approach.
Akan, Batu. "Human Robot Interaction Solutions for Intuitive Industrial Robot Programming." Licentiate thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-14315.
Full textrobot colleague project
Topp, Elin Anna. "Human-Robot Interaction and Mapping with a Service Robot : Human Augmented Mapping." Doctoral thesis, Stockholm : School of computer science and communication, KTH, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4899.
Full textHuang, Chien-Ming. "Joint attention in human-robot interaction." Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/41196.
Full textBremner, Paul. "Conversational gestures in human-robot interaction." Thesis, University of the West of England, Bristol, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557106.
Full textFiore, Michelangelo. "Decision Making in Human-Robot Interaction." Thesis, Toulouse, INSA, 2016. http://www.theses.fr/2016ISAT0049/document.
Full textThere has been an increasing interest, in the last years, in robots that are able to cooperate with humans not only as simple tools, but as full agents, able to execute collaborative activities in a natural and efficient way. In this work, we have developed an architecture for Human-Robot Interaction able to execute joint activities with humans. We have applied this architecture to three different problems, that we called the robot observer, the robot coworker, and the robot teacher. After quickly giving an overview on the main aspects of human-robot cooperation and on the architecture of our system, we detail these problems.In the observer problem the robot monitors the environment, analyzing perceptual data through geometrical reasoning to produce symbolic information.We show how the system is able to infer humans' actions and intentions by linking physical observations, obtained by reasoning on humans' motions and their relationships with the environment, with planning and humans' mental beliefs, through a framework based on Markov Decision Processes and Bayesian Networks. We show, in a user study, that this model approaches the capacity of humans to infer intentions. We also discuss on the possible reactions that the robot can execute after inferring a human's intention. We identify two possible proactive behaviors: correcting the human's belief, by giving information to help him to correctly accomplish his goal, and physically helping him to accomplish the goal.In the coworker problem the robot has to execute a cooperative task with a human. In this part we introduce the Human-Aware Task Planner, used in different experiments, and detail our plan management component. The robot is able to cooperate with humans in three different modalities: robot leader, human leader, and equal partners. We introduce the problem of task monitoring, where the robot observes human activities to understand if they are still following the shared plan. After that, we describe how our robot is able to execute actions in a safe and robust way, taking humans into account. We present a framework used to achieve joint actions, by continuously estimating the robot's partner activities and reacting accordingly. This framework uses hierarchical Mixed Observability Markov Decision Processes, which allow us to estimate variables, such as the human's commitment to the task, and to react accordingly, splitting the decision process in different levels. We present an example of Collaborative Planner, for the handover problem, and then a set of laboratory experiments for a robot coworker scenario. Additionally, we introduce a novel multi-agent probabilistic planner, based on Markov Decision Processes, and discuss how we could use it to enhance our plan management component.In the robot teacher problem we explain how we can adapt the plan explanation and monitoring of the system to the knowledge of users on the task to perform. Using this idea, the robot will explain in less details tasks that the user has already performed several times, going more in-depth on new tasks. We show, in a user study, that this adaptive behavior is perceived by users better than a system without this capacity.Finally, we present a case study for a human-aware robot guide. This robot is able to guide users with adaptive and proactive behaviors, changing the speed to adapt to their needs, proposing a new pace to better suit the task's objectives, and directly engaging users to propose help. This system was integrated with other components to deploy a robot in the Schiphol Airport of Amsterdam, to guide groups of passengers to their flight gates. We performed user studies both in a laboratory and in the airport, demonstrating the robot's capacities and showing that it is appreciated by users
Alanenpää, Madelene. "Gaze detection in human-robot interaction." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428387.
Full textAlmeida, Luís Miguel Martins. "Human-robot interaction for object transfer." Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/22374.
Full textRobots come into physical contact with humans under a variety of circumstances to perform useful work. This thesis has the ambitious aim of contriving a solution that leads to a simple case of physical human-robot interaction, an object transfer task. Firstly, this work presents a review of the current research within the field of Human-Robot Interaction, where two approaches are distinguished, but simultaneously required: a pre-contact approximation and an interaction by contact. Further, to achieve the proposed objectives, this dissertation addresses a possible answer to three major problems: (1) The robot control to perform the inherent movements of the transfer assignment, (2) the human-robot pre interaction and (3) the interaction by contact. The capabilities of a 3D sensor and force/tactile sensors are explored in order to prepare the robot to handover an object and to control the robot gripper actions, correspondingly. The complete software development is supported by the Robot Operating System (ROS) framework. Finally, some experimental tests are conducted to validate the proposed solutions and to evaluate the system's performance. A possible transfer task is achieved, even if some refinements, improvements and extensions are required to improve the solution performance and range.
Os robôs entram em contacto físico com os humanos sob uma variedade de circunstâncias para realizar trabalho útil. Esta dissertação tem como objetivo o desenvolvimento de uma solução que permita um caso simples de interação física humano-robô, uma tarefa de transferência de objetos. Inicialmente, este trabalho apresenta uma revisão da pesquisa corrente na área da interação humano-robô, onde duas abordagens são distinguíveis, mas simultaneamente necessárias: uma aproximação pré-contacto e uma interação pós-contacto. Seguindo esta linha de pensamento, para atingir os objetivos propostos, esta dissertação procura dar resposta a três grandes problemas: (1) O controlo do robô para que este desempenhe os movimentos inerentes á tarefa de transferência, (2) a pré-interação humano-robô e (3) a interação por contacto. As capacidades de um sensor 3D e de sensores de força são exploradas com o objetivo de preparar o robô para a transferência e de controlar as ações da garra robótica, correspondentemente. O desenvolvimento de arquitetura software é suportado pela estrutura Robot Operating System (ROS). Finalmente, alguns testes experimentais são realizados para validar as soluções propostas e para avaliar o desempenho do sistema. Uma possível transferência de objetos é alcançada, mesmo que sejam necessários alguns refinamentos, melhorias e extensões para melhorar o desempenho e abrangência da solução.
Books on the topic "Human-robot interaction"
Jost, Céline, Brigitte Le Pévédic, Tony Belpaeme, Cindy Bethel, Dimitrios Chrysostomou, Nigel Crook, Marine Grandgeorge, and Nicole Mirnig, eds. Human-Robot Interaction. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42307-0.
Full textMansour, Rahimi, and Karwowski Waldemar 1953-, eds. Human-robot interaction. London: Taylor & Francis, 1992.
Find full textPrassler, Erwin, Gisbert Lawitzky, Andreas Stopp, Gerhard Grunwald, Martin Hägele, Rüdiger Dillmann, and Ioannis Iossifidis, eds. Advances in Human-Robot Interaction. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/b97960.
Full textGoodrich, Michael A. Human-robot interaction: A survey. Hanover: Now Publishers, 2007.
Find full textXing, Bo, and Tshilidzi Marwala. Smart Maintenance for Human–Robot Interaction. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-67480-3.
Full textAyanoğlu, Hande, and Emília Duarte, eds. Emotional Design in Human-Robot Interaction. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-96722-6.
Full textDautenhahn, Kerstin, and Joe Saunders, eds. New Frontiers in Human–Robot Interaction. Amsterdam: John Benjamins Publishing Company, 2011. http://dx.doi.org/10.1075/ais.2.
Full textWang, Xiangyu, ed. Mixed Reality and Human-Robot Interaction. Dordrecht: Springer Netherlands, 2011. http://dx.doi.org/10.1007/978-94-007-0582-1.
Full textWang, Xiangyu. Mixed Reality and Human-Robot Interaction. Dordrecht: Springer Science+Business Media B.V., 2011.
Find full textShiomi, Masahiro, and Hidenobu Sumioka. Social Touch in Human–Robot Interaction. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003384274.
Full textBook chapters on the topic "Human-robot interaction"
Sidobre, Daniel, Xavier Broquère, Jim Mainprice, Ernesto Burattini, Alberto Finzi, Silvia Rossi, and Mariacarla Staffa. "Human–Robot Interaction." In Springer Tracts in Advanced Robotics, 123–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29041-1_3.
Full textBillard, Aude, and Daniel Grollman. "Human-Robot Interaction." In Encyclopedia of the Sciences of Learning, 1474–76. Boston, MA: Springer US, 2012. http://dx.doi.org/10.1007/978-1-4419-1428-6_760.
Full textOhnishi, Kouhei. "Human–Robot Interaction." In Mechatronics and Robotics, 255–64. Boca Raton : CRC Press, 2020.: CRC Press, 2020. http://dx.doi.org/10.1201/9780429347474-12.
Full textAyanoğlu, Hande, and João S. Sequeira. "Human-Robot Interaction." In Human–Computer Interaction Series, 39–55. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-96722-6_3.
Full textFeil-Seifer, David, and Maja J. Matarić. "Human-Robot Interaction." In Encyclopedia of Complexity and Systems Science, 1–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2020. http://dx.doi.org/10.1007/978-3-642-27737-5_274-5.
Full textEdwards, Autumn. "Human–Robot Interaction." In The Sage Handbook of Human–Machine Communication, 167–77. 1 Oliver's Yard, 55 City Road London EC1Y 1SP: SAGE Publications Ltd, 2023. http://dx.doi.org/10.4135/9781529782783.n21.
Full textEsterwood, Connor, Qiaoning Zhang, X. Jessie Yang, and Lionel P. Robert. "Human–Robot Interaction." In Human-Computer Interaction in Intelligent Environments, 305–32. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003490685-10.
Full textIr, André Pirlet. "The Role of Standardization in Technical Regulations." In Human–Robot Interaction, 1–8. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-1.
Full textTakács, Árpád, Imre J. Rudas, and Tamás Haidegger. "The Other End of Human–Robot Interaction." In Human–Robot Interaction, 137–70. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-10.
Full textLőrincz, Márton. "Passive Bilateral Teleoperation with Safety Considerations." In Human–Robot Interaction, 171–86. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-11.
Full textConference papers on the topic "Human-robot interaction"
Billings, Deborah R., Kristin E. Schaefer, Jessie Y. C. Chen, and Peter A. Hancock. "Human-robot interaction." In the seventh annual ACM/IEEE international conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2157689.2157709.
Full text"Human robot interaction." In 2016 9th International Conference on Human System Interactions (HSI). IEEE, 2016. http://dx.doi.org/10.1109/hsi.2016.7529627.
Full textSt-Onge, David, Nicolas Reeves, and Nataliya Petkova. "Robot-Human Interaction." In HRI '17: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3029798.3034785.
Full textHan, Ji, Gopika Ajaykumar, Ze Li, and Chien-Ming Huang. "Structuring Human-Robot Interactions via Interaction Conventions." In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2020. http://dx.doi.org/10.1109/ro-man47096.2020.9223468.
Full text"Human, Robot and Interaction." In 2019 IEEE International Conference on Industrial Cyber Physical Systems (ICPS). IEEE, 2019. http://dx.doi.org/10.1109/icphys.2019.8780335.
Full textSandygulova, Anara, Abraham G. Campbell, Mauro Dragone, and G. M. P. O'Hare. "Immersive human-robot interaction." In the seventh annual ACM/IEEE international conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2157689.2157768.
Full textBudkov, V. Yu, M. V. Prischepa, A. L. Ronzhin, and A. A. Karpov. "Multimodal human-robot interaction." In 2010 International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT 2010). IEEE, 2010. http://dx.doi.org/10.1109/icumt.2010.5676593.
Full textScimeca, Luca, Fumiya Iida, Perla Maiolino, and Thrishantha Nanayakkara. "Human-Robot Medical Interaction." In HRI '20: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3371382.3374847.
Full text"Physical Human-Robot Interaction." In 2019 IEEE International Conference on Mechatronics (ICM). IEEE, 2019. http://dx.doi.org/10.1109/icmech.2019.8722848.
Full textBartneck, Christoph, and Jodi Forlizzi. "Shaping human-robot interaction." In Extended abstracts of the 2004 conference. New York, New York, USA: ACM Press, 2004. http://dx.doi.org/10.1145/985921.986205.
Full textReports on the topic "Human-robot interaction"
Arkin, Ronald C., and Lilia Moshkina. Affect in Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, January 2014. http://dx.doi.org/10.21236/ada593747.
Full textMartinson, E., and W. Lawson. Learning Speaker Recognition Models through Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, May 2011. http://dx.doi.org/10.21236/ada550036.
Full textManring, Levi H., John Monroe Pederson, and Dillon Gabriel Potts. Improving Human-Robot Interaction and Control Through Augmented Reality. Office of Scientific and Technical Information (OSTI), August 2018. http://dx.doi.org/10.2172/1467198.
Full textJiang, Shu, and Ronald C. Arkin. Mixed-Initiative Human-Robot Interaction: Definition, Taxonomy, and Survey. Fort Belvoir, VA: Defense Technical Information Center, January 2015. http://dx.doi.org/10.21236/ada620347.
Full textScholtz, Jean, Jeff Young, Holly A. Yanco, and Jill L. Drury. Evaluation of Human-Robot Interaction Awareness in Search and Rescue. Fort Belvoir, VA: Defense Technical Information Center, January 2006. http://dx.doi.org/10.21236/ada456128.
Full textBagchi, Shelly, Murat Aksu, Megan Zimmerman, Jeremy A. Marvel, Brian Antonishek, Heni Ben Amor, Terry Fong, Ross Mead, and Yue Wang. Workshop Report: Test Methods and Metrics for Effective HRI in Collaborative Human-Robot Teams, ACM/IEEE Human-Robot Interaction Conference, 2019. National Institute of Standards and Technology, December 2020. http://dx.doi.org/10.6028/nist.ir.8339.
Full textBagchi, Shelly, Jeremy A. Marvel, Megan Zimmerman, Murat Aksu, Brian Antonishek, Heni Ben Amor, Terry Fong, Ross Mead, and Yue Wang. Workshop Report: Test Methods and Metrics for Effective HRI in Real-World Human-Robot Teams, ACM/IEEE Human-Robot Interaction Conference, 2020 (Virtual). National Institute of Standards and Technology, January 2021. http://dx.doi.org/10.6028/nist.ir.8345.
Full textSchaefer, Kristin E., Deborah R. Billings, James L. Szalma, Jeffrey K. Adams, Tracy L. Sanders, Jessie Y. Chen, and Peter A. Hancock. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, July 2014. http://dx.doi.org/10.21236/ada607926.
Full textBagchi, Shelly, Jeremy A. Marvel, Megan Zimmerman, Murat Aksu, Brian Antonishek, Xiang Li, Heni Ben Amor, Terry Fong, Ross Mead, and Yue Wang. Workshop Report: Novel and Emerging Test Methods and Metrics for Effective HRI, ACM/IEEE Conference on Human-Robot Interaction, 2021. National Institute of Standards and Technology, February 2022. http://dx.doi.org/10.6028/nist.ir.8417.
Full textBreazael, Cynthia, and Brian Scassellati. Infant-Like Social Interactions Between a Robot and a Human Caregiver. Fort Belvoir, VA: Defense Technical Information Center, January 2006. http://dx.doi.org/10.21236/ada450357.
Full text