Letteratura scientifica selezionata sul tema "Human-robot interaction"
Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili
Consulta la lista di attuali articoli, libri, tesi, atti di convegni e altre fonti scientifiche attinenti al tema "Human-robot interaction".
Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.
Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.
Articoli di riviste sul tema "Human-robot interaction"
Takamatsu, Jun. "Human-Robot Interaction". Journal of the Robotics Society of Japan 37, n. 4 (2019): 293–96. http://dx.doi.org/10.7210/jrsj.37.293.
Testo completoJia, Yunyi, Biao Zhang, Miao Li, Brady King e Ali Meghdari. "Human-Robot Interaction". Journal of Robotics 2018 (1 ottobre 2018): 1–2. http://dx.doi.org/10.1155/2018/3879547.
Testo completoMurphy, Robin, Tatsuya Nomura, Aude Billard e Jennifer Burke. "Human–Robot Interaction". IEEE Robotics & Automation Magazine 17, n. 2 (giugno 2010): 85–89. http://dx.doi.org/10.1109/mra.2010.936953.
Testo completoSethumadhavan, Arathi. "Human-Robot Interaction". Ergonomics in Design: The Quarterly of Human Factors Applications 20, n. 3 (luglio 2012): 27–28. http://dx.doi.org/10.1177/1064804612449796.
Testo completoSheridan, Thomas B. "Human–Robot Interaction". Human Factors: The Journal of the Human Factors and Ergonomics Society 58, n. 4 (20 aprile 2016): 525–32. http://dx.doi.org/10.1177/0018720816644364.
Testo completoJones, Keith S., e Elizabeth A. Schmidlin. "Human-Robot Interaction". Reviews of Human Factors and Ergonomics 7, n. 1 (25 agosto 2011): 100–148. http://dx.doi.org/10.1177/1557234x11410388.
Testo completoThomaz, Andrea, Guy Hoffman e Maya Cakmak. "Computational Human-Robot Interaction". Foundations and Trends in Robotics 4, n. 2-3 (2016): 104–223. http://dx.doi.org/10.1561/2300000049.
Testo completoKarniel, Amir, Angelika Peer, Opher Donchin, Ferdinando A. Mussa-Ivaldi e Gerald E. Loeb. "Haptic Human-Robot Interaction". IEEE Transactions on Haptics 5, n. 3 (2012): 193–95. http://dx.doi.org/10.1109/toh.2012.47.
Testo completoPook, Polly K., e Dana H. Ballard. "Deictic human/robot interaction". Robotics and Autonomous Systems 18, n. 1-2 (luglio 1996): 259–69. http://dx.doi.org/10.1016/0921-8890(95)00080-1.
Testo completoYoung, James E., JaYoung Sung, Amy Voida, Ehud Sharlin, Takeo Igarashi, Henrik I. Christensen e Rebecca E. Grinter. "Evaluating Human-Robot Interaction". International Journal of Social Robotics 3, n. 1 (1 ottobre 2010): 53–67. http://dx.doi.org/10.1007/s12369-010-0081-8.
Testo completoTesi sul tema "Human-robot interaction"
Kruse, Thibault. "Planning for human robot interaction". Thesis, Toulouse 3, 2015. http://www.theses.fr/2015TOU30059/document.
Testo completoThe recent advances in robotics inspire visions of household and service robots making our lives easier and more comfortable. Such robots will be able to perform several object manipulation tasks required for household chores, autonomously or in cooperation with humans. In that role of human companion, the robot has to satisfy many additional requirements compared to well established fields of industrial robotics. The purpose of planning for robots is to achieve robot behavior that is goal-directed and establishes correct results. But in human-robot-interaction, robot behavior cannot merely be judged in terms of correct results, but must be agree-able to human stakeholders. This means that the robot behavior must suffice additional quality criteria. It must be safe, comfortable to human, and intuitively be understood. There are established practices to ensure safety and provide comfort by keeping sufficient distances between the robot and nearby persons. However providing behavior that is intuitively understood remains a challenge. This challenge greatly increases in cases of dynamic human-robot interactions, where the actions of the human in the future are unpredictable, and the robot needs to constantly adapt its plans to changes. This thesis provides novel approaches to improve the legibility of robot behavior in such dynamic situations. Key to that approach is not to merely consider the quality of a single plan, but the behavior of the robot as a result of replanning multiple times during an interaction. For navigation planning, this thesis introduces directional cost functions that avoid problems in conflict situations. For action planning, this thesis provides the approach of local replanning of transport actions based on navigational costs, to provide opportunistic behavior. Both measures help human observers understand the robot's beliefs and intentions during interactions and reduce confusion
Bodiroža, Saša. "Gestures in human-robot interaction". Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2017. http://dx.doi.org/10.18452/17705.
Testo completoGestures consist of movements of body parts and are a mean of communication that conveys information or intentions to an observer. Therefore, they can be effectively used in human-robot interaction, or in general in human-machine interaction, as a way for a robot or a machine to infer a meaning. In order for people to intuitively use gestures and understand robot gestures, it is necessary to define mappings between gestures and their associated meanings -- a gesture vocabulary. Human gesture vocabulary defines which gestures a group of people would intuitively use to convey information, while robot gesture vocabulary displays which robot gestures are deemed as fitting for a particular meaning. Effective use of vocabularies depends on techniques for gesture recognition, which considers classification of body motion into discrete gesture classes, relying on pattern recognition and machine learning. This thesis addresses both research areas, presenting development of gesture vocabularies as well as gesture recognition techniques, focusing on hand and arm gestures. Attentional models for humanoid robots were developed as a prerequisite for human-robot interaction and a precursor to gesture recognition. A method for defining gesture vocabularies for humans and robots, based on user observations and surveys, is explained and experimental results are presented. As a result of the robot gesture vocabulary experiment, an evolutionary-based approach for refinement of robot gestures is introduced, based on interactive genetic algorithms. A robust and well-performing gesture recognition algorithm based on dynamic time warping has been developed. Most importantly, it employs one-shot learning, meaning that it can be trained using a low number of training samples and employed in real-life scenarios, lowering the effect of environmental constraints and gesture features. Finally, an approach for learning a relation between self-motion and pointing gestures is presented.
Miners, William Ben. "Toward Understanding Human Expression in Human-Robot Interaction". Thesis, University of Waterloo, 2006. http://hdl.handle.net/10012/789.
Testo completoAn intuitive method to minimize human communication effort with intelligent devices is to take advantage of our existing interpersonal communication experience. Recent advances in speech, hand gesture, and facial expression recognition provide alternate viable modes of communication that are more natural than conventional tactile interfaces. Use of natural human communication eliminates the need to adapt and invest time and effort using less intuitive techniques required for traditional keyboard and mouse based interfaces.
Although the state of the art in natural but isolated modes of communication achieves impressive results, significant hurdles must be conquered before communication with devices in our daily lives will feel natural and effortless. Research has shown that combining information between multiple noise-prone modalities improves accuracy. Leveraging this complementary and redundant content will improve communication robustness and relax current unimodal limitations.
This research presents and evaluates a novel multimodal framework to help reduce the total human effort and time required to communicate with intelligent devices. This reduction is realized by determining human intent using a knowledge-based architecture that combines and leverages conflicting information available across multiple natural communication modes and modalities. The effectiveness of this approach is demonstrated using dynamic hand gestures and simple facial expressions characterizing basic emotions. It is important to note that the framework is not restricted to these two forms of communication. The framework presented in this research provides the flexibility necessary to include additional or alternate modalities and channels of information in future research, including improving the robustness of speech understanding.
The primary contributions of this research include the leveraging of conflicts in a closed-loop multimodal framework, explicit use of uncertainty in knowledge representation and reasoning across multiple modalities, and a flexible approach for leveraging domain specific knowledge to help understand multimodal human expression. Experiments using a manually defined knowledge base demonstrate an improved average accuracy of individual concepts and an improved average accuracy of overall intents when leveraging conflicts as compared to an open-loop approach.
Akan, Batu. "Human Robot Interaction Solutions for Intuitive Industrial Robot Programming". Licentiate thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-14315.
Testo completorobot colleague project
Topp, Elin Anna. "Human-Robot Interaction and Mapping with a Service Robot : Human Augmented Mapping". Doctoral thesis, Stockholm : School of computer science and communication, KTH, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4899.
Testo completoHuang, Chien-Ming. "Joint attention in human-robot interaction". Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/41196.
Testo completoBremner, Paul. "Conversational gestures in human-robot interaction". Thesis, University of the West of England, Bristol, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557106.
Testo completoFiore, Michelangelo. "Decision Making in Human-Robot Interaction". Thesis, Toulouse, INSA, 2016. http://www.theses.fr/2016ISAT0049/document.
Testo completoThere has been an increasing interest, in the last years, in robots that are able to cooperate with humans not only as simple tools, but as full agents, able to execute collaborative activities in a natural and efficient way. In this work, we have developed an architecture for Human-Robot Interaction able to execute joint activities with humans. We have applied this architecture to three different problems, that we called the robot observer, the robot coworker, and the robot teacher. After quickly giving an overview on the main aspects of human-robot cooperation and on the architecture of our system, we detail these problems.In the observer problem the robot monitors the environment, analyzing perceptual data through geometrical reasoning to produce symbolic information.We show how the system is able to infer humans' actions and intentions by linking physical observations, obtained by reasoning on humans' motions and their relationships with the environment, with planning and humans' mental beliefs, through a framework based on Markov Decision Processes and Bayesian Networks. We show, in a user study, that this model approaches the capacity of humans to infer intentions. We also discuss on the possible reactions that the robot can execute after inferring a human's intention. We identify two possible proactive behaviors: correcting the human's belief, by giving information to help him to correctly accomplish his goal, and physically helping him to accomplish the goal.In the coworker problem the robot has to execute a cooperative task with a human. In this part we introduce the Human-Aware Task Planner, used in different experiments, and detail our plan management component. The robot is able to cooperate with humans in three different modalities: robot leader, human leader, and equal partners. We introduce the problem of task monitoring, where the robot observes human activities to understand if they are still following the shared plan. After that, we describe how our robot is able to execute actions in a safe and robust way, taking humans into account. We present a framework used to achieve joint actions, by continuously estimating the robot's partner activities and reacting accordingly. This framework uses hierarchical Mixed Observability Markov Decision Processes, which allow us to estimate variables, such as the human's commitment to the task, and to react accordingly, splitting the decision process in different levels. We present an example of Collaborative Planner, for the handover problem, and then a set of laboratory experiments for a robot coworker scenario. Additionally, we introduce a novel multi-agent probabilistic planner, based on Markov Decision Processes, and discuss how we could use it to enhance our plan management component.In the robot teacher problem we explain how we can adapt the plan explanation and monitoring of the system to the knowledge of users on the task to perform. Using this idea, the robot will explain in less details tasks that the user has already performed several times, going more in-depth on new tasks. We show, in a user study, that this adaptive behavior is perceived by users better than a system without this capacity.Finally, we present a case study for a human-aware robot guide. This robot is able to guide users with adaptive and proactive behaviors, changing the speed to adapt to their needs, proposing a new pace to better suit the task's objectives, and directly engaging users to propose help. This system was integrated with other components to deploy a robot in the Schiphol Airport of Amsterdam, to guide groups of passengers to their flight gates. We performed user studies both in a laboratory and in the airport, demonstrating the robot's capacities and showing that it is appreciated by users
Alanenpää, Madelene. "Gaze detection in human-robot interaction". Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-428387.
Testo completoAlmeida, Luís Miguel Martins. "Human-robot interaction for object transfer". Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/22374.
Testo completoRobots come into physical contact with humans under a variety of circumstances to perform useful work. This thesis has the ambitious aim of contriving a solution that leads to a simple case of physical human-robot interaction, an object transfer task. Firstly, this work presents a review of the current research within the field of Human-Robot Interaction, where two approaches are distinguished, but simultaneously required: a pre-contact approximation and an interaction by contact. Further, to achieve the proposed objectives, this dissertation addresses a possible answer to three major problems: (1) The robot control to perform the inherent movements of the transfer assignment, (2) the human-robot pre interaction and (3) the interaction by contact. The capabilities of a 3D sensor and force/tactile sensors are explored in order to prepare the robot to handover an object and to control the robot gripper actions, correspondingly. The complete software development is supported by the Robot Operating System (ROS) framework. Finally, some experimental tests are conducted to validate the proposed solutions and to evaluate the system's performance. A possible transfer task is achieved, even if some refinements, improvements and extensions are required to improve the solution performance and range.
Os robôs entram em contacto físico com os humanos sob uma variedade de circunstâncias para realizar trabalho útil. Esta dissertação tem como objetivo o desenvolvimento de uma solução que permita um caso simples de interação física humano-robô, uma tarefa de transferência de objetos. Inicialmente, este trabalho apresenta uma revisão da pesquisa corrente na área da interação humano-robô, onde duas abordagens são distinguíveis, mas simultaneamente necessárias: uma aproximação pré-contacto e uma interação pós-contacto. Seguindo esta linha de pensamento, para atingir os objetivos propostos, esta dissertação procura dar resposta a três grandes problemas: (1) O controlo do robô para que este desempenhe os movimentos inerentes á tarefa de transferência, (2) a pré-interação humano-robô e (3) a interação por contacto. As capacidades de um sensor 3D e de sensores de força são exploradas com o objetivo de preparar o robô para a transferência e de controlar as ações da garra robótica, correspondentemente. O desenvolvimento de arquitetura software é suportado pela estrutura Robot Operating System (ROS). Finalmente, alguns testes experimentais são realizados para validar as soluções propostas e para avaliar o desempenho do sistema. Uma possível transferência de objetos é alcançada, mesmo que sejam necessários alguns refinamentos, melhorias e extensões para melhorar o desempenho e abrangência da solução.
Libri sul tema "Human-robot interaction"
Jost, Céline, Brigitte Le Pévédic, Tony Belpaeme, Cindy Bethel, Dimitrios Chrysostomou, Nigel Crook, Marine Grandgeorge e Nicole Mirnig, a cura di. Human-Robot Interaction. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42307-0.
Testo completoMansour, Rahimi, e Karwowski Waldemar 1953-, a cura di. Human-robot interaction. London: Taylor & Francis, 1992.
Cerca il testo completoPrassler, Erwin, Gisbert Lawitzky, Andreas Stopp, Gerhard Grunwald, Martin Hägele, Rüdiger Dillmann e Ioannis Iossifidis, a cura di. Advances in Human-Robot Interaction. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/b97960.
Testo completoGoodrich, Michael A. Human-robot interaction: A survey. Hanover: Now Publishers, 2007.
Cerca il testo completoXing, Bo, e Tshilidzi Marwala. Smart Maintenance for Human–Robot Interaction. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-67480-3.
Testo completoAyanoğlu, Hande, e Emília Duarte, a cura di. Emotional Design in Human-Robot Interaction. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-96722-6.
Testo completoDautenhahn, Kerstin, e Joe Saunders, a cura di. New Frontiers in Human–Robot Interaction. Amsterdam: John Benjamins Publishing Company, 2011. http://dx.doi.org/10.1075/ais.2.
Testo completoWang, Xiangyu, a cura di. Mixed Reality and Human-Robot Interaction. Dordrecht: Springer Netherlands, 2011. http://dx.doi.org/10.1007/978-94-007-0582-1.
Testo completoWang, Xiangyu. Mixed Reality and Human-Robot Interaction. Dordrecht: Springer Science+Business Media B.V., 2011.
Cerca il testo completoShiomi, Masahiro, e Hidenobu Sumioka. Social Touch in Human–Robot Interaction. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003384274.
Testo completoCapitoli di libri sul tema "Human-robot interaction"
Sidobre, Daniel, Xavier Broquère, Jim Mainprice, Ernesto Burattini, Alberto Finzi, Silvia Rossi e Mariacarla Staffa. "Human–Robot Interaction". In Springer Tracts in Advanced Robotics, 123–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29041-1_3.
Testo completoBillard, Aude, e Daniel Grollman. "Human-Robot Interaction". In Encyclopedia of the Sciences of Learning, 1474–76. Boston, MA: Springer US, 2012. http://dx.doi.org/10.1007/978-1-4419-1428-6_760.
Testo completoOhnishi, Kouhei. "Human–Robot Interaction". In Mechatronics and Robotics, 255–64. Boca Raton : CRC Press, 2020.: CRC Press, 2020. http://dx.doi.org/10.1201/9780429347474-12.
Testo completoAyanoğlu, Hande, e João S. Sequeira. "Human-Robot Interaction". In Human–Computer Interaction Series, 39–55. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-319-96722-6_3.
Testo completoFeil-Seifer, David, e Maja J. Matarić. "Human-Robot Interaction". In Encyclopedia of Complexity and Systems Science, 1–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2020. http://dx.doi.org/10.1007/978-3-642-27737-5_274-5.
Testo completoEdwards, Autumn. "Human–Robot Interaction". In The Sage Handbook of Human–Machine Communication, 167–77. 1 Oliver's Yard, 55 City Road London EC1Y 1SP: SAGE Publications Ltd, 2023. http://dx.doi.org/10.4135/9781529782783.n21.
Testo completoEsterwood, Connor, Qiaoning Zhang, X. Jessie Yang e Lionel P. Robert. "Human–Robot Interaction". In Human-Computer Interaction in Intelligent Environments, 305–32. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003490685-10.
Testo completoIr, André Pirlet. "The Role of Standardization in Technical Regulations". In Human–Robot Interaction, 1–8. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-1.
Testo completoTakács, Árpád, Imre J. Rudas e Tamás Haidegger. "The Other End of Human–Robot Interaction". In Human–Robot Interaction, 137–70. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-10.
Testo completoLőrincz, Márton. "Passive Bilateral Teleoperation with Safety Considerations". In Human–Robot Interaction, 171–86. Boca Raton, FL : CRC Press/Taylor & Francis Group, [2019]: Chapman and Hall/CRC, 2019. http://dx.doi.org/10.1201/9781315213781-11.
Testo completoAtti di convegni sul tema "Human-robot interaction"
Billings, Deborah R., Kristin E. Schaefer, Jessie Y. C. Chen e Peter A. Hancock. "Human-robot interaction". In the seventh annual ACM/IEEE international conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2157689.2157709.
Testo completo"Human robot interaction". In 2016 9th International Conference on Human System Interactions (HSI). IEEE, 2016. http://dx.doi.org/10.1109/hsi.2016.7529627.
Testo completoSt-Onge, David, Nicolas Reeves e Nataliya Petkova. "Robot-Human Interaction". In HRI '17: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3029798.3034785.
Testo completoHan, Ji, Gopika Ajaykumar, Ze Li e Chien-Ming Huang. "Structuring Human-Robot Interactions via Interaction Conventions". In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2020. http://dx.doi.org/10.1109/ro-man47096.2020.9223468.
Testo completo"Human, Robot and Interaction". In 2019 IEEE International Conference on Industrial Cyber Physical Systems (ICPS). IEEE, 2019. http://dx.doi.org/10.1109/icphys.2019.8780335.
Testo completoSandygulova, Anara, Abraham G. Campbell, Mauro Dragone e G. M. P. O'Hare. "Immersive human-robot interaction". In the seventh annual ACM/IEEE international conference. New York, New York, USA: ACM Press, 2012. http://dx.doi.org/10.1145/2157689.2157768.
Testo completoBudkov, V. Yu, M. V. Prischepa, A. L. Ronzhin e A. A. Karpov. "Multimodal human-robot interaction". In 2010 International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT 2010). IEEE, 2010. http://dx.doi.org/10.1109/icumt.2010.5676593.
Testo completoScimeca, Luca, Fumiya Iida, Perla Maiolino e Thrishantha Nanayakkara. "Human-Robot Medical Interaction". In HRI '20: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2020. http://dx.doi.org/10.1145/3371382.3374847.
Testo completo"Physical Human-Robot Interaction". In 2019 IEEE International Conference on Mechatronics (ICM). IEEE, 2019. http://dx.doi.org/10.1109/icmech.2019.8722848.
Testo completoBartneck, Christoph, e Jodi Forlizzi. "Shaping human-robot interaction". In Extended abstracts of the 2004 conference. New York, New York, USA: ACM Press, 2004. http://dx.doi.org/10.1145/985921.986205.
Testo completoRapporti di organizzazioni sul tema "Human-robot interaction"
Arkin, Ronald C., e Lilia Moshkina. Affect in Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, gennaio 2014. http://dx.doi.org/10.21236/ada593747.
Testo completoMartinson, E., e W. Lawson. Learning Speaker Recognition Models through Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, maggio 2011. http://dx.doi.org/10.21236/ada550036.
Testo completoManring, Levi H., John Monroe Pederson e Dillon Gabriel Potts. Improving Human-Robot Interaction and Control Through Augmented Reality. Office of Scientific and Technical Information (OSTI), agosto 2018. http://dx.doi.org/10.2172/1467198.
Testo completoJiang, Shu, e Ronald C. Arkin. Mixed-Initiative Human-Robot Interaction: Definition, Taxonomy, and Survey. Fort Belvoir, VA: Defense Technical Information Center, gennaio 2015. http://dx.doi.org/10.21236/ada620347.
Testo completoScholtz, Jean, Jeff Young, Holly A. Yanco e Jill L. Drury. Evaluation of Human-Robot Interaction Awareness in Search and Rescue. Fort Belvoir, VA: Defense Technical Information Center, gennaio 2006. http://dx.doi.org/10.21236/ada456128.
Testo completoBagchi, Shelly, Murat Aksu, Megan Zimmerman, Jeremy A. Marvel, Brian Antonishek, Heni Ben Amor, Terry Fong, Ross Mead e Yue Wang. Workshop Report: Test Methods and Metrics for Effective HRI in Collaborative Human-Robot Teams, ACM/IEEE Human-Robot Interaction Conference, 2019. National Institute of Standards and Technology, dicembre 2020. http://dx.doi.org/10.6028/nist.ir.8339.
Testo completoBagchi, Shelly, Jeremy A. Marvel, Megan Zimmerman, Murat Aksu, Brian Antonishek, Heni Ben Amor, Terry Fong, Ross Mead e Yue Wang. Workshop Report: Test Methods and Metrics for Effective HRI in Real-World Human-Robot Teams, ACM/IEEE Human-Robot Interaction Conference, 2020 (Virtual). National Institute of Standards and Technology, gennaio 2021. http://dx.doi.org/10.6028/nist.ir.8345.
Testo completoSchaefer, Kristin E., Deborah R. Billings, James L. Szalma, Jeffrey K. Adams, Tracy L. Sanders, Jessie Y. Chen e Peter A. Hancock. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction. Fort Belvoir, VA: Defense Technical Information Center, luglio 2014. http://dx.doi.org/10.21236/ada607926.
Testo completoBagchi, Shelly, Jeremy A. Marvel, Megan Zimmerman, Murat Aksu, Brian Antonishek, Xiang Li, Heni Ben Amor, Terry Fong, Ross Mead e Yue Wang. Workshop Report: Novel and Emerging Test Methods and Metrics for Effective HRI, ACM/IEEE Conference on Human-Robot Interaction, 2021. National Institute of Standards and Technology, febbraio 2022. http://dx.doi.org/10.6028/nist.ir.8417.
Testo completoBreazael, Cynthia, e Brian Scassellati. Infant-Like Social Interactions Between a Robot and a Human Caregiver. Fort Belvoir, VA: Defense Technical Information Center, gennaio 2006. http://dx.doi.org/10.21236/ada450357.
Testo completo