Academic literature on the topic 'Human-Robot motion'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Human-Robot motion.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Human-Robot motion"
Liu, Hongyi, and Lihui Wang. "Human motion prediction for human-robot collaboration." Journal of Manufacturing Systems 44 (July 2017): 287–94. http://dx.doi.org/10.1016/j.jmsy.2017.04.009.
Full textKIGUCHI, Kazuo, Subrata Kumar KUNDU, and Makato Sasaki. "1P1-A24 An Inner Skeleton Robot for Human Elbow Motion Assist." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2006 (2006): _1P1—A24_1—_1P1—A24_2. http://dx.doi.org/10.1299/jsmermd.2006._1p1-a24_1.
Full textGopura, Ranathunga Arachchilage Ruwan Chandra, and Kazuo Kiguchi. "1207 Control of an Exoskeleton Robot for Human Wrist Motion Support." Proceedings of the Conference on Information, Intelligence and Precision Equipment : IIP 2008 (2008): 67–68. http://dx.doi.org/10.1299/jsmeiip.2008.67.
Full textKhoramshahi, Mahdi, and Aude Billard. "A dynamical system approach for detection and reaction to human guidance in physical human–robot interaction." Autonomous Robots 44, no. 8 (July 26, 2020): 1411–29. http://dx.doi.org/10.1007/s10514-020-09934-9.
Full textLin, Hsien I., and Zan Sheng Chen. "Whole-Body Human-to-Humanoid Motion Imitation." Applied Mechanics and Materials 479-480 (December 2013): 617–21. http://dx.doi.org/10.4028/www.scientific.net/amm.479-480.617.
Full textDARIUSH, BEHZAD, MICHAEL GIENGER, ARJUN ARUMBAKKAM, YOUDING ZHU, BING JIAN, KIKUO FUJIMURA, and CHRISTIAN GOERICK. "ONLINE TRANSFER OF HUMAN MOTION TO HUMANOIDS." International Journal of Humanoid Robotics 06, no. 02 (June 2009): 265–89. http://dx.doi.org/10.1142/s021984360900170x.
Full textTSUMUGIWA, Toru, Atsushi KAMIYOSHI, Ryuichi YOKOGAWA, and Hiroshi SHIBATA. "1A1-M08 Robot Motion Control based on Relative Motion Information between Human and Robot in Human-Robot Dynamical Interaction." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2007 (2007): _1A1—M08_1—_1A1—M08_3. http://dx.doi.org/10.1299/jsmermd.2007._1a1-m08_1.
Full textKodama, Ryoji, Toru Nogai, and Katsumi Suzuki. "Effect of the Motion in Horizontal Plane on the Stability of Biped Walking." Journal of Robotics and Mechatronics 5, no. 6 (December 20, 1993): 531–36. http://dx.doi.org/10.20965/jrm.1993.p0531.
Full textIVANCEVIC, VLADIMIR G., and TIJANA T. IVANCEVIC. "HUMAN VERSUS HUMANOID ROBOT BIODYNAMICS." International Journal of Humanoid Robotics 05, no. 04 (December 2008): 699–713. http://dx.doi.org/10.1142/s0219843608001595.
Full textMori, Yoshikazu, Koji Ota, and Tatsuya Nakamura. "Robot Motion Algorithm Based on Interaction with Human." Journal of Robotics and Mechatronics 14, no. 5 (October 20, 2002): 462–70. http://dx.doi.org/10.20965/jrm.2002.p0462.
Full textDissertations / Theses on the topic "Human-Robot motion"
Paulin, Rémi. "human-robot motion : an attention-based approach." Thesis, Université Grenoble Alpes (ComUE), 2018. http://www.theses.fr/2018GREAM018.
Full textFor autonomous mobile robots designed to share their environment with humans, path safety and efficiency are not the only aspects guiding their motion: they must follow social rules so as not to cause discomfort to surrounding people. Most socially-aware path planners rely heavily on the concept of social spaces; however, social spaces are hard to model and they are of limited use in the context of human-robot interaction where intrusion into social spaces is necessary. In this work, a new approach for socially-aware path planning is presented that performs well in complex environments as well as in the context of human-robot interaction. Specifically, the concept of attention is used to model how the influence of the environment as a whole affects how the robot's motion is perceived by people within close proximity. A new computational model of attention is presented that estimates how our attentional resources are shared amongst the salient elements in our environment. Based on this model, the novel concept of attention field is introduced and a path planner that relies on this field is developed in order to produce socially acceptable paths. To do so, a state-of-the-art many-objective optimization algorithm is successfully applied to the path planning problem. The capacities of the proposed approach are illustrated in several case studies where the robot is assigned different tasks. Firstly, when the task is to navigate in the environment without causing distraction our approach produces promising results even in complex situations. Secondly, when the task is to attract a person's attention in view of interacting with him or her, the motion planner is able to automatically choose a destination that best conveys its desire to interact whilst keeping the motion safe, efficient and socially acceptable
Lasota, Przemyslaw A. (Przemyslaw Andrzej). "Robust human motion prediction for safe and efficient human-robot interaction." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122497.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 175-188).
From robotic co-workers in factories to assistive robots in homes, human-robot interaction (HRI) has the potential to revolutionize a large array of domains by enabling robotic assistance where it was previously not possible. Introducing robots into human-occupied domains, however, requires strong consideration for the safety and efficiency of the interaction. One particularly effective method of supporting safe an efficient human-robot interaction is through the use of human motion prediction. By predicting where a person might reach or walk toward in the upcoming moments, a robot can adjust its motions to proactively resolve motion conflicts and avoid impeding the person's movements. Current approaches to human motion prediction, however, often lack the robustness required for real-world deployment. Many methods are designed for predicting specific types of tasks and motions, and do not necessarily generalize well to other domains.
It is also possible that no single predictor is suitable for predicting motion in a given scenario, and that multiple predictors are needed. Due to these drawbacks, without expert knowledge in the field of human motion prediction, it is difficult to deploy prediction on real robotic systems. Another key limitation of current human motion prediction approaches lies in deficiencies in partial trajectory alignment. Alignment of partially executed motions to a representative trajectory for a motion is a key enabling technology for many goal-based prediction methods. Current approaches of partial trajectory alignment, however, do not provide satisfactory alignments for many real-world trajectories. Specifically, due to reliance on Euclidean distance metrics, overlapping trajectory regions and temporary stops lead to large alignment errors.
In this thesis, I introduce two frameworks designed to improve the robustness of human motion prediction in order to facilitate its use for safe and efficient human-robot interaction. First, I introduce the Multiple-Predictor System (MPS), a datadriven approach that uses given task and motion data in order to synthesize a high performing predictor by automatically identifying informative prediction features and combining the strengths of complementary prediction methods. With the use of three distinct human motion datasets, I show that using the MPS leads to lower prediction error in a variety of HRI scenarios, and allows for accurate prediction for a range of time horizons. Second, in order to address the drawbacks of prior alignment techniques, I introduce the Bayesian ESTimator for Partial Trajectory Alignment (BEST-PTA).
This Bayesian estimation framework uses a combination of optimization, supervised learning, and unsupervised learning components that are trained and synthesized based on a given set of example trajectories. Through an evaluation on three human motion datasets, I show that BEST-PTA reduces alignment error when compared to state-of-the-art baselines. Furthermore, I demonstrate that this improved alignment reduces human motion prediction error. Lastly, in order to assess the utility of the developed methods for improving safety and efficiency in HRI, I introduce an integrated framework combining prediction with robot planning in time. I describe an implementation and evaluation of this framework on a real physical system. Through this demonstration, I show that the developed approach leads to automatically derived adaptive robot behavior. I show that the developed framework leads to improvements in quantitative metrics of safety and efficiency with the use of a simulated evaluation.
"Funded by the NASA Space Technology Research Fellowship Program and the National Science Foundation"--Page 6
by Przemyslaw A. Lasota.
Ph. D. in Autonomous Systems
Ph.D.inAutonomousSystems Massachusetts Institute of Technology, Department of Aeronautics and Astronautics
Gray, Cobb Susan Valerie. "Perception and orientation issues in human control of robot motion." Thesis, University of Nottingham, 1991. http://eprints.nottingham.ac.uk/11237/.
Full textHuang, Chien-Ming. "Joint attention in human-robot interaction." Thesis, Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/41196.
Full textNarsipura, Sreenivasa Manish. "Modeling of human movement for the generation of humanoid robot motion." Thesis, Toulouse, INPT, 2010. http://www.theses.fr/2010INPT0120/document.
Full textHumanoid robotics is coming of age with faster and more agile robots. To compliment the physical complexity of humanoid robots, the robotics algorithms being developed to derive their motion have also become progressively complex. The work in this thesis spans across two research fields, human neuroscience and humanoid robotics, and brings some ideas from the former to aid the latter. By exploring the anthropological link between the structure of a human and that of a humanoid robot we aim to guide conventional robotics methods like local optimization and task-based inverse kinematics towards more realistic human-like solutions. First, we look at dynamic manipulation of human hand trajectories while playing with a yoyo. By recording human yoyo playing, we identify the control scheme used as well as a detailed dynamic model of the hand-yoyo system. Using optimization this model is then used to implement stable yoyo-playing within the kinematic and dynamic limits of the humanoid HRP-2. The thesis then extends its focus to human and humanoid locomotion. We take inspiration from human neuroscience research on the role of the head in human walking and implement a humanoid robotics analogy to this. By allowing a user to steer the head of a humanoid, we develop a control method to generate deliberative whole-body humanoid motion including stepping, purely as a consequence of the head movement. This idea of understanding locomotion as a consequence of reaching a goal is extended in the final study where we look at human motion in more detail. Here, we aim to draw to a link between “invariants” in neuroscience and “kinematic tasks” in humanoid robotics. We record and extract stereotypical characteristics of human movements during a walking and grasping task. These results are then normalized and generalized such that they can be regenerated for other anthropomorphic figures with different kinematic limits than that of humans. The final experiments show a generalized stack of tasks that can generate realistic walking and grasping motion for the humanoid HRP-2. The general contribution of this thesis is in showing that while motion planning for humanoid robots can be tackled by classical methods of robotics, the production of realistic movements necessitate the combination of these methods with the systematic and formal observation of human behavior
Umali, Antonio. "Framework For Robot-Assisted Doffing of Personal Protective Equipment." Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/940.
Full textPai, Abhishek. "Distance-Scaled Human-Robot Interaction with Hybrid Cameras." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1563872095430977.
Full textConte, Dean Edward. "Autonomous Robotic Escort Incorporating Motion Prediction with Human Intention." Thesis, Virginia Tech, 2021. http://hdl.handle.net/10919/102581.
Full textMaster of Science
This thesis presents a method for a mobile robot to escort a human to their destination successfully and efficiently. The proposed technique uses human intention to predict the walk path allowing the robot to be in front of the human while walking. Human intention is inferred by the head direction, an effective past-proven indicator of intention, and is combined with conventional motion prediction. The robot motion is then determined from the predicted human position allowing for anticipative autonomous escorting. Experimental analysis shows that the incorporation of the proposed human intention reduces human position prediction error by approximately 35% when turning. Furthermore, experimental validation with an mobile robotic platform shows escorting up to 50% more accurate compared to the conventional techniques, while achieving 97% success rate. The unique escorting interaction method proposed has applications such as touch-less shopping cart robots, exercise companions, collaborative rescue robots, and sanitary transportation for hospitals.
Nitz, Pettersson Hannes, and Samuel Vikström. "VISION-BASED ROBOT CONTROLLER FOR HUMAN-ROBOT INTERACTION USING PREDICTIVE ALGORITHMS." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-54609.
Full textHayne, Rafi. "Toward Enabling Safe & Efficient Human-Robot Manipulation in Shared Workspaces." Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/1012.
Full textBooks on the topic "Human-Robot motion"
Martin, W. N. Motion Understanding: Robot and Human Vision. Boston, MA: Springer US, 1988.
Find full textLenarčič, J., and M. M. Stanišić. Advances in robot kinematics: Motion in man and machine. Dordrecht: Springer, 2010.
Find full textN, Martin W., and Aggarwal J. K. 1936-, eds. Motion understanding: Robot and human vision. Boston: Kluwer Academic Publishers, 1988.
Find full textNoceti, Nicoletta, Alessandra Sciutti, and Francesco Rea. Modelling Human Motion: From Human Perception to Robot Design. Springer, 2020.
Find full text(Editor), W. N. Martin, and J. K. Aggarwal (Editor), eds. Motion Understanding: Robot and Human Vision (The International Series in Engineering and Computer Science). Springer, 1988.
Find full text(Editor), Federico Barbagli, Domenico Prattichizzo (Editor), and Kenneth Salisbury (Editor), eds. Multi-point Interaction with Real and Virtual Objects (Springer Tracts in Advanced Robotics). Springer, 2005.
Find full textMetta, Giorgio. Humans and humanoids. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199674923.003.0047.
Full textErdem, Uğur Murat, Nicholas Roy, John J. Leonard, and Michael E. Hasselmo. Spatial and episodic memory. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199674923.003.0029.
Full textBook chapters on the topic "Human-Robot motion"
Langer, Allison, and Shelly Levy-Tzedek. "Priming and Timing in Human-Robot Interactions." In Modelling Human Motion, 335–50. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46732-6_16.
Full textLiu, Changliu, Te Tang, Hsien-Chung Lin, and Masayoshi Tomizuka. "Efficiency in Real-Time Motion Planning." In Designing Robot Behavior in Human-Robot Interactions, 61–82. Boca Raton : CRC Press, Taylor & Francis Group, [2019] | “A science publishers book.”: CRC Press, 2019. http://dx.doi.org/10.1201/9780429058714-4.
Full textIllmann, Jörg, Boris Kluge, Erwin Prassler, and Matthias Strobel. "Statistical Recognition of Motion Patterns." In Advances in Human-Robot Interaction, 69–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-31509-4_7.
Full textWösch, Thomas, Werner Neubauer, Georg v. Wichert, and Zsolt Kemény. "Motion Planning for Domestic Robot Assistants." In Advances in Human-Robot Interaction, 195–205. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-31509-4_17.
Full textGe, Shuzhi Sam, and Yanan Li. "Motion Synchronization for Human-Robot Collaboration." In Social Robotics, 248–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-34103-8_25.
Full textLohan, Katrin, Muneeb Imtiaz Ahmad, Christian Dondrup, Paola Ardón, Èric Pairet, and Alessandro Vinciarelli. "Adapting Movements and Behaviour to Favour Communication in Human-Robot Interaction." In Modelling Human Motion, 271–97. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46732-6_13.
Full textGu, Edward Y. L. "Representations of Rigid Motion." In A Journey from Robot to Digital Human, 49–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39047-0_3.
Full textPrassler, Prof Dr Erwin, Dr Andreas Stopp, Martin Hägele, Ioannis Iossifidis, Dr Gisbert Lawitzky, Dr Gerhard Grunwald, and Prof Dr Ing Rüdiger Dillmann. "4 Co-existence: Physical Interaction and Coordinated Motion." In Advances in Human-Robot Interaction, 161–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-31509-4_14.
Full textAhmed, Rehan M., Anani V. Ananiev, and Ivan G. Kalaykov. "Compliant Motion Control for Safe Human Robot Interaction." In Robot Motion and Control 2009, 265–74. London: Springer London, 2009. http://dx.doi.org/10.1007/978-1-84882-985-5_24.
Full textGao, Robert X., Lihui Wang, Peng Wang, Jianjing Zhang, and Hongyi Liu. "Human Motion Recognition and Prediction for Robot Control." In Advanced Human-Robot Collaboration in Manufacturing, 261–82. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-69178-3_11.
Full textConference papers on the topic "Human-Robot motion"
Potkonjak, Veljko, Vladimir Petrović, Kosta Jovanović, and Dragan Kostić. "Human-Robot Analogy − How Physiology Shapes Human and Robot Motion." In European Conference on Artificial Life 2013. MIT Press, 2013. http://dx.doi.org/10.7551/978-0-262-31709-2-ch021.
Full textDragan, Anca D., Shira Bauman, Jodi Forlizzi, and Siddhartha S. Srinivasa. "Effects of Robot Motion on Human-Robot Collaboration." In HRI '15: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2696454.2696473.
Full textDietz, Griffin, Jane L. E, Peter Washington, Lawrence H. Kim, and Sean Follmer. "Human Perception of Swarm Robot Motion." In CHI '17: CHI Conference on Human Factors in Computing Systems. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3027063.3053220.
Full textFukui, Kotaro, Toshihiro Kusano, Yoshikazu Mukaeda, Yuto Suzuki, Atsuo Takanishi, and Masaaki Honda. "Speech robot mimicking human articulatory motion." In Interspeech 2010. ISCA: ISCA, 2010. http://dx.doi.org/10.21437/interspeech.2010-337.
Full textDragan, Anca, and Siddhartha Srinivasa. "Familiarization to robot motion." In HRI'14: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2559636.2559674.
Full textZhou, Allan, Dylan Hadfield-Menell, Anusha Nagabandi, and Anca D. Dragan. "Expressive Robot Motion Timing." In HRI '17: ACM/IEEE International Conference on Human-Robot Interaction. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/2909824.3020221.
Full textSakata, Wataru, Futoshi Kobayashi, and Hiroyuki Nakamoto. "Robot-human handover based on motion prediction of human." In 2017 6th International Conference on Informatics, Electronics and Vision & 2017 7th International Symposium in Computational Medical and Health Technology (ICIEV-ISCMHT). IEEE, 2017. http://dx.doi.org/10.1109/iciev.2017.8338592.
Full textMolina-Tanco, L., J. P. Bandera, R. Marfil, and F. Sandoval. "Real-time human motion analysis for human-robot interaction." In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2005. http://dx.doi.org/10.1109/iros.2005.1545240.
Full textKang, Jie, Kai Jia, Fang Xu, Fengshan Zou, Yanan Zhang, and Hengle Ren. "Real-Time Human Motion Estimation for Human Robot Collaboration." In 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER). IEEE, 2018. http://dx.doi.org/10.1109/cyber.2018.8688348.
Full textThobbi, Anand, Ye Gu, and Weihua Sheng. "Using human motion estimation for human-robot cooperative manipulation." In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2011). IEEE, 2011. http://dx.doi.org/10.1109/iros.2011.6094904.
Full text