Auswahl der wissenschaftlichen Literatur zum Thema „Deep Learning and Perception for Grasping and Manipulation“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Inhaltsverzeichnis
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Deep Learning and Perception for Grasping and Manipulation" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Deep Learning and Perception for Grasping and Manipulation"
Han, Dong, Hong Nie, Jinbao Chen, Meng Chen, Zhen Deng und Jianwei Zhang. „Multi-modal haptic image recognition based on deep learning“. Sensor Review 38, Nr. 4 (17.09.2018): 486–93. http://dx.doi.org/10.1108/sr-08-2017-0160.
Der volle Inhalt der QuelleValarezo Añazco, Edwin, Sara Guerrero, Patricio Rivera Lopez, Ji-Heon Oh, Ga-Hyeon Ryu und Tae-Seong Kim. „Deep Learning-Based Ensemble Approach for Autonomous Object Manipulation with an Anthropomorphic Soft Robot Hand“. Electronics 13, Nr. 2 (17.01.2024): 379. http://dx.doi.org/10.3390/electronics13020379.
Der volle Inhalt der QuelleWang, Cong, Qifeng Zhang, Qiyan Tian, Shuo Li, Xiaohui Wang, David Lane, Yvan Petillot und Sen Wang. „Learning Mobile Manipulation through Deep Reinforcement Learning“. Sensors 20, Nr. 3 (10.02.2020): 939. http://dx.doi.org/10.3390/s20030939.
Der volle Inhalt der QuelleZhao, Wenhui, Bin Xu und Xinzhong Wu. „Robot grasping system based on deep learning target detection“. Journal of Physics: Conference Series 2450, Nr. 1 (01.03.2023): 012071. http://dx.doi.org/10.1088/1742-6596/2450/1/012071.
Der volle Inhalt der QuelleZhou, Hongyu, Jinhui Xiao, Hanwen Kang, Xing Wang, Wesley Au und Chao Chen. „Learning-Based Slip Detection for Robotic Fruit Grasping and Manipulation under Leaf Interference“. Sensors 22, Nr. 15 (22.07.2022): 5483. http://dx.doi.org/10.3390/s22155483.
Der volle Inhalt der QuelleZhang, Ruihua, Xujun Chen, Zhengzhong Wan, Meng Wang und Xinqing Xiao. „Deep Learning-Based Oyster Packaging System“. Applied Sciences 13, Nr. 24 (08.12.2023): 13105. http://dx.doi.org/10.3390/app132413105.
Der volle Inhalt der QuelleLiu, Ning, Cangui Guo, Rongzhao Liang und Deping Li. „Collaborative Viewpoint Adjusting and Grasping via Deep Reinforcement Learning in Clutter Scenes“. Machines 10, Nr. 12 (29.11.2022): 1135. http://dx.doi.org/10.3390/machines10121135.
Der volle Inhalt der QuelleHan, Dong, Beni Mulyana, Vladimir Stankovic und Samuel Cheng. „A Survey on Deep Reinforcement Learning Algorithms for Robotic Manipulation“. Sensors 23, Nr. 7 (05.04.2023): 3762. http://dx.doi.org/10.3390/s23073762.
Der volle Inhalt der QuelleMohammed, Marwan Qaid, Lee Chung Kwek, Shing Chyi Chua, Abdulaziz Salamah Aljaloud, Arafat Al-Dhaqm, Zeyad Ghaleb Al-Mekhlafi und Badiea Abdulkarem Mohammed. „Deep Reinforcement Learning-Based Robotic Grasping in Clutter and Occlusion“. Sustainability 13, Nr. 24 (10.12.2021): 13686. http://dx.doi.org/10.3390/su132413686.
Der volle Inhalt der QuelleSayour, Malak H., Sharbel E. Kozhaya und Samer S. Saab. „Autonomous Robotic Manipulation: Real-Time, Deep-Learning Approach for Grasping of Unknown Objects“. Journal of Robotics 2022 (30.06.2022): 1–14. http://dx.doi.org/10.1155/2022/2585656.
Der volle Inhalt der QuelleDissertationen zum Thema "Deep Learning and Perception for Grasping and Manipulation"
Zapata-Impata, Brayan S. „Robotic manipulation based on visual and tactile perception“. Doctoral thesis, Universidad de Alicante, 2020. http://hdl.handle.net/10045/118217.
Der volle Inhalt der QuelleThis doctoral thesis has been carried out with the support of the Spanish Ministry of Economy, Industry and Competitiveness through the grant BES-2016-078290.
Tahoun, Mohamed. „Object Shape Perception for Autonomous Dexterous Manipulation Based on Multi-Modal Learning Models“. Electronic Thesis or Diss., Bourges, INSA Centre Val de Loire, 2021. http://www.theses.fr/2021ISAB0003.
Der volle Inhalt der QuelleThis thesis proposes 3D object reconstruction methods based on multimodal deep learning strategies. The targeted applications concern robotic manipulation. First, the thesis proposes a 3D visual reconstruction method from a single view of the object obtained by an RGB-D sensor. Then, in order to improve the quality of 3D reconstruction of objects from a single view, a new method combining visual and tactile information has been proposed based on a learning reconstruction model. The proposed method has been validated on a visual-tactile dataset respecting the kinematic constraints of a robotic hand. The visual-tactile dataset respecting the kinematic properties of the multi-fingered robotic hand has been created in the framework of this PhD work. This dataset is unique in the literature and is also a contribution of the thesis. The validation results show that the tactile information can have an important contribution for the prediction of the complete shape of an object, especially the part that is not visible to the RGD-D sensor. They also show that the proposed model allows to obtain better results compared to those obtained with the best performing methods of the state of the art
Morrison, Douglas. „Robotic grasping in unstructured and dynamic environments“. Thesis, Queensland University of Technology, 2021. https://eprints.qut.edu.au/207886/1/Douglas_Morrison_Thesis.pdf.
Der volle Inhalt der QuelleBuchteile zum Thema "Deep Learning and Perception for Grasping and Manipulation"
Blank, Andreas, Lukas Zikeli, Sebastian Reitelshöfer, Engin Karlidag und Jörg Franke. „Augmented Virtuality Input Demonstration Refinement Improving Hybrid Manipulation Learning for Bin Picking“. In Lecture Notes in Mechanical Engineering, 332–41. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-18326-3_32.
Der volle Inhalt der QuelleMehman Sefat, Amir, Saad Ahmad, Alexandre Angleraud, Esa Rahtu und Roel Pieters. „Robotic grasping in agile production“. In Deep Learning for Robot Perception and Cognition, 407–33. Elsevier, 2022. http://dx.doi.org/10.1016/b978-0-32-385787-1.00021-x.
Der volle Inhalt der QuelleKantor, George, und Francisco Yandun. „Advances in grasping techniques in agricultural robots“. In Burleigh Dodds Series in Agricultural Science, 355–86. Burleigh Dodds Science Publishing, 2024. http://dx.doi.org/10.19103/as.2023.0124.09.
Der volle Inhalt der Quelle„Deep learning techniques for modelling human manipulation and its translation for autonomous robotic grasping with soft end-effe“. In AI for Emerging Verticals: Human-robot computing, sensing and networking, 3–28. Institution of Engineering and Technology, 2020. http://dx.doi.org/10.1049/pbpc034e_ch1.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Deep Learning and Perception for Grasping and Manipulation"
Chu, You-Rui, Haiyue Zhu und Zhiping Lin. „Intelligent 6-DoF Robotic Grasping and Manipulation System Using Deep Learning“. In International Conference of Asian Society for Precision Engineering and Nanotechnology. Singapore: Research Publishing Services, 2022. http://dx.doi.org/10.3850/978-981-18-6021-8_or-02-0217.html.
Der volle Inhalt der QuelleZhang, Chi, und Yingzhao Zhu. „A review of robot grasping tactile perception based on deep learning“. In Third International Conference on Control and Intelligent Robotics (ICCIR 2023), herausgegeben von Kechao Wang und M. Vijayalakshmi. SPIE, 2023. http://dx.doi.org/10.1117/12.3011588.
Der volle Inhalt der QuellePavlichenko, Dmytro, und Sven Behnke. „Deep Reinforcement Learning of Dexterous Pre-Grasp Manipulation for Human-Like Functional Categorical Grasping“. In 2023 IEEE 19th International Conference on Automation Science and Engineering (CASE). IEEE, 2023. http://dx.doi.org/10.1109/case56687.2023.10260385.
Der volle Inhalt der QuelleFang, Jianhao, Weifei Hu, Chuxuan Wang, Zhenyu Liu und Jianrong Tan. „Deep Reinforcement Learning Enhanced Convolutional Neural Networks for Robotic Grasping“. In ASME 2021 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/detc2021-67225.
Der volle Inhalt der QuelleLu, Jingpei, Ambareesh Jayakumari, Florian Richter, Yang Li und Michael C. Yip. „SuPer Deep: A Surgical Perception Framework for Robotic Tissue Manipulation using Deep Learning for Feature Extraction“. In 2021 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2021. http://dx.doi.org/10.1109/icra48506.2021.9561249.
Der volle Inhalt der QuelleRakhimkul, Sanzhar, Anton Kim, Askarbek Pazylbekov und Almas Shintemirov. „Autonomous Object Detection and Grasping Using Deep Learning for Design of an Intelligent Assistive Robot Manipulation System“. In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC). IEEE, 2019. http://dx.doi.org/10.1109/smc.2019.8914465.
Der volle Inhalt der QuelleImran, Alishba, William Escobar und Fred Barez. „Design of an Affordable Prosthetic Arm Equipped With Deep Learning Vision-Based Manipulation“. In ASME 2021 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/imece2021-68714.
Der volle Inhalt der QuelleChen, Zhu, Xiao Liang und Minghui Zheng. „Including Image-Based Perception in Disturbance Observer for Warehouse Drones“. In ASME 2020 Dynamic Systems and Control Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/dscc2020-3284.
Der volle Inhalt der Quelle