Academic literature on the topic 'Vision-based Force Sensing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Vision-based Force Sensing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Vision-based Force Sensing"

1

Chawda, Vinay, and Marcia K. O'Malley. "Vision-based force sensing for nanomanipulation." IEEE/ASME Transactions on Mechatronics 16, no. 6 (December 2011): 1177–83. http://dx.doi.org/10.1109/tmech.2010.2093535.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Reddy, Annem Narayana, Nandan Maheshwari, Deepak Kumar Sahu, and G. K. Ananthasuresh. "Miniature Compliant Grippers With Vision-Based Force Sensing." IEEE Transactions on Robotics 26, no. 5 (October 2010): 867–77. http://dx.doi.org/10.1109/tro.2010.2056210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Adam, Georges, and David J. Cappelleri. "Towards a real-time 3D vision-based micro-force sensing probe." Journal of Micro-Bio Robotics 16, no. 1 (January 11, 2020): 23–32. http://dx.doi.org/10.1007/s12213-019-00122-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ye, X. W., C. Z. Dong, and T. Liu. "Force monitoring of steel cables using vision-based sensing technology: methodology and experimental verification." Smart Structures and Systems 18, no. 3 (September 25, 2016): 585–99. http://dx.doi.org/10.12989/sss.2016.18.3.585.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, Xiaoqian, Rajkumar Muthusamy, Eman Hassan, Zhenwei Niu, Lakmal Seneviratne, Dongming Gan, and Yahya Zweiri. "Neuromorphic Vision Based Contact-Level Classification in Robotic Grasping Applications." Sensors 20, no. 17 (August 21, 2020): 4724. http://dx.doi.org/10.3390/s20174724.

Full text
Abstract:
In recent years, robotic sorting is widely used in the industry, which is driven by necessity and opportunity. In this paper, a novel neuromorphic vision-based tactile sensing approach for robotic sorting application is proposed. This approach has low latency and low power consumption when compared to conventional vision-based tactile sensing techniques. Two Machine Learning (ML) methods, namely, Support Vector Machine (SVM) and Dynamic Time Warping-K Nearest Neighbor (DTW-KNN), are developed to classify material hardness, object size, and grasping force. An Event-Based Object Grasping (EBOG) experimental setup is developed to acquire datasets, where 243 experiments are produced to train the proposed classifiers. Based on predictions of the classifiers, objects can be automatically sorted. If the prediction accuracy is below a certain threshold, the gripper re-adjusts and re-grasps until reaching a proper grasp. The proposed ML method achieves good prediction accuracy, which shows the effectiveness and the applicability of the proposed approach. The experimental results show that the developed SVM model outperforms the DTW-KNN model in term of accuracy and efficiency for real time contact-level classification.
APA, Harvard, Vancouver, ISO, and other styles
6

Sun, Huanbo, Katherine J. Kuchenbecker, and Georg Martius. "A soft thumb-sized vision-based sensor with accurate all-round force perception." Nature Machine Intelligence 4, no. 2 (February 2022): 135–45. http://dx.doi.org/10.1038/s42256-021-00439-3.

Full text
Abstract:
AbstractVision-based haptic sensors have emerged as a promising approach to robotic touch due to affordable high-resolution cameras and successful computer vision techniques; however, their physical design and the information they provide do not yet meet the requirements of real applications. We present a robust, soft, low-cost, vision-based, thumb-sized three-dimensional haptic sensor named Insight, which continually provides a directional force-distribution map over its entire conical sensing surface. Constructed around an internal monocular camera, the sensor has only a single layer of elastomer over-moulded on a stiff frame to guarantee sensitivity, robustness and soft contact. Furthermore, Insight uniquely combines photometric stereo and structured light using a collimator to detect the three-dimensional deformation of its easily replaceable flexible outer shell. The force information is inferred by a deep neural network that maps images to the spatial distribution of three-dimensional contact force (normal and shear). Insight has an overall spatial resolution of 0.4 mm, a force magnitude accuracy of around 0.03 N and a force direction accuracy of around five degrees over a range of 0.03–2 N for numerous distinct contacts with varying contact area. The presented hardware and software design concepts can be transferred to a wide variety of robot parts.
APA, Harvard, Vancouver, ISO, and other styles
7

JO, Kensei, Yasuaki KAKEHI, Kouta MINAMIZAWA, Katsunari SATO, Hideaki NII, Naoki KAWAKAMI, and Susumu Tachi. "1P1-I08 A Basic Study on Vision-Based Force Vector Sensing with Movable Input Surface." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2008 (2008): _1P1—I08_1—_1P1—I08_2. http://dx.doi.org/10.1299/jsmermd.2008._1p1-i08_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Lijun, Xian Yi Li, and Hong Ming Gao. "The Friction Interaction Mechanism of Welding Seam Identifying Based on Force Sensing." Advanced Materials Research 291-294 (July 2011): 995–98. http://dx.doi.org/10.4028/www.scientific.net/amr.291-294.995.

Full text
Abstract:
The welding seam identifying(WSI) is one of remote welding precondition. The welding seam is usually identified by vision sensor. The investigation on WSI based on force sensing is less reported. Because the interaction mechanism of friction is not clear in 6 dimension(6D) force of WSI, the influence of friction on the WSI feed and direction is studied. The experimental results show that the friction will decrease the WSI feed in XSY plane and increase in Z direction, but they is in range permitted by WSI. The friction makes WSI feed direction point to the middle of welding seam track, and do not influence WSI. Above technologies, The WSI of S groove is achieved. The average deviation of WSI is less than ±0.5mm when there is friction in 6D force of WSI. It can meet the precision of WSI in remote welding.
APA, Harvard, Vancouver, ISO, and other styles
9

Min, Kyung-Won, Seok-Jung Jang, and Junhee Kim. "A Standalone Vision Sensing System for Pseudodynamic Testing of Tuned Liquid Column Dampers." Journal of Sensors 2016 (2016): 1–11. http://dx.doi.org/10.1155/2016/8152651.

Full text
Abstract:
Experimental investigation of the tuned liquid column damper (TLCD) is a primal factory task prior to its installation at a site and is mainly undertaken by a pseudodynamic test. In this study, a noncontact standalone vision sensing system is developed to replace a series of the conventional sensors installed at the TLCD tested. The fast vision sensing system is based on binary pixel counting of the portion of images steamed in a pseudodynamic test and achieves near real-time measurements of wave height, lateral motion, and control force of the TLCD. The versatile measurements of the system are theoretically and experimentally evaluated through a wide range of lab scale dynamic tests.
APA, Harvard, Vancouver, ISO, and other styles
10

Huang, Shouren, Bergström Niklas, Yuji Yamakawa, Taku Senoo, and Masatoshi Ishikawa. "Robotic Contour Tracing with High-Speed Vision and Force-Torque Sensing based on Dynamic Compensation Scheme." IFAC-PapersOnLine 50, no. 1 (July 2017): 4616–22. http://dx.doi.org/10.1016/j.ifacol.2017.08.654.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Vision-based Force Sensing"

1

Zhang, Zhongkai. "Vision-based calibration, position control and force sensing for soft robots." Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I001/document.

Full text
Abstract:
La modélisation de robots souples est extrêmement difficile, à cause notamment du nombre théoriquement infini des degrés de liberté. Cette difficulté est accentuée lorsque les robots ont des configurations complexes. Ce problème de modélisation entraîne de nouveaux défis pour la calibration et la conception des commandes des robots, mais également de nouvelles opportunités avec de nouvelles stratégies de détection de force possibles. Cette thèse a pour objectif de proposer des solutions nouvelles et générales utilisant la modélisation et la vision. La thèse présente dans un premier temps un modèle cinématique à temps discret pour les robots souples reposant sur la méthode des éléments finis (FEM) en temps réel. Ensuite, une méthode de calibration basée sur la vision du système de capteur-robot et des actionneurs est étudiée. Deux contrôleurs de position en boucle fermée sont conçus. En outre, pour traiter le problème de la perte d'image, une stratégie de commande commutable est proposée en combinant à la fois le contrôleur à boucle ouverte et le contrôleur à boucle fermée. Deux méthodes (avec et sans marqueur(s)) de détection de force externe pour les robots déformables sont proposées. L'approche est basée sur la fusion de mesures basées sur la vision et le modèle par FEM. En utilisant les deux méthodes, il est possible d'estimer non seulement les intensités, mais également l'emplacement des forces externes. Enfin, nous proposons une application concrète : un robot cathéter dont la flexion à l'extrémité est piloté par des câbles. Le robot est contrôlé par une stratégie de contrôle découplée qui permet de contrôler l’insertion et la flexion indépendamment, tout en se basant sur un modèle FEM
The modeling of soft robots which have, theoretically, infinite degrees of freedom, are extremely difficult especially when the robots have complex configurations. This difficulty of modeling leads to new challenges for the calibration and the control design of the robots, but also new opportunities with possible new force sensing strategies. This dissertation aims to provide new and general solutions using modeling and vision. The thesis at first presents a discrete-time kinematic model for soft robots based on the real-time Finite Element (FE) method. Then, a vision-based simultaneous calibration of sensor-robot system and actuators is investigated. Two closed-loop position controllers are designed. Besides, to deal with the problem of image feature loss, a switched control strategy is proposed by combining both the open-loop controller and the closed-loop controller. Using soft robot itself as a force sensor is available due to the deformable feature of soft structures. Two methods (marker-based and marker-free) of external force sensing for soft robots are proposed based on the fusion of vision-based measurements and FE model. Using both methods, not only the intensities but also the locations of the external forces can be estimated.As a specific application, a cable-driven continuum catheter robot through contacts is modeled based on FE method. Then, the robot is controlled by a decoupled control strategy which allows to control insertion and bending independently. Both the control inputs and the contact forces along the entire catheter can be computed by solving a quadratic programming (QP) problem with a linear complementarity constraint (QPCC)
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Jichun. "Tissue diagnosis probe based on stiffness measurement using vision and force sensing modalities." Thesis, King's College London (University of London), 2013. https://kclpure.kcl.ac.uk/portal/en/theses/tissue-diagnosis-probe-based-on-stiffness-measurement-using-vision-and-force-sensing-modalities(2157fcfd-4155-43c8-ab70-2d8fad7ee622).html.

Full text
Abstract:
This thesis presents the creation of a novel tissue diagnosis probe based on the measurement of stiffness and force during mechanical tool-tissue interactions. The probe using force and vision sensing modalities was created to be used for tissue diagnosis in medical applications, especially for robot-assisted minimallyinvasive surgery (MIS) to provide the necessary sensing modalities to allow for haptic feedback. By employing the developed prototypes, estimations of the mechanical properties of ex-vivo human prostate tissues were conducted using the finite element analysis (FEA) method and the Newton-Raphson algorithm. A clinical study of prostate tumour identification has been carried out on ex-vivo prostate samples and a study onrobotic palpation using a second prototype developed as part of this projectand comparing it to manual palpation was conducted. With the aim of measuring the indentation depth and the corresponding tissue reaction force simultaneously to obtain stiffness information,aprototype of a stiffness probe was constructedconsisting of a commercial digitalcamera and a force sensor. The effectiveness and sensitivityof the designedprobe was validatedthrough experiments on silicone phantomsand animal organs. The results showed that the probe could perform stiffness measurementsand localize tissue abnormalitieswhen indenting or sliding over the target surfaces. In order to investigate the mechanical properties of ex-vivo human prostate using the developed probe, a portable sliding indenter robot integrating the probe and the Phantom Omni device was created. Based on force-displacement measurements of the probe-soft tissue interaction, inversefinite element analysis (FEA)andthegeneralised Newton-Raphson algorithm were used to estimateunknown parameters including the shear modulus ofthe ex-vivo human prostate. The prostate was modelled as a nonlinear hyperelastic material (utilizing Arruda-Boyce model) inthe finite element modelling software package, ABAQUS 6.8-1. The results indicated that the proposed model can estimate the mechanical properties ofthe ex-vivo human prostate effectively. With the aim of identifying the stiffness of normal and cancerous prostate tissue, the prostates of 26 male patients were examinedusing the developed sliding indenter robot. Threedimensional (3D) stiffness maps of ex-vivo human prostate were created. The stiffness maps were correlatedwith other clinical examinations including Magnetic Resonance Imaging (MRI), Digital Rectal Examination (DRE), histology and Ultrosound-guided biopsy. The proposedprobe proved to be a promising platform to distinguish between cancerous and healthy tissue in prostate andto discriminate pathological tissue variations. In addition, the results provided quantitative information for the diagnosis and localization of prostate cancer. To ensure the proposed probe is suitable for MIS applications, a furtherprototype of the stiffness probebased on optic-fibreforce sensing replacing the commercial available force sensor was created. A study on robotic palpation using the developed probe and comparing it tomanual palpation was conducted. The results indicated that robotic palpation was more effective than manual palpation conducted by experienced surgeons.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Vision-based Force Sensing"

1

Wei, Haibin, Manjia Su, and Yisheng Guan. "Semi-autonomous Robotic Manipulation by Tele-Operation with Master-Slave Robots and Autonomy Based on Vision and Force Sensing." In Intelligent Robotics and Applications, 36–46. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89095-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Evangeliou, Nikolaos, Athanasios Tsoukalas, Nikolaos Giakoumidis, Steffen Holter, and Anthony Tzes. "Development of a Versatile Modular Platform for Aerial Manipulators." In Service Robotics. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.94027.

Full text
Abstract:
The scope of this chapter is the development of an aerial manipulator platform using an octarotor drone with an attached manipulator. An on-board spherical camera provides visual information for the drone’s surroundings, while a Pan-Tilt-Zoom camera system is used to track targets. A powerful computer with a GPU offers significant on-board computational power for the visual servoing of the aerial manipulator system. This vision system, along with the Inertial Management Unit based controller provides exemplary guidance in confined and outdoor spaces. Coupled with the manipulator’s force sensing capabilities the system can interact with the environment. This aerial manipulation system is modular as far as attaching various payloads depending on the application (i.e., environmental sensing, facade cleaning and others, aerial netting for evader-drone geofencing, and others). Experimental studies using a motion capture system are offered to validate the system’s efficiency.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Vision-based Force Sensing"

1

Gupta, Abhishek, Volkan Patoglu, and Marcia K. O'Malley. "Vision Based Force Sensing for Nanorobotic Manipulation." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-15111.

Full text
Abstract:
Over the last decade, considerable interest has been generated in building and manipulating nanoscale structures. Applications of nanomanipulation include study of nanoparticles, molecules, DNA and viruses, and bottom-up nanoassembly. We propose a Nanomanipulation System using the Zyvex S100 nanomanipulator, which operates within a scanning electron microscope (SEM), as its primary component. The primary advantage of the S100 setup over standard scanning probe microscopy based nanomanipulators is the ability to see the object during manipulation. Relying on visual feedback alone to control the nanomanipulator is not preferable due to perceptual limitations of depth and contact within the SEM. To improve operator performance over visual feedback alone, an impedance-controlled bilateral teleoperation setup is envisioned. Lack of on-board force sensors on the S100 system is the primary hindrance in the realization of the proposed architecture. In this paper, we present a computer vision based force sensing scheme. The advantages of this sensing strategy include its low cost and lack of requirement of hardware modification(s). Force sensing is implemented using an atomic force microscopy (AFM) probe attached to the S100 end-effector. Deformation of the cantilever probe is monitored using a Hough transform based algorithm. These deformations are mapped to corresponding end-effector forces following the Euler-Bernoulli beam mechanics model. The forces thus sensed can be used to provide force-feedback to the operator through a master manipulator.
APA, Harvard, Vancouver, ISO, and other styles
2

Greminger, Michael A., and Bradley J. Nelson. "Vision-based force sensing at nanonewton scales." In Intelligent Systems and Advanced Manufacturing, edited by Bradley J. Nelson and Jean-Marc Breguet. SPIE, 2001. http://dx.doi.org/10.1117/12.444147.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Tao, Yang Cong, Xiaomao Li, and Yan Peng. "Robot Tactile Sensing: Vision Based Tactile Sensor for Force Perception." In 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER). IEEE, 2018. http://dx.doi.org/10.1109/cyber.2018.8688163.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Adam, Georges, and David J. Cappelleri. "Design of a 3D Vision-based Micro-Force Sensing Probe." In 2019 International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS). IEEE, 2019. http://dx.doi.org/10.1109/marss.2019.8860957.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chang, R. J., and C. C. Shiu. "Vision-based control of SMA-actuated polymer microgripper with force sensing." In 2011 IEEE International Conference on Mechatronics and Automation (ICMA). IEEE, 2011. http://dx.doi.org/10.1109/icma.2011.5986304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Belharet, Karim, David Folio, and Antoine Ferreira. "Vision-based force sensing of a magnetic microrobot in a viscous flow." In 2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2014. http://dx.doi.org/10.1109/icra.2014.6907133.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Yanxia, and Chaojun Wang. "Research on Robot Manipulator Servo Control Based on Force and Vision Sensing." In 2013 5th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC). IEEE, 2013. http://dx.doi.org/10.1109/ihmsc.2013.160.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jichun Li, Hongbin Liu, K. Althoefer, and L. D. Seneviratne. "A stiffness probe based on force and vision sensing for soft tissue diagnosis." In 2012 34th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2012. http://dx.doi.org/10.1109/embc.2012.6346088.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Peiyu, Wen Ying, and Seongkook Heo. "Fringer: A Finger-Worn Passive Device Enabling Computer Vision Based Force Sensing Using Moiré Fringes." In UIST '22: The 35th Annual ACM Symposium on User Interface Software and Technology. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3526114.3558706.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Adam, Georges, Mazin Hakim, Luis Solorio, and David J. Cappelleri. "Stiffness Characterization and Micromanipulation for Biomedical Applications using the Vision-based Force-Sensing Magnetic Mobile Microrobot." In 2020 International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS). IEEE, 2020. http://dx.doi.org/10.1109/marss49294.2020.9307874.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography