To see the other types of publications on this topic, follow the link: Robotic sensors and control.

Journal articles on the topic 'Robotic sensors and control'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Robotic sensors and control.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kramer, Kathleen A., and Stephen C. Stubberud. "Control Loop Sensor Calibration Using Neural Networks for Robotic Control." Journal of Robotics 2011 (2011): 1–8. http://dx.doi.org/10.1155/2011/845685.

Full text
Abstract:
Whether sensor model’s inaccuracies are a result of poor initial modeling or from sensor damage or drift, the effects can be just as detrimental. Sensor modeling errors result in poor state estimation. This, in turn, can cause a control system relying upon the sensor’s measurements to become unstable, such as in robotics where the control system is applied to allow autonomous navigation. A technique referred to as a neural extended Kalman filter (NEKF) is developed to provide both state estimation in a control loop and to learn the difference between the true sensor dynamics and the sensor model. The technique requires multiple sensors on the control system so that the properly operating and modeled sensors can be used as truth. The NEKF trains a neural network on-line using the same residuals as the state estimation. The resulting sensor model can then be reincorporated fully into the system to provide the added estimation capability and redundancy.
APA, Harvard, Vancouver, ISO, and other styles
2

Kazerooni, H., Mark S. Evans, and J. Jones. "Hydrostatic Force Sensor for Robotic Applications." Journal of Dynamic Systems, Measurement, and Control 119, no. 1 (March 1, 1997): 115–19. http://dx.doi.org/10.1115/1.2801201.

Full text
Abstract:
This article presents a theoretical and experimental investigation of a new kind of force sensor which detects forces by measuring an induced pressure change in a material of large Poisson’s ratio. In this investigation, we develop mathematical expressions for the sensor’s sensitivity and bandwidth, and show that its sensitivity can be much larger and its bandwidth is usually smaller than those of existing strain-gage-type sensors. This force sensor is well-suited for measuring large but slowly varying forces. It can be installed in a space smaller than that required for existing sensors. This paper also discusses the effects of various parameters on the sensor’s performance and on failure modes. To verify the theoretical derivation, a prototype force sensor was designed and built. This prototype hydrostatic force sensor can measure the compressive forces up to 7200 lbf and tensile forces up to 3500 lbf.
APA, Harvard, Vancouver, ISO, and other styles
3

Cheng, Teddy M., and Andrey V. Savkin. "Decentralized control for mobile robotic sensor network self-deployment: barrier and sweep coverage problems." Robotica 29, no. 2 (April 16, 2010): 283–94. http://dx.doi.org/10.1017/s0263574710000147.

Full text
Abstract:
SUMMARYThis paper addresses the problems of barrier coverage and sweep coverage in a corridor environment with a network of self-deployed mobile autonomous robotic sensors. Using the ideas of nearest neighbor rules and information consensus, we propose a decentralized control law for the robotic sensors to solve the coverage problems. Numerical simulations illustrate the effectiveness of the proposed algorithm. The results in this paper demonstrate that such simple motion coordination rules can play a significant role in addressing the issue of coverage in a mobile robotic sensor network.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhu, Lingfeng, Yancheng Wang, Deqing Mei, and Chengpeng Jiang. "Development of Fully Flexible Tactile Pressure Sensor with Bilayer Interlaced Bumps for Robotic Grasping Applications." Micromachines 11, no. 8 (August 12, 2020): 770. http://dx.doi.org/10.3390/mi11080770.

Full text
Abstract:
Flexible tactile sensors have been utilized in intelligent robotics for human-machine interaction and healthcare monitoring. The relatively low flexibility, unbalanced sensitivity and sensing range of the tactile sensors are hindering the accurate tactile information perception during robotic hand grasping of different objects. This paper developed a fully flexible tactile pressure sensor, using the flexible graphene and silver composites as the sensing element and stretchable electrodes, respectively. As for the structural design of the tactile sensor, the proposed bilayer interlaced bumps can be used to convert external pressure into the stretching of graphene composites. The fabricated tactile sensor exhibits a high sensing performance, including relatively high sensitivity (up to 3.40% kPa−1), wide sensing range (200 kPa), good dynamic response, and considerable repeatability. Then, the tactile sensor has been integrated with the robotic hand finger, and the grasping results have indicated the capability of using the tactile sensor to detect the distributed pressure during grasping applications. The grasping motions, properties of the objects can be further analyzed through the acquired tactile information in time and spatial domains, demonstrating the potential applications of the tactile sensor in intelligent robotics and human-machine interfaces.
APA, Harvard, Vancouver, ISO, and other styles
5

Brüggenwirth, Stefan, and Fernando Rial. "Robotic control for cognitive UWB radar." Encyclopedia with Semantic Computing and Robotic Intelligence 02, no. 01 (June 2018): 1850009. http://dx.doi.org/10.1142/s2529737618500090.

Full text
Abstract:
In the paper, we describe a trajectory planning problem for a six-DoF robotic manipulator arm that carries an ultra-wideband (UWB) radar sensor with synthetic aperture (SAR). The resolution depends on the trajectory and velocity profile of the sensor head. The constraints can be modeled as an optimization problem to obtain a feasible, collision-free target trajectory of the end-effector of the manipulator arm in Cartesian coordinates that minimizes observation time. For 3D reconstruction, the target is observed in multiple height slices. For through-the-wall radar the sensor can be operated in sliding mode for scanning larger areas. For IED inspection the spotlight mode is preferred, constantly pointing the antennas towards the target to obtain maximum azimuth resolution. UWB sensors typically use a wide spectrum shared by other RF communication systems. This may become a limiting factor on system sensitivity and severely degrade the image quality. Cognitive radars can adapt dynamically their bandwidth, frequency and other transmit parameters to the radio frequency environment to avoid interference with primary users.
APA, Harvard, Vancouver, ISO, and other styles
6

Cheng, Teddy M., and Andrey V. Savkin. "Self-deployment of mobile robotic sensor networks for multilevel barrier coverage." Robotica 30, no. 4 (August 8, 2011): 661–69. http://dx.doi.org/10.1017/s0263574711000877.

Full text
Abstract:
SUMMARYWe study a problem of K-barrier coverage by employing a network of self-deployed, autonomous mobile robotic sensors. A decentralized coordination algorithm is proposed for the robotic sensors to address the coverage problem. The algorithm is developed based on some simple rules that only rely on local information. By applying the algorithm to the robotic sensors, K layers of sensor barriers are formed to cover the region between two given points. To illustrate the proposed algorithm, numerical simulations are carried out for a number of scenarios.
APA, Harvard, Vancouver, ISO, and other styles
7

Setiawan, Joga Dharma, Mochammad Ariyanto, M. Munadi, Muhammad Mutoha, Adam Glowacz, and Wahyu Caesarendra. "Grasp Posture Control of Wearable Extra Robotic Fingers with Flex Sensors Based on Neural Network." Electronics 9, no. 6 (May 29, 2020): 905. http://dx.doi.org/10.3390/electronics9060905.

Full text
Abstract:
This study proposes a data-driven control method of extra robotic fingers to assist a user in bimanual object manipulation that requires two hands. The robotic system comprises two main parts, i.e., robotic thumb (RT) and robotic fingers (RF). The RT is attached next to the user’s thumb, while the RF is located next to the user’s little finger. The grasp postures of the RT and RF are driven by bending angle inputs of flex sensors, attached to the thumb and other fingers of the user. A modified glove sensor is developed by attaching three flex sensors to the thumb, index, and middle fingers of a wearer. Various hand gestures are then mapped using a neural network. The input data of the robotic system are the bending angles of thumb and index, read by flex sensors, and the outputs are commanded servo angles for the RF and RT. The third flex sensor is attached to the middle finger to hold the extra robotic finger’s posture. Two force-sensitive resistors (FSRs) are attached to the RF and RT for the haptic feedback when the robot is worn to take and grasp a fragile object, such as an egg. The trained neural network is embedded into the wearable extra robotic fingers to control the robotic motion and assist the human fingers in bimanual object manipulation tasks. The developed extra fingers are tested for their capacity to assist the human fingers and perform 10 different bimanual tasks, such as holding a large object, lifting and operate an eight-inch tablet, and lifting a bottle, and opening a bottle cap at the same time.
APA, Harvard, Vancouver, ISO, and other styles
8

Noritsugu, Toshiro. "Special Issue on Robotics for Innovative Industry and Society." International Journal of Automation Technology 8, no. 2 (March 5, 2014): 139. http://dx.doi.org/10.20965/ijat.2014.p0139.

Full text
Abstract:
Robotics has become one of the most important automation technologies for industry and society. Robot components such as actuators and sensors, together with mechanisms and control systems, are being more and more combined with intelligent sensors in innovative industry design and fabrication. Robot technology is being applied in such fields as welfare, education, agriculture, and energy. Robot technology for welfare and nursing is being promoted by the government to increase lifestyle creativity as society ages. This special issue focuses on robotics in fields from manufacturing industries to societal needs. Papers ranging from robotics theory to robot application have been invited. Among the topics covered are robot mechanisms, robot components, actuators, sensors, and controllers, robot control theory, robotic systems, energy saving, industrial applications, automation, vehicles, entertainment, medicine, welfare and nursing applications, and robotics education. The 15 papers presented in this issue include actuators such as rubber artificial muscles or phase-change actuators, pneumatics, power assist devices such as assist glove and upper-limb assist devices, robotic suits, sensor fusion, omnidirectional locomotion, underwater robots, force display apparatuses, meal assistant robots, manufacturing applications of parallel-link mechanisms, surface polishing, and agricultural applications. These papers bring readers the latest state-of-the-art robot technologies useful in everything from analysis and design to control and applications in innovative industries. We thank the authors for their invaluable contributions and the reviewers for their advice – all of which have made this special issue both informative and entertaining.
APA, Harvard, Vancouver, ISO, and other styles
9

Khort, Dmitriy, Alexey Kutyrev, Rostislav Filippov, and Stepan Semichev. "Development control system robotic platform for horticulture." E3S Web of Conferences 262 (2021): 01024. http://dx.doi.org/10.1051/e3sconf/202126201024.

Full text
Abstract:
The article presents a control system for a robotic platform for horticulture. The electronic control system consists of a running engine control unit, a stepper motor steering unit, an electronic differential control unit, a power plant automatic on / off control unit, and battery charging balancing. The developed control system of the robotic vehicle contains a central computer that collects information from sensors and sensors, processes it and transmits control signals to the drives of the machine movement. The movement of the robotic platform is carried out both by a radio signal with a remote control, and in offline mode on a pre-set map of the area according to data from the GLONASS/GPS differential receiver of the satellite navigation system. It is also possible to independently control the movement of a robotic platform using a vision system. The autonomy of the robotic platform provides 10 hours of continuous operation in low-light conditions in various weather conditions.
APA, Harvard, Vancouver, ISO, and other styles
10

B Lima, Glaydson Luiz, Osamu Saotome, and Ijar M. Da Fonseca. "Inspection and control system for experiments in space robotics." South Florida Journal of Development 2, no. 3 (July 11, 2021): 4094–104. http://dx.doi.org/10.46932/sfjdv2n3-023.

Full text
Abstract:
The communication subsystem is one among the various subsystems of a telerobotic space system. It is responsible for coordinating the commands received from the teleoperator control subsystem to the robotic arm, for reading signals from the sensors, and for stating the communication of the telerobot with the ground station. The telerobotic experiment under development by the ITA space robotics research group was developed with the purpose of investigating a robotic space system dynamics and control, including the study of the working and integration of all subsystems involved in the teleoperation control. The lab experiment consists of two identical units of robot manipulators, each of them mounted on its own floating air-supported platform. The objective is to simulate computationally the operations of rendezvous and capture in the microgravity' orbital environment, emulated by the floating manipulators' dynamics. The closed circuit for this system involves the in time position detection, transmission and data processing by using a position-tracking (X, Y, and Z) computer system combined with a Kinect sensor (RGB-D). The computer system comprises two computers capable of processing the positional images with greater accuracy. One of them receive and send the sensor data to a second computer which performs the data processing by proper algorithms in Matlab® and Simulink and sends commands to the robotic arm via WIFI (UDP protocol) network. The robot receives and executes the control signals moving the robotic arms whose position is again detected by the kinect sensor and informed back to the computer system, closing the control mesh and allowing the safe capture of the target. This work deals with the communication subsystem of the space robot experiment and its ability to set an integrated and efficient communication satisfying the telerobot control requirements
APA, Harvard, Vancouver, ISO, and other styles
11

Rahman, S. M. Mizanoor, and Ryojun Ikeura. "Weight-perception-based fixed and variable admittance control algorithms for unimanual and bimanual lifting of objects with a power assist robotic system." International Journal of Advanced Robotic Systems 15, no. 4 (July 1, 2018): 172988141667813. http://dx.doi.org/10.1177/1729881416678131.

Full text
Abstract:
Weight-perception-based fixed admittance control algorithm and variable admittance control algorithm are proposed for unimanual and bimanual lifting of objects with a power assist robotic system. To include weight perception in controls, the mass parameter for the inertial force is hypothesized as different from that for the gravitational force in the dynamics model for lifting objects with the system. For the bimanual lift, two alternative approaches of force sensor arrangements are considered: a common force sensor and two separate force sensors between object and human hands. Computational models for power assistance, excess in load forces, and manipulation efficiency and precision are derived. The fixed admittance control algorithm is evaluated in a 1-degree-of-freedom power assist robotic system. Results show that inclusion of weight perception in controls produce satisfactory performance in terms of power assistance, system kinematics and kinetics, human–robot interactions, and manipulation efficiency and precision. The fixed admittance control algorithm is then augmented to variable admittance control algorithm as a tool of active compliance to vary the admittance with inertia instead of with gravity. The evaluation shows further improvement in the performance for the variable admittance control algorithm. The evaluation also shows that bimanual lifts outperform unimanual lifts and bimanual lifts with separate force sensors outperform bimanual lifts with a common force sensor. Then, the results are proposed to develop power assist robotic systems for handling heavy objects in industries.
APA, Harvard, Vancouver, ISO, and other styles
12

Moldovan, Constantin Catalin, and Ionel Staretu. "Analysis of the Accuracy and Robustness of Kinect Sensor Used with the Scope of Robot Manipulation." Applied Mechanics and Materials 555 (June 2014): 170–77. http://dx.doi.org/10.4028/www.scientific.net/amm.555.170.

Full text
Abstract:
This paper describes in details the evaluation procedure developed in this paper for the measurement of the Kinect sensors accuracy and robustness in the detection process of the user hand and recognizing human hand gestures. Furthermore, the results are transferred to a robotic gripper in virtual environment for visualization. The research started with consideration on the current state of the methods and sensors used for the detection of the hand gestures. It was seen that the detection of human hand and gestures recognition represent an important research study in the field of robotics controls and automation. Through the time, different methods [7, 13] and sensors [12] were developed for the action of detection which became, with the passing of time, better and better, reaching today, a point, when Kinect sensor was constructed as a combination between mature sensor technology and efficient algorithms for detection. The goal of this paper is represented by the analysis, if the XBox 360 (TM) Kinect sensor, developed by Microsoft, is accurate enough to be used as a mean to control a robotic hand in virtual environment. To find this out, the main objectives of this paper represents the measurement of the accuracy of Kinect sensor action for the capture of gestures performed by a human hand in a repeatable manner. As a second objective, in order to be able to visualize the result, the gestures are translated into virtual environment. In order achieve the main scope of this paper, two metrics were defined, according to ISO standard for measurement. Those are: Accuracy/Precision using ISO 5725: 1994 [16] Repeatability - ISO 21748: 2010 [17] The result of the captured hand posture with the highest accuracy are then moved to a robotic hand simulated in virtual environment executing an object grasping and releasing.
APA, Harvard, Vancouver, ISO, and other styles
13

Goryanina, Ksenia I., Aleksndr D. Lukyanov, and Oleg I. Katin. "Review of robotic manipulators and identification of the main problems." MATEC Web of Conferences 226 (2018): 02015. http://dx.doi.org/10.1051/matecconf/201822602015.

Full text
Abstract:
One of the main elements of automation of industrial enterprises is the use of robotic systems consisting of mechanical manipulators and control systems. In recent years, the market of service robotics has been actively developing. The main part of the market of professional service robots in value terms is occupied by medical devices. Agriculture and logistics are also actively developing areas. The success of the automation systems implementation depends on the solution of complex scientific and technical problems, primarily in the following areas: machine vision; sensor networks; navigation systems. Thus, one of the fundamental problems, the solution of which largely depends on the success in creating the perfect adaptive and intelligent robots, is the use of such types of sensors of sensory information, which allow obtaining a sufficiently large amount of information about the problem environment in a short time. This is a problem of creating means of perception.
APA, Harvard, Vancouver, ISO, and other styles
14

Vladareanu, Luige. "Advanced Intelligent Control through Versatile Intelligent Portable Platforms." Sensors 20, no. 13 (June 29, 2020): 3644. http://dx.doi.org/10.3390/s20133644.

Full text
Abstract:
Deep research and communicating new trends in the design, control and applications of the real time control of intelligent sensors systems using advanced intelligent control methods and techniques is the main purpose of this research. The innovative multi-sensor fusion techniques, integrated through the Versatile Intelligent Portable (VIP) platforms are developed, combined with computer vision, virtual and augmented reality (VR&AR) and intelligent communication, including remote control, adaptive sensor networks, human-robot (H2R) interaction systems and machine-to-machine (M2M) interfaces. Intelligent decision support systems (IDSS), including remote sensing, and their integration with DSS, GA-based DSS, fuzzy sets DSS, rough sets-based DSS, intelligent agent-assisted DSS, process mining integration into decision support, adaptive DSS, computer vision based DSS, sensory and robotic DSS, are highlighted in the field of advanced intelligent control.
APA, Harvard, Vancouver, ISO, and other styles
15

Karalekas, Georgios, Stavros Vologiannidis, and John Kalomiros. "EUROPA: A Case Study for Teaching Sensors, Data Acquisition and Robotics via a ROS-Based Educational Robot." Sensors 20, no. 9 (April 27, 2020): 2469. http://dx.doi.org/10.3390/s20092469.

Full text
Abstract:
Robots have become a popular educational tool in secondary education, introducing scientific, technological, engineering and mathematical concepts to students all around the globe. In this paper EUROPA, an extensible, open software and open hardware robotic platform is presented focusing on teaching physics, sensors, data acquisition and robotics. EUROPA’s software infrastructure is based οn Robot Operating System (ROS). It includes easy to use interfaces for robot control and interaction with users and thus can easily be incorporated in Science, Technology, Engineering and Mathematics (STEM) and robotics classes. EUROPA was designed taking into account current trends in educational robotics. An overview of widespread robotic platforms is presented, documenting several critical parameters of interest such as their architecture, sensors, actuators and controllers, their approximate cost, etc. Finally, an introductory STEM curriculum developed for EUROPA and applied in a class of high school students is presented.
APA, Harvard, Vancouver, ISO, and other styles
16

Patané, Luca. "Bio-Inspired Robotic Solutions for Landslide Monitoring." Energies 12, no. 7 (April 1, 2019): 1256. http://dx.doi.org/10.3390/en12071256.

Full text
Abstract:
Bio-inspired solutions are often taken into account to solve problems that nature took millions of years to deal with. In the field of robotics, when we need to design systems able to perform in unstructured environments, bio-inspiration can be a useful instrument both for mechanical design and for the control architecture. In the proposed work the problem of landslide monitoring is addressed proposing a bio-inspired robotic structure developed to deploy a series of smart sensors on target locations with the aim of creating a sensor network capable of acquiring information on the status of the area of interest. The acquired data can be used both to create models and to generate alert signals when a landslide event is identified in the early stage. The design process of the robotic system, including dynamic simulations and robot experiments, will be presented here.
APA, Harvard, Vancouver, ISO, and other styles
17

Gussu, Tesfaye Wakessa, and Chyi-Yeu Lin. "Geometry Based Approach to Obstacle Avoidance of Triomnidirectional Wheeled Mobile Robotic Platform." Journal of Sensors 2017 (2017): 1–10. http://dx.doi.org/10.1155/2017/2849537.

Full text
Abstract:
Mobile robots undergo a collision-free autonomous motion by using the information obtained from a suitable combination of multiple sensors of same or different families. These sensors are often configured around the chassis of the robotic platform. However, little to no information is available as to how these sensors are configured on mobile robotic platforms and how many of these sensors to place on such platforms. Instead, an empirical approach is adopted. That is, the number of sensors of the same family or any type as well as combination of sensors for detecting obstacles is determined by experiment or information obtained from external sensors. This approach is often seen to be iterative and time consuming. In this paper, an approach for determining the minimum number of sensors and their spacing on the robotic platform is proposed so that mobile robots undergo collision-free motion. The effectiveness of the developed approach is experimentally tested by examining the obstacle avoidance capability of the triomnidirectional wheeled robotic platform based on a motion triggering signal obtained from a skirt of ultrasonic sensors only. It was observed that the newly developed approach allows this robotic platform to avoid obstacles effectively.
APA, Harvard, Vancouver, ISO, and other styles
18

Friedrich, Werner E. "Robotic handling: sensors increase reliability." Industrial Robot: An International Journal 22, no. 4 (August 1995): 23–26. http://dx.doi.org/10.1108/01439919510098425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Girovský, Peter, and Matúš Kundrát. "Robotic Arm Based Dynamixel Actuators Controlled by the Data Glove." International Journal of Engineering Research in Africa 18 (October 2015): 152–58. http://dx.doi.org/10.4028/www.scientific.net/jera.18.152.

Full text
Abstract:
This contribution deals with the control of a robot arm using a data glove. To construct the data gloves were used bend sensors and sensor of acceleration. Robotic arm was built using the modular actuators Dynamixel. Control program for controlling a robot arm using a data glove was created in Matlab.
APA, Harvard, Vancouver, ISO, and other styles
20

RUSSELL, R. ANDREW, GEOFFREY TAYLOR, LINDSAY KLEEMAN, and ANIES H. PURNAMADJAJA. "MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS." International Journal of Humanoid Robotics 01, no. 02 (June 2004): 289–314. http://dx.doi.org/10.1142/s0219843604000162.

Full text
Abstract:
Sensing is a key element for any intelligent robotic system. This paper describes the current progress of a project in the Intelligent Robotics Research Center at Monash University that has the aim of developing a synergistic set of sensory systems for a humanoid robot. Currently, sensing modes for colour vision, stereo vision, active range, smell and airflow are being developed in a size and form that is compatible with the humanoid appearance. Essential considerations are sensor calibration and the processing of sensor data to give reliable information about properties of the robot's environment. In order to demonstrate the synergistic use of all of the available sensory modes, a high level supervisory control scheme is being developed for the robot. All time-stamped sensor data together with derived information about the robot's environment are organized in a blackboard system. Control action sequences are then derived from the blackboard data based on a task description. The paper presents details of each of the robot's sensory systems, sensor calibration, and supervisory control. Results are also presented of a demonstration project that involves identifying and selecting mugs containing household chemicals. Proposals for future development of the humanoid robot are also presented.
APA, Harvard, Vancouver, ISO, and other styles
21

Bogue, Robert. "Sensors for robotic perception. Part one: human interaction and intentions." Industrial Robot: An International Journal 42, no. 5 (August 17, 2015): 386–91. http://dx.doi.org/10.1108/ir-05-2015-0098.

Full text
Abstract:
Purpose – The purpose of this two-part paper is to illustrate how sensors impart robots with perceptive capabilities. This first part considers robots that interact with humans and which seek to mimic human intentions. Design/methodology/approach – Following a short introduction, this paper first discusses the sensors used in robotic prosthetics. It then considers sensor applications in recently developed service, companion and assistive robots. The final section concerns the sensors used in collaborative robots, followed by brief concluding comments. Findings – This shows that sensors play a vital role in imparting perceptive capabilities to robots which interact with people. They can interpret human intentions, control prosthetic limbs, monitor and map a robot’s environment, assist with navigation, ensure the safety of co-workers and even detect a person’s emotional state. They are based on a diversity of principles and technologies, including microelectromechanical system (MEMS)-based sensors for physical variables, myographic electrodes and electroencephalogram (EEG) sensors, lasers, infra-red and sonar systems and sophisticated cameras and imaging systems. Originality/value – This provides a timely account of how sensors confer perceptive capabilities to the growing number of robots which interact directly with people.
APA, Harvard, Vancouver, ISO, and other styles
22

Malakar, Nabin K., Daniil Gladkov, and Kevin H. Knuth. "Modeling a Sensor to Improve Its Efficacy." Journal of Sensors 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/481054.

Full text
Abstract:
Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. Thus many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. The light sensor integrates the light arriving from a spatially distributed region within its field of view weighted by its spatial sensitivity function (SSF). We demonstrate that by incorporating an accurate model of the light sensor SSF into the likelihood function of a Bayesian inference engine, an autonomous system can make improved inferences about its surroundings. The method presented here is data based, fairly general, and made with plug-and-play in mind so that it could be implemented in similar problems.
APA, Harvard, Vancouver, ISO, and other styles
23

MAEDER, ANDREAS, HANNES BISTRY, and JIANWEI ZHANG. "INTELLIGENT VISION SYSTEMS FOR ROBOTIC APPLICATIONS." International Journal of Information Acquisition 05, no. 03 (September 2008): 259–67. http://dx.doi.org/10.1142/s0219878908001648.

Full text
Abstract:
Vision-based sensors are a key component for robot-systems, where many tasks depend on image data. Realtime control constraints bind a lot of processing power for only a single sensor modality. Dedicated and distributed processing resources are the "natural" solution to overcome this limitation. This paper presents experiments, using embedded processors as well as dedicated hardware, to execute various image (pre)processing tasks. Architectural concepts and requirements for intelligent vision systems have been acquired.
APA, Harvard, Vancouver, ISO, and other styles
24

Boyraz, Pinar, Svenja Tappe, Tobias Ortmaier, and Annika Raatz. "Design of a low-cost tactile robotic sleeve for autonomous endoscopes and catheters." Measurement and Control 53, no. 3-4 (January 24, 2020): 613–26. http://dx.doi.org/10.1177/0020294019895303.

Full text
Abstract:
Recent developments in medical robotics have been significant, supporting the minimally invasive operation requirements, such as smaller devices and more feedback available to surgeons. Nevertheless, the tactile feedback from a catheter or endoscopic type robotic device has been restricted mostly on the tip of the device and was not aimed to support the autonomous movement of the medical device during operation. In this work, we design a robotic sheath/sleeve with a novel and more comprehensive approach, which can function for whole body or segment-based feedback control as well as diagnostic purposes. The robotic sleeve has several types of piezo-resistive pressure and extension sensors, which are embedded at several latitudes and depths of the silicone substrate. The sleeve takes the human skin as a biological model for its structure. It has a better tactile sensation of the inner tissues in the torturous narrow channels such as cardiovascular or endoluminal tracts in human body and thus can be used to diagnose abnormalities. In addition to this capability, using the stretch sensors distributed alongside its body, the robotic sheath/sleeve can perceive the ego-motion of the robotic backbone of the catheter and can act as a position feedback device. Because of the silicone substrate, the sleeve contributes toward safety of the medical device passively by providing a compliant interface. As an active safety measure, the robotic sheath can sense blood clots or sudden turns inside a channel and by modifying the local trajectory and can prevent embolisms or tissue rupture. In the future, advanced manufacturing techniques will increase the capabilities of the tactile robotic sleeve.
APA, Harvard, Vancouver, ISO, and other styles
25

Škultéty, Emil, Elena Pivarčiová, and Ladislav Karrach. "Design of an Inertial Measuring Unit for Control of Robotic Devices." Materials Science Forum 952 (April 2019): 313–22. http://dx.doi.org/10.4028/www.scientific.net/msf.952.313.

Full text
Abstract:
Industrial robots are increasingly used to automate technological processes, such as machining, welding, paint coating, assembly, etc. Automation rationalizes material flows, integrates production facilities and reduces the need for manufacturing inventory, provides cost savings for human maintenance. Technology development and growing competition have an influence on production growth and increase of product quality, and thus the new possibilities in innovation of industrial robot are searched for. One of the possibilities is applying of an inertial navigation system into robot control. This article focuses on new trends in manufacturing technology: design of Inertial Measurement Unit (IMU) for a robotic application control. The Arduino platform is used for the IMU as a hardware solution. The advantage of this platform is low cost and wide range of sensors and devices that are compatible with this platform. For scanning, the MEMS sensor MPU6050 is used, which includes a 3-axis gyroscope and an accelerometer in one chip. New trends in manufacturing facilities, especially robotics innovation and automation, will enable the productivity to grow in production processes.
APA, Harvard, Vancouver, ISO, and other styles
26

Winiarski, Tomasz, and Adam Woźniak. "Indirect force control development procedure." Robotica 31, no. 3 (August 16, 2012): 465–78. http://dx.doi.org/10.1017/s0263574712000446.

Full text
Abstract:
SUMMARYAddition of extra sensors, especially video cameras and force sensors, under control of appropriate software makes robotic manipulators working in factories suitable for a range of new applications. This paper presents a method of manipulator indirect force control development, in which the force set values are specified in the operational space and the manipulator is equipped with a force sensor in its wrist. Standard control development methods need the estimation of parameters of the detailed model of a manipulator and position servos, which is a complicated and time-consuming task. Hence, in this work a time-efficient hybrid procedure of controller development is proposed consisting of both analytical and experimental stages: proposal of an approximate continuous model of a manipulator, experimental determination and verification of its parameter values using the resonance phenomenon, continuous regulator development, and digitization of the regulator.
APA, Harvard, Vancouver, ISO, and other styles
27

Dai, Yanyan, and Suk Gyu Lee. "Multiple Internet of Robotic Things robots based on LiDAR and camera sensors." International Journal of Advanced Robotic Systems 17, no. 2 (March 1, 2020): 172988142091376. http://dx.doi.org/10.1177/1729881420913769.

Full text
Abstract:
A combination of Internet of Things and multiple robots with sensors has been an attractive research topic over the past years. This article proposes an Internet of Robotic Things system structure to monitor events, fuse sensor data, use local robots to determine a best action, and then act to control multiple mobile robots. The Internet of Robotic Things system includes two main layers: the host controller layer and the multiple robots layer. The controller layer communicates with the multiple robots layer by Wi-Fi module. The Internet of Robotic Things system helps finish five tasks: localizing robots, planning paths, avoiding obstacles, moving to waypoint stable, and creating a map. Based on depth data from depth camera and robot posture, a mapping algorithm is proposed to create map. Based on light detection and ranging sensor data and google cartographer, simultaneously localization and mapping (SLAM) is also processed in this article. The fuzzy sliding mode tracking control method is proposed for each robot to guarantee the robot stable moves. Simulation results show the effectiveness of the proposed algorithm and are used to compare with the experiment result. In the experiment, one host computer and two Kobuki mobile robots with light detection and ranging and depth camera sensors are integrated as an Internet of Robotic Things system. Two robots successfully localize themselves and avoid obstacles. The follower robot simultaneously builds a map.
APA, Harvard, Vancouver, ISO, and other styles
28

Tawiah, Thomas Andzi-Quainoo. "A review of algorithms and techniques for image-based recognition and inference in mobile robotic systems." International Journal of Advanced Robotic Systems 17, no. 6 (November 1, 2020): 172988142097227. http://dx.doi.org/10.1177/1729881420972278.

Full text
Abstract:
Autonomous vehicles include driverless, self-driving and robotic cars, and other platforms capable of sensing and interacting with its environment and navigating without human help. On the other hand, semiautonomous vehicles achieve partial realization of autonomy with human intervention, for example, in driver-assisted vehicles. Autonomous vehicles first interact with their surrounding using mounted sensors. Typically, visual sensors are used to acquire images, and computer vision techniques, signal processing, machine learning, and other techniques are applied to acquire, process, and extract information. The control subsystem interprets sensory information to identify appropriate navigation path to its destination and action plan to carry out tasks. Feedbacks are also elicited from the environment to improve upon its behavior. To increase sensing accuracy, autonomous vehicles are equipped with many sensors [light detection and ranging (LiDARs), infrared, sonar, inertial measurement units, etc.], as well as communication subsystem. Autonomous vehicles face several challenges such as unknown environments, blind spots (unseen views), non-line-of-sight scenarios, poor performance of sensors due to weather conditions, sensor errors, false alarms, limited energy, limited computational resources, algorithmic complexity, human–machine communications, size, and weight constraints. To tackle these problems, several algorithmic approaches have been implemented covering design of sensors, processing, control, and navigation. The review seeks to provide up-to-date information on the requirements, algorithms, and main challenges in the use of machine vision–based techniques for navigation and control in autonomous vehicles. An application using land-based vehicle as an Internet of Thing-enabled platform for pedestrian detection and tracking is also presented.
APA, Harvard, Vancouver, ISO, and other styles
29

Bogue, Robert. "Tactile sensing for surgical and collaborative robots and robotic grippers." Industrial Robot: the international journal of robotics research and application 46, no. 1 (January 21, 2019): 1–6. http://dx.doi.org/10.1108/ir-12-2018-0255.

Full text
Abstract:
Purpose This paper aims to illustrate the increasingly important role played by tactile sensing in robotics by considering three specific fields of application. Design/methodology/approach Following a short introduction, this paper first provides details of tactile sensing principles, technologies, products and research. The following sections consider tactile sensing applications in robotic surgery, collaborative robots and robotic grippers. Finally, brief conclusions are drawn. Findings Tactile sensors are the topic of an extensive and technologically diverse research effort, with sensing skins attracting particular attention. Many products are now available commercially. New generations of surgical robots are emerging which use tactile sensing to provide haptic feedback, thereby eliminating the surgeon’s total reliance on visual control. Many collaborative robots use tactile and proximity sensing as key safety mechanisms and some use sensing skins. Some skins can detect both human proximity and physical contact. Sensing skins that can be retrofitted have been developed. Commercial tactile sensors have been incorporated into robotic grippers, notably anthropomorphic types, and allow the handling of delicate objects and those with varying shapes and sizes. Tactile sensing uses will inevitably increase because of the ever-growing numbers of robots interacting with humans. Originality/value This study provides a detailed account of the growing use of tactile sensing in robotics in three key areas of application.
APA, Harvard, Vancouver, ISO, and other styles
30

Yu, Zhiqiang, Qing Shi, Huaping Wang, Ning Yu, Qiang Huang, and Toshio Fukuda. "How to achieve precise operation of a robotic manipulator on a macro to micro/nano scale." Assembly Automation 37, no. 2 (April 3, 2017): 186–99. http://dx.doi.org/10.1108/aa-02-2017-017.

Full text
Abstract:
Purpose The purpose of this paper is to present state-of-the-art approaches for precise operation of a robotic manipulator on a macro- to micro/nanoscale. Design/methodology/approach This paper first briefly discussed fundamental issues associated with precise operation of a robotic manipulator on a macro- to micro/nanoscale. Second, this paper described and compared the characteristics of basic components (i.e. mechanical parts, actuators, sensors and control algorithm) of the robotic manipulator. Specifically, commonly used mechanisms of the manipulator were classified and analyzed. In addition, intuitive meaning and applications of its actuator explained and compared in details. Moreover, related research studies on general control algorithm and visual control that are used in a robotic manipulator to achieve precise operation have also been discussed. Findings Remarkable achievements in dexterous mechanical design, excellent actuators, accurate perception, optimized control algorithms, etc., have been made in precise operations of a robotic manipulator. Precise operation is critical for dealing with objects which need to be manufactured, modified and assembled. The operational accuracy is directly affected by the performance of mechanical design, actuators, sensors and control algorithms. Therefore, this paper provides a categorization showing the fundamental concepts and applications of these characteristics. Originality/value This paper presents a categorization of the mechanical design, actuators, sensors and control algorithms of robotic manipulators in the macro- to micro/nanofield for precise operation.
APA, Harvard, Vancouver, ISO, and other styles
31

Oliveira, Luiz F. P., António P. Moreira, and Manuel F. Silva. "Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead." Robotics 10, no. 2 (March 24, 2021): 52. http://dx.doi.org/10.3390/robotics10020052.

Full text
Abstract:
The constant advances in agricultural robotics aim to overcome the challenges imposed by population growth, accelerated urbanization, high competitiveness of high-quality products, environmental preservation and a lack of qualified labor. In this sense, this review paper surveys the main existing applications of agricultural robotic systems for the execution of land preparation before planting, sowing, planting, plant treatment, harvesting, yield estimation and phenotyping. In general, all robots were evaluated according to the following criteria: its locomotion system, what is the final application, if it has sensors, robotic arm and/or computer vision algorithm, what is its development stage and which country and continent they belong. After evaluating all similar characteristics, to expose the research trends, common pitfalls and the characteristics that hinder commercial development, and discover which countries are investing into Research and Development (R&D) in these technologies for the future, four major areas that need future research work for enhancing the state of the art in smart agriculture were highlighted: locomotion systems, sensors, computer vision algorithms and communication technologies. The results of this research suggest that the investment in agricultural robotic systems allows to achieve short—harvest monitoring—and long-term objectives—yield estimation.
APA, Harvard, Vancouver, ISO, and other styles
32

Yousuf, Bilal M., Asim Mehdi, Abdul Saboor Khan, Aqib Noor, and Arslan Ali. "Robust Feedback Control Design of Underactuated Robotic Hands with Selectively Lockable Switches for Amputees." Journal of Robotics 2018 (June 7, 2018): 1–9. http://dx.doi.org/10.1155/2018/6835968.

Full text
Abstract:
In recent years, reproduction of human mechanical hand with upgraded abilities is one of the major concerns. This paper addresses the problems of underactuated robotic hand with low cost design as it avoids electromyogram (EMG) sensors. The main goal is to balance the hand in the way, like grabbing, speed, and power, and provide a more robust and cost effective solution. All fingers have some mechanical consistency for picking up objects in a better way. A Flex sensor is attached to all fingers and it is interfaced with a computer using Arduino UNO microcontroller. The sensor aids the arm in three different directions: at first it senses whether an object is grasped or not. In the second step, it determines the coefficient of friction between the objects. Finally it grasps the object and stops. One of the primary issues of prosthetic hand is to have the capacity to satisfy every detail of torque, speed, and latency. In this research, we have developed a model of robotic hand with some modifications. The adaptability of grasping is compared with the degree of freedom (DOF) along with the quantity of fingers. We are controlling our hands via sensors based signal controlling system. The idea is to design a robotic hand, which has low cost, is easy to use, and is light in weight, which helps the amputees to use it with ease in their daily lives. The efficacy of the proposed control is verified and validated using simulations.
APA, Harvard, Vancouver, ISO, and other styles
33

Majeed, Tanveer, Mohd Atif Wahid, and Faizan Ali. "Applications of Robotics in Welding." International Journal of Emerging Research in Management and Technology 7, no. 3 (June 6, 2018): 30. http://dx.doi.org/10.23956/ijermt.v7i3.9.

Full text
Abstract:
An Industrial robot is reprogrammable, automatically controlled, multifunctional manipulator programmable in three or more axes, which may be either fixed in place or mobile for use in industrial automation applications. Technical innovations in robotic welding has facilitated manual welding processes in sever working conditions with enormous heat and fumes to be replaced with robotic welding. The robotic welding has greater capability to control robot motion, welding parameters and enhanced wrong detection and wrong correction. Major difficulties in robotic welding are joint edge inspection, weld penetration control, seam tracking of joints, and width or profile measurement of a joint. These problems can be more easily solved by use of sensory feedback signals from weld joint. Robotic welding system has intelligent and effective control system that can track the joint, monitor the joint in process and accounts for variation in joint location. Sensors play an important role in robotic welding systems with adaptive and intelligent control system features that can track the joint, account for variation in joint location and geometry monitor in-process quality of the weld. In this paper various aspects of robotic welding, robot programming, and problems associated with robot welding are undertaken.
APA, Harvard, Vancouver, ISO, and other styles
34

Walker, James, Thomas Zidek, Cory Harbel, Sanghyun Yoon, F. Sterling Strickland, Srinivas Kumar, and Minchul Shin. "Soft Robotics: A Review of Recent Developments of Pneumatic Soft Actuators." Actuators 9, no. 1 (January 10, 2020): 3. http://dx.doi.org/10.3390/act9010003.

Full text
Abstract:
This paper focuses on the recent development of soft pneumatic actuators for soft robotics over the past few years, concentrating on the following four categories: control systems, material and construction, modeling, and sensors. This review work seeks to provide an accelerated entrance to new researchers in the field to encourage research and innovation. Advances in methods to accurately model soft robotic actuators have been researched, optimizing and making numerous soft robotic designs applicable to medical, manufacturing, and electronics applications. Multi-material 3D printed and fiber optic soft pneumatic actuators have been developed, which will allow for more accurate positioning and tactile feedback for soft robotic systems. Also, a variety of research teams have made improvements to soft robot control systems to utilize soft pneumatic actuators to allow for operations to move more effectively. This review work provides an accessible repository of recent information and comparisons between similar works. Future issues facing soft robotic actuators include portable and flexible power supplies, circuit boards, and drive components.
APA, Harvard, Vancouver, ISO, and other styles
35

Georgopoulou, Antonia, Lukas Egloff, Bram Vanderborght, and Frank Clemens. "A Soft Pneumatic Actuator with Integrated Deformation Sensing Elements Produced Exclusively with Extrusion Based Additive Manufacturing." Engineering Proceedings 6, no. 1 (May 17, 2021): 11. http://dx.doi.org/10.3390/i3s2021dresden-10097.

Full text
Abstract:
In recent years, soft pneumatic actuators have come into the spotlight because of their simple control and the wide range of complex motions. To monitor the deformation of soft robotic systems, elastomer-based sensors are being used. However, the embedding of sensors into soft actuator modules by polymer casting is time consuming and difficult to upscale. In this study, it is shown how a pneumatic bending actuator with an integrated sensing element can be produced using an extrusion-based additive manufacturing method, e.g., fused deposition modeling (FDM). The advantage of FDM against direct printing or robocasting is the significantly higher resolution and the ability to print large objectives in a short amount of time. New, commercial launched, pellet-based FDM printers are able to 3D print thermoplastic elastomers of low shore hardness that are required for soft robotic applications, to avoid high pressure for activation. A soft pneumatic actuator with the in situ integrated piezoresistive sensor element was successfully printed using a commercial styrene-based thermoplastic elastomer (TPS) and a developed TPS/carbon black (CB) sensor composite. It has been demonstrated that the integrated sensing elements could monitor the deformation of the pneumatic soft robotic actuator. The findings of this study contribute to extending the applicability of additive manufacturing for integrated soft sensors in large soft robotic systems.
APA, Harvard, Vancouver, ISO, and other styles
36

Preechayasomboon, Pornthep, and Eric Rombokas. "Sensuator: A Hybrid Sensor–Actuator Approach to Soft Robotic Proprioception Using Recurrent Neural Networks." Actuators 10, no. 2 (February 7, 2021): 30. http://dx.doi.org/10.3390/act10020030.

Full text
Abstract:
Soft robotic actuators are now being used in practical applications; however, they are often limited to open-loop control that relies on the inherent compliance of the actuator. Achieving human-like manipulation and grasping with soft robotic actuators requires at least some form of sensing, which often comes at the cost of complex fabrication and purposefully built sensor structures. In this paper, we utilize the actuating fluid itself as a sensing medium to achieve high-fidelity proprioception in a soft actuator. As our sensors are somewhat unstructured, their readings are difficult to interpret using linear models. We therefore present a proof of concept of a method for deriving the pose of the soft actuator using recurrent neural networks. We present the experimental setup and our learned state estimator to show that our method is viable for achieving proprioception and is also robust to common sensor failures.
APA, Harvard, Vancouver, ISO, and other styles
37

Groot, A. W. De. "Effect of sensor size in robotic tactile sensor arrays." Robotica 6, no. 4 (October 1988): 285–87. http://dx.doi.org/10.1017/s026357470000463x.

Full text
Abstract:
SUMMARYThe degree to which a binary tactile (or visual) image matches the original object is limited by the resolution of the sensor array. Given this fundamental limitation it is still possible to minimize the error in the image formed by the interconnection of the centers of activated sensors along the object's edge. This is achieved by a suitable choice of the physical size of each sensor within the limits of the pixel size. An empirical investigation shows that normally a sensor area of about 50% of the square of the resolution yields an optimal result.
APA, Harvard, Vancouver, ISO, and other styles
38

de Mathelin, Michel, Florent Nageotte, Philippe Zanne, and Birgitta Dresp-Langley. "Sensors for Expert Grip Force Profiling: Towards Benchmarking Manual Control of a Robotic Device for Surgical Tool Movements." Sensors 19, no. 20 (October 21, 2019): 4575. http://dx.doi.org/10.3390/s19204575.

Full text
Abstract:
STRAS (Single access Transluminal Robotic Assistant for Surgeons) is a new robotic system based on the Anubis® platform of Karl Storz for application to intra-luminal surgical procedures. Pre-clinical testing of STRAS has recently permitted to demonstrate major advantages of the system in comparison with classic procedures. Benchmark methods permitting to establish objective criteria for ‘expertise’ need to be worked out now to effectively train surgeons on this new system in the near future. STRAS consists of three cable-driven sub-systems, one endoscope serving as guide, and two flexible instruments. The flexible instruments have three degrees of freedom and can be teleoperated by a single user via two specially designed master interfaces. In this study, small force sensors sewn into a wearable glove to ergonomically fit the master handles of the robotic system were employed for monitoring the forces applied by an expert and a trainee (complete novice) during all the steps of surgical task execution in a simulator task (4-step-pick-and-drop). Analysis of grip-force profiles is performed sensor by sensor to bring to the fore specific differences in handgrip force profiles in specific sensor locations on anatomically relevant parts of the fingers and hand controlling the master/slave system.
APA, Harvard, Vancouver, ISO, and other styles
39

Nakamoto, Hiroyuki, Futoshi Kobayashi, and Fumio Kojima. "Evaluation of Circle Diameter by Distributed Tactile Information in Active Tracing." Journal of Sensors 2013 (2013): 1–7. http://dx.doi.org/10.1155/2013/658749.

Full text
Abstract:
Active touch with voluntary movement on the surface of an object is important for human to obtain the local and detailed features on it. In addition, the active touch is considered to enhance the human spatial resolution. In order to improve dexterity performance of multifinger robotic hands, it is necessary to study an active touch method for robotic hands. In this paper, first, we define four requirements of a tactile sensor for active touch and design a distributed tactile sensor model, which can measure a distribution of compressive deformation. Second, we suggest a measurement process with the sensor model, a synthesis method of distributed deformations. In the experiments, a five-finger robotic hand with tactile sensors traces on the surface of cylindrical objects and evaluates the diameters. We confirm that the hand can obtain more information of the diameters by tracing the finger.
APA, Harvard, Vancouver, ISO, and other styles
40

Ferreira, N. M. Fonseca, André Araujo, M. S. Couceiro, and David Portugal. "Intensive summer course in robotics – Robotcraft." Applied Computing and Informatics 16, no. 1/2 (May 1, 2018): 155–79. http://dx.doi.org/10.1016/j.aci.2018.04.005.

Full text
Abstract:
This paper describes a two-month summer intensive course designed to introduce participants with a hands-on technical craft on robotics and to acquire experience in the low-level details of embedded systems. Attendants started this course with a brief introduction to robotics; learned to draw, design and create a personalized 3D structure for their mobile robotic platform and developed skills in embedded systems. They were familiarize with the practices used in robotics, learning to connect all sensors and actuator, developing a typical application on differential kinematic using Arduino, exploring ROS features under Raspberry Pi environment and Arduino – Raspberry Pi communication. Different paradigms and some real applications and programming were addressed on the topic of Artificial Intelligence. Throughout the course, participants were introduced to programming languages (including Python and C++), advanced programming concepts such as ROS, basic API development, system concepts such as I2C and UART serial interfaces, PWM motor control and sensor fusion to improve robotic navigation and localization. This paper describes not just the concept, layout and methodology used on RobotCraft 2017 but also presents the participants knowledge background and their overall opinions, leading to focus on lessons learned and suggestions for future editions.
APA, Harvard, Vancouver, ISO, and other styles
41

Tanaka, Yoshio, Tetsushi Ueta, Hiroshi Kawakami, and Takashi Sumitomo. "A Robotic Truck Crane with Vibration Sensors." Journal of Robotics and Mechatronics 7, no. 3 (June 20, 1995): 213–17. http://dx.doi.org/10.20965/jrm.1995.p0213.

Full text
Abstract:
To solve the problems of the shortage and increasing age of skilled operators in Japan, we propose a new concept of “robotic crane system” or “intelligent crane system” and develop the prototype system under laboratory conditions. This paper describes the hardware of the robotic crane system, control design of the flexible rotary crane using vibration sensors, and the first experimental results. In order to simulate the crane’s load swing efficiently and develop a 3-D simulator for cranes with a spherical pendulum such as a rotary crane, an efficient dynamic simulation method is also presented. This method was applied to the prototype system.
APA, Harvard, Vancouver, ISO, and other styles
42

Li, Shang Rong, and Wei Ping Zhang. "Implementation of Synchronous Tracking Control with Multiple Hall Switch Sensors Based on PLC." Applied Mechanics and Materials 433-435 (October 2013): 178–82. http://dx.doi.org/10.4028/www.scientific.net/amm.433-435.178.

Full text
Abstract:
Positioning control is widely used in many fields of applications, such as CNC machine tools, robotic motion control. This paper presents a novel method for position offset detection with a multi-switch type Hall sensors and a control design of tracking the moving object system based on PLC (programmable logic controller). The experimental results showed that the Hall switch sensor could follow the magnets in the different position without departing from each other when the rotating speed of the magnets is no more than 100 [rpm].
APA, Harvard, Vancouver, ISO, and other styles
43

Bogue, Rob. "New technologies for robotic tactile sensing and navigation." Industrial Robot: the international journal of robotics research and application 48, no. 4 (June 4, 2021): 478–83. http://dx.doi.org/10.1108/ir-03-2021-0054.

Full text
Abstract:
Purpose This aims to provide details of new sensor technologies and developments with potential applications in robotic tactile sensing and navigation. Design/methodology/approach Following a short introduction, this provides examples of tactile sensing research. This is followed by details of research into inertial sensors and other navigation techniques. Finally, brief conclusions are drawn. Findings This shows that tactile sensing and navigation techniques are the topic of a technologically diverse research effort which has prospects to impart various classes of robots with significantly enhanced capabilities. Originality/value This provides a technically detailed insight into recent sensor research with applications in robotic tactile sensing and navigation.
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Xiang, and M. Fikret Ercan. "Decentralized Coordination Control for a Network of Mobile Robotic Sensors." Wireless Personal Communications 102, no. 4 (January 16, 2018): 2429–42. http://dx.doi.org/10.1007/s11277-018-5263-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Vijay Joseph, Arockia, Akshat Mathur, Jatin Verma, and Ankita Singh. "Gesture based wireless control for a robotic manipulator." International Journal of Engineering & Technology 7, no. 2.31 (May 29, 2018): 231. http://dx.doi.org/10.14419/ijet.v7i2.31.13449.

Full text
Abstract:
This project plays a very important role to complement the industrial and automation field. Nowadays, robots are used in several fields of engineering and manufacturing and the systems for controlling or actuating them have also enhanced from the past. The use of gestures for controlling them has been the new trend to control the movement of robotic manipulators. The various methodologies for controlling them are motion tracking, image processing and by using Kinect sensors. All these methods can be used as a teach pendant where one can provide the movement of the manipulator as a preset and the manipulator can carry out the same motion repetitively, or in the case of motion tracking and while using Kinect sensors, the user is bound to a confined area where the cameras can monitor the user’s body. Here, we propose a wireless controlled robotic arm system for tool handling (pick and place) and many other applications where human reach is elusive. The result is that the gestures of the human hand are in sync with the manipulator’s movement. Further, this robotic arm has been implanted beneath a drone which would then have the ability to reach certain heights where human reach is impervious or might put a human’s life in jeopardy. In this case, the user can maneuver along with manipulator wherever it is used.
APA, Harvard, Vancouver, ISO, and other styles
46

Weiner, Pascal, Caterina Neef, Yoshihisa Shibata, Yoshihiko Nakamura, and Tamim Asfour. "An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers." Sensors 20, no. 1 (December 23, 2019): 101. http://dx.doi.org/10.3390/s20010101.

Full text
Abstract:
Grasping and manipulation with anthropomorphic robotic and prosthetic hands presents a scientific challenge regarding mechanical design, sensor system, and control. Apart from the mechanical design of such hands, embedding sensors needed for closed-loop control of grasping tasks remains a hard problem due to limited space and required high level of integration of different components. In this paper we present a scalable design model of artificial fingers, which combines mechanical design and embedded electronics with a sophisticated multi-modal sensor system consisting of sensors for sensing normal and shear force, distance, acceleration, temperature, and joint angles. The design is fully parametric, allowing automated scaling of the fingers to arbitrary dimensions in the human hand spectrum. To this end, the electronic parts are composed of interchangeable modules that facilitate the mechanical scaling of the fingers and are fully enclosed by the mechanical parts of the finger. The resulting design model allows deriving freely scalable and multimodally sensorised fingers for robotic and prosthetic hands. Four physical demonstrators are assembled and tested to evaluate the approach.
APA, Harvard, Vancouver, ISO, and other styles
47

Fang, Bin, Hongxiang Xue, Fuchun Sun, Yiyong Yang, and Renxiang Zhu. "A cross-modal tactile sensor design for measuring robotic grasping forces." Industrial Robot: the international journal of robotics research and application 46, no. 3 (May 20, 2019): 337–44. http://dx.doi.org/10.1108/ir-08-2018-0175.

Full text
Abstract:
PurposeThe purpose of the paper is to present a novel cross-modal sensor whose tactile is computed by the visual information. The proposed sensor can measure the forces of robotic grasping.Design/methodology/approachThe proposed cross-modal tactile sensor consists of a transparent elastomer with markers, a camera, an LED circuit board and supporting structures. The model and performance of the elastomer are analyzed. Then marker recognition method is proposed to determine the movements of the marker on the surface, and the force calculation algorithm is presented to compute the three-dimension force.FindingsExperimental results demonstrate that the proposed tactile sensor can accurately measure robotic grasping forces.Originality/valueThe proposed cross-modal tactile sensor determines the robotic grasping forces by the images of markers. It can give more information of the force than traditional tactile sensors. Meanwhile, the proposed algorithms for forces calculation determine the superior results.
APA, Harvard, Vancouver, ISO, and other styles
48

Lin, Shifeng, and Ning Wang. "Cloud robotic grasping of Gaussian mixture model based on point cloud projection under occlusion." Assembly Automation 41, no. 3 (April 5, 2021): 312–23. http://dx.doi.org/10.1108/aa-11-2020-0170.

Full text
Abstract:
Purpose In multi-robot cooperation, the cloud can share sensor data, which can help robots better perceive the environment. For cloud robotics, robot grasping is an important ability that must be mastered. Usually, the information source of grasping mainly comes from visual sensors. However, due to the uncertainty of the working environment, the information acquisition of the vision sensor may encounter the situation of being blocked by unknown objects. This paper aims to propose a solution to the problem in robot grasping when the vision sensor information is blocked by sharing the information of multi-vision sensors in the cloud. Design/methodology/approach First, the random sampling consensus algorithm and principal component analysis (PCA) algorithms are used to detect the desktop range. Then, the minimum bounding rectangle of the occlusion area is obtained by the PCA algorithm. The candidate camera view range is obtained by plane segmentation. Then the candidate camera view range is combined with the manipulator workspace to obtain the camera posture and drive the arm to take pictures of the desktop occlusion area. Finally, the Gaussian mixture model (GMM) is used to approximate the shape of the object projection and for every single Gaussian model, the grabbing rectangle is generated and evaluated to get the most suitable one. Findings In this paper, a variety of cloud robotic being blocked are tested. Experimental results show that the proposed algorithm can capture the image of the occluded desktop and grab the objects in the occluded area successfully. Originality/value In the existing work, there are few research studies on using active multi-sensor to solve the occlusion problem. This paper presents a new solution to the occlusion problem. The proposed method can be applied to the multi-cloud robotics working environment through cloud sharing, which helps the robot to perceive the environment better. In addition, this paper proposes a method to obtain the object-grabbing rectangle based on GMM shape approximation of point cloud projection. Experiments show that the proposed methods can work well.
APA, Harvard, Vancouver, ISO, and other styles
49

Leung, Anderson, and Shahram Payandeh. "Application of adaptive neural network to localization of objects using pressure array transducer." Robotica 14, no. 4 (July 1996): 407–14. http://dx.doi.org/10.1017/s0263574700019809.

Full text
Abstract:
SUMMARYPattern recognition and object localization, using various sensors such as vision and tactile sensors, are two important areas of research in the application of robotic systems. This paper demonstrates the feasibility of using some relatively inexpensive array of pressure sensors and a neural network approach to achieve object localization and pattern recognition. The sensors that are used are force sensing resistors (FSRs), more specifically, a 16 x 16 array of FSRs. Because of the nonlinearity associated with a FSR, three possible approaches for gathering output from the sensor array are used. The neural network that is used consists of two 2-layer counterpropagation networks (CPNs). One of the CPNs is trained to recognize contact signatures of different objects placed on a fixed reference position on the sensor array.
APA, Harvard, Vancouver, ISO, and other styles
50

Ben Abdallah, Ismail, Yassine Bouteraa, and Chokri Rekik. "Kinect-Based Sliding Mode Control for Lynxmotion Robotic Arm." Advances in Human-Computer Interaction 2016 (2016): 1–10. http://dx.doi.org/10.1155/2016/7921295.

Full text
Abstract:
Recently, the technological development of manipulator robot increases very quickly and provides a positive impact to human life. The implementation of the manipulator robot technology offers more efficiency and high performance for several human’s tasks. In reality, efforts published in this context are focused on implementing control algorithms with already preprogrammed desired trajectories (passive robots case) or trajectory generation based on feedback sensors (active robots case). However, gesture based control robot can be considered as another channel of system control which is not widely discussed. This paper focuses on a Kinect-based real-time interactive control system implementation. Based on LabVIEW integrated development environment (IDE), a developed human-machine-interface (HMI) allows user to control in real time a Lynxmotion robotic arm. The Kinect software development kit (SDK) provides a tool to keep track of human body skeleton and abstract it into 3-dimensional coordinates. Therefore, the Kinect sensor is integrated into our control system to detect the different user joints coordinates. The Lynxmotion dynamic has been implemented in a real-time sliding mode control algorithm. The experimental results are carried out to test the effectiveness of the system, and the results verify the tracking ability, stability, and robustness.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography