Combining Motion Primitives and Image-Based Visual Servo Control

Author(s):  
Ghananeel Rotithor ◽  
Ashwin P. Dani

Abstract Combining perception feedback control with learning-based open-loop motion generation for the robot’s end-effector control is an attractive solution for many robotic manufacturing tasks. For instance, while performing a peg-in-the-hole or an insertion task when the hole or the recipient part is not visible in the eye-in-the-hand camera, an open-loop learning-based motion primitive method can be used to generate end-effector path. Once the recipient part is in the field of view (FOV), visual servo control can be used to control the motion of the robot. Inspired by such applications, this paper presents a control scheme that switches between Dynamic Movement Primitives (DMPs) and Image-based Visual Servo (IBVS) control combining end-effector control with perception-based feedback control. A simulation result is performed that switches the controller between DMP and IBVS to verify the performance of the proposed control methodology.

2005 ◽  
Vol 02 (02) ◽  
pp. 203-224 ◽  
Author(s):  
CHRIS GASKETT ◽  
ALEŠ UDE ◽  
GORDON CHENG

We propose a hand-eye coordination system for a humanoid robot that supports bimanual reaching. The system combines endpoint closed-loop and open-loop visual servo control. The closed-loop component moves the eyes, head, arms, and torso, based on the position of the target and the robot's hands, as seen by the robot's head-mounted cameras. The open-loop component uses a motor-motor mapping that is learnt online to support movement when visual cues are not available.


Author(s):  
Magnus H. Rognvaldsson ◽  
Gary McMurray ◽  
Wayne Daley ◽  
Paul M. Griffin

Abstract The performance of a vision system for a robot using visual servo control is very dependent upon the placement of the camera. The placement on robot attached positions other than at the end-effector of the robot manipulator has generally not been considered. In this paper we examine the practical issues for different placements of the camera on the robot manipulator for the purpose of visual servo control. An animated computer model was developed to aid in the evaluation of various vision sensor placements. We present the results of the analysis and make recommendations.


2004 ◽  
Author(s):  
J. Chen ◽  
D. M. Dason ◽  
W. E. Dixon ◽  
V. K. Chitrakaran

2020 ◽  
Vol 67 (3) ◽  
pp. 2450-2459 ◽  
Author(s):  
Huifeng Wu ◽  
Yi Yan ◽  
Danfeng Sun ◽  
Rene Simon

2014 ◽  
Vol 47 (3) ◽  
pp. 8110-8115 ◽  
Author(s):  
S.S. Mehta ◽  
W. MacKunis ◽  
T.F. Burks

2000 ◽  
Vol 6 (1) ◽  
pp. 33-43 ◽  
Author(s):  
Sung Ho Kim ◽  
Jong Suk Choi ◽  
Byung Kook Kim

Sign in / Sign up

Export Citation Format

Share Document