Chao, Fei, Zhu, Zuyuan, Lin, Chih-Min, Hu, Huosheng, Yang, Longzhi, Shang, Changjing and Zhou, Changle (2018) Enhanced Robotic Hand-eye Coordination inspired from Human-like Behavioral Patterns. IEEE Transactions on Cognitive and Developmental Systems, 10 (2). pp. 384-396. ISSN 2379-8920
|
Text (Full text)
Zhu-THMS-Submit.pdf - Accepted Version Download (1MB) | Preview |
Abstract
Robotic hand-eye coordination is recognized as an important skill to deal with complex real environments. Conventional robotic hand-eye coordination methods merely transfer stimulus signals from robotic visual space to hand actuator space. This paper introduces a reverse method: Build another channel that transfers stimulus signals from robotic hand space to visual space. Based on the reverse channel, a human-like behavior pattern: “Stop-to-Fixate”, is imparted to the robot, thereby giving the robot an enhanced reaching ability. A visual processing system inspired by the human retina structure is used to compress visual information so as to reduce the robot’s learning complexity. In addition, two constructive neural networks establish the two sensory delivery channels. The experimental results demonstrate that the robotic system gradually obtains a reaching ability. In particular, when the robotic hand touches an unseen object, the reverse channel successfully drives the visual system to notice the unseen object.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Robotic hand-eye coordination, sensory motor mapping, human-like behavioral pattern, constructive neural network |
Subjects: | G400 Computer Science G700 Artificial Intelligence |
Department: | Faculties > Engineering and Environment > Computer and Information Sciences |
Depositing User: | Becky Skoyles |
Date Deposited: | 01 Nov 2016 09:57 |
Last Modified: | 01 Aug 2021 08:07 |
URI: | http://nrl.northumbria.ac.uk/id/eprint/28129 |
Downloads
Downloads per month over past year