×

A method for robotic grasping based on improved Gaussian mixture model. (English) Zbl 1470.93111

Summary: The present research envisages a method for the robotic grasping based on the improved Gaussian mixture model. The improved Gaussian mixture model is a method proposed by incorporating Bayesian ideas into the Gaussian model. It will use the Gaussian model to perform grasping training in a certain area which we called trained area. The improved Gaussian models utilized the trained Gaussian models as prior models. The proposed method improved the cumulative updates and the evaluation results of the improved models to make robots more adaptable to grasp in the untrained areas. The self-taught learning ability of the robot about grasping was semi-supervised. Firstly, the observable variables of objects were determined by a camera. Then, we dragged the robot to grasp object. The relationship between the variables and robot’s joint angles were mapped. We obtained new samples in the close untrained area to improve the Gaussian model. With these new observable variables, the robot grasped it successfully. Finally, the effectiveness of the method was verified by experiments and comparative tests on grasping of real objects and grasping simulation of the improved Gaussian models through the virtual robot experimentation platform.

MSC:

93C85 Automated systems (robots, etc.) in control theory

Software:

Lua
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Y. Tao, T. Wang, H. Liu, S. Jiang, Insights and suggestions on the current situation and development trend of intelligent robots, Chin. High Technol. Lett., 29 (2019),149-163.
[2] Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Int., 22 (2000), 1330-1334.
[3] Y. Wang, C. Liu, X. Yang, Online calibration techniques of visual measurement systems for industrial robots, Robot, 33 (2011), 299-302.
[4] L. Zhang, X. Huang, W. Feng, Space robot vision calibration with reference objects from motion trajectories, Robot, 38 (2016), 193-199.
[5] B. Belzile, L. Birglen, A compliant self-adaptive gripper with proprioceptive haptic feedback, Auton. Rob., 36 (2014), 79-91.
[6] D. Petković, M. Issa, N. D. Pavlović, L. Zentner, Ž. Ćojbašić, Adaptive neuro fuzzy controller for adaptive compliant robotic grippe, Expert Syst. Appl., 39 (2012), 13295-13304.
[7] A. Ahrary, R. D. A. Ludena, A novel approach to design of an underactuated mechanism for grasping in agriculture application, in Lee R. (eds) Applied Computing and Information Technology (eds. R. Lee), Springer, (2014), 31-45.
[8] M. Manti, T. Hassan, G. Passetti, N. d’Elia, M. Cianchetti, C. Laschi, An underactuated and adaptable soft robotic gripper, in Conference on Biomimetic and Biohybrid Systems, Springer, (2015), 64-74.
[9] C. Qian, R. Li, B. Li, M. Hu, Y. Xin, Study on bionic manipulator based on multi-sensor data Fusion, Piezoelectr. Acoustoopt., 39 (2017), 490-493.
[10] S. Liu, J. Deng, Y. Zhan, Y. Ye, Design and implementation of manipulator based on PWM technology, J. Detect. Control, 39 (2017), 19-23.
[11] A. Saxena, J. Driemeyer, A. Y. Ng, Robotic grasping of novel objects using vision, Int. J. Rob. Res., 27 (2008), 157-173.
[12] W. Dong, Research on industrial robots scraping technologies based on machine vision, Huazhong University of Science and Technology, (2011).
[13] Z. Yan, X. Du, M. Cao, Y. Cai, T. Lu, S. Wang, A method for robotic grasping position detection based on deep learning, Chin. High Technol. Lett., 28 (2018), 58-66.
[14] J. Xia, K. Qian, X. Ma, H. Liu, Fast planar grasp pose detection for robot based on cascaded deep convolutional neural networks, Robot, 40 (2018),794-802.
[15] Y. Mollard, T. Munzer, A. Baisero, M. Toussaint, M. Lopes, Robot programming from demonstration, feedback and transfer, 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), 2015. Available
[16] K. Bousmalis, A. Irpan, P. Wohlhart, Y. Bai, M. Kelcey, M. Kalakrishnan, Using Simulation and Domain Adaptation to Improve Efficiency of Deep Robotic Grasping, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available
[17] M. Schwarz, C. Lenz, G. M. García, S. Koo, A. S. Periyasamy, M. Schreiber, Fast Object Learning and Dual-arm Coordination for Cluttered Stowing, Picking, and Packing, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available
[18] P. Schmidt, N. Vahrenkamp, M. Wächter, T. Asfour, Grasping of Unknown Objects Using Deep Convolutional Neural Networks Based on Depth Images, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available
[19] G. Zhao, Y. Tao, H. Liu, X. Deng, Y. Chen, H. Xiong, A robot demonstration method based on LWR and Q-learning algorithm, J. Int. Fuzzy Syst., 35 (2018), 35-46.
[20] Y. Chen, J. Guo, Y. Tao, Adaptive grasping strategy of robot based on Gaussian process, J. Beijing Univ. Aeronaut. Astronaut., 43 (2017), 1738-1745.
[21] E. Rohmer, S. P. N. Singh, M. Freese, V-REP: A Versatile and Scalable Robot Simulation Framework, 2013 IEEE/RSJ International Conference on Intelligent Robots and System, 2013. Available
[22] N.Diego, Virtual robot experimentation platform user manual [EB/OL], 2016. Available
[23] R. Iernsalimschy, Programming in Lua
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.