A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 9 Issue 3
Mar.  2022

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
B. X. Wu, J. P. Zhong, and C. G. Yang, “A visual-based gesture prediction framework applied in social robots,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 510–519, Mar. 2022. doi: 10.1109/JAS.2021.1004243
Citation: B. X. Wu, J. P. Zhong, and C. G. Yang, “A visual-based gesture prediction framework applied in social robots,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 510–519, Mar. 2022. doi: 10.1109/JAS.2021.1004243

A Visual-Based Gesture Prediction Framework Applied in Social Robots

doi: 10.1109/JAS.2021.1004243
Funds:  This work was supported in part by National Nature Science Foundation of China (NSFC) (U20A20200, 61861136009), in part by Guangdong Basic and Applied Basic Research Foundation (2019B1515120076, 2020B1515120054), in part by Industrial Key Technologies R & D Program of Foshan (2020001006308)
More Information
  • In daily life, people use their hands in various ways for most daily activities. There are many applications based on the position, direction, and joints of the hand, including gesture recognition, gesture prediction, robotics and so on. This paper proposes a gesture prediction system that uses hand joint coordinate features collected by the Leap Motion to predict dynamic hand gestures. The model is applied to the NAO robot to verify the effectiveness of the proposed method. First of all, in order to reduce jitter or jump generated in the process of data acquisition by the Leap Motion, the Kalman filter is applied to the original data. Then some new feature descriptors are introduced. The length feature, angle feature and angular velocity feature are extracted from the filtered data. These features are fed into the long-short time memory recurrent neural network (LSTM-RNN) with different combinations. Experimental results show that the combination of coordinate, length and angle features achieves the highest accuracy of 99.31%, and it can also run in real time. Finally, the trained model is applied to the NAO robot to play the finger-guessing game. Based on the predicted gesture, the NAO robot can respond in advance.

     

  • loading
  • [1]
    C. Yang, H. Wu, Z. Li, W. He, N. Wang, and C.-Y. Su, “Mind control of a robotic arm with visual fusion technology,” IEEE Trans. Industrial Informatics, vol. 14, no. 9, pp. 3822–3830, 2017.
    [2]
    J. Zhang, M. Li, Y. Feng, and C. Yang, “Robotic grasp detection based on image processing and random forest,” Multimedia Tools and Applications, vol. 79, no. 3, pp. 2427–2446, 2020.
    [3]
    J. Li, J. Zhong, J. Yang, and C. Yang, “An incremental learning framework to enhance teaching by demonstration based on multimodal sensor fusion,” Frontiers in Neurorobotics, vol. 14, p. 5, 2020.
    [4]
    P. Premaratne, “Historical development of hand gesture recognition,” in Human Computer Interaction Using Hand Gestures. Singapore: Springer, 2014, pp. 5–29.
    [5]
    M. Oudah, A. Al-Naji, and J. Chahl, “Hand gesture recognition based on computer vision: A review of techniques,” Journal of Imaging, vol. 6, no. 8, p. 73, 2020.
    [6]
    A. S. Al-Shamayleh, R. Ahmad, M. A. Abushariah, K. A. Alam, and N. Jomhari, “A systematic literature review on vision based gesture recognition techniques,” Multimedia Tools and Applications, vol. 77, no. 21, pp. 28121–28184, 2018. doi: 10.1007/s11042-018-5971-z
    [7]
    L. Dipietro, A. M. Sabatini, and P. Dario, “A survey of glove-based systems and their applications,” IEEE Trans. Systems,Man,and Cybernetics,Part C (Applications and Reviews), vol. 38, no. 4, pp. 461–482, 2008. doi: 10.1109/TSMCC.2008.923862
    [8]
    A. Jaramillo-Yánez, M. E. Benalcázar, and E. Mena-Maldonado, “Real-time hand gesture recognition using surface electromyography and machine learning: A systematic literature review,” Sensors, vol. 20, no. 9, p. 2467, 2020.
    [9]
    J. Qi, G. Jiang, G. Li, Y. Sun, and B. Tao, “Surface EMG hand gesture recognition system based on PCA and GRNN,” Neural Computing and Applications, vol. 32, no. 10, pp. 6343–6351, 2020. doi: 10.1007/s00521-019-04142-8
    [10]
    S. Jiang, Q. Gao, H. Liu, and P. B. Shull, “A novel, co-located EMG-FMG-sensing wearable armband for hand gesture recognition,” Sensors and Actuators A: Physical, vol. 301, p. 111738, 2020.
    [11]
    W.-T. Shi, Z.-J. Lyu, S.-T. Tang, T.-L. Chia, and C.-Y. Yang, “A bionic hand controlled by hand gesture recognition based on surface EMG signals: A preliminary study,” Biocybernetics and Biomedical Engineering, vol. 38, no. 1, pp. 126–135, 2018. doi: 10.1016/j.bbe.2017.11.001
    [12]
    S. S. Rautaray and A. Agrawal, “Vision based hand gesture recognition for human computer interaction: A survey,” Artificial Intelligence Review, vol. 43, no. 1, pp. 1–54, 2015. doi: 10.1007/s10462-012-9356-9
    [13]
    A. Kurakin, Z. Zhang, and Z. Liu, “A real time system for dynamic hand gesture recognition with a depth sensor,” in Proc. IEEE 20th European Signal Processing Conf., 2012, pp. 1975–1979.
    [14]
    H. Cheng, L. Yang, and Z. Liu, “Survey on 3D hand gesture recognition,” IEEE Trans. Circuits and Systems for Video Technology, vol. 26, no. 9, pp. 1659–1673, 2015.
    [15]
    P. Barros, N. T. Maciel-Junior, B. J. Fernandes, B. L. Bezerra, and S. M. Fernandes, “A dynamic gesture recognition and prediction system using the convexity approach,” Computer Vision and Image Understanding, vol. 155, pp. 139–149, 2017. doi: 10.1016/j.cviu.2016.10.006
    [16]
    P. K. Pisharady and M. Saerbeck, “Recent methods and databases in vision-based hand gesture recognition: A review,” Computer Vision and Image Understanding, vol. 141, pp. 152–165, 2015. doi: 10.1016/j.cviu.2015.08.004
    [17]
    Z. Ren, J. Yuan, J. Meng, and Z. Zhang, “Robust part-based hand gesture recognition using kinect sensor,” IEEE Trans. Multimedia, vol. 15, no. 5, pp. 1110–1120, 2013. doi: 10.1109/TMM.2013.2246148
    [18]
    Y. Li, “Hand gesture recognition using kinect,” in Proc. IEEE Int. Conf. Computer Science and Automation Engineering, 2012, pp. 196–199.
    [19]
    D. Bachmann, F. Weichert, and G. Rinkenauer, “Review of three-dimensional human-computer interaction with focus on the leap motion controller,” Sensors, vol. 18, no. 7, p. 2194, 2018.
    [20]
    D. Avola, M. Bernardi, L. Cinque, G. L. Foresti, and C. Massaroni, “Exploiting recurrent neural networks and leap motion controller for the recognition of sign language and semaphoric hand gestures,” IEEE Trans. Multimedia, vol. 21, no. 1, pp. 234–245, 2018.
    [21]
    Z. Zhang, C. He, and K. Yang, “A novel surface electromyographic signal-based hand gesture prediction using a recurrent neural network,” Sensors, vol. 20, no. 14, p. 3994, 2020.
    [22]
    C. Wei, W. Zhou, J. Pu, and H. Li, “Deep grammatical multi-classifier for continuous sign language recognition,” in Proc. IEEE 5th Int. Conf. Multimedia Big Data, 2019, pp. 435–442.
    [23]
    P. Kumar, H. Gauba, P. P. Roy, and D. P. Dogra, “A multimodal framework for sensor based sign language recognition,” Neurocomputing, vol. 259, pp. 21–38, 2017. doi: 10.1016/j.neucom.2016.08.132
    [24]
    S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, no. 9, pp. 1735–1780, 1997.
    [25]
    X. Yu, W. He, H. Li, and J. Sun, “Adaptive fuzzy full-state and outputfeedback control for uncertain robots with output constraint,” IEEE Trans. Systems,Man,and Cybernetics:Systems, 2020.
    [26]
    W. He, Z. Li, and C. P. Chen, “A survey of human-centered intelligent robots: Issues and challenges,” IEEE/CAA J. Autom. Sinica, vol. 4, no. 4, pp. 602–609, 2017. doi: 10.1109/JAS.2017.7510604
    [27]
    T.-W. Chong and B.-G. Lee, “American sign language recognition using leap motion controller with machine learning approach,” Sensors, vol. 18, no. 10, p. 3554, 2018.
    [28]
    A. Vaitkevičius, M. Taroza, T. Blažauskas, R. Damaševičius, R. Maske-liūnas, and M. Woźniak, “Recognition of american sign language gestures in a virtual reality using leap motion,” Applied Sciences, vol. 9, no. 3, p. 445, 2019.
    [29]
    W. Zeng, C. Wang, and Q. Wang, “Hand gesture recognition using leap motion via deterministic learning,” Multimedia Tools and Applications, vol. 77, no. 21, pp. 28185–28206, 2018.
    [30]
    G. Marin, F. Dominio, and P. Zanuttigh, “Hand gesture recognition with jointly calibrated leap motion and depth sensor,” Multimedia Tools and Applications, vol. 75, no. 22, pp. 1–25, 2016.
    [31]
    F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the accuracy and robustness of the leap motion controller,” Sensors, vol. 13, no. 5, pp. 6380–6393, 2013. doi: 10.3390/s130506380
    [32]
    D. Gouaillier, V. Hugel, P. Blazevic, C. Kilner, J. Monceaux, P. Lafourcade, B. Marnier, J. Serre, and B. Maisonnier, “Mechatronic design of NAO humanoid,” in Proc. IEEE Int. Conf. Robotics and Automation, 2009, pp. 769–774.

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(17)

    Article Metrics

    Article views (7265) PDF downloads(107) Cited by()

    Highlights

    • A method for predicting gestures based on the LSTM is proposed. The data of gestures is collected by the Leap Motion
    • In order to reduce or eliminate the jitter or jump generated in the process of acquiring data by the Leap Motion, the Kalman filter is applied to solve this problem effectively
    • We propose a reliable feature extraction method, which extracts coordinate features, length features, angle features and angular velocity features, and combines these features to predict gestures
    • We apply the trained model to the NAO robot and make it play the finger-guessing game with players, which effectively verifies the real-time and accuracy of the proposed approach

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return