A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 7 Issue 2
Mar.  2020

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Mykhailo Ivanov, Oleg Sergyienko, Vera Tyrsa, Lars Lindner, Wendy Flores-Fuentes, Julio Cesar Rodríguez-Quiñonez, Wilmar Hernandez and Paolo Mercorelli, "Influence of Data Clouds Fusion From 3D Real-Time Vision System on Robotic Group Dead Reckoning in Unknown Terrain," IEEE/CAA J. Autom. Sinica, vol. 7, no. 2, pp. 368-385, Mar. 2020. doi: 10.1109/JAS.2020.1003027
Citation: Mykhailo Ivanov, Oleg Sergyienko, Vera Tyrsa, Lars Lindner, Wendy Flores-Fuentes, Julio Cesar Rodríguez-Quiñonez, Wilmar Hernandez and Paolo Mercorelli, "Influence of Data Clouds Fusion From 3D Real-Time Vision System on Robotic Group Dead Reckoning in Unknown Terrain," IEEE/CAA J. Autom. Sinica, vol. 7, no. 2, pp. 368-385, Mar. 2020. doi: 10.1109/JAS.2020.1003027

Influence of Data Clouds Fusion From 3D Real-Time Vision System on Robotic Group Dead Reckoning in Unknown Terrain

doi: 10.1109/JAS.2020.1003027
More Information
  • This paper proposes the solution of tasks set required for autonomous robotic group behavior optimization during the mission on a distributed area in a cluttered hazardous terrain. The navigation scheme uses the benefits of the original real-time technical vision system (TVS) based on a dynamic triangulation principle. The method uses TVS output data with fuzzy logic rules processing for resolution stabilization. Based on previous researches, the dynamic communication network model is modified to implement the propagation of information with a feedback method for more stable data exchange inside the robotic group. According to the comparative analysis of approximation methods, in this paper authors are proposing to use two-steps post-processing path planning aiming to get a smooth and energy-saving trajectory. The article provides a wide range of studies and computational experiment results for different scenarios for evaluation of common cloud point influence on robotic motion planning.

     

  • loading
  • [1]
    J. Boes and F. Migeon, “Self-organizing multi-agent systems for the control of complex systems,” J. Systems and Software, vol. 134, pp. 12–28, 2017. doi: 10.1016/j.jss.2017.08.038
    [2]
    A. H. Bond and L. Gasser, Readings in Distributed Artificial Intelligence, San Mateo, CA: Morgan Kaufmann, 1988.
    [3]
    J. A. Shapiro, “Thinking about bacterial populations as multicellular organisms,” Annu. Rev. Microbiol, vol. 52, pp. 81–104, 1998. doi: 10.1146/annurev.micro.52.1.81
    [4]
    L. A. Parr, J. T. Winslow, W. D. Hopkins, and F. Waal, “Recognizing facial cues: individual discrimination by chimpanzees (Pan troglodytes) and rhesus monkeys (Macaca mulatta),” J. Comparative Psychology, vol. 114, pp. 47, 2000. doi: 10.1037/0735-7036.114.1.47
    [5]
    S. Nebti and A. Boukerram, “Swarm intelligence inspired classifiers for facial recognition,” Swarm and Evolutionary Computation, vol. 32, pp. 150–166, 2017. doi: 10.1016/j.swevo.2016.07.001
    [6]
    M. Mavrovouniotis, C. Li, and S. Yang, “A survey of swarm intelligence for dynamic optimization: algorithms and applications,” Swarm and Evolutionary Computation, vol. 33, pp. 1–17, 2017. doi: 10.1016/j.swevo.2016.12.005
    [7]
    A. Atyabi, S. Phon-Amnuaisuk, and C. K. Ho, “Navigating a robotic swarm in an uncharted 2D landscape,” Applied Soft Computing, vol. 10, pp. 149–169, 2010. doi: 10.1016/j.asoc.2009.06.017
    [8]
    P. Levi, E. Meister, and F. Schlachter, “Reconfigurable swarm robots produce self-assembling and self-repairing organisms,” Robot. Auton. Syst., vol. 62, pp. 1371–1376, Oct. 2014. doi: 10.1016/j.robot.2014.07.001
    [9]
    A. O. de. Sa, N. Nedjah, and L. Macedo Mourelle, “Distributed and resilient localization algorithm for swarm robotic systems,” Applied Soft Computing, vol. 57, pp. 738–750, 2017. doi: 10.1016/j.asoc.2016.07.049
    [10]
    O. Y. Sergiyenko, M. V. Ivanov, V. V. Tyrsa, V. M. Kartashov, M. Rivas Lopez, D. Hernández Balbuena, W. Flores Fuentes, J. C. Rodríguez Quiñonez, J. I. Nieto Hipólito, W. Hernandez, and A. Tchernykh, “Data transferring model determination in robotic group,” Robotics and Autonomous Systems, vol. 83, pp. 251–260, 2016. doi: 10.1016/j.robot.2016.04.003
    [11]
    Y. Tan and Z.-Y. Zheng, “Research advance in swarm robotics,” Defence Technology, vol. 9, pp. 18–39, 2013. doi: 10.1016/j.dt.2013.03.001
    [12]
    P. Suárez, A. Iglesias, and A. Gálvez, “Make robots be bats: specializing robotic swarms to the Bat algorithm,” Swarm and Evolutionary Computation, 2018.
    [13]
    P. Valdastri, P. Corradi, A. Menciassi, T. Schmickl, K. Crailsheim, J. Seyfried, and P. Dario, “Micromanipulation, communication and swarm intelligence issues in a swarm microrobotic platform,” Robotics and Autonomous Systems, vol. 54, pp. 789–804, 2006. doi: 10.1016/j.robot.2006.05.001
    [14]
    U. Lima P., M. Custódio L, Multi-Robot Systems, Patnaik S., C. Jain L., G. Tzafestas S., Resconi G., Konar A, (eds) Innovations in Robot Mobility and Control. Studies in Computational Intelligence, vol 8. Springer, Berlin, Heidelberg.
    [15]
    D. Marocco and S. Nolfi, “Origins of communication in evolving robots,” Lecture Notes in Computer Science, vol. 4095, pp. 789–803, 2006.
    [16]
    L. Básaca, O. Sergiyenko, J. C. Rodríguez-Quinonez, X. García, V. Tyrsa, M. Rivas-Lopez, D. Hernandez-Balbuena, P. Mercorelli, M. Podrygalo, A. Gurko, I. Tabakova, and O. Starostenko, “Optical 3D laser measurement system for navigation of autonomous mobile robot,” Optics and Lasers in Engineering, vol. 54, pp. 159–169, 2014. doi: 10.1016/j.optlaseng.2013.08.005
    [17]
    O. Sergiyenko, V. Kartashov, M. Ivanov, D. Hernández-Balbuena, V. Tyrsa and J. I. Nieto-Hipólito, “Transferring model in robotic group,” in Proc. IEEE 25th Int. Symp. Industrial Electronics, 2016, pp. 946–952.
    [18]
    B. Patle, D. Parhi, A. Jagadeesh, and S. K. Kashyap, “Matrix-binary codes based genetic algorithm for path planning of mobile robot,” Computers &Electrical Engineering, vol. 67, pp. 708–728, 2018.
    [19]
    M. H. B. Abadi, M. A. Oskoei, and A. Fakharian, "Side sonar vision applied to Omni-directional images to navigate mobile robots," in Proc. 5th Iranian Joint Congr. Fuzzy and Intelligent Systems, 2017, pp. 97–102.
    [20]
    L. Hou, F. Fan, J. Fu, and J. Wang, “Time-varying algorithm for swarm robotics,” IEEE/CAA J. Autom. Sinica, vol. 5, no. 1, pp. 217–222, 2018. doi: 10.1109/JAS.2017.7510685
    [21]
    D. Quoc Bao and I. Zelinka, “Obstacle avoidance for swarm robot based on self-organizing migrating algorithm,” Procedia Computer Science, vol. 150, pp. 425–432, 2019. doi: 10.1016/j.procs.2019.02.073
    [22]
    D. Ivanov, “Distribution of roles in groups of robots with limited communications based on the swarm interaction,” Procedia Computer Science, vol. 150, pp. 518–523, 2019. doi: 10.1016/j.procs.2019.02.087
    [23]
    S. Lu, Y. Zhang, and J. Su, “Mobile robot for power substation inspection: a survey,” IEEE/CAA J. Autom. Sinica, pp. 1–18, 2017.
    [24]
    P. Sun and Z. Yu, “Tracking control for a cushion robot based on fuzzy path planning with safe angular velocity,” IEEE/CAA J. Autom. Sinica, vol. 4, no. 4, pp. 610–619, 2017. doi: 10.1109/JAS.2017.7510607
    [25]
    M. Ivanov, O. Sergiyenko, V. Tyrsa, P. Mercorelli, V. Kartashov, W. Hernandez, S. Sheiko, and M. Kolendovska, “Individual scans fusion in virtual knowledge base for navigation of mobile robotic group with 3D TVS,” in Proc. IECON 2018 44th Annual IEEE Conf. Industrial Electronics Society, Washington, USA, 2018.
    [26]
    M. Ivanov, O. Sergiyenko, P. Mercorelli, W. Hernandez, V. Tyrsa, D. Hernandez-Balbuena, J. C. Quiñonez Rodriguez, V. Kartashov, M. Kolendovska, and I. Tabakova, “Effective informational entropy reduction in multi-robot systems based on real-time TVS,” in Proc. IEEE 28th Int. Symp. Industrial Electronics, Vancouver, Canada, 2019.
    [27]
    C. O. Vilão, D. H. Perico, I. J. Silva, T. P. D. Homem, F. Tonidandel, and R. A. C. Bianchi, “A single camera vision system for a Humanoid robot,” in Proc. Joint Conf. Robotics: SBR-LARS Robotics Symposium and Robocontrol, Sao Carlos, Brazil, 2014.
    [28]
    B. Katalinic, N. Gryaznov, and A. Lopota, “Computer vision for mobile on-ground robotics,” Procedia Engineering, vol. 100, pp. 1376–1380, 2015. doi: 10.1016/j.proeng.2015.01.506
    [29]
    M. C. Achtelik and D. Scaramuzza, “Vision-controlled micro flying robots: from system design to autonomous navigation and mapping in GPS-denied environments,” IEEE Robotics &Automation Magazine, vol. 2, no. 3, pp. 26–40, 2014.
    [30]
    G. Alenyà Ribas, S. Foix Salmerón, and C. Torras Genís, “ToF cameras for active vision in robotics,” Sensors and Actuators A:Physical, vol. 218, pp. 10–22, 2014. doi: 10.1016/j.sna.2014.07.014
    [31]
    Q. Fan, B. Sun, Y. Sun, Y. Wu, and X. Zhuang, “Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB,” J. Navigation, vol. 70, no. 5, pp. 1079–1097, 2017. doi: 10.1017/S0373463317000194
    [32]
    O. Y. Sergiyenko, “Optoelectronic system for mobile robot navigation,” Optoelectronics,Instrumentation and Data Processing, vol. 46, pp. 414–128, 01 10. 2010. doi: 10.3103/S8756699011050037
    [33]
    O. Sergiyenko, W. Hernandez, V. Tyrsa, L. D. Cruz, O.Starostenko, and M.Pena-Cabrera, “Remote sensor for spatial measurements by using optical scanning,” MDPI,Sensors, vol. 9, no. 7, pp. 5477–5492, 2009. doi: 10.3390/s90705477
    [34]
    L. Básaca, J. Rodríguez, O. Sergiyenko, V. Tyrsa, W. Hernández, and J. I. Nieto Hipólito, “Resolution improvement of dynamic triangulation method for 3D vision system in robot navigation task,” in Proc. IEEE 36th Annual Conf. Industrial Electronics IECON’10, 2010.
    [35]
    O. Sergiyenko, V. Tyrsa, M. Rivas López, D. Hernández Balbuena, L. C. Básaca Preciado, J. C. Rodríguez Quiñonez, and W. Flores Fuentes, “Sistema óptico de triangulación dinámica para la medición de ángulos y coordenadas en un espacio tridimensional”. Mexico Patent 344504 (MX201400647), 2016.
    [36]
    J. C. Rodriguez-Quinonez, O. Sergiyenko, F. F. Gonzalez-Navarro, L. Basaca-Preciado, and V. Tyrsa, “Surface recognition improvement in 3D medical laser scanner using Levenberg-Marquardt method,” Signal Processing, vol. 93, pp. 378–386, 2013. doi: 10.1016/j.sigpro.2012.07.001
    [37]
    J. Rodriguez-Quiñonez, O. Sergiyenko, D. Hernandez-Balbuena, M. Rivas-Lopez, W. Flores-Fuentes, and L. C. Basaca Preciado, “Improve 3D laser scanner measurements accuracy using a FFBP neural network with Widrow-Hoff weigt/bias learning function,” Opto-Electronic Review, vol. 22, no. 4, pp. 224–235, 2014.
    [38]
    J. Rodriguez-Quiñonez, O. Sergiyenko, W. Flores-Fuentes, M. Rivas-Lopez, D. Hernandez-Balbuena, R. Rascon, and P. Mercorelli, “Improve a 3D distance measurement accuracy in stereo vision systems using optimization methods. approach,” Opto-Electronics Review, vol. 25, no. 1, pp. 24–32, 2017. doi: 10.1016/j.opelre.2017.03.001
    [39]
    X. Garcia, O. Sergiyenko, V. Tyrsa, M. Rivas-Lopez, D. Hernandez-Balbuena, J. C. Rodriguez-Quiñonez, L. C. Basaca-Preciado, and P. Mercorelli, “Optimization of 3D laser scanning speed by use of combined variable step,” Optics and Lasers in Engineering, vol. 54, pp. 141–151, 2014. doi: 10.1016/j.optlaseng.2013.08.011
    [40]
    L. Lindner, O. Sergiyenko, M. Rivas-López, B. Valdez-Salas, J. C. Rodríguez-Quiñonez, D. Hernández-Balbuena, W. Flores-Fuentes, V. T. Misael, M. Barrera, F. N. Muerrieta-Rico, P. Mercorelli, A. Gurko, and V. M. Kartashov, “Machine vision system for UAV navigation,” in Proc. 2016 Int. Conf. Electrical Systems for Aircraft, Railway, Ship Propulsion and Road Vehicles Int. Transportation Electrification Conf., 2016.
    [41]
    L. Lindner, O. Sergiyenko, M. Rivas-López, D. Hernández-Balbuena, W. Flores-Fuentes, J. C. Rodríguez-Quiñonez, F. N. Murrieta-Rico, M. Ivanov, V. Tyrsa, and L. C. Básaca-Preciado, “Exact laser beam positioning for measurement of vegetation vitality,” Industrial Robot:The Int. J. Robotics Research and Application, vol. 44, pp. 532–541, 2017. doi: 10.1108/IR-11-2016-0297
    [42]
    L. Lindner, O. Sergiyenko, J. Rodríguez-Quiñonez, and V. Tyrsa, “Continuous 3D scanning mode using servomotors instead of stepping motors in dynamic laser triangulation,” in Proc. IEEE 24th Int. Symp. Industrial Electronics, Buzios, Brazil, 2015.
    [43]
    L. Lindner, O. Sergiyenko, J. C. Rodríguez-Quiñonez, M. Rivas-Lopez, D. Hernandez-Balbuena, W. Flores-Fuentes, F. N. Murrieta-Rico, and V. Tyrsa, “Mobile robot vision system using continuous laser scanning for industrial application,” Industrial Robot:An Int. J., vol. 43, no. 4, pp. 360–369, 2016. doi: 10.1108/IR-01-2016-0048
    [44]
    X. N. Guo, Y. F. Chen, J. W. Zhao, L. Yang, and W. W. Gong, “Field environment intelligent navigation system for tomato transportation robot based on Dijkstra,” in Proc. Chinese Intelligent Automation Conf., Jiangsu, China, 2019, pp. 329–336.
    [45]
    X. Chen and S. Qin, “Approach to high efficient hierarchical pathfinding of indoor mobile service robots based on grid map and Floyd-Warshall algorithm,” in Proc. 2017 Chinese Automation Congr., Jinan, China, 2017, pp. 6476–6483.
    [46]
    J. Rosenblatt, “DAMN: distributed architecture for mobile navigation,” J. Experimental and Theoretical Artificial Intelligence, vol. 9, no. 2–3, pp. 339–360, 1997. doi: 10.1080/095281397147167
    [47]
    P. E. Bezier, “How renault uses numerical control for car body design and tooling,” in Society of Automotive Engineers, Detroit, MI, USA, 1968.
    [48]
    P. Bezier, “Example of an existing system in the motor industry: the unisurf system,” in Proc. the Royal Society of London, vol. A321, pp. 207–218, 1971.
    [49]
    K. Kawabata, L. Ma, J. Xue, C. Zhu, and N. Zheng, “A path generation for automated vehicle based on Bezier curve and via-points,” Robotics and Autonomous Systems, vol. 74, no. A, pp. 243–252, 2015.
    [50]
    L. Han, H. Yashiro, T. Nejad, Q. Do, and S. Mita, “Bezier curve based path planning for autonomous vehicle in urban environment,” in IEEE Intelligent Vehicles Symp., CA, USA, 2010.
    [51]
    L. E. Dubins, “On curves of minimal length with a constraint on average curvature, and with prescribed initial and terminal positions and tangents,” American J. Mathematics, vol. 79, no. 3, pp. 497–516, 1957. doi: 10.2307/2372560
    [52]
    L. Blin, A. Cournier, and V. Villain, “An improved Snap-stabilizing PIF algorithm,” Self-Stabilizing Systems, vol. 2704, pp. 199–214, 2003.
    [53]
    H. T. Nguyen and E. A. Walker, A First Course in Fuzzy Logic, USA: Chapman and Hall/CRC, 2005, p. 440.
    [54]
    E. Rohmer, S. P. N. Singh, and M. Freese, “V-REP: a versatile and scalable robot simulation framework,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Tokyo, Japan, 2013.
    [55]
    O. Michael, “Webots: professional mobile robot simulation,” J. Advanced Robotics Systems, vol. 1, no. 1, pp. 39–42, 2004.
    [56]
    B. P. Gerkey, R. Vaughan, and A. Howard, “The player/stage project: tools for multi-robot and distributed sensor systems,” in Proc. Int. Conf. Advanced Robotics, University of Coimbra, Portugal, 2003.
    [57]
    J. Hocking, Unity in Action: Multiplatform Game Development in C# with Unity 5, Shelter Island, New York: Manning Publications, 2015, pp. 352.
    [58]
    Y. Deng, Y. Chen, Y. Zhang, and S. Mahadevan, “Fuzzy Dijkstra algorithm for shortest path problem under uncertain environment,” Applied Soft Computing, vol. 12, no. 3, pp. 1231–1237, 2012. doi: 10.1016/j.asoc.2011.11.011
    [59]
    E. Xidias and P. Azariadis, “Computing collision-free motions for a team of robots using formation and non-holonomic constraints,” Robotics and Autonomous Systems, vol. 82, pp. 15–23, 2016. doi: 10.1016/j.robot.2016.04.008

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(13)  / Tables(5)

    Article Metrics

    Article views (3088) PDF downloads(113) Cited by()

    Highlights

    • The article covers all the task needed for complete robotic behavior model.
    • Articles describe the method of resolution stabilization for laser technical vision system.
    • Robotic navigation and trajectories are improved by implementing the data transferring within the group.

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return