IEEE/CAA Journal of Automatica Sinica
Citation: | T. H. Zhang, J. H. Xiao, L. Li, C. Wang, and G. M. Xie, "Toward Coordination Control of Multiple Fish-Like Robots: Real-Time Vision-Based Pose Estimation and Tracking via Deep Neural Networks," IEEE/CAA J. Autom. Sinica, vol. 8, no. 12, pp. 1964-1976, Dec. 2021. doi: 10.1109/JAS.2021.1004228 |
[1] |
E. W. Hawkes, L. H. Blumenschein, J. D. Greer, and A. M. Okamura, “A soft robot that navigates its environment through growth,” Science Robotics, vol. 2, no. 8, Article No. eaan3028, 2017.
|
[2] |
L. Li, A. Liu, W. Wang, S. Ravi, R. Fu, J. Yu, and G. Xie, “Bottomlevel motion control for robotic fish to swim in groups: Modeling and experiments,” Bioinspiration &Biomimetics, vol. 14, no. 4, Article No. 046001, 2019.
|
[3] |
L. Li, M. Nagy, J. M. Graving, J. Bak-Coleman, G. Xie, and I. D. Couzin, “Vortex phase matching as a strategy for schooling in robots and in fish,” Nature Communications, vol. 11, no. 1, pp. 1–9, 2020. doi: 10.1038/s41467-019-13993-7
|
[4] |
S. Butail, T. Bartolini, and M. Porfiri, “Collective response of zebrafish shoals to a free-swimming robotic fish,” PLoS One, vol. 8, no. 10, Article No. e76123, 2013. doi: 10.1371/journal.pone.0076123
|
[5] |
F. Bonnet, Y. Kato, J. Halloy, and F. Mondada, “Infiltrating the zebrafish swarm: Design, implementation and experimental tests of a miniature robotic fish lure for fish–robot interaction studies,” Artificial Life and Robotics, vol. 21, no. 3, pp. 239–246, 2016. doi: 10.1007/s10015-016-0291-8
|
[6] |
C. Wang, X. Chen, G. Xie, and M. Cao, “Emergence of leadership in a robotic fish group under diverging individual personality traits,” Royal Society Open Science, vol. 4, no. 5, Article No. 161015, 2017. doi: 10.1098/rsos.161015
|
[7] |
J. Yuan, J. Yu, Z. Wu, and M. Tan, “Precise planar motion measurement of a swimming multi-joint robotic fish,” Science China Information Sciences, vol. 59, no. 9, pp. 1–15, 2016.
|
[8] |
K. Terayama, H. Habe, and M.-A. Sakagami, “Multiple fish tracking with an NACA airfoil model for collective behavior analysis,” IPSJ Trans. Computer Vision and Applications, vol. 8, no. 1, pp. 1–7, 2016. doi: 10.1186/s41074-016-0002-3
|
[9] |
Y. Ma, J. Kosecka, and S. S. Sastry, “Vision guided navigation for a nonholonomic mobile robot,” IEEE Trans. Robotics and Automation, vol. 15, no. 3, pp. 521–536, 1999. doi: 10.1109/70.768184
|
[10] |
A. Phillips, “Robot fish: Bio-inspired fishlike underwater robots,” Underwater Technology, vol. 34, no. 3, pp. 143–145, 2017.
|
[11] |
A. Pérez-Escudero, J. Vicente-Page, R. C. Hinz, S. Arganda, and G. G. De Polavieja, “idTracker: Tracking individuals in a group by automatic identification of unmarked animals,” Nature Methods, vol. 11, no. 7, pp. 743–748, 2014. doi: 10.1038/nmeth.2994
|
[12] |
R. K. Katzschmann, A. D. Marchese, and D. Rus, “Hydraulic autonomous soft robotic fish for 3D swimming,” in Experimental Robotics, Cham, Germany: Springer, 2016, pp. 405–420.
|
[13] |
X. Tan, D. Kim, N. Usher, D. Laboy, J. Jackson, A. Kapetanovic, J. Rapai, B. Sabadus, and X. Zhou, “An autonomous robotic fish for mobile sensing,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems. IEEE, 2006, pp. 5424–5429.
|
[14] |
W. Wang and G. Xie, “Online high-precision probabilistic localization of robotic fish using visual and inertial cues,” IEEE Trans. Industrial Electronics, vol. 62, no. 2, pp. 1113–1124, 2014.
|
[15] |
M. Penmetcha, S. Luo, A. Samantaray, J. E. Dietz, B. Yang, and B.-C. Min, “Computer vision-based algae removal planner for multi-robot teams,” in Proc. IEEE Int. Conf. Systems, Man and Cybernetics (SMC). IEEE, 2019, pp. 1575–1581.
|
[16] |
Y. Liu, Z. Meng, Y. Zou, and M. Cao, “Visual object tracking and servoing control of a nano-scale quadrotor: System, algorithms, and experiments,” IEEE/CAA Journal of Automatica Sinica, vol. 8, no. 2, pp. 344–360, 2020.
|
[17] |
Z. Zhang, T. Yang, T. Zhang, F. Zhou, N. Cen, T. Li, and G. Xie, “Global vision-based formation control of soft robotic fish swarm,” Soft Robotics, vol. 8, no. 3, pp. 310–318, 2021.
|
[18] |
S. Butail, N. Abaid, S. Macrì, and M. Porfiri, “Fish–robot interactions: robot fish in animal behavioral studies,” in Robot Fish, Berlin, Heidelberg: Springer, 2015, pp. 359–377.
|
[19] |
M. Porez, F. Boyer, and A. J. Ijspeert, “Improved lighthill fish swimming model for bio-inspired robots: Modeling, computational aspects and experimental comparisons,” Int. J. Robotics Research, vol. 33, no. 10, pp. 1322–1341, 2014. doi: 10.1177/0278364914525811
|
[20] |
E. Fontaine, D. Lentink, S. Kranenbarg, U. K. Müller, J. L. van Leeuwen, A. H. Barr, and J. W. Burdick, “Automated visual tracking for studying the ontogeny of zebrafish swimming,” J. Experimental Biology, vol. 211, no. 8, pp. 1305–1316, 2008. doi: 10.1242/jeb.010272
|
[21] |
L. Shao and G. Xie, “Real-time tracking of moving objects on a water surface,” in Proc. IEEE Int. Conf. Mechatronics and Automation, 2012, pp. 2114–2119.
|
[22] |
I. Ahmed, S. Din, G. Jeon, F. Piccialli, and G. Fortino, “Towards collaborative robotics in top view surveillance: A framework for multiple object tracking by detection using deep learning,” IEEE/CAA Journal of Automatica Sinica, vol. 8, no. 7, pp. 1253–1270, Jul. 2021. doi: 10.1109/JAS.2020.1003453
|
[23] |
B. Xiao, H. Wu, and Y. Wei, “Simple baselines for human pose estimation and tracking,” in Proc. European Conf. Computer Vision (ECCV), 2018, pp. 466–481.
|
[24] |
Z. Cao, T. Simon, S.-E. Wei, and Y. Sheikh, “Realtime multi-person 2d pose estimation using part affinity fields,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2017, pp. 7291–7299.
|
[25] |
E. Insafutdinov, L. Pishchulin, B. Andres, M. Andriluka, and B. Schiele, “Deepercut: A deeper, stronger, and faster multi-person pose estimation model,” in Proc. European Conf. Computer Vision, Springer, 2016, pp. 34–50.
|
[26] |
A. Mathis, P. Mamidanna, K. M. Cury, T. Abe, V. N. Murthy, M. W. Mathis, and M. Bethge, “DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning,” Nature Neuroscience, vol. 21, no. 9, pp. 1281–1289, 2018. doi: 10.1038/s41593-018-0209-y
|
[27] |
T. D. Pereira, D. E. Aldarondo, L. Willmore, M. Kislin, S. S.-H. Wang, M. Murthy, and J. W. Shaevitz, “Fast animal pose estimation using deep neural networks,” Nature Methods, vol. 16, no. 1, pp. 117–125, 2019. doi: 10.1038/s41592-018-0234-5
|
[28] |
J. Yu, C. Wang, and G. Xie, “Coordination of multiple robotic fish with applications to underwater robot competition,” IEEE Trans. Industrial Electronics, vol. 63, no. 2, pp. 1280–1288, 2015.
|
[29] |
C. Wang, G. Xie, L. Wang, and M. Cao, “CPG-based locomotion control of a robotic fish: using linear oscillators and reducing control parameters via PSO,” Int. Journal of Innovative Computing,Information and Control, vol. 7, no. 7, pp. 4237–4249, 2011.
|
[30] |
J. Redmon and A. Farhadi, “Yolov3: An incremental improvement,” arXiv preprint arXiv: 1804.02767, 2018.
|
[31] |
M. Tan and Q. Le, “Efficientnet: Rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Machine Learning, PMLR, 2019, pp. 6105–6114.
|
[32] |
M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, “Mobilenetv2: Inverted residuals and linear bottlenecks,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2018, pp. 4510–4520.
|
[33] |
M. Andriluka, U. Iqbal, E. Insafutdinov, L. Pishchulin, A. Milan, J. Gall, and B. Schiele, “Posetrack: A benchmark for human pose estimation and tracking,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2018, pp. 5167–5176.
|
[34] |
P. Voigtlaender, M. Krause, A. Osep, J. Luiten, B. B. G. Sekar, A. Geiger, and B. Leibe, “Mots: Multi-object tracking and segmentation,” in Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition, 2019, pp. 7942–7951.
|
[35] |
T. Mikolov, M. Karafiat, L. Burget, J. Černockỳ, and S. Khudanpur, “Recurrent neural network based language model,” in Proc. 11th Annual Conf. Int. Speech Communication Association, 2010.
|
[36] |
W. Zaremba, I. Sutskever, and O. Vinyals, “Recurrent neural network regularization,” arXiv preprint arXiv: 1409.2329, 2014.
|
[37] |
A. Graves, “Generating sequences with recurrent neural networks,” arXiv preprint arXiv: 1308.0850, 2013.
|
[38] |
A. Alahi, K. Goel, V. Ramanathan, A. Robicquet, L. Fei-Fei, and S. Savarese, “Social lstm: Human trajectory prediction in crowded spaces,” in Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2016, pp. 961–971.
|
[39] |
K. He, R. Girshick, and P. Dollár, “Rethinking imagenet pre-training,” in Proc. IEEE/CVF Int. Conf. Computer Vision, 2019, pp. 4918–4927.
|
[40] |
C. Wang and G. Xie, “Limit-cycle-based decoupled design of circle formation control with collision avoidance for anonymous agents in a plane,” IEEE Trans. Automatic Control, vol. 62, no. 12, pp. 6560–6567, 2017. doi: 10.1109/TAC.2017.2712758
|
[41] |
C. Wang, W. Xia, and G. Xie, “Limit-cycle-based design of formation control for mobile agents,” IEEE Trans. Automatic Control, vol. 65, no. 8, pp. 3530–3543, 2019.
|
[42] |
Z. Gao and G. Guo, “Fixed-time sliding mode formation control of auvs based on a disturbance observer,” IEEE/CAA Journal of Automatica Sinica, vol. 7, no. 2, pp. 539–545, 2020. doi: 10.1109/JAS.2020.1003057
|
[43] |
Y. Zheng, Q. Zhao, J. Ma, and L. Wang, “Second-order consensus of hybrid multi-agent systems,” Systems &Control Letters, vol. 125, pp. 51–58, 2019.
|
[44] |
Y. Zheng, J. Ma, and L. Wang, “Consensus of hybrid multi-agent systems,” IEEE Trans. Neural Networks and Learning Systems, vol. 29, no. 4, pp. 1359–1365, 2017.
|
[45] |
T. Zhang, Y. Li, S. Li, Q. Ye, C. Wang, and G. Xie, “Decentralized circle formation control for fish-like robots in real-world via reinforcement learning,” in Proc. Int. Conf. Robotics and Automation (ICRA), IEEE, arXiv preprint arXiv: 2103.05293, 2021.
|