A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 11 Issue 8
Aug.  2024

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
S. Qi, R. Wang, T. Zhang, W. Huang, F. Yu, and  L. Wang,  “Enhancing evolutionary algorithms with pattern mining for sparse large-scale multi-objective optimization problems,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 8, pp. 1786–1801, Aug. 2024. doi: 10.1109/JAS.2024.124548
Citation: S. Qi, R. Wang, T. Zhang, W. Huang, F. Yu, and  L. Wang,  “Enhancing evolutionary algorithms with pattern mining for sparse large-scale multi-objective optimization problems,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 8, pp. 1786–1801, Aug. 2024. doi: 10.1109/JAS.2024.124548

Enhancing Evolutionary Algorithms With Pattern Mining for Sparse Large-Scale Multi-Objective Optimization Problems

doi: 10.1109/JAS.2024.124548
Funds:  This work was support by the Open Project of Xiangjiang Laboratory (22XJ02003), the University Fundamental Research Fund (23-ZZCX-JDZ-28, ZK21-07), the National Science Fund for Outstanding Young Scholars (62122093), the National Natural Science Foundation of China (72071205), the Hunan Graduate Research Innovation Project (CX20230074), the Hunan Natural Science Foundation Regional Joint Project (2023JJ50490), the Science and Technology Project for Young and Middle-aged Talents of Hunan (2023TJ-Z03), and the Science and Technology Innovation Program of Humnan Province (2023RC1002)
More Information
  • Sparse large-scale multi-objective optimization problems (SLMOPs) are common in science and engineering. However, the large-scale problem represents the high dimensionality of the decision space, requiring algorithms to traverse vast expanse with limited computational resources. Furthermore, in the context of sparse, most variables in Pareto optimal solutions are zero, making it difficult for algorithms to identify non-zero variables efficiently. This paper is dedicated to addressing the challenges posed by SLMOPs. To start, we introduce innovative objective functions customized to mine maximum and minimum candidate sets. This substantial enhancement dramatically improves the efficacy of frequent pattern mining. In this way, selecting candidate sets is no longer based on the quantity of non-zero variables they contain but on a higher proportion of non-zero variables within specific dimensions. Additionally, we unveil a novel approach to association rule mining, which delves into the intricate relationships between non-zero variables. This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value. We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs. The results demonstrate that our approach achieves competitive solutions across various challenges.

     

  • loading
  • [1]
    Y. Deng, H. Xu, and J. Wu, “Optimization of blockchain investment portfolio under artificial bee colony algorithm,” J. Comput. Appl. Math., vol. 385, p. 113199, Mar. 2021. doi: 10.1016/j.cam.2020.113199
    [2]
    A. M. Caunhye, X. Nie, and S. Pokharel, “Optimization models in emergency logistics: A literature review,” Socio-Econ. Plann. Sci., vol. 46, no. 1, pp. 4–13, Mar. 2012. doi: 10.1016/j.seps.2011.04.004
    [3]
    I. Giagkiozis, R. C. Purshouse, and P. J. Fleming, “An overview of population-based algorithms for multi-objective optimisation,” Int. J. Syst. Sci., vol. 46, no. 9, pp. 1572–1599, Jul. 2015. doi: 10.1080/00207721.2013.823526
    [4]
    W. Li, X. Yao, T. Zhang, R. Wang, and L. Wang, “Hierarchy ranking method for multimodal multiobjective optimization with local Pareto fronts,” IEEE Trans. Evol. Comput., vol. 27, no. 1, pp. 98–110, Feb. 2023. doi: 10.1109/TEVC.2022.3155757
    [5]
    W. Li, T. Zhang, R. Wang, and H. Ishibuchi, “Weighted indicator-based evolutionary algorithm for multimodal multiobjective optimization,” IEEE Trans. Evolutionary Computation, vol. 25, no. 6, pp. 1064–1078, Dec. 2021. doi: 10.1109/TEVC.2021.3078441
    [6]
    K. Li, T. Zhang, R. Wang, Y. Wang, Y. Han, and L. Wang, “Deep reinforcement learning for combinatorial optimization: Covering salesman problems,” IEEE Trans. Cybern., vol. 52, no. 12, pp. 13142–13155, Dec. 2022. doi: 10.1109/TCYB.2021.3103811
    [7]
    K. Li, T. Zhang, and R. Wang, “Deep reinforcement learning for multiobjective optimization,” IEEE Trans. Cybern., vol. 51, no. 6, pp. 3103–3114, Jun. 2021. doi: 10.1109/TCYB.2020.2977661
    [8]
    E. T. Oldewage, A. P. Engelbrecht, and C. W. Cleghorn, “The merits of velocity clamping particle swarm optimisation in high dimensional spaces,” in Proc. IEEE Symp. Series on Computational Intelligence, Honolulu, USA, 2017, pp. 1–8.
    [9]
    R. Cheng and Y. Jin, “A competitive swarm optimizer for large scale optimization,” IEEE Trans. Cybern., vol. 45, no. 2, pp. 191–204, Feb. 2015. doi: 10.1109/TCYB.2014.2322602
    [10]
    Q. Yang, W.-N. Chen, J. Da Deng, Y. Li, T. Gu, and J. Zhang, “A level-based learning swarm optimizer for large-scale optimization,” IEEE Trans. Evol. Comput., vol. 22, no. 4, pp. 578–594, Aug. 2018. doi: 10.1109/TEVC.2017.2743016
    [11]
    Y. Tian, L. Si, X. Zhang, R. Cheng, C. He, K. C. Tan, and Y. Jin, “Evolutionary large-scale multi-objective optimization: A survey,” ACM Comput. Surv., vol. 54, no. 8, p. 174, Nov. 2022.
    [12]
    X. Ma, F. Liu, Y. Qi, X. Wang, L. Li, L. Jiao, M. Yin, and M. Gong, “A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables,” IEEE Trans. Evol. Comput., vol. 20, no. 2, pp. 275–298, Apr. 2016. doi: 10.1109/TEVC.2015.2455812
    [13]
    X. Zhang, Y. Tian, R. Cheng, and Y. Jin, “A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization,” IEEE Trans. Evol. Comput., vol. 22, no. 1, pp. 97–112, Feb. 2018. doi: 10.1109/TEVC.2016.2600642
    [14]
    H. Zille, H. Ishibuchi, S. Mostaghim, and Y. Nojima, “A framework for large-scale multiobjective optimization based on problem transformation,” IEEE Trans. Evol. Comput., vol. 22, no. 2, pp. 260–275, Apr. 2018. doi: 10.1109/TEVC.2017.2704782
    [15]
    X. Zhang, X. Zheng, R. Cheng, J. Qiu, and Y. Jin, “A competitive mechanism based multi-objective particle swarm optimizer with fast convergence,” Inf. Sci., vol. 427, pp. 63–76, Feb. 2018. doi: 10.1016/j.ins.2017.10.037
    [16]
    Y. Tian, X. Zheng, X. Zhang, and Y. Jin, “Efficient large-scale multiobjective optimization based on a competitive swarm optimizer,” IEEE Trans. Cybern., vol. 50, no. 8, pp. 3696–3708, Aug. 2020. doi: 10.1109/TCYB.2019.2906383
    [17]
    S. Qi, J. Zou, S. Yang, Y. Jin, J. Zheng, and X. Yang, “A self-exploratory competitive swarm optimization algorithm for large-scale multiobjective optimization,” Inf. Sci., vol. 609, pp. 1601–1620, Sep. 2022. doi: 10.1016/j.ins.2022.07.110
    [18]
    Y. Jin, T. Okabe, and B. Sendhoff, “Neural network regularization and ensembling using multi-objective evolutionary algorithms,” in Proc. Congr. Evolutionary Computation, Portland, USA, 2004, pp. 1–8.
    [19]
    N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” J. Mach. Learn. Res., vol. 15, no. 1, pp. 1929–1958, Jan. 2014.
    [20]
    Y. Tian, X. Zhang, C. Wang, and Y. Jin, “An evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. Evol. Comput., vol. 24, no. 2, pp. 380–393, Apr. 2020. doi: 10.1109/TEVC.2019.2918140
    [21]
    K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Comput., vol. 6, no. 2, pp. 182–197, Apr. 2002. doi: 10.1109/4235.996017
    [22]
    Y. Tian, C. Lu, X. Zhang, K. C. Tan, and Y. Jin, “Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks,” IEEE Trans. Cybern., vol. 51, no. 6, pp. 3115–3128, Jun. 2021. doi: 10.1109/TCYB.2020.2979930
    [23]
    Y. Tian, C. Lu, X. Y. Zhang, F. Cheng, and Y. Jin, “A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems,” IEEE Trans. Cybern., vol. 52, no. 7, pp. 6784–6797, Jul. 2022. doi: 10.1109/TCYB.2020.3041325
    [24]
    J. Han, H. Cheng, D. Xin, and X. Yan, “Frequent pattern mining: Current status and future directions,” Data Min. Knowl. Disc., vol. 15, no. 1, pp. 55–86, Jan. 2007. doi: 10.1007/s10618-006-0059-1
    [25]
    J. Hipp, U. Güntzer, and G. Nakhaeizadeh, “Algorithms for association rule mining—A general survey and comparison,” ACM SIGKDD Explor. Newsl., vol. 2, no. 1, pp. 58–64, Jun. 2000. doi: 10.1145/360402.360421
    [26]
    M. A. Potter and K. A. De Jong, “A cooperative coevolutionary approach to function optimization,” in Proc. Int. Conf. Evolutionary Computation, and the 3rd Conf. Parallel Problem Solving from Nature—PPSN III, Jerusalem, Israel, 1994, pp. 249–257.
    [27]
    T. Weise, R. Chiong, and K. Tang, “Evolutionary optimization: Pitfalls and booby traps,” J. Comput. Sci. Technol., vol. 27, no. 5, pp. 907–936, Nov. 2012. doi: 10.1007/s11390-012-1274-4
    [28]
    Z. Yang, K. Tang, and X. Yao, “Large scale evolutionary optimization using cooperative coevolution,” Inf. Sci., vol. 178, no. 15, pp. 2985–2999, Aug. 2008. doi: 10.1016/j.ins.2008.02.017
    [29]
    Y.-N. Guo, Y. Lin, and S.-G. Zhang, “Interactive genetic algorithms based on frequentpattern mining,” in Proc. 6th Int. Conf. Natural Computation, Yantai, China, 2010, pp. 2381–2385.
    [30]
    Y.-C. Ni, R. Wu, X. Du, P. Ye, W.-B. Li, and R.-L. Xiao, “Evolutionary algorithm for optimization of energy consumption at GCC compile time based on frequent pattern mining,” J. Softw., vol. 30, no. 5, pp. 1269–1287, May 2019.
    [31]
    M. N. Le and Y. S. Ong, “A frequent pattern mining algorithm for understanding genetic algorithms,” in Proc. 4th Int. Conf. Intelligent Computing Advanced Intelligent Computing Theories and Applications, and With Aspects of Artificial Intelligence, Shanghai, China, 2008, pp. 131–139.
    [32]
    W. Fang, Q. Zhang, J. Sun, and X. Wu, “Mining high quality patterns using multi-objective evolutionary algorithm,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 8, pp. 3883–3898, Aug. 2022. doi: 10.1109/TKDE.2020.3033519
    [33]
    Y. Tian, S. Yang, L. Zhang, F. Duan, and X. Zhang, “A surrogate-assisted multiobjective evolutionary algorithm for large-scale task-oriented pattern mining,” IEEE Trans. Emerging Top. Comput. Intell., vol. 3, no. 2, pp. 106–116, Apr. 2019. doi: 10.1109/TETCI.2018.2872055
    [34]
    S. Wang and Y. Yin, “Polygene-based evolutionary algorithms with frequent pattern mining,” Front. Comput. Sci., vol. 12, no. 5, pp. 950–965, Feb. 2018. doi: 10.1007/s11704-016-6104-3
    [35]
    B. Alatas, E. Akin, and A. Karci, “MODENAR: Multi-objective differential evolution algorithm for mining numeric association rules,” Appl. Soft Comput., vol. 8, no. 1, pp. 646–656, Jan. 2008. doi: 10.1016/j.asoc.2007.05.003
    [36]
    T. Lucas, T. C. P. B. Silva, R. Vimieiro, and T. B. Ludermir, “A new evolutionary algorithm for mining top-k discriminative patterns in high dimensional data,” Appl. Soft Comput., vol. 59, pp. 487–499, Oct. 2017. doi: 10.1016/j.asoc.2017.05.048
    [37]
    M. Kaur and S. Kang, “Market basket analysis: Identify the changing trends of market data using association rule mining,” Procedia Comput. Sci., vol. 85, pp. 78–85, 2016.
    [38]
    A. N. Mahmood, J. Hu, Z. Tari, and C. Leckie, “Critical infrastructure protection: Resource efficient sampling to improve detection of less frequent patterns in network traffic,” J. Netw. Comput. Appl., vol. 33, no. 4, pp. 491–502, Jul. 2010. doi: 10.1016/j.jnca.2010.01.003
    [39]
    G. Stelzer, N. Rosen, I. Plaschkes, S. Zimmerman, M. Twik, S. Fishilevich, T. I. Stein, R. Nudel, I. Lieder, Y. Mazor, S. Kaplan, D. Dahary, D. Warshawsky, Y. Guan‐Golan, A. Kohn, N. Rappaport, M. Safran, and D. Lancet, “The GeneCards suite: From gene data mining to disease genome sequence analyses,” Curr. Protoc. Bioinformatics, vol. 54, pp. 1.30.1–1.30.33, Jun. 2016.
    [40]
    K. J. Cios and G. W. Moore, “Uniqueness of medical data mining,” Artif. Intell. Med., vol. 26, no. 1–2, pp. 1–24, Sep.–Oct. 2002. doi: 10.1016/S0933-3657(02)00049-0
    [41]
    A. Kulkarni, A. Shivananda, A. Kulkarni, and V. A. Krishnan, “Market basket analysis (association rule mining),” in Applied Recommender Systems With Python: Build Recommender Systems With Deep Learning, NLP and Graph-Based Techniques, A. Kulkarni, A. Shivananda, A. Kulkarni, and V. A. Krishnan, Eds. Berkeley, USA: Apress, 2022, pp. 21–62.
    [42]
    S. G. Langhnoja, M. P. Barot, and D. B. Mehta, “Web usage mining using association rule mining on clustered data for pattern discovery,” Int. J. Data Min. Techn. Appl., vol. 2, no. 1, pp. 141–150, Jun. 2013.
    [43]
    M. Anandhavalli, M. K. Ghose, and K. Gauthaman, “Association rule mining in genomics,” Int. J. Comput. Theory Eng., vol. 2, no. 2, pp. 269–273, Apr. 2010.
    [44]
    M. Kim, T. Hiroyasu, M. Miki, and S. Watanabe, “SPEA2+: Improving the performance of the strength Pareto evolutionary algorithm 2,” in Proc. 8th Int. Conf. Parallel Problem Solving from Nature, Birmingham, UK, 2004, pp. 742–751.
    [45]
    K. Deb and A. Srinivasan, “Innovization: Innovating design principles through optimization,” in Proc. 8th Annu. Conf. Genetic and Evolutionary Computation, Seattle, USA, 2006, pp. 1629–1636.
    [46]
    J. Jahn, Vector optimization: Theory, Applications, and Extensions. Berlin, Germany: Springer, 2004.
    [47]
    Z. Ding, L. Chen, D. Sun, and X. Zhang, “A multi-stage knowledge-guided evolutionary algorithm for large-scale sparse multi-objective optimization problems,” Swarm and Evolutionary Computation, vol. 73, p. 101119, Aug. 2022. doi: 10.1016/j.swevo.2022.101119
    [48]
    Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [educational forum],” IEEE Comput. Intell. Mag., vol. 12, no. 4, pp. 73–87, Nov. 2017. doi: 10.1109/MCI.2017.2742868
    [49]
    P. A. N. Bosman and D. Thierens, “The balance between proximity and diversity in multiobjective evolutionary algorithms,” IEEE Trans. Evol. Comput., vol. 7, no. 2, pp. 174–188, Apr. 2003. doi: 10.1109/TEVC.2003.810761
    [50]
    Y. Tian, X. Xiang, X. Zhang, R. Cheng, and Y. Jin, “Sampling reference points on the pareto fronts of benchmark multi-objective optimization problems,” in Proc. IEEE World Congr. Computational Intelligence, Rio de Janeiro, Brazil, 2018.
    [51]
    W. Haynes, “Wilcoxon rank sum test,” in Encyclopedia of Systems Biology, W. Dubitzky, O. Wolkenhauer, K. H. Cho, and H. Yokota, Eds. New York, USA: Springer, 2013, pp. 2354–2355.
    [52]
    B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimization for feature selection in classification: A multi-objective approach,” IEEE Trans. Cybern., vol. 43, no. 6, pp. 1656–1671, Dec. 2013. doi: 10.1109/TSMCB.2012.2227469
    [53]
    R. A. Rossi and N. K. Ahmed, “The network data repository with interactive graph analytics and visualization,” in Proc. 29th AAAI Conf. Artificial Intelligence, Austin, Texas, 2015, pp. 4292–4293.
    [54]
    D. J. Watts and S. H. Strogatz, “Collective dynamics of ‘small-world’ networks,” Nature, vol. 393, no. 6684, pp. 440–442, Jun. 1998. doi: 10.1038/30918
    [55]
    J. Bader, K. Deb, and E. Zitzler, “Faster hypervolume-based search using Monte Carlo sampling,” in Proc. 19th Int. Conf. Multiple Criteria Decision Making Multiple Criteria Decision Making for Sustainable Energy and Transportation Systems, Auckland, New Zealand, 2008, pp. 313–326.

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(2)  / Tables(8)

    Article Metrics

    Article views (170) PDF downloads(41) Cited by()

    Highlights

    • We propose a new MOEAs, named FPMs, based on frequent pattern mining. This approach achieves two main objectives: Firstly, it significantly reduces the decision space by effectively curtailing the dimensionality of the maximal candidate set; Second, it helps the minimum candidate set precisely locate the positions of non-zero variables
    • We integrate association rule mining into SLMOEAs to investigate the interdependence between decision variables (ARM). First, ARM identifies more non-zero variable positions by extracting links to non-zero variables in the minimum candidate set. Second, ARM helps identify sparse distributions that can rapidly optimize the objective function value by mining the correlation between decision variables, aiding the population in finding solutions with superior objective values within limited computing resources
    • Building on the FPM and ARM techniques, we propose EPMEA, an algorithm for addressing SLMOPs by integrating pattern mining methodologies. Experimental results on eight benchmark SMOPs and four real-world problems demonstrate EPMEA’s competitive performance

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return