A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 9 Issue 3
Mar.  2022

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Y. D. Wang, Z. F. Zhang, and Y. H. Lin, “Multi-cluster feature selection based on isometric mapping,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 570–572, Mar. 2022. doi: 10.1109/JAS.2021.1004398
Citation: Y. D. Wang, Z. F. Zhang, and Y. H. Lin, “Multi-cluster feature selection based on isometric mapping,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 570–572, Mar. 2022. doi: 10.1109/JAS.2021.1004398

Multi-Cluster Feature Selection Based on Isometric Mapping

doi: 10.1109/JAS.2021.1004398
  • loading
  • [1]
    J. Li, K. Cheng, S. Wang, et al, “Feature selection: a data perspective,” ACM Computing Surveys (CSUR), vol. 50, no. 6, pp. 1–45, 2017.
    [2]
    Y. Wang, X. Li, and J. Wang, “A neurodynamic optimization approach to supervised feature selection via fractional programming,” Neural Networks, vol. 136, pp. 194–206, 2021. doi: 10.1016/j.neunet.2021.01.004
    [3]
    Z. Cai and W. Zhu, “Feature selection for multi-label classification using neighborhood preservation,” IEEE/CAA Journal of Automatica Sinica, vol. 5, no. 1, pp. 320–330, 2018. doi: 10.1109/JAS.2017.7510781
    [4]
    R. Kohavi and G. John, “Wrappers for feature subset selection,” Artificial Intelligence, vol. 97, no. 1–2, pp. 273–324, 1997. doi: 10.1016/S0004-3702(97)00043-X
    [5]
    H. Liu, M. Zhou, and Q. Liu, “An embedded feature selection method for imbalanced data classification,” IEEE/CAA Journal of Automatica Sinica, vol. 6, no. 3, pp. 703–715, 2019. doi: 10.1109/JAS.2019.1911447
    [6]
    X. He, D. Cai, and P. Niyogi, “Laplacian score for feature selection,” in Proc. 18th International Conf. Neural Information Processing Systems, 2006, pp. 1–8.
    [7]
    D. Cai, C. Zhang, and X. He, “Unsupervised feature selection for multi-cluster data,” in Proc. 16th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, 2010, pp. 333–342.
    [8]
    M. Belkin M and Niyogi, “Laplacian Eigenmaps for dimensionality reduction and data representation,” Neural Computation, vol. 15, no. 6, pp. 1373–1396, 2003.
    [9]
    J. Tenenbaum, V. Silva, and J. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2323, 2000. doi: 10.1126/science.290.5500.2319
    [10]
    S. Roweis and L. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000. doi: 10.1126/science.290.5500.2323
    [11]
    T. F. Cox and M. Cox, “Multidimensional scaling,” Journal of the Royal Statistical Society, vol. 46, no. 2, pp. 1050–1057, 2001.
    [12]
    R. Tibshirani, “Regression shrinkage and selection via the Lasso,” Journal of the Royal Statistical Society Series B (Methodological), vol. 58, no. 1, pp. 267–288, 1996. doi: 10.1111/j.2517-6161.1996.tb02080.x
    [13]
    B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani., “Least angle regression,” Annals of Statistics, vol. 32, no. 2, pp. 407–499, 2004.
    [14]
    A. Ng, M. Jordan, and Y. Weiss, “On spectral clustering: analysis and an algorithm,” Proceedings of the Advances in Neural Information Processing Systems, vol. 14, pp. 849–856, 2002.
    [15]
    Z. Li, Y. Yang, J. Liu, X. Zhou, and H. Lu, “Unsupervised feature selection using nonnegative spectral analysis,” in Proc. 26th AAAI Conf. Artificial Intelligence, 2012, pp. 1026–1032.

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(1)  / Tables(2)

    Article Metrics

    Article views (864) PDF downloads(146) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return