A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 15.3, Top 1 (SCI Q1)
    CiteScore: 23.5, Top 2% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
L. Wang, K. Liu, and Y. Yuan, IEEE/CAA J. Autom. Sinica, 2024. doi: 10.1109/JAS.2024.124863
Citation: L. Wang, K. Liu, and Y. Yuan, IEEE/CAA J. Autom. Sinica, 2024. doi: 10.1109/JAS.2024.124863

GT-A2T: Graph Tensor Alliance Attention Network

doi: 10.1109/JAS.2024.124863
More Information
  • loading
  • [1]
    L. Sun, Z. Zhang, F. Wang, P. Ji, J. Wen, S. Su, and P. Yu, “Aligning dynamic social networks: An optimization over dynamic graph autoencoder,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 6, pp. 5597–5611, 2023.
    [2]
    X. Xu, T. Zhang, C. Xu, Z. Cui, and J. Yang, “Spatial-temporal tensor graph convolutional network for traffic speed prediction,” IEEE Trans. Intell. Transp. Syst., vol. 24, no. 1, pp. 92–103, 2023. doi: 10.1109/TITS.2022.3215613
    [3]
    Y. Zhou, X. Luo, and M. Zhou, “Cryptocurrency transaction network embedding from static and dynamic perspectives: An overview,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 5, pp. 1105–1121, 2023. doi: 10.1109/JAS.2023.123450
    [4]
    Y. Xu, Y. Yuan, Z. Wang, and X. Li, “Noncooperative model predictive game with Markov jump graph,” IEEE/CAA J. Autom. Sinica, vol. 10, no. 4, pp. 931–944, 2023. doi: 10.1109/JAS.2023.123129
    [5]
    F. Manessi, A. Rozza, and M. Manzo, “Dynamic graph convolutional networks,” Pattern Recognit., vol. 97, 2020.
    [6]
    A. Pareja, G. Domeniconi, J. Chen, T. F. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. B. Schardl, and C. E. Leiserson, “EvolveGCN: Evolving graph convolutional networks for dynamic graphs,” in Proc. AAAI Conf. Artif. Intell., 2020, pp. 5363–5370.
    [7]
    O. A. Malik, S. Ubaru, L. Horesh, M. E. Kilmer, and H. Avron, “Dynamic graph convolutional networks using the tensor M-product,” in Proc. SIAM Int. Conf. Data Mining, 2021, pp. 729–737.
    [8]
    A. Sankar, Y. Wu, L. Gou, W. Zhang, and H. Yang, “DySAT: Deep neural representation learning on dynamic graphs via self-attention networks,” in Proc. Int. Conf. Web Search and Data Mining, 2020, pp. 519–527.
    [9]
    M. Zhang, S. Wu, X. Yu, Q. Liu, and L. Wang, “Dynamic graph neural networks for sequential recommendation,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 5, pp. 4741–4753, 2023.
    [10]
    R. Jiang, Z. Wang, J. Yong, P. Jeph, Q. Chen, Y. Kobayashi, X. Song, S. Fukushima, and T. Suzumura, “Spatio-temporal meta-graph learning for traffic forecasting,” in Proc. AAAI Conf. Artif. Intell., 2023, pp. 8078–8086.
    [11]
    M. Zhang, Y. Xia, Q. Liu, S. Wu, and L. Wang, “Learning long- and short-term representations for temporal knowledge graph reasoning,” in Proc. ACM Web Conf., 2023, pp. 2412–2422.
    [12]
    M. Fang, L. Tang, X. Yang, Y. Chen, C. Li, and Q. Li, “FTPG: A fine-grained traffic prediction method with graph attention network using big trace data,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 6, pp. 5163–5175, 2022. doi: 10.1109/TITS.2021.3049264
    [13]
    P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Liò and Y. Bengio, “Graph attention networks,” in Proc. Int. Conf. Learn. Represent., 2018.
    [14]
    T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in Proc. Int. Conf. Learn. Represent., 2016.
    [15]
    P. Wu, H. Li, L. W. Hu, J. Ge, and N. Zeng, “A local-global attention fusion framework with tensor decomposition for medical diagnosis,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 0, pp. 1–3, 2023.
    [16]
    H. Wu, X. Luo, M. Zhou, M. Rawa, K. Sedraoui, and A. Albeshri, “A PID-incorporated latent factorization of tensors approach to dynamically weighted directed network analysis,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 3, pp. 533–546, 2022. doi: 10.1109/JAS.2021.1004308
    [17]
    X. Tong, L. Cheng and Y. Wu, “Bayesian tensor tucker completion with a flexible core,” IEEE Trans. Signal Process., vol. 71, pp. 4077–4091, 2023. doi: 10.1109/TSP.2023.3327845
    [18]
    A. Paranjape, A. R. Benson, and J. Leskovec, “Motifs in temporal networks,” in Proc. ACM Inter. Conf. Web Search and Data Mining, 2017, pp. 601–610.
    [19]
    X. Luo, Y. Yuan, S. Chen, N. Zeng, and Z. Wang, “Position-transitional particle swarm optimization-incorporated latent factor analysis,” IEEE Trans. Knowl. Data Eng., vol. 34, no. 8, pp. 3958–3970, 2022. doi: 10.1109/TKDE.2020.3033324
    [20]
    Y. Yu, G. Zhou, N. Zheng, Y. Qiu, S. Xie, and Q. Zhao, “Graph-regularized non-negative tensor-ring decomposition for multiway representation learning,” IEEE Trans. Cybern., vol. 53, no. 5, pp. 3114–3127, 2023. doi: 10.1109/TCYB.2022.3157133

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(3)  / Tables(2)

    Article Metrics

    Article views (11) PDF downloads(3) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return