An Empirical Study of Temporal Graph Neural Networks for Dynamic Node Forecasting
DOI:
https://doi.org/10.33751/komputasi.v23i1.79Abstrak
Temporal graph modeling has become increasingly important for understanding and forecasting the dynamics of complex systems that evolve over time. One of the central challenges in temporal graph learning lies in identifying graph neural network (GNN) architectures that can effectively capture both spatial dependencies and temporal dynamics. This study presents a comprehensive benchmarking analysis of widely used GNN architectures, namely Graph Convolution Network (GCN), GraphSAGE, Graph Attention Network (GAT), Chebyshev Networks (ChebNet), and Simplified Graph Convolution Network (SGC), each integrated with recurrent mechanisms for temporal modeling. The evaluation is conducted on the WikiMaths dataset, a large-scale temporal graph dataset representing user visits of mathematics-related Wikipedia articles. Experimental results demonstrate that the choice of graph convolution operator significantly impacts temporal forecasting performance, with GraphSAGE and ChebNet consistently exhibiting superior performance compared to other architectures. This work provides empirical insights into the strengths and limitations of established temporal GNN models, contributing to a clearer understanding of their applicability in dynamic graph forecasting tasks.
Unduhan
Diterbitkan
Cara Mengutip
Terbitan
Bagian
Lisensi
Hak Cipta (c) 2026 Komputasi: Jurnal Ilmiah Ilmu Komputer dan Matematika

Artikel ini berlisensiCreative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.









