Professor
School of Computer Science and Technology
Harbin Institute of Technology (Shenzhen)
University Town of Shenzhen, Nanshan District, Shenzhen, Guangdong, China
Authors
Adnan Zeb, Shiyao Zhang, Xuetao Wei, and James J.Q. Yu*
Publication
Expert Systems with Applications, Volume 224, June 2024, Article 122962
Abstract
Exploiting spatial-temporal correlations has long been regarded as the cornerstone of traffic state prediction. Among existing techniques, temporal graph neural networks (TGNNs) have recently emerged as a prominent solution for modeling complex spatial-temporal traffic data correlations. Existing studies on TGNNs mainly focus on developing new building blocks to embed hidden correlations into a unified latent representation, which is mapped to predictions of distinct horizons. However, mapping the same latent features to distinct scalar predictions makes the gradient computation challenging for updating model parameters in the relevant directions. Besides, TGNNs are biased towards the shared temporal patterns while neglecting the complex dependencies within each data series, which can be captured to enrich latent features. To handle these problems jointly, we propose a novel feature projection scheme for the traffic prediction framework of TGNNs. The proposed projection scheme is based on spatial convolutions that first generate horizon-specific feature maps and then transform them into scalar predictions of the corresponding horizons. These horizon-specific feature maps establish interactions between the unified latent representation and the corresponding output values to bring the predictions closer to the true values. Besides, the proposed scheme also serves as a pattern modeling phase that enhances the expressivity of TGNNs by enriching latent features with data source-wise patterns of distinct time steps. Comprehensive experiments on two real-world traffic datasets demonstrate that the proposed scheme enhances the predictive performance and reduces the model parameters of TGNNs.
[ Download PDF ] [ Digital Library ] [ Copy Citation ]