余剑峤
James Jianqiao Yu
首页 论文 服务 ENG

讲师(助理教授)

计算机科学系

约克大学

英国约克 YO10 5GH CSE/139

jqyu(at)ieee.org Google Scholar
Uncertainty-Aware Temporal Graph Convolutional Network for Traffic Speed Forecasting

作者
Weizhu Qian, Thomas Dyhre Nielsen, Yan Zhao, Kim Guldstrand Larsen, and James Jianqiao Yu

发表
IEEE Transactions on Intelligent Transportation Systems, Volume 25, Issue 8, August 2024

摘要
Traffic speed forecasting has been a very active research area as it is essential for Intelligent Transportation Systems. Although a plethora of deep learning methods have been proposed for traffic speed forecasting, the majority of them can only make point-wise prediction, which may not provide enough information for critical real-world scenarios where prediction confidence also need to be estimated, e.g., route planning for ambulances and rescue vehicles. To address this issue, we propose a novel uncertainty-aware deep learning method coined Uncertainty-Aware Temporal Graph Convolutional Network (UAT-GCN). UAT-GCN employs a Graph Convolutional Network and Gated Recurrent Unit based architecture to capture spatio-temporal dependencies. In addition, UAT-GCN consists of a specialized regressor for estimating both epistemic (model-related) and aleatoric (data-related) uncertainty. In particular, UAT-GCN utilizes Monte Carlo dropout and predictive variances to estimate epistemic and aleatoric uncertainty, respectively. In addition, we also consider the recursive dependency between predictions to further improve the forecasting performance. An extensive empirical study with real datasets offers evidence that the proposed model is capable of advancing current state-of-the-arts in terms of point-wise forecasting and quantifying prediction uncertainty with high reliability. The obtained results suggest that, compared to existing methods, the RMSE and MAE of the proposed model on the SZ-taxi dataset are reduced by 2.15% and 7.23% , respectively; the RMSE and MAE of the proposed model on the Los-loop dataset are reduced by 4.17% and 8.53% , respectively.