Skip to content

Latest commit

 

History

History
37 lines (24 loc) · 2.18 KB

README.md

File metadata and controls

37 lines (24 loc) · 2.18 KB

PGCN: Progressive Graph Convolutional Network for Spatial-Temporal Traffic Forecasting

image

This is a PyTorch implementation of Progressive Graph Convolutional Network in the paper entitled "PGCN: Progressive Graph Convolutional Network for Spatial-Temporal Traffic Forecasting" The paper is currently under review for KDD '22.

Progressive Graph Construction

image

Using adjusted cosine similarity values, the model constructs progressive graph to reflect the changes in traffic states at each time step.

Performance Comparison

Datasets

  • PeMS-Bay: Highway traffic speed data from 325 sensors in Bay Area [1]
  • METR-LA: Highway traffic flow data from 207 sensors in LA [1]
  • Urban-core: Urban traffic speed data from 304 sensors in Seoul, South Korea [2]
  • Seattle: Highway traffic speed data from 323 sensors in Greater Seattle Area [3]

Results

image

Evaluation results on four real-world datasets show that our model consistently outputs state-of-the-art results.

code implementation

Code for PGCN has been implemented by modifying codes from Graph WaveNet (https://github.com/nnzhan/Graph-WaveNet)[4]

References

[1] Li, Y., Yu, R., Shahabi, C., & Liu, Y. (2017). Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926.

[2] Shin, Y., & Yoon, Y. (2021). A Comparative Study on Basic Elements of Deep Learning Models for Spatial-Temporal Traffic Forecasting. arXiv preprint arXiv:2111.07513.

[3] Cui, Z., Henrickson, K., Ke, R., & Wang, Y. (2019). Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting. IEEE Transactions on Intelligent Transportation Systems, 21(11), 4883-4894.

[4] Wu, Z., Pan, S., Long, G., Jiang, J., & Zhang, C. (2019). Graph wavenet for deep spatial-temporal graph modeling. arXiv preprint arXiv:1906.00121.