(Translated by https://www.hiragana.jp/)
GitHub - zhengchuanpan/STPC-Net: STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds
Skip to content

STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds

Notifications You must be signed in to change notification settings

zhengchuanpan/STPC-Net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds

This is the implementation of STPC-Net in the following paper:
Chuanpan Zheng, Cheng Wang, Xiaoliang Fan, Jianzhong Qi, and Xu Yan. "STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds", published in IEEE Transactions on Intelligent Transportation Systems (T-ITS).

Data

The datasets are available at Baidu Yun with code eyxx, and should be put into the corresponding data/ folder.

The GeoLife dataset contains 4,793,591 points generated by 69 users. Each point is associated with the sensor (user) id, latitude, longitude, timestamp, a 6-dimensional feature vector (the time interval and relative distance between two consecutive points, the speed, acceleration, jerk, and heading change rate of each point), and the corresponding transportation mode (walk, bike, bus, drive, and train).

The PeMSD8 dataset contains a distance matrix and 3,035,520 points generated by 170 sensors. Each point is associated with the sensor id, timestamp, and a 3-dimensional feature vector (traffic flow, speed, and occupancy). This dataset does not include the latitude and longitude, but provides the distances between every two sensors.

Requirements

Python 3.7.10, tensorflow 1.14.0, numpy 1.16.4

Training

To train STPC-Net on two datasets, cd to the corresponding folder, and run:

python train.py

Testing

We provide pre-trained model files on both datasets.

To evaluate STPC-Net on two datasets, cd to the corresponding folder , and run:

 python test.py

Results

We provide pre-trained models on both datasets, which achieve the following performance:

GeoLife Accuracy Precision Recall F1-score
STPC-Net 82.87% 83.80% 82.20% 82.85%
PeMSD8 RMSE MAE MAPE
STPC-Net 24.62 15.09 9.62%

Citation

If you find this repository useful in your research, please cite the following paper:

@article{ STPC-Net:TITS,
  author   = "Chuanpan Zheng and Cheng Wang and Xiaoliang Fan and Jianzhong Qi and Xu Yan"
  title    = "STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds",
  journal  = "IEEE Transactions on Intelligent Transportation Systems",
  year     = "2021"
}

About

STPC-Net: Learn Massive Geo-sensory Data as Spatio-Temporal Point Clouds

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages