A Unified Spatio-Temporal Inference Network for Car-Sharing Serial Prediction

Nihad Brahimi, Huaping Zhang*, Syed Danial Asghar Zaidi, Lin Dai

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Car-sharing systems require accurate demand prediction to ensure efficient resource allocation and scheduling decisions. However, developing precise predictive models for vehicle demand remains a challenging problem due to the complex spatio-temporal relationships. This paper introduces USTIN, the Unified Spatio-Temporal Inference Prediction Network, a novel neural network architecture for demand prediction. The model consists of three key components: a temporal feature unit, a spatial feature unit, and a spatio-temporal feature unit. The temporal unit utilizes historical demand data and comprises four layers, each corresponding to a different time scale (hourly, daily, weekly, and monthly). Meanwhile, the spatial unit incorporates contextual points of interest data to capture geographic demand factors around parking stations. Additionally, the spatio-temporal unit incorporates weather data to model the meteorological impacts across locations and time. We conducted extensive experiments on real-world car-sharing data. The proposed USTIN model demonstrated its ability to effectively learn intricate temporal, spatial, and spatiotemporal relationships, and outperformed existing state-of-the-art approaches. Moreover, we employed negative binomial regression with uncertainty to identify the most influential factors affecting car usage.

Original languageEnglish
Article number1266
JournalSensors
Volume24
Issue number4
DOIs
Publication statusPublished - Feb 2024

Keywords

  • prediction
  • spatial feature
  • spatio-temporal feature
  • spatio-temporal inference
  • temporal features
  • uncertainty analysis

Fingerprint

Dive into the research topics of 'A Unified Spatio-Temporal Inference Network for Car-Sharing Serial Prediction'. Together they form a unique fingerprint.

Cite this