Pushing AI to its limit
Reconstructed an LSTM model that predicts time-series traffic volumes on large augmented data. The LSTM model is trained on volume-only, time-augmented, and multivariable-augmented (weather, crashes, road conditions, etc..) datasets spanning 30,000+ rows.
Simulate traffic for the next day, week, or decade
I was part of the BYU Transportation team collaborating on a paper of assessing how an LSTM model performs in predicting traffic volumes when trained on datasets featuring pure traffic volume, time-series traffic volume, and time-series traffic volume augmented with weather data and road conditions. Results compared testing accuracy, performance, and recurrent prediction performance.
98%
Faster training speed. From 180s / epoch to 3s / epoch.
40,000+ data points processed in seconds
The initial model was taking 2 hours for training, thus progressing our research was unfeasible. I optimized LSTM model’s accuracy while reducing the running time using multi-GPU training, embedders, multi-threaded data loading, batching, and minimizing CPU-GPU synchronization.
A powerful model with an industry-level benchmark
Constructed linear models for performance comparisons with MSE as low as 60 when trained on 30,000 batches of time-augmented traffic data.
Pushing AI to its limit
Reconstructed an LSTM model that predicts time-series traffic volumes on large augmented data. The LSTM model is trained on volume-only, time-augmented, and multivariable-augmented (weather, crashes, road conditions, etc..) datasets spanning 30,000+ rows.
Simulate traffic for the next day, week, or decade
I was part of the BYU Transportation team collaborating on a paper of assessing how an LSTM model performs in predicting traffic volumes when trained on datasets featuring pure traffic volume, time-series traffic volume, and time-series traffic volume augmented with weather data and road conditions. Results compared testing accuracy, performance, and recurrent prediction performance.
40,000+ data points processed in seconds
The initial model was taking 2 hours for training, thus progressing our research was unfeasible. I optimized LSTM model’s accuracy while reducing the running time using multi-GPU training, embedders, multi-threaded data loading, batching, and minimizing CPU-GPU synchronization.
98%
Faster training speed. From 180s / epoch to 3s / epoch.
A powerful model with an industry-level benchmark
Constructed linear models for performance comparisons with MSE as low as 60 when trained on 30,000 batches of time-augmented traffic data.
Pushing AI to its limit
Reconstructed an LSTM model that predicts time-series traffic volumes on large augmented data. The LSTM model is trained on volume-only, time-augmented, and multivariable-augmented (weather, crashes, road conditions, etc..) datasets spanning 30,000+ rows.
Simulate traffic for the next day, week, or decade
I was part of the BYU Transportation team collaborating on a paper of assessing how an LSTM model performs in predicting traffic volumes when trained on datasets featuring pure traffic volume, time-series traffic volume, and time-series traffic volume augmented with weather data and road conditions. Results compared testing accuracy, performance, and recurrent prediction performance.
40,000+ data points processed in seconds
The initial model was taking 2 hours for training, thus progressing our research was unfeasible. I optimized LSTM model’s accuracy while reducing the running time using multi-GPU training, embedders, multi-threaded data loading, batching, and minimizing CPU-GPU synchronization.
98%
Faster training speed. From 180s / epoch to 3s / epoch.
A powerful model with an industry-level benchmark
Constructed linear models for performance comparisons with MSE as low as 60 when trained on 30,000 batches of time-augmented traffic data.