Traditional evaluation metrics provides numerical values but often lack comprehensibility, hindering effective differentiation of model performances. Our work addresses this challenge by introduc-
ing overlay_dx, a novel evaluation metric measuring the performance of time series prediction models. Overlay_dx is a visual metric that represents the percentage of predictions falling within a confidence inter-
val around actual values. Additionally, once evaluation results are plotted, overlay_dx computes the area under the overlay curve, providing a quantitative measure of alignment between predicted and actual values across different thresholds and predictions. Through extensive experiments, we demonstrate that our approach offers a unified evaluation framework that combines both visual and numerical assessments, en-
abling improved model comparison and providing valuable insights for further research and optimization efforts in time series prediction.
Conference
2025 - International Conference on Optimization and Learning (OLA2025)
Open Access