Mixture of Online and Offline Experts for Non-Stationary Time Series

0
citations
#2074
in AAAI 2025
of 3028 papers
3
Top Authors
4
Data Points

Abstract

We consider a general and realistic scenario involving non-stationary time series, consisting of several offline intervals with different distributions within a fixed offline time horizon, and an online interval that continuously receives new samples. For non-stationary time series, the data distribution in the current online interval may have appeared in previous offline intervals. We theoretically explore the feasibility of applying knowledge from offline intervals to the current online interval. To this end, we propose the Mixture of Online and Offline Experts (MOOE). MOOE learns static offline experts from offline intervals and maintains a dynamic online expert for the current online interval. It then adaptively combines the offline and online experts using a meta expert to make predictions for the samples received in the online interval. Specifically, we focus on theoretical analysis, deriving parameter convergence, regret bounds, and generalization error bounds to prove the effectiveness of the algorithm.

Citation History

Jan 27, 2026
0
Feb 7, 2026
0
Feb 13, 2026
0
Feb 13, 2026
0