A library to learn probabilistic time-series models. Mostly, the models are assumed to be conditional Gaussians, where means and covariances can potentially be non-linearly depending on latent variables. Models are learned with (approximate) expectation maximization algorithms.
Clone the repo and install
git clone https://github.com/christiando/timeseries_models
cd timeseries_models
pip install .If you want to run the notebooks you also have to install the packages in the requirements.txt.
pip install -r requirements.txtIt is recommended to do this in an isolated python environment. Alternatively one can build the Dockerimage that comes with the repo in .devcontainer.
The approach of the library is to compar State space models are constructed of a state model and an observation model. These can then be combined in a state-space model.
The state-space model can then be learnt via an (approximate) expectation maximization algorithm.
First we construct the state space model.
# imports
from jax import config
config.update("jax_enable_x64", True) # It is recommended to run the code always with double precision.
from timeseries_models import state_model, observation_model, state_space_model
# load data, jax numpy ndarray, either [T, C] or [B, T, C]
data_train = ...
# construct model
latent_dims = 2 # Dimensions of state space
data_dims = data_train.shape[-1] # Dimensions of data space
sm = state_model.LinearStateModel(latent_dims)
om = observation_model.LinearObservationModel(data_dims, latent_dims)
ssm = state_space_model.StateSpaceModel(observation_model=om, state_model=sm)Then learning the model is as simple as
# Fit model
ssm.fit(data_train)To do predictions with the learnt model just need to load the data and predict the model.
# Predict
T_hist = 0
data_pred = ... # Load prediction data, [T_hist + T_pred, C] or [B, T_hist + T_pred, C]
result_pred = ssm.predict(data_pred, first_prediction_index = T_hist)Note that this code relies heavily on the gaussian_toolbox
First checkout the Tutorial notebook which shows small examples.
The state models that are considered here, have the form
where
This is a linear state transition model
with
This implements a linear+squared exponential mean (LSEM) state model
with
The kernel and linear activation function are given by
The parameters that need to be inferred are
This implements a linear+radial basis function mean (LRBF) state model
with
The kernel and linear activation function are given by
The parameters that need to be inferred are
The observation models that are considered here, have the form
where
with
Non-linearity has the same form of LSEMStateModel.
Non-linearity has the same form of LRBFMStateModel.
The library was mainly developed for the publication. If you use the library please cite
@article{DONNER2025,
title = {A projected nonlinear state-space model for forecasting time series signals},
journal = {International Journal of Forecasting},
year = {2025},
issn = {0169-2070},
doi = {https://doi.org/10.1016/j.ijforecast.2025.01.002},
url = {https://www.sciencedirect.com/science/article/pii/S0169207025000020},
author = {Christian Donner and Anuj Mishra and Hideaki Shimazaki}
}