Loading...
Loading...
Google Research has pushed long-context time-series modeling forward with TimesFM 2.5, a compact 200M-parameter, decoder-only foundation model designed for forecasting and representation learning. The update expands the context window to 16k tokens to better capture long-range dependencies while aiming for faster, more practical inference than larger models. It also introduces an optional 30M continuous quantile head to forecast up to 1,000 horizons, alongside API and training tweaks such as removing a frequency indicator and adding new forecasting flags. Released openly via GitHub, Hugging Face, and BigQuery, TimesFM is positioned to broaden adoption across finance, IoT, and energy use cases.
google-research / timesfm
Google Research released TimesFM 2.5, a time-series foundation model updated to a 200M-parameter decoder-only architecture with a 16k context window (up from 2,048) and optional 30M continuous quantile head supporting up to 1k forecast horizons. The model trims parameters from 500M to 200M, removes the frequency indicator, and introduces new forecasting flags; an upgraded inference API and planned Flax (JAX) version promise faster inference and restored covariate (XReg) support. Checkpoints, code, and examples are published on GitHub, Hugging Face, and integrated into BigQuery; installation supports PyTorch or Flax backends. This matters for enterprises and developers needing long-context, probabilistic time-series forecasting at scale.
Google Research released TimesFM 2.5, a time-series foundation model that trims parameters to 200M while expanding context length to 16k tokens and adding an optional 30M continuous quantile head for forecasting up to 1k horizons. Announced alongside ICML 2024 work and hosted on Hugging Face and BigQuery, TimesFM is a decoder-only pretrained model for time-series forecasting; the repo provides PyTorch and Flax variants, installation instructions, and usage examples. Key changes from 2.0 include removal of a frequency indicator, new forecasting flags, and planned Flax support and restored covariate (XReg) handling. The model targets faster inference and broader long-context forecasting use cases across industry and research.
Google released a 200M-parameter time-series foundation model with a 16k-context window, published via the Google Research TimesFM repo. The project provides a specialized model and resources aimed at long-context time-series forecasting and representation learning, addressing challenges where long-range dependencies matter. Key players include Google Research and the open-source TimesFM repository on GitHub. This matters because large-context time-series models can improve forecasting in finance, IoT, energy, and other domains where long-term temporal patterns are crucial, and a compact 200M-parameter footprint may enable broader adoption and on-device inference. The release also invites comparison and experimentation from the ML community on time-series benchmarks and practical deployments.