Tech News
← Back to articles

OpenTSLM: Language Models That Understand Time-Series (Stanford, ETH, Google)

read original related products more articles

Time-Series Language Models (TSLMs) are multimodal foundation models with time series as a native modality, next to text, enabling direct reasoning, explanation, and forecasting over temporal data in natural language.

Our research shows order-of-magnitude gains in temporal reasoning while running on smaller, faster backbones. TSLMs are not an add-on. They're a new modality for AI.