POW

Granite Time Series:
focused models for temporal data.

Granite Time Series is IBM's specialized family for forecasting, anomaly detection, classification, and representation learning on structured temporal signals. It is a strong fit when the job is measurable, domain-shaped, and better served by compact sequence models than by general-purpose LLMs.

Forecasting & Detection
Compact Foundation Models

Available Models

Family Variants

Granite Time Series TTM-R1

TinyTimeMixer Forecasting

A tiny forecasting model for fast zero-shot baselines and lightweight adaptation.

View Specs →

Granite Time Series TTM-R2

TinyTimeMixer Forecasting

An improved tiny forecaster with broader practical coverage for business and operations data.

View Specs →

Granite Time Series TSPulse-R1

Temporal Encoder

A representation model for anomaly detection, classification, clustering, and similarity search.

View Specs →

Granite Time Series FlowState-R1

Forecasting Foundation Model

A flexible forecasting backbone for changing horizons, granularities, and planning windows.

View Specs →

Granite Time Series PatchTST

Transformer Baseline

A patch-based transformer baseline for long-horizon and multivariate forecasting work.

View Specs →

Granite Time Series PatchTSMixer

Mixer Baseline

An efficient mixer-style baseline for forecasting with simpler compute patterns.

View Specs →

Granite Time Series PatchTST-FM-R1

Foundation Model

A reusable PatchTST-based starting point for transfer-heavy forecasting workflows.

View Specs →

Why This Family Matters

Granite Time Series helps the site show a broader AI truth: specialized models often win when the data shape, evaluation target, and operational boundary are already clear.

Teach that not every forecasting problem needs an LLM.

Granite Time Series is a useful counterexample to language-first thinking. It shows how specialized temporal models can be smaller, faster, and more honest for forecasting and anomaly work.

Connect model choice to data shape.

This family helps readers understand that structured temporal signals reward architectures designed for sequences, windows, and timescales rather than open-ended text generation.

Make compact foundation models feel concrete.

TinyTimeMixer, TSPulse, and FlowState make it easier to explain that foundation-model ideas also apply to time-series systems, not just chat assistants.

Position it as a practical analytics family.

The page should frame Granite Time Series as a strong option for forecasting, monitoring, operations, finance, and industrial telemetry where evaluation is structured and measurable.

Ask the AI for help