Research
December 13, 2025

Towards Efficient Foundation Model: A Novel Time Series Embedding

Select a country
Link Template
Oops! Something went wrong while submitting the form.

High-performance forecasting shouldn't require a supercomputer.

Current Time Series Foundation Models (TSFMs) are powerful but resource-heavy, requiring massive datasets and expensive compute infrastructure. Traditional models are cheap but lack scalability.  

MIT and Ikigai Labs jointly present a breakthrough architectural possibility—a resource-efficient TSFM. This research paper introduces a novel method to embed time series of any length and scale by mapping them to a 2D image format (unit square). The result is the best of both worlds: achieving foundation model performance at a fraction of the cost.

Key Findings:

  • Versatility: Handles variable lengths and scales seamlessly.  
  • Efficiency: Significantly reduces the computational load compared to standard TSFMs.  
  • Efficacy: In model identification tasks, our proposed embeddings perform comparably to those from resource-intensive pre-trained models.

Read the research paper to see how we are enabling the next generation of Compute-Efficient TSFMs.

No items found.

Recommended resources