Google's 200M-parameter time-series foundation model with 16k context
codepawl
22 points
8 comments
March 31, 2026
Related Discussions
Found 5 related stories in 52.5ms across 3,471 title embeddings via pgvector HNSW
- Google releases Gemma 4 open models jeffmcjunkin · 1306 pts · April 02, 2026 · 45% similar
- Flash-MoE: Running a 397B Parameter Model on a Laptop mft_ · 332 pts · March 22, 2026 · 45% similar
- Show HN: Timber – Ollama for classical ML models, 336x faster than Python kossisoroyce · 85 pts · March 02, 2026 · 44% similar
- Gemini Embedding 2: natively multimodal embedding model panarky · 22 pts · March 10, 2026 · 43% similar
- FFmpeg at Meta: Media Processing at Scale sudhakaran88 · 239 pts · March 09, 2026 · 42% similar
Discussion Highlights (3 comments)
Foobar8568
Somehow I missed that one. Are there any competition on this? I always had difficulties with ML and time series, I'll need to try that out.
EmilStenstrom
Here is the link to the blogpost, that actually describe what this is: https://github.com/google-research/timesfm?tab=readme-ov-fil...
EmilStenstrom
I somehow find the concept of a general time series model strange. How can the same model predict egg prices in Italy, and global inflation in a reliable way? And how would you even use this model, given that there are no explanations that help you trust where the prediction comes from…