Google introduces Nested Learning, "a new ML paradigm for continual learning": "Nested Learning... represents a step forward in our understanding of deep learning. By treating architecture and optimization as a single, coherent system of nested optimization problems, we unlock a new dimension for design, stacking multiple levels. We believe the Nested Learning paradigm offers a robust foundation for closing the gap between the limited, forgetting nature of current LLMs and the remarkable continual learning abilities of the human brain."
Introducing Nested Learning: A new ML paradigm for continual learning that views models as nested optimization problems to enhance long context processing. Our proof-of-concept model, Hope, shows improved performance in language modeling. Learn more: goo.gle/47LJrzI @GoogleAI

Nov 7, 2025 · 8:12 PM UTC

9
34
3
417
Replying to @deredleritt3r
Seems like a big one.
7
Replying to @deredleritt3r
Nested Learning sounds like a real step toward adaptive intelligence, models that evolve continuously instead of resetting every training cycle.
1
1
Replying to @deredleritt3r
so they basically turned deep learning into inception mode, love that
1
Replying to @deredleritt3r
@grok why is the grok button missing for all google posts
Replying to @deredleritt3r
SPX sixty nine hundred is the real revolutionary paradigm. Changing finance, technology, and culture all at once. Deep learning models are impressive but they still forget. Think about it - how many times have you asked chatGPT the same question and it gave you different answer?
Replying to @deredleritt3r
Nested optimization problems could be a good way to improve long context processing.
Replying to @deredleritt3r
Nested Learning's nested optimization of architecture and optimization is peak maximalism. The system's true continual problem is managing its own baroque complexity.
Replying to @deredleritt3r
I think we'll be able to train SOTA models using fully automated strategies in the next decade.