This is a great intro to thermodynamic computing. If you're reading this, you're already participating. Your neurons are spending sunlight stored in your cells to turn symbols into meaning. Energy becomes information. Information becomes action. This is the invisible game.
Thermodynamic computing is here
There is a new computing paradigm emerging from the noise, and its arrival may be as significant as the dawn of deep learning or the advent of cloud virtualization. A new company, Extropic, has just launched its first thermodynamic computer, a device they call a TSU, or Thermal Sampling Unit. While the web is already filling with deep technical dives, what’s more important for most of us is building a clear intuition for what this technology is, how it’s fundamentally different from anything that’s come before, and why it’s generating so much excitement. This isn’t just another chip; it’s a new way to think about computation itself.
Seeing is Believing: Solving Puzzles in One Shot
To understand what a TSU does, let’s look at two classic, notoriously difficult computer science problems: Sudoku and the Eight Queens problem. When you or I solve a Sudoku, we use a process of sequential logic, guess-and-check, and backtracking. We make an assumption, follow its logical conclusion, and if we hit a dead end, we erase and try again. A classical computer does the same, just much faster. A TSU, however, approaches this in a completely different way. Using a TSU simulator, one can “program” the problem by first clamping the known values—the clues already on the board. Then, you program in the constraints: no duplicate numbers in any row, column, or 3x3 square. With the problem thus defined, the TSU doesn’t “search” for a solution; it anneals one. In a single computational step, the solution simply emerges, backfilling all the empty squares correctly.
The same principle applies to the Eight Queens problem, a challenge to place eight queens on a chessboard so that none can attack any other. This is a complex combinatorial problem with 92 distinct solutions. A classical computer would have to iteratively search for these. A TSU, by contrast, can be programmed with the constraints (the “anti-affinity” between queens on the same row, column, or diagonal) and then set to sample the “solution space.” In this context, a valid solution is one with a “problem energy” of zero. The TSU’s physical nature allows it to naturally find these zero-energy states. A simulation of this process shows the TSU discovering all 92 unique solutions, demonstrating its ability to not just find an answer, but to explore the entire landscape of all correct answers. This is a fundamentally new approach, one that bypasses the brute-force, iterative methods we’ve relied on for decades.
The Physics of Computation: Using Noise, Not Fighting It
This new power comes from a radical design philosophy. For the last 70 years, computing has been about one thing: order. We build chips that are deterministic, logical, and precise. The great enemy has always been noise, heat, and randomness. We spend billions on cooling and error correction to eliminate these very things. Quantum computing, in many ways, is the ultimate expression of this, requiring temperatures near absolute zero to eliminate all thermal noise and achieve quantum coherence. Thermodynamic computing is the polar opposite. It doesn’t fight the noise; it uses it. The TSU is built on the understanding that the natural, stochastic noise from “leaky” transistors—the very randomness we’ve tried to engineer out of existence—is itself a powerful computational resource.
Think of it this way: a GPU, which is central to today’s AI, has to simulate noise. When a generative AI model creates a new image or sentence, it’s using complex algorithms to fake randomness. The TSU doesn’t need to fake it; it harnesses the actual physical randomness of thermodynamics. It is a piece of hardware that directly computes with probability. This makes it a hybrid, sitting somewhere between a purely analog computer (which might use light or sound waves to compute) and a digital GPU. It’s a physical device that leverages the laws of physics itself to find solutions, rather than just using logic gates to simulate them.
From a Lost Hiker to a Million Bouncy Balls
Perhaps the best way to build intuition is with a metaphor. Imagine that solving a complex optimization problem is like trying to find the lowest point of altitude in a 100-square-mile mountainous landscape. Classical computing, using an algorithm like gradient descent, is like being a single hiker dropped into this landscape at night. You have no map or satellite view. All you have is an altimeter and the sensation of the slope under your feet. You can only take one step at a time, always walking downhill, hoping you don’t get stuck in a small local valley when the true, lowest canyon is miles away.
Thermodynamic computing is a completely different approach. It’s like having a million bouncy balls and a helicopter. You drop all million balls simultaneously across the entire 100-square-mile landscape. Then, you “turn on an earthquake,” shaking the entire system. The balls bounce and jostle, but as the shaking (the “annealing”) subsides, where do they all end up? They naturally settle into the lowest points. The balls that collect in the deepest valley represent the optimal solution. The TSU is, in essence, a physical device for dropping those million balls at once and letting the laws of thermodynamics find the lowest “energy” state for you, all at the same time.
Beyond Puzzles: The Real-World Impact
This is far more than just a clever way to solve brain teasers. This ability to instantly find the lowest energy state for a complex, constrained system has staggering real-world applications. One of the most immediate is protein folding. Companies like Google’s DeepMind have made incredible progress with AI like AlphaFold, which predicts protein structures. But this is still a predictive model trained on existing data. A TSU could potentially solve the folding problem directly, treating the protein as a system of atomic affinities and repulsions and finding its most stable, lowest-energy configuration almost instantaneously. This could revolutionize drug discovery and materials science.
An even more profound possibility lies in nuclear fusion. One of the greatest engineering challenges in history is controlling the superheated plasma within a tokamak reactor. This requires shaping unimaginably complex magnetic containment fields in real-time to prevent the plasma from touching the reactor walls. This is a real-time optimization problem so complex it’s currently beyond our capabilities. A TSU, however, could be fast enough. Its ability to compute with electricity itself, rather than abstracting the problem through layers of software, might allow it to update the magnetic fields fast enough to stabilize the fusion reaction. One could even imagine a future where thermodynamic computing elements are built directly into the tokamak’s walls, allowing the reactor to physically and intelligently react to the plasma’s state in real time.
A ‘GPT-2 Moment’ for a New Era
It’s easy to become numb to hype, but what we are witnessing with the TSU feels different. This is what you might call a “GPT-2 moment.” For those who were there, GPT-2 was the first generative AI model that wasn’t just a toy; it was the first time you could play with it at home and see the spark of true generative intelligence. It was the precursor that pointed directly to the GPT-3 and ChatGPT revolution that has since changed the world. This TSU has that same feel. It’s the “SDK” for a new computing paradigm.
This technology is as different from classical computing as quantum computing is, but with a critical difference: a team of 15 built this in two years, and it runs at room temperature on your desk. Quantum computing has seen decades of work and billions in funding, and it still hasn’t produced a commercially viable, scalable machine. The TSU is here now. Based on a two-decade-long career at the cutting edge of technology—from seeing the obvious future of virtualization in 2007 to an early conviction in deep learning and GPT—this has all the same hallmarks of a fundamental, world-changing shift. We are not just building faster calculators; we are learning to compute with the universe itself. Pay close attention to this. This is the next big thing.