Pinned Tweet
All vertices n+ (positive noumena) of the polynon are identical and equal, allowing the enfolding of all the properties of Consciousness as fundamental substance, from infinite autopoiesis to truth paradoxes and cognitive multi-dimensionality. philpapers.org/archive/ROITP…
5
Tib Roibu retweeted
If you are too confident in your representational maps of reality you will miss seeing reality
You don't need to complete The System of German Idealism when these books alone have transcended all that is German Idealism #Schelling
4
19
1
219
Tib Roibu retweeted
important research paper from google... "LLMs don't just memorize, they build a geometric map that helps them reason" according to the paper: – builds a global map from only local pairs – plans full unseen paths when knowledge is in weights; fails in context – turns a many-step path into a 1-step pick – comes from a natural training bias; room to make memory more geometric
Tib Roibu retweeted
By the way, one intuition I like about descent on the spectral norm is the update is *invariant to the input distribution* for certain loss landscapes (specifically, gradients that are orthogonal transformations of the input vector) From our recent blog: kvfrans.com/matrix-whitening…
Tib Roibu retweeted
Updated pre print: Language conveys meaning through structures that range from concrete description to abstract generalization. This study introduces a computational framework for measuring linguistic depth by quantifying the algorithmic complexity of semantic networks derived from large language model embeddings. Building on Piagetian and Vygotskian theories of cognitive development, this study propose that abstraction, whether in thought, measurement, or language, reflects the compression of distributed information into coherent, generative structures. We operationalize this principle using Kolmogorov complexity, K(G), estimated via the compressed length of network edge lists. Simulation studies demonstrate that networks derived from factor-structured data exhibit significantly lower K(G) than density-matched random controls, with separation accuracy reaching 94% as factor loadings strengthen. In a controlled linguistic experiment comparing 100 pairs of abstract and concrete phrases matched for syntax and length, abstract expressions produced consistently lower algorithmic complexity (M = 726) than concrete expressions (M = 784), t(194.41) = -3.28, p = .001, d = -0.46. Slot-based lexical manipulation experiments revealed that cross-category substitutions increased K(G) by 51 bytes in abstract contexts but only 27 bytes in concrete contexts, demonstrating a 1.9:1 directional asymmetry. These findings establish that semantic abstraction manifests as network compressibility, i.e., abstract language achieves conceptual depth through structural regularity rather than elaborative detail. The framework unites psychometric network theory, complexity science, and computational linguistics under a single information-theoretic principle, offering a model-agnostic measure of abstraction applicable across psychological networks, natural language, and neural representations. By formalizing the intuition that profound expression achieves economy through structure, K(G) provides both theoretical insight into how meaning is organized and practical methodology for assessing semantic depth in text, psychological data, and artificial intelligence systems. osf.io/preprints/psyarxiv/b9…
Tib Roibu retweeted
That’s like mistaking each dot in a diffraction pattern for a separate source of light, when in fact they all have the same point of origin.
Consciousness feels unified, but it draws from countless neural processes running in parallel.
1
6
Geometry is cognition: polynons.com/noumenal-ontolo…
Nice short reading: "What Is Geometry?" 1) Axioms (Euclid) 2) Coordinates (Descartes, Fermat) 3) Calculus (Newton, Leibniz) 4) Groups (Klein, Lie) 5) Manifolds (Riemann) 6) Fiber bundles (E. Cartan, Whitney) "A property is geometric, if it does not deal directly with numbers"
4
Tib Roibu retweeted
We are launching a brand new journal. Please say hello to Applied & Computational Topology & Geometry (Link and more info below). Thanks to generous support from the @AMathRes, ACTG will be diamond open access --- completely free for both authors and readers!
Tib Roibu retweeted
New paper! We reverse engineered the mechanisms underlying Claude Haiku’s ability to perform a simple “perceptual” task. We discover beautiful feature families and manifolds, clean geometric transformations, and distributed attention algorithms!
45
313
63
2,464
Tib Roibu retweeted
[Chow Lectures 2025] (mis.mpg.de/events/series/cho…) Geometry and Combinatorics of Scattering Amplitudes Nima Arkani-Hamed (October 6-8, 2025) 1 - piped.video/watch?v=yDUdrx9W… 2 - piped.video/watch?v=LsOKLZV4… 3 - piped.video/watch?v=hDvc-fVv…
Tib Roibu retweeted
Geometry and the imagination by Hilbert and Cohn-Vossen Archive link: archive.org/details/geometry…
7
77
3
505
Perhaps, it is intrisically about approaching progress through play behaviour. Nvidia sprung out of the gaming industry, for example. Generative AI also brings a "positive" to the already "if anyone builds it, everyone dies" vibe going on.
I'm a fan of AI video and many of the things it enables, but the fact that so much compute is thrown at it (based on very real demand) also tells me that AGI / ASI is going to remain a distant fantasy for a while incidentally good news if you're of the p(doom) persuasion
1
2
BREAKING NEWS The Royal Swedish Academy of Sciences has decided to award the 2025 #NobelPrize in Physics to John Clarke, Michel H. Devoret and John M. Martinis “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.”
1
I wrote about how cognitive gravity can be used to infer such psychological effects, via embedded and embodied principles of geometric cognition. Currently working on a similar piece about virality, which is covered quite nice in this new paper. polynons.com/cognitive-gravi…
Why do some ideas spread widely, while others fail to catch on? Our new review paper on the PSYCHOLOGY OF VIRALITY is now out in @TrendsCognSci (it was led by @steverathje2 ) Read the full paper here: cell.com/trends/cognitive-sc…
2
Would have loved to see a mention of Tom Noddy, here, as he was doing his "bubble magic" in the late 70s, early 80s, bringing public attention to the natural elegance of multi-bubble forms long before the proofs emerged. Good read, nonetheless. geometrymatters.com/tom-nodd…
Symmetry has long been a guide for mathematicians, but sometimes the most beautiful answer to a problem is not the best answer. In the bubble problem, beauty and symmetry have prevailed once more. (From 2022) quantamagazine.org/monumenta…
1
A “hidden in plain sight” science of cognitive gravity: a body of evidence that already exists across psychology, linguistics, neuroscience, and computational modeling, that has not yet been unified under a single framework. polynons.com/cognitive-gravi…
1
Efficient training of neural networks is difficult. Our second Connectionism post introduces Modular Manifolds, a theoretical step toward more stable and performant training by co-designing neural net optimizers with manifold constraints on weight matrices. thinkingmachines.ai/blog/mod… We explore a fundamental understanding of the geometry of neural network optimization.
ORCH-OG
Cool paper analyzing quantum consciousness and ORCH-OR theoretically from the perspective of quantum-classical systems. Very cool stuff! frontiersin.org/journals/hum… @NirvanicAI
1