Planckeon Labs develops high-performance, intelligent software tools that accelerate research in theoretical physics. We focus on bridging the gap between cutting-edge physical theory and practical computational implementation—making the abstract tractable and the intractable fast.
Our work spans symbolic computation, numerical methods, simulation frameworks, and AI-assisted tooling for physicists working at the frontiers of knowledge.
For nearly a century, theoretical physics has grappled with one of its greatest challenges: unifying quantum mechanics with general relativity into a consistent theory of quantum gravity. Despite decades of brilliant work across multiple approaches—string theory, loop quantum gravity, asymptotic safety, causal sets, and others—the field remains at an impasse.
The challenges are profound:
- Experimental inaccessibility: The Planck scale (10⁻³⁵ m) is 10¹⁶ orders of magnitude smaller than what current particle accelerators can probe
- Mathematical complexity: The theories require sophisticated mathematical machinery at the intersection of differential geometry, algebraic topology, and quantum field theory
- Computational barriers: Many calculations are intractable without significant computational advances
- Conceptual fragmentation: Different approaches often speak different mathematical languages, making cross-pollination difficult
We believe that intelligent, high-performance software tools can help break this impasse—not by replacing theoretical insight, but by amplifying it. Computation can explore parameter spaces humans cannot, symbolic systems can manage complexity that exceeds working memory, and AI can identify patterns across vast literature.
Our name draws from one of the most profound concepts in theoretical physics: planckeons—hypothetical entities existing at the Planck scale, where the fabric of spacetime itself becomes quantized.
At scales of approximately 10⁻³⁵ meters (Planck length) and 10⁻⁵ grams (Planck mass), we reach the Planck scale—the regime where quantum mechanics and general relativity must unify. Here, the familiar continuum of spacetime breaks down, and something more fundamental emerges.
Despite its foundational importance, the physical meaning of this scale remains elusive. Traditional approaches posit that the Planck scale introduces a minimal measurable length and a natural cutoff, yet the nature of spacetime near this threshold—and whether spacetime itself is fundamental—remains one of the deepest open questions in physics.
Recent theoretical work (2024–2026) by Licata, Tamburini, Fiscaletti, and others characterizes planckeons as the fundamental "grains" of the gravitational vacuum. In this framework:
Planckeons are modeled as the "mouths" of non-traversable Einstein-Rosen (ER) bridges—quantum wormholes at the Planck scale. They serve as holographic devices that realize the ER=EPR conjecture, proposed by Maldacena and Susskind, which posits a deep equivalence between:
- ER: Einstein-Rosen wormholes (geometric connections through spacetime)
- EPR: Einstein-Podolsky-Rosen entanglement (quantum correlations)
In this view, entanglement is geometry—quantum correlations between distant regions are geometrically realized as wormhole connections.
Perhaps most remarkably, this framework suggests that spacetime is not fundamental but emerges from a more primitive structure. The lattice of planckeons as wormhole mouths defines the Planck scale and generates the spacetime that crystallizes from the quantum entanglement of nonlocal correlations.
This is formalized through the Ryu-Takayanagi formula, which relates the entanglement entropy of a quantum system to the area of a minimal surface in the bulk geometry:
where
The fluctuations associated with planckeon activity deform the geometry of spacetime, leading to generalized uncertainty relations of the form:
This implies the existence of a minimal measurable length proportional to the Planck length—a fundamental limit to spatial resolution analogous to ℏ in quantum mechanics.
The planckeon framework yields a rich thermodynamic structure. Using partition functions derived from entanglement entropy:
one can identify a critical temperature above the Planck scale marking a phase transition from a "wormhole gas" (high-temperature, delocalized connections) to a "remnant phase" (low-temperature, localized structure).
This leads to quantum-corrected forms of the Bekenstein entropy, linking wormhole geometry with quantum information flow.
Planckeons are also proposed as stable, Planck-sized remnants left over after the evaporation of primordial black holes:
- Extreme stability: With lifetimes potentially exceeding the age of the universe, planckeons are candidates for cold dark matter
- Relic abundance: If the early universe reached temperatures near the Planck scale, high-energy collisions could have produced sufficient planckeons to account for the observed dark matter density
Some emergent models use planckeons to address the hierarchy problem—the vast gap (~10¹⁷) between the electroweak scale (~10² GeV) and the Planck scale (~10¹⁹ GeV):
- Higgs mass: In these theories, the Higgs mass emerges as a property derived from the dissipative features of a planckeon-filled vacuum
- Gauge couplings: Physical constants such as the elementary charge and W/Z boson masses can be re-derived as signatures of planckeon interactions at the unified scale
| Property | Approximate Value | Significance |
|---|---|---|
| Mass | ~10⁻⁵ g (Planck mass |
Where quantum and gravitational effects balance |
| Length | ~10⁻³³ cm (Planck length |
The smallest meaningful length scale |
| Time | ~10⁻⁴³ s (Planck time |
Fundamental temporal resolution |
| Statistics | q-deformed / "infinite statistics" | Beyond ordinary bosons and fermions |
| Role | Bridging locality and non-locality | Wormhole mouths connecting entangled regions |
We chose Planckeon because it captures what we aspire to in our software:
| Physics Concept | Software Analogy |
|---|---|
| Fundamental | Building tools that address core computational challenges, not surface-level conveniences |
| Bridging | Connecting abstract theory with practical computation, just as planckeons bridge locality and non-locality |
| Emergent | Creating frameworks where complex capabilities emerge from well-designed primitives |
| At the Edge | Working where established methods break down and new approaches are needed |
| Unifying | Bringing together disparate approaches, just as planckeons unify quantum and gravitational descriptions |
Just as planckeons represent the point where our current understanding of physics must evolve, our tools aim to push the boundaries of what's computationally possible in theoretical physics research.
Speed without intelligence is noise.
Intelligence without speed is impractical.
We build tools that are both.
Theoretical physics demands software that can handle:
- Symbolic manipulation at scale (tensor calculus, differential geometry, Lie algebras)
- Numerical precision at the limits of floating-point (numerical relativity, lattice QFT)
- Algorithmic sophistication for NP-hard problems (combinatorial topology, graph theory)
- Intuitive interfaces for domain experts (physicists shouldn't need to be software engineers)
We don't believe researchers should choose between power and usability.
"Could AI develop a consistent theory of quantum gravity?" — Igor Babuschkin, xAI co-founder, CERN physicist
We share this vision. The search for new physics through traditional means—larger colliders, more precise experiments—faces diminishing returns. The Planck scale remains experimentally inaccessible by roughly 16 orders of magnitude. Meanwhile, the mathematical complexity of quantum gravity theories has grown beyond what individual researchers can tractably explore.
We believe superintelligence—not larger colliders—may be the key to unlocking the mysteries of the universe.
This isn't about replacing theoretical physicists. It's about building intelligent systems that amplify human insight, explore vast solution spaces, and verify mathematical consistency at scales impossible for unaided cognition.
Quantum gravity research is uniquely suited for AI augmentation:
| Challenge | Traditional Approach | AI-Native Approach |
|---|---|---|
| Mathematical complexity | Manual derivation, limited exploration | LLMs for symbolic reasoning, automated theorem proving |
| Solution space exploration | Intuition-guided search | RL agents exploring parameter spaces systematically |
| Cross-paradigm synthesis | Conference discussions, literature review | Embedding models connecting disparate frameworks |
| Consistency verification | Peer review, manual checking | Neurosymbolic systems with formal verification |
| Computational intractability | Approximations, toy models | Learned surrogates, neural network accelerators |
We're building AI-native infrastructure specifically designed for theoretical physics research:
Pure neural approaches lack the rigor physics demands. Pure symbolic systems lack the flexibility to explore novel structures. We build hybrid neurosymbolic architectures where:
- LLMs propose mathematical structures, conjectures, and proof strategies
- Symbolic engines verify correctness, check consistency, enforce physical constraints
- Models learn to self-verify as they reason, building verified chains of inference
This mirrors how physicists actually work: intuitive leaps followed by rigorous verification.
The Model Context Protocol (MCP) enables AI models to interact with external tools through a standardized interface. We're developing MCP servers that give frontier models direct access to:
- Computer algebra systems (symbolic tensor calculus, differential geometry)
- Numerical simulation frameworks (lattice QFT, numerical relativity)
- Literature databases (arXiv, INSPIRE-HEP semantic search)
- Formal proof assistants (Lean, Coq for mathematical verification)
- Visualization tools (Penrose diagrams, spacetime embeddings)
This creates AI agents that can reason about physics and compute—not just generate plausible-sounding text.
Intelligence without speed is impractical. Our tools are built in modern systems programming languages optimized for:
- Memory efficiency: Zig, Rust for zero-overhead abstractions
- Parallelism: Lock-free data structures, GPU acceleration
- Interoperability: C ABI compatibility, Python bindings for researcher accessibility
- Correctness: Strong type systems, compile-time guarantees
Languages like Zig, Rust, Odin, and Mojo let us build infrastructure that's simultaneously fast, safe, and expressive enough for complex physics abstractions.
We're developing specialized agent architectures for physics research:
- Exploration agents: RL-trained agents that explore mathematical structures, searching for consistent theories
- Verification agents: Agents that continuously check proposed theories against known physical principles and experimental constraints
- Synthesis agents: Systems that identify connections between disparate approaches (e.g., linking loop quantum gravity observables to string theory predictions)
- Literature agents: Semantic search over physics literature with citation-aware reasoning
Imagine a research environment where:
- A physicist describes a conjecture in natural language
- An agent formalizes it in rigorous mathematical notation
- Symbolic systems check consistency with known physics
- Exploration agents probe edge cases and counterexamples
- Numerical simulations validate predictions in tractable limits
- Literature agents surface relevant prior work
- The physicist focuses on insight while the system handles bookkeeping
This is what we're building. Not AGI solving physics autonomously, but intelligence infrastructure that makes physicists dramatically more effective.
The convergence of several factors makes this the right moment:
- Frontier models can now engage meaningfully with mathematical reasoning
- Tool use via function calling and MCP enables grounded computation
- Systems languages (Zig, Rust) have matured for high-performance AI infrastructure
- Open models enable customization for specialized domains
- Compute costs continue falling, making large-scale exploration feasible
The gap between what AI can do and what physics needs is closing rapidly. We intend to build the bridge.
Our approach draws inspiration from researchers who recognized the potential of AI for fundamental physics early:
- Igor Babuschkin (xAI co-founder, CERN physicist) — who asked whether superintelligence could develop a consistent theory of quantum gravity
- The AlphaFold team — demonstrating that AI can solve long-standing scientific problems considered intractable
- Symbolic AI pioneers — who understood that formal reasoning requires structure, not just pattern matching
We believe the singularity is near—and we're building the tools to ensure it advances our understanding of the universe.
| Repository | Description | Status |
|---|---|---|
| autograv | Bridging numerical relativity and automatic differentiation using JAX | Active |
| pauliz | High-performance, zero-dependency quantum computing simulation in Zig | Active |
| attn-as-bilinear-form | Transformer attention via tensor calculus, statistical mechanics & differential geometry | Research |
We're also developing tools for symbolic computation, physics-specialized programming languages, and AI-assisted research workflows. Some of these remain in private development.
- Licata, I., Tamburini, F., & Fiscaletti, D. (2025). Planckeons as mouths of quantum wormholes and holographic origin of spacetime. arXiv:2505.02804
-
Maldacena, J., & Susskind, L. (2013). Cool horizons for entangled black holes. arXiv:1306.0533
-
Van Raamsdonk, M. (2010). Building up spacetime with quantum entanglement. arXiv:1005.3035
-
Ryu, S., & Takayanagi, T. (2006). Holographic derivation of entanglement entropy from AdS/CFT. arXiv:hep-th/0603001
- Buoninfante, L., et al. (2024). Visions in Quantum Gravity. arXiv:2412.08696
- Research Inquiries: Open an issue
- Collaboration: We're open to partnerships with research institutions and physics departments
- Contributing: Check individual repositories for contribution guidelines
"The universe is not only queerer than we suppose, but queerer than we can suppose." — J.B.S. Haldane
Building tools for those who explore the queerness.
