Skip to content

fast and intelligent (ai integrated) software tooling built in rust/zig for (theoretical) physics focusing on quantum gravity research and development, in order to advance the quest of finding a quantum theory of gravity

Notifications You must be signed in to change notification settings

planckeon/.github

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Planckeon Labs

Planckeon Labs

R&D in fast | intelligent software tooling for theoretical physics

GitHub


What We Build

Planckeon Labs develops high-performance, intelligent software tools that accelerate research in theoretical physics. We focus on bridging the gap between cutting-edge physical theory and practical computational implementation—making the abstract tractable and the intractable fast.

Our work spans symbolic computation, numerical methods, simulation frameworks, and AI-assisted tooling for physicists working at the frontiers of knowledge.


The Quantum Gravity Impasse

For nearly a century, theoretical physics has grappled with one of its greatest challenges: unifying quantum mechanics with general relativity into a consistent theory of quantum gravity. Despite decades of brilliant work across multiple approaches—string theory, loop quantum gravity, asymptotic safety, causal sets, and others—the field remains at an impasse.

The challenges are profound:

  • Experimental inaccessibility: The Planck scale (10⁻³⁵ m) is 10¹⁶ orders of magnitude smaller than what current particle accelerators can probe
  • Mathematical complexity: The theories require sophisticated mathematical machinery at the intersection of differential geometry, algebraic topology, and quantum field theory
  • Computational barriers: Many calculations are intractable without significant computational advances
  • Conceptual fragmentation: Different approaches often speak different mathematical languages, making cross-pollination difficult

We believe that intelligent, high-performance software tools can help break this impasse—not by replacing theoretical insight, but by amplifying it. Computation can explore parameter spaces humans cannot, symbolic systems can manage complexity that exceeds working memory, and AI can identify patterns across vast literature.


Why "Planckeon"?

Our name draws from one of the most profound concepts in theoretical physics: planckeons—hypothetical entities existing at the Planck scale, where the fabric of spacetime itself becomes quantized.

The Planck Scale & Its Mysteries

At scales of approximately 10⁻³⁵ meters (Planck length) and 10⁻⁵ grams (Planck mass), we reach the Planck scale—the regime where quantum mechanics and general relativity must unify. Here, the familiar continuum of spacetime breaks down, and something more fundamental emerges.

Despite its foundational importance, the physical meaning of this scale remains elusive. Traditional approaches posit that the Planck scale introduces a minimal measurable length and a natural cutoff, yet the nature of spacetime near this threshold—and whether spacetime itself is fundamental—remains one of the deepest open questions in physics.

Planckeons as the Atoms of Spacetime

Recent theoretical work (2024–2026) by Licata, Tamburini, Fiscaletti, and others characterizes planckeons as the fundamental "grains" of the gravitational vacuum. In this framework:

1. Wormhole Mouths & the ER=EPR Conjecture

Planckeons are modeled as the "mouths" of non-traversable Einstein-Rosen (ER) bridges—quantum wormholes at the Planck scale. They serve as holographic devices that realize the ER=EPR conjecture, proposed by Maldacena and Susskind, which posits a deep equivalence between:

  • ER: Einstein-Rosen wormholes (geometric connections through spacetime)
  • EPR: Einstein-Podolsky-Rosen entanglement (quantum correlations)

In this view, entanglement is geometry—quantum correlations between distant regions are geometrically realized as wormhole connections.

2. Emergent Spacetime from Entanglement

Perhaps most remarkably, this framework suggests that spacetime is not fundamental but emerges from a more primitive structure. The lattice of planckeons as wormhole mouths defines the Planck scale and generates the spacetime that crystallizes from the quantum entanglement of nonlocal correlations.

This is formalized through the Ryu-Takayanagi formula, which relates the entanglement entropy of a quantum system to the area of a minimal surface in the bulk geometry:

$$S_A = \frac{\text{Area}(\gamma_A^{\min})}{4G_{d+1}}$$

where $\gamma_A^{\min}$ is the minimal surface in AdS spacetime anchored to the boundary of region $A$. Applied to planckeons, this yields an entanglement entropy governing the thermodynamics of the planckeon lattice.

3. Generalized Uncertainty Relations

The fluctuations associated with planckeon activity deform the geometry of spacetime, leading to generalized uncertainty relations of the form:

$$\Delta x \Delta p \geq \frac{\hbar}{2}\left[1 + \beta l_p^2 \frac{\gamma^2 M^2}{\alpha \hbar^2} M_{Pl}^2 c^2\right]$$

This implies the existence of a minimal measurable length proportional to the Planck length—a fundamental limit to spatial resolution analogous to ℏ in quantum mechanics.

4. Thermodynamics & Phase Transitions

The planckeon framework yields a rich thermodynamic structure. Using partition functions derived from entanglement entropy:

$$Z(\eta) = \sum_k e^{-\eta \frac{\alpha \hbar c N_k}{l_p \gamma^2 M^2}}$$

one can identify a critical temperature above the Planck scale marking a phase transition from a "wormhole gas" (high-temperature, delocalized connections) to a "remnant phase" (low-temperature, localized structure).

This leads to quantum-corrected forms of the Bekenstein entropy, linking wormhole geometry with quantum information flow.

Dark Matter Candidacy

Planckeons are also proposed as stable, Planck-sized remnants left over after the evaporation of primordial black holes:

  • Extreme stability: With lifetimes potentially exceeding the age of the universe, planckeons are candidates for cold dark matter
  • Relic abundance: If the early universe reached temperatures near the Planck scale, high-energy collisions could have produced sufficient planckeons to account for the observed dark matter density

The Hierarchy Problem

Some emergent models use planckeons to address the hierarchy problem—the vast gap (~10¹⁷) between the electroweak scale (~10² GeV) and the Planck scale (~10¹⁹ GeV):

  • Higgs mass: In these theories, the Higgs mass emerges as a property derived from the dissipative features of a planckeon-filled vacuum
  • Gauge couplings: Physical constants such as the elementary charge and W/Z boson masses can be re-derived as signatures of planckeon interactions at the unified scale

Key Physical Parameters

Property Approximate Value Significance
Mass ~10⁻⁵ g (Planck mass $M_P$) Where quantum and gravitational effects balance
Length ~10⁻³³ cm (Planck length $l_P$) The smallest meaningful length scale
Time ~10⁻⁴³ s (Planck time $t_P$) Fundamental temporal resolution
Statistics q-deformed / "infinite statistics" Beyond ordinary bosons and fermions
Role Bridging locality and non-locality Wormhole mouths connecting entangled regions

The Name as Mission

We chose Planckeon because it captures what we aspire to in our software:

Physics Concept Software Analogy
Fundamental Building tools that address core computational challenges, not surface-level conveniences
Bridging Connecting abstract theory with practical computation, just as planckeons bridge locality and non-locality
Emergent Creating frameworks where complex capabilities emerge from well-designed primitives
At the Edge Working where established methods break down and new approaches are needed
Unifying Bringing together disparate approaches, just as planckeons unify quantum and gravitational descriptions

Just as planckeons represent the point where our current understanding of physics must evolve, our tools aim to push the boundaries of what's computationally possible in theoretical physics research.


Our Philosophy

Speed without intelligence is noise.
Intelligence without speed is impractical.
We build tools that are both.

Theoretical physics demands software that can handle:

  • Symbolic manipulation at scale (tensor calculus, differential geometry, Lie algebras)
  • Numerical precision at the limits of floating-point (numerical relativity, lattice QFT)
  • Algorithmic sophistication for NP-hard problems (combinatorial topology, graph theory)
  • Intuitive interfaces for domain experts (physicists shouldn't need to be software engineers)

We don't believe researchers should choose between power and usability.


AI-Native Tooling for Quantum Gravity

"Could AI develop a consistent theory of quantum gravity?" — Igor Babuschkin, xAI co-founder, CERN physicist

We share this vision. The search for new physics through traditional means—larger colliders, more precise experiments—faces diminishing returns. The Planck scale remains experimentally inaccessible by roughly 16 orders of magnitude. Meanwhile, the mathematical complexity of quantum gravity theories has grown beyond what individual researchers can tractably explore.

We believe superintelligence—not larger colliders—may be the key to unlocking the mysteries of the universe.

This isn't about replacing theoretical physicists. It's about building intelligent systems that amplify human insight, explore vast solution spaces, and verify mathematical consistency at scales impossible for unaided cognition.

The Case for AI in Fundamental Physics

Quantum gravity research is uniquely suited for AI augmentation:

Challenge Traditional Approach AI-Native Approach
Mathematical complexity Manual derivation, limited exploration LLMs for symbolic reasoning, automated theorem proving
Solution space exploration Intuition-guided search RL agents exploring parameter spaces systematically
Cross-paradigm synthesis Conference discussions, literature review Embedding models connecting disparate frameworks
Consistency verification Peer review, manual checking Neurosymbolic systems with formal verification
Computational intractability Approximations, toy models Learned surrogates, neural network accelerators

Our Technical Approach

We're building AI-native infrastructure specifically designed for theoretical physics research:

1. Neurosymbolic Systems with Verification

Pure neural approaches lack the rigor physics demands. Pure symbolic systems lack the flexibility to explore novel structures. We build hybrid neurosymbolic architectures where:

  • LLMs propose mathematical structures, conjectures, and proof strategies
  • Symbolic engines verify correctness, check consistency, enforce physical constraints
  • Models learn to self-verify as they reason, building verified chains of inference

This mirrors how physicists actually work: intuitive leaps followed by rigorous verification.

2. MCP Servers for Physics

The Model Context Protocol (MCP) enables AI models to interact with external tools through a standardized interface. We're developing MCP servers that give frontier models direct access to:

  • Computer algebra systems (symbolic tensor calculus, differential geometry)
  • Numerical simulation frameworks (lattice QFT, numerical relativity)
  • Literature databases (arXiv, INSPIRE-HEP semantic search)
  • Formal proof assistants (Lean, Coq for mathematical verification)
  • Visualization tools (Penrose diagrams, spacetime embeddings)

This creates AI agents that can reason about physics and compute—not just generate plausible-sounding text.

3. High-Performance Foundations

Intelligence without speed is impractical. Our tools are built in modern systems programming languages optimized for:

  • Memory efficiency: Zig, Rust for zero-overhead abstractions
  • Parallelism: Lock-free data structures, GPU acceleration
  • Interoperability: C ABI compatibility, Python bindings for researcher accessibility
  • Correctness: Strong type systems, compile-time guarantees

Languages like Zig, Rust, Odin, and Mojo let us build infrastructure that's simultaneously fast, safe, and expressive enough for complex physics abstractions.

4. Agent Frameworks for Research

We're developing specialized agent architectures for physics research:

  • Exploration agents: RL-trained agents that explore mathematical structures, searching for consistent theories
  • Verification agents: Agents that continuously check proposed theories against known physical principles and experimental constraints
  • Synthesis agents: Systems that identify connections between disparate approaches (e.g., linking loop quantum gravity observables to string theory predictions)
  • Literature agents: Semantic search over physics literature with citation-aware reasoning

The Vision: Physics Copilots

Imagine a research environment where:

  1. A physicist describes a conjecture in natural language
  2. An agent formalizes it in rigorous mathematical notation
  3. Symbolic systems check consistency with known physics
  4. Exploration agents probe edge cases and counterexamples
  5. Numerical simulations validate predictions in tractable limits
  6. Literature agents surface relevant prior work
  7. The physicist focuses on insight while the system handles bookkeeping

This is what we're building. Not AGI solving physics autonomously, but intelligence infrastructure that makes physicists dramatically more effective.

Why Now?

The convergence of several factors makes this the right moment:

  1. Frontier models can now engage meaningfully with mathematical reasoning
  2. Tool use via function calling and MCP enables grounded computation
  3. Systems languages (Zig, Rust) have matured for high-performance AI infrastructure
  4. Open models enable customization for specialized domains
  5. Compute costs continue falling, making large-scale exploration feasible

The gap between what AI can do and what physics needs is closing rapidly. We intend to build the bridge.

Inspired By

Our approach draws inspiration from researchers who recognized the potential of AI for fundamental physics early:

  • Igor Babuschkin (xAI co-founder, CERN physicist) — who asked whether superintelligence could develop a consistent theory of quantum gravity
  • The AlphaFold team — demonstrating that AI can solve long-standing scientific problems considered intractable
  • Symbolic AI pioneers — who understood that formal reasoning requires structure, not just pattern matching

We believe the singularity is near—and we're building the tools to ensure it advances our understanding of the universe.


Our Repositories

Public Projects

Repository Description Status
autograv Bridging numerical relativity and automatic differentiation using JAX Active
pauliz High-performance, zero-dependency quantum computing simulation in Zig Active
attn-as-bilinear-form Transformer attention via tensor calculus, statistical mechanics & differential geometry Research

Research & Development

We're also developing tools for symbolic computation, physics-specialized programming languages, and AI-assisted research workflows. Some of these remain in private development.


References & Further Reading

Planckeon Physics

  1. Licata, I., Tamburini, F., & Fiscaletti, D. (2025). Planckeons as mouths of quantum wormholes and holographic origin of spacetime. arXiv:2505.02804

ER=EPR & Emergent Spacetime

  1. Maldacena, J., & Susskind, L. (2013). Cool horizons for entangled black holes. arXiv:1306.0533

  2. Van Raamsdonk, M. (2010). Building up spacetime with quantum entanglement. arXiv:1005.3035

  3. Ryu, S., & Takayanagi, T. (2006). Holographic derivation of entanglement entropy from AdS/CFT. arXiv:hep-th/0603001

Quantum Gravity Landscape

  1. Buoninfante, L., et al. (2024). Visions in Quantum Gravity. arXiv:2412.08696

Connect

  • Research Inquiries: Open an issue
  • Collaboration: We're open to partnerships with research institutions and physics departments
  • Contributing: Check individual repositories for contribution guidelines

"The universe is not only queerer than we suppose, but queerer than we can suppose." — J.B.S. Haldane

Building tools for those who explore the queerness.

About

fast and intelligent (ai integrated) software tooling built in rust/zig for (theoretical) physics focusing on quantum gravity research and development, in order to advance the quest of finding a quantum theory of gravity

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published