Abstract
Why do simple learning rules yield AI systems that generalize far beyond their training data? I argue that this reflects an abundance of learnable structure in nature, and that this abundance motivates Nomic Liberalism, a conception of laws developed from Minimal Primitivism (Chen and Goldstein, 2022). On this view, laws can be simple, predictive, representation-relative, existing at many scales and in domains far beyond fundamental physics. Simplicity functions as a nomic razor: an epistemic guide to discovering laws, rather than a general guide to truth (Chen, 2024, 2025).
I show how puzzling phenomena in machine learning—such as double descent, scaling laws, and the emergence of AGI from simple objectives such as next-token prediction (Chen, Belkin, Bergen, and Danks, 2026)—can be understood as learning systems discovering such liberal laws, often in domains traditionally thought to lack lawful structure, and in coordinate systems quite unlike familiar human concepts. AI success thus provides concrete evidence for this expanded conception of lawhood. Yet abundance has limits. Recent results in quantum foundations establish in-principle constraints on learning: in high-dimensional quantum systems, nearly all quantum states are observationally indistinguishable—a limit no learning algorithm can overcome (Chen and Tumulka, 2025). A satisfactory theory of induction must therefore explain both why learning works so well and why it must sometimes fail.
Kurzbiografie
Eddy Keming Chen is Associate Professor of Philosophy at the University of California, San Diego. He works at the intersection of philosophy of physics, metaphysics, and philosophy of artificial intelligence. His research focuses on the foundations of quantum mechanics, the metaphysics of time, and the nature of laws, as well as conceptual and normative questions raised by AI systems. Chen is especially known for developing the "density matrix realism" program, which reconceptualizes the quantum state as fundamentally mixed rather than pure, and for exploring its implications for the Past Hypothesis and the arrow of time. Across these areas, his work combines close engagement with contemporary science and technology with systematic metaphysical analysis, aiming to clarify the structure of fundamental reality and its emerging artificial counterparts.