Our papers are the official record of our discoveries. They allow others to build on and apply our work. Each paper is the result of many months of research, so we make a special effort to make them clear, beautiful and inspirational, and publish them in leading journals.
A neural network learns to classify different types of spacetime in general relativity according to their algebraic Petrov classification.
Certain states in quantum field theories are described by the geometry and algebra of melting crystals via properties of partition functions.
The Mahler measure is shown to be at the intersection between number theory, algebraic geometry, combinatorics, and quantum field theory.
Neural networks find efficient ways to compute the Hilbert series, an important counting function in algebraic geometry and gauge theory.
Unsupervised machine-learning of the Hodge numbers of Calabi-Yau hypersurfaces detects new patterns with an unexpected linear dependence.
Neural networks find numerical solutions to Hermitian Yang-Mills equations, a difficult system of PDEs crucial to mathematics and physics.
Groethendieck's “children’s drawings”, a type of bipartite graph, link number theory, geometry, and the physics of conformal field theory.
Machine-learning methods can distinguish between Sato-Tate groups, promoting a data-driven approach for problems involving Euler factors.
Machine-learning is a powerful tool for sifting through the landscape of possible Universes that could derive from Calabi-Yau manifolds.
The few-shot machine learning technique reduces the vast geometric landscape of string theory vacua to a tiny cluster of representatives.