Our papers are the official record of our discoveries. They allow others to build on and apply our work. Each paper is the result of many months of research, so we make a special effort to make them clear, beautiful and inspirational, and publish them in leading journals.
Neural networks find efficient ways to compute the Hilbert series, an important counting function in algebraic geometry and gauge theory.
The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.
Statistical methods that normally fail for very high-dimensional data can be rescued via mathematical tools from statistical physics.
Bayesian networks describe the evolution of orthodontic features on patients receiving treatment versus no treatment for malocclusion.
We generalise neural networks into a quantum framework, demonstrating the possibility of quantum auto-encoders and teleportation.
Memristive networks preserve memory and have the ability to learn according to analysis of the network’s internal memory dynamics.
Exact equations of motion provide an analytical description of the evolution and relaxation properties of complex memristive circuits.
A local model of preferential attachment with short-term memory generates scale-free networks, which can be readily computed by memristors.
Less developed countries have to learn simple capabilities in order to start a stable industrialization and development process.
The analysis of real networks which contain many short loops requires novel methods, because they break the assumptions of tree-like models.
Explicit formulae for the Shannon entropies of random graph ensembles provide measures to compare and reproduce their topological features.
Associative networks with different loads model the ability of the immune system to respond simultaneously to multiple distinct antigen invasions.
New mathematical tools quantify the topological structure of large directed networks which describe how genes interact within a cell.