Our papers are the official record of our discoveries. They allow others to build on and apply our work. Each paper is the result of many months of research, so we make a special effort to make them clear, beautiful and inspirational, and publish them in leading journals.
A neural network learns to classify different types of spacetime in general relativity according to their algebraic Petrov classification.
The structural and functional building blocks of gene regulatory networks correspond, which tell us how genetic computation is organised.
Certain states in quantum field theories are described by the geometry and algebra of melting crystals via properties of partition functions.
The Mahler measure is shown to be at the intersection between number theory, algebraic geometry, combinatorics, and quantum field theory.
The bipartite nature of regulatory networks means gene-gene logics are composed, which severely restricts which ones can show up in life.
Neural networks find efficient ways to compute the Hilbert series, an important counting function in algebraic geometry and gauge theory.
Unsupervised machine-learning of the Hodge numbers of Calabi-Yau hypersurfaces detects new patterns with an unexpected linear dependence.
Neural networks find numerical solutions to Hermitian Yang-Mills equations, a difficult system of PDEs crucial to mathematics and physics.
Circuits of memristors, resistors with memory, can exhibit instabilities which allow classical tunnelling through potential energy barriers.
Scale-invariant plant clusters explain the ability for a diverse range of plant species to coexist in ecosystems such as Barra Colorado.
A solution to the information paradox uses standard quantum field theory to show that black holes can evaporate in a predictable way.
A delicate balance between white blood cell protein expression and the molecules on the surface of tumour cells determines cancer prognoses.
Statistical physics contributes to new models and metrics for the study of financial network structure, dynamics, stability and instability.
The notion of quantum superposition speeds up the training process for binary neural networks and ensures that their parameters are optimal.
As the maximum age of a population decreases, it grows slower but converges faster, favouring programmed death in a changing environment.
Groethendieck's “children’s drawings”, a type of bipartite graph, link number theory, geometry, and the physics of conformal field theory.
Exact methods supersede approximations used in high-dimensional linear regression to find correlations in statistical physics problems.
An ongoing study tests the feasibility of using MRI scans to screen men for prostate cancer in place of unreliable antigen blood tests.
The fraction of logics that are biologically permitted can be bounded and shown to be tiny, which makes inferring them from experiments easier.
Networks where risky banks are mostly exposed to other risky banks have higher levels of systemic risk than those with stable bank interactions.
Cancer patients who contract and recover from Coronavirus-2 exhibit long-term immune system weaknesses, depending on the type of cancer.
Fire sales of common asset holdings can whip through a channel of contagion between banks, insurance companies and investments funds.
Naturally occurring networks have an underlying scale-free structure that is often clouded by finite-size effects in the sample data.
The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.
Statistical methods that normally fail for very high-dimensional data can be rescued via mathematical tools from statistical physics.
Parallels between the perfect and abundant numbers and their recursive analogs point to deeper structure in the recursive divisor function.
Consistent valuation of interbank claims within an interconnected financial system can be found with a recursive update of banks' equities.
Insights from biology, physics and business shed light on the nature and costs of complexity and how to manage it in business organizations.
We optimize Bayesian data clustering by mapping the problem to the statistical physics of a gas and calculating the lowest entropy state.
A theoretical model of recursive innovation suggests that new technologies are recursively built up from new combinations of existing ones.
A mathematical model captures the temporal and steady state behaviour of networks whose two sets of nodes either generate or destroy links.
A phase transition creates the geometry of the continuum from discrete space, but it needs disorder if it is to have the right metric.
Machine learning techniques enhance the efficiency of energy harvesters by implementing reversible energy-conserving operations.
Modern portfolio theory inspires a strategy for allocating renewable energy sources which minimises the impact of production fluctuations.
The distribution of product complexity helps explain why some technology sectors tend to exhibit faster innovation rates than others.
A simple solvable model of memristive networks suggests a correspondence between the asymptotic states of memristors and the Ising model.
Statistical physics harnesses links between maximum entropy and information theory to capture null model and real-world network features.
An explicit recipe for defining the Hamiltonian in general probabilistic theories, which have the potential to generalise quantum theory.
The distributions of size and shape of a material’s grains can be constructed from a 2D slice of the material and electron diffraction data.
Exact solutions for the dynamics of interacting memristors predict whether they relax to higher or lower resistance states given random initialisations.
Network users who have access to the network’s most informative node, as quantified by a novel index, the InfoRank, have a competitive edge.
One-shot analogs of fluctuation-theorem results help unify these two approaches for small-scale, nonequilibrium statistical physics.
Hamming balls, subgraphs of the hypercube, maximise the graph’s largest eigenvalue exactly when the dimension of the cube is large enough.
A novel approach to volunteer clouds outperforms traditional distributed task scheduling algorithms in the presence of intensive workloads.
Bipartite networks model the structures of ecological and economic real-world systems, enabling hypothesis testing and crisis forecasting.
Forecast errors for simple experience curve models facilitate more reliable estimates for the costs of technology deployment.
An iterative version of a method to identify hierarchies and rankings of nodes in directed networks can partly overcome its resolution limit.
The large-scale structure of the interbank network changes drastically in times of crisis due to the effect of measures from central banks.
An explicit analytical solution reproduces the main features of random graph ensembles with many short cycles under strict degree constraints.
The usefulness of components and the complexity of products inform the best strategy for innovation at different stages of the process.
In systems of innovation, the relative usefulness of different components changes as the number of components we possess increases.
Theoretical searches propose 2D borane as a new graphene-like material which is stable and semi-metallic with Dirac cone structure.
Complex networks model the links between financial institutions and how these channels can transition from diversifying to propagating risk.
Bayesian networks describe the evolution of orthodontic features on patients receiving treatment versus no treatment for malocclusion.
We generalise neural networks into a quantum framework, demonstrating the possibility of quantum auto-encoders and teleportation.
Statistical mechanics concepts reconstruct connections between financial institutions and the stock market, despite limited data disclosure.
A new algorithm unveils complicated structures in the bipartite mapping between countries and products of the international trade network.
Spectroscopy experiments show that energy shifts due to photon emission from individual molecules satisfy a fundamental quantum relation.
When people operate in echo chambers, they focus on information adhering to their system of beliefs. Debunking them is harder than it seems
Moment-based methods provide a simple way to describe a population of spherical particles and extract 3d information from 2d measurements.
The spectral density of graph ensembles provides an exact solution to the graph partitioning problem and helps detect community structure.
Memristive networks preserve memory and have the ability to learn according to analysis of the network’s internal memory dynamics.
A new equality which depends on the maximum entropy describes the worst-case amount of work done by finite-dimensional quantum systems.
Firms can harness the shifting importance of component building blocks to build better products and services and hence increase their chances of sustained success.
Processes believed to stabilize financial markets can drive them towards instability by creating cyclical structures that amplify distress.
Exact equations of motion provide an analytical description of the evolution and relaxation properties of complex memristive circuits.
Inference from single snapshots of temporal networks can misleadingly group communities if the links between snapshots are correlated.
Compact heat exchangers can be designed to run at low power if the exchange is concentrated in a crumpled surface fed by a fractal network.
Non-linear models of distress propagation in financial networks characterise key regimes where shocks are either amplified or suppressed.
Targeted immunisation policies limit distress propagation and prevent system-wide crises in financial networks according to sandpile models.
An extension of the Kelly criterion maximises the growth rate of multiplicative stochastic processes when limited resources are available.
Increasing the complexity of the network of contracts between financial institutions decreases the accuracy of estimating systemic risk.
In quantum tunnelling, a particle tunnels through a barrier that it classically could not surmount.
The structural properties of a network motif predict its functional versatility and relate to gene regulatory networks.
Coupled distribution grids are more vulnerable to a cascading systemic failure but they have larger safe regions within their networks.
An adaptive network of oscillators in fragmented and incoherent states can re-organise itself into connected and synchronized states.
The community matrix of a complex ecosystem captures the population dynamics of interacting species and transitions to unstable abundances.
Percolation theory shows that the formation of giant clusters of neurons relies on a few parameters that could be measured experimentally.
A formulation of Moore’s law estimates the probability that a given technology will outperform another at a certain point in the future.
In an infinitely bouncing Universe, the scalar field driving the cosmological expansion and contraction carries information between phases.
The principal eigenvalue of small neutral networks determines their robustness, and is bounded by the logarithm of the number of vertices.
With inspiration from Maxwell’s classic thought experiment, it is possible to extract macroscopic work from microscopic measurements of photons.
News sentiment analysis and web browsing data are unilluminating alone, but inspected together, predict fluctuations in stock prices.
A subset of bootstrap percolation models, which stabilise systems of cells on infinite lattices, exhibit non-trivial phase transitions.
A new tool derived from information theory quantitatively identifies trees, hierarchies and community structures within complex networks.
When the number of tweets about an event peaks, the sentiment of those tweets correlates strongly with abnormal stock market returns.
Analysis of the hyperbolicity of real-world networks distinguishes between those which are aristocratic and those which are democratic.
Properties of protein interaction networks test the reliability of data and hint at the underlying mechanism with which proteins recruit each other.
A random analogue of the Erdős-Ko-Rado theorem sheds light on its stability in an area of parameter space which has not yet been explored.
Tweet volume is a good indicator of political parties' success in elections when considered over an optimal time window so as to minimise noise.
Single-shot information theory inspires a new formulation of statistical mechanics which measures the optimal guaranteed work of a system.
Exact equations for the thermodynamic quantities of lattices made of d-dimensional hypercubes are obtainable with the Bethe-Peierls approach.
A dynamical microscopic theory of instability for financial networks reformulates the DebtRank algorithm in terms of basic accounting principles.
The stable structures of calcium and magnesium carbonate at high pressures are crucial for understanding the Earth's deep carbon cycle.
The Yule-Simon distribution describes the diffusion of knowledge and ideas in a social network which in turn influences economic growth.
The speed of a financial crisis outbreak sets the maximum delay before intervention by central authorities is no longer effective.
A local model of preferential attachment with short-term memory generates scale-free networks, which can be readily computed by memristors.
Dynamical systems theory predicts the growth potential of countries with heterogeneous patterns of evolution where regression methods fail.
A simple formula gives the maximum time for an n x n grid to become entirely infected having undergone a bootstrap percolation process.
Less developed countries have to learn simple capabilities in order to start a stable industrialization and development process.
The analysis of real networks which contain many short loops requires novel methods, because they break the assumptions of tree-like models.
A quick and simple way to evaluate the packing fraction of polydisperse spheres, which is a measure of how they crowd around each other.
Time series data from networks of credit default swaps display no early warnings of financial crises without additional macroeconomic indicators.
Explicit formulae for the Shannon entropies of random graph ensembles provide measures to compare and reproduce their topological features.
When networks come under attack, a repairable architecture is superior to, and globally distinct from, an architecture that is robust.
A review of the achievements concerning typical bipartite entanglement for random quantum states involving a large number of particles.
Generating random structures in the vicinity of a material’s defect predicts the low and high energy atomic structure at the grain boundary.
The critical probability for bootstrap percolation, a process which mimics the spread of an infection in a graph, is bounded for Galton-Watson trees.
The likelihood of stock prices bouncing on specific values increases due to memory effects in the time series data of the price dynamics.
The interplay between redundancies and smart reconfiguration protocols can improve the resilience of networked infrastructures to failures.
Fractal structures need very little mass to support a load; but for current designs, this makes them vulnerable to manufacturing errors.
The optimal architecture of a financial system is only dependent on its topology when the market is illiquid, and no topology is always superior.
The immune system must simultaneously recall multiple defense strategies because many antigens can attack the host at the same time.
Lognormal distributions (and mixtures of same) are a useful model for the size distribution in emulsions and sediments.
A new non-monetary metric captures diversification, a dominant effect on the globalised market, and the effective complexity of products.
Coupled non-linear maps extract information about the competitiveness of countries to the complexity of their products from trade data.
A new concept, graph temperature, enables the prediction of distinct topological properties of real-world networks simultaneously.
Information theory fixes weighted networks’ degeneracy issues with a generalisation of binary graphs and an optimal scale of link intensities.
Associative networks with different loads model the ability of the immune system to respond simultaneously to multiple distinct antigen invasions.
The Eiffel tower is now a longstanding example of hierarchical design due to its non-trivial internal structure spanning many length scales.
The most efficient load-bearing fractals are designed as big structures under gentle loads ... a situation common in aerospace applications.
Complex networks detect the driver institutions of an interbank market and ascertain that intervention policies should be time-scale dependent.
New mathematical tools can help infer financial networks from partial data to understand the propagation of distress through the network.
Network-based metrics to assess systemic risk and the importance of financial institutions can help tame the financial derivatives market.
Information about 10% of the links in a complex network is sufficient to reconstruct its main features and resilience with the fitness model.
A statistical procedure identifies dominant edges within weighted networks to determine whether a network has reached its steady state.
A systematic way to vary the power-law scaling relations between loading parameters and volume of material aids the hierarchical design process.
The transition from solid to hollow beams changes the scaling of stability versus loading analogously to increasing the hierarchical order by one.
Network theory finds unexpected interactions between the number of products a country produces and the number of countries producing each product.
A quantitative assessment of the non-monetary advantage of diversification represents a country’s hidden potential for development and growth.
Network analysis of diagnostic data identifies combinations of the key factors which cause Class III malocclusion and how they evolve over time.
Analysis of web search queries about a given stock, from the seemingly uncoordinated activity of many users, can anticipate the trading peak.
Unbiased randomisation processes generate sophisticated synthetic networks for modelling and testing the properties of real-world networks.
Spectral analysis shows that disassortative networks exhibit a higher epidemiological threshold and are therefore easier to immunize.
Edge multiplicity—the number of triangles attached to edges—is a powerful analytic tool to understand and generalize network properties.
Methods from tailored random graph theory reveal the relation between true biological networks and the often-biased samples taken from them.
Analysis of the linear elastic behaviour of plant cell dispersions improves our understanding of how to stabilise and texturise food products.
A transfer operator formalism solves the macroscopic dynamics of disordered Ising chain systems which are relevant for ageing phenomena.
A Monte Carlo model simulates the microstructural evolution of metallic and ceramic powders during the consolidation process liquid-phase sintering.
New mathematical tools quantify the topological structure of large directed networks which describe how genes interact within a cell.
The information needed to self-assemble a structure quantifies its modularity and explains the prevalence of certain structures over others.
Techniques from random sphere packing predict the dimension of the Apollonian gasket, a fractal made up of non-overlapping hyperspheres.
The topological structure of tie knots categorises them by shape, size and aesthetic appeal and defines the sequence of knots to produce them.
Of the 256 elementary cellular automata, 28 of them exhibit random behavior over time, but spatio-temporal currents still lurk underneath.
In single elimination competition the best indicator of success is a player's wealth: the accumulated wealth of all defeated players.
Machine-learning methods can distinguish between Sato-Tate groups, promoting a data-driven approach for problems involving Euler factors.
Machine-learning is a powerful tool for sifting through the landscape of possible Universes that could derive from Calabi-Yau manifolds.
The few-shot machine learning technique reduces the vast geometric landscape of string theory vacua to a tiny cluster of representatives.
The generation of large graphs with a controllable number of short loops paves the way for building more realistic random networks.
Rapid temperature cycling from one extreme to another affects the rate at which the mean particle size in solid or liquid solutions changes.
Recursively divisible numbers are a new kind of number that are highly divisible, whose quotients are highly divisible, and so on.
The number of particles in a higher derivative theory of gravity relates to its effective mass scale, which signals the theory’s viability.