Our papers are the official record of our discoveries. They allow others to build on and apply our work. Each paper is the result of many months of research, so we make a special effort to make them clear, beautiful and inspirational, and publish them in leading journals.
Naturally occurring networks have an underlying scale-free structure that is often clouded by finite-size effects in the sample data.
Recursively divisible numbers are a new kind of number that are highly divisible, whose quotients are highly divisible, and so on.
Complex network theory unlocks systematic understanding of financial stability and climate finance in pursuit of a more sustainable society.
The generation of large graphs with a controllable number of short loops paves the way for building more realistic random networks.
Rapid temperature cycling from one extreme to another affects the rate at which the mean particle size in solid or liquid solutions changes.
Exact methods supersede approximations used in high-dimensional linear regression to find correlations in statistical physics problems.
The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.
Parallels between the perfect and abundant numbers and their recursive analogs point to deeper structure in the recursive divisor function.
Statistical methods that normally fail for very high-dimensional data can be rescued via mathematical tools from statistical physics.
Consistent valuation of interbank claims within an interconnected financial system can be found with a recursive update of banks' equities.
Insights from biology, physics and business shed light on the nature and costs of complexity and how to manage it in business organizations.
We optimize Bayesian data clustering by mapping the problem to the statistical physics of a gas and calculating the lowest entropy state.
A theoretical model of recursive innovation suggests that new technologies are recursively built up from new combinations of existing ones.
A mathematical model captures the temporal and steady state behaviour of networks whose two sets of nodes either generate or destroy links.
A phase transition creates the geometry of the continuum from discrete space, but it needs disorder if it is to have the right metric.
Machine learning techniques enhance the efficiency of energy harvesters by implementing reversible energy-conserving operations.
The number of particles in a higher derivative theory of gravity relates to its effective mass scale, which signals the theory’s viability.
Modern portfolio theory inspires a strategy for allocating renewable energy sources which minimises the impact of production fluctuations.
A simple solvable model of memristive networks suggests a correspondence between the asymptotic states of memristors and the Ising model.
Statistical physics harnesses links between maximum entropy and information theory to capture null model and real-world network features.
The distribution of product complexity helps explain why some technology sectors tend to exhibit faster innovation rates than others.
An explicit recipe for defining the Hamiltonian in general probabilistic theories, which have the potential to generalise quantum theory.
The distributions of size and shape of a material’s grains can be constructed from a 2D slice of the material and electron diffraction data.
Exact solutions for the dynamics of interacting memristors predict whether they relax to higher or lower resistance states given random initialisations.
Network users who have access to the network’s most informative node, as quantified by a novel index, the InfoRank, have a competitive edge.
One-shot analogs of fluctuation-theorem results help unify these two approaches for small-scale, nonequilibrium statistical physics.
Hamming balls, subgraphs of the hypercube, maximise the graph’s largest eigenvalue exactly when the dimension of the cube is large enough.
A novel approach to volunteer clouds outperforms traditional distributed task scheduling algorithms in the presence of intensive workloads.
Bipartite networks model the structures of ecological and economic real-world systems, enabling hypothesis testing and crisis forecasting.
Forecast errors for simple experience curve models facilitate more reliable estimates for the costs of technology deployment.
An iterative version of a method to identify hierarchies and rankings of nodes in directed networks can partly overcome its resolution limit.
The large-scale structure of the interbank network changes drastically in times of crisis due to the effect of measures from central banks.
An explicit analytical solution reproduces the main features of random graph ensembles with many short cycles under strict degree constraints.
The usefulness of components and the complexity of products inform the best strategy for innovation at different stages of the process.
In systems of innovation, the relative usefulness of different components changes as the number of components we possess increases.
Complex networks model the links between financial institutions and how these channels can transition from diversifying to propagating risk.
Bayesian networks describe the evolution of orthodontic features on patients receiving treatment versus no treatment for malocclusion.
Theoretical searches propose 2D borane as a new graphene-like material which is stable and semi-metallic with Dirac cone structure.
We generalise neural networks into a quantum framework, demonstrating the possibility of quantum auto-encoders and teleportation.
Statistical mechanics concepts reconstruct connections between financial institutions and the stock market, despite limited data disclosure.
A new algorithm unveils complicated structures in the bipartite mapping between countries and products of the international trade network.
Spectroscopy experiments show that energy shifts due to photon emission from individual molecules satisfy a fundamental quantum relation.
When people operate in echo chambers, they focus on information adhering to their system of beliefs. Debunking them is harder than it seems
Moment-based methods provide a simple way to describe a population of spherical particles and extract 3d information from 2d measurements.
The spectral density of graph ensembles provides an exact solution to the graph partitioning problem and helps detect community structure.
Memristive networks preserve memory and have the ability to learn according to analysis of the network’s internal memory dynamics.
A new equality which depends on the maximum entropy describes the worst-case amount of work done by finite-dimensional quantum systems.
Firms can harness the shifting importance of component building blocks to build better products and services and hence increase their chances of sustained success.
Exact equations of motion provide an analytical description of the evolution and relaxation properties of complex memristive circuits.
Processes believed to stabilize financial markets can drive them towards instability by creating cyclical structures that amplify distress.
Inference from single snapshots of temporal networks can misleadingly group communities if the links between snapshots are correlated.
Compact heat exchangers can be designed to run at low power if the exchange is concentrated in a crumpled surface fed by a fractal network.
Non-linear models of distress propagation in financial networks characterise key regimes where shocks are either amplified or suppressed.
Targeted immunisation policies limit distress propagation and prevent system-wide crises in financial networks according to sandpile models.
An extension of the Kelly criterion maximises the growth rate of multiplicative stochastic processes when limited resources are available.
In quantum tunnelling, a particle tunnels through a barrier that it classically could not surmount.
Increasing the complexity of the network of contracts between financial institutions decreases the accuracy of estimating systemic risk.
The structural properties of a network motif predict its functional versatility and relate to gene regulatory networks.
Coupled distribution grids are more vulnerable to a cascading systemic failure but they have larger safe regions within their networks.
An adaptive network of oscillators in fragmented and incoherent states can re-organise itself into connected and synchronized states.
The community matrix of a complex ecosystem captures the population dynamics of interacting species and transitions to unstable abundances.
Percolation theory shows that the formation of giant clusters of neurons relies on a few parameters that could be measured experimentally.
A formulation of Moore’s law estimates the probability that a given technology will outperform another at a certain point in the future.
In an infinitely bouncing Universe, the scalar field driving the cosmological expansion and contraction carries information between phases.
The principal eigenvalue of small neutral networks determines their robustness, and is bounded by the logarithm of the number of vertices.
With inspiration from Maxwell’s classic thought experiment, it is possible to extract macroscopic work from microscopic measurements of photons.
News sentiment analysis and web browsing data are unilluminating alone, but inspected together, predict fluctuations in stock prices.
A subset of bootstrap percolation models, which stabilise systems of cells on infinite lattices, exhibit non-trivial phase transitions.
A new tool derived from information theory quantitatively identifies trees, hierarchies and community structures within complex networks.
When the number of tweets about an event peaks, the sentiment of those tweets correlates strongly with abnormal stock market returns.
Analysis of the hyperbolicity of real-world networks distinguishes between those which are aristocratic and those which are democratic.
Properties of protein interaction networks test the reliability of data and hint at the underlying mechanism with which proteins recruit each other.
A random analogue of the Erdős-Ko-Rado theorem sheds light on its stability in an area of parameter space which has not yet been explored.
Tweet volume is a good indicator of political parties' success in elections when considered over an optimal time window so as to minimise noise.
Single-shot information theory inspires a new formulation of statistical mechanics which measures the optimal guaranteed work of a system.
Exact equations for the thermodynamic quantities of lattices made of d-dimensional hypercubes are obtainable with the Bethe-Peierls approach.
A dynamical microscopic theory of instability for financial networks reformulates the DebtRank algorithm in terms of basic accounting principles.
The Yule-Simon distribution describes the diffusion of knowledge and ideas in a social network which in turn influences economic growth.
The stable structures of calcium and magnesium carbonate at high pressures are crucial for understanding the Earth's deep carbon cycle.
The speed of a financial crisis outbreak sets the maximum delay before intervention by central authorities is no longer effective.
A local model of preferential attachment with short-term memory generates scale-free networks, which can be readily computed by memristors.
Dynamical systems theory predicts the growth potential of countries with heterogeneous patterns of evolution where regression methods fail.
A simple formula gives the maximum time for an n x n grid to become entirely infected having undergone a bootstrap percolation process.
Less developed countries have to learn simple capabilities in order to start a stable industrialization and development process.
The analysis of real networks which contain many short loops requires novel methods, because they break the assumptions of tree-like models.
A quick and simple way to evaluate the packing fraction of polydisperse spheres, which is a measure of how they crowd around each other.
Time series data from networks of credit default swaps display no early warnings of financial crises without additional macroeconomic indicators.
Explicit formulae for the Shannon entropies of random graph ensembles provide measures to compare and reproduce their topological features.
When networks come under attack, a repairable architecture is superior to, and globally distinct from, an architecture that is robust.
A review of the achievements concerning typical bipartite entanglement for random quantum states involving a large number of particles.
Generating random structures in the vicinity of a material’s defect predicts the low and high energy atomic structure at the grain boundary.
The critical probability for bootstrap percolation, a process which mimics the spread of an infection in a graph, is bounded for Galton-Watson trees.
The likelihood of stock prices bouncing on specific values increases due to memory effects in the time series data of the price dynamics.
The interplay between redundancies and smart reconfiguration protocols can improve the resilience of networked infrastructures to failures.
Fractal structures need very little mass to support a load; but for current designs, this makes them vulnerable to manufacturing errors.
The optimal architecture of a financial system is only dependent on its topology when the market is illiquid, and no topology is always superior.
The immune system must simultaneously recall multiple defense strategies because many antigens can attack the host at the same time.
Lognormal distributions (and mixtures of same) are a useful model for the size distribution in emulsions and sediments.
We introduce a novel method to define a self-consistent and non-monetary metrics for the competitiveness of countries.
We present a framework to define a data-driven metrics to assess the level of competitiveness of a country.
We define a generalized ensemble of graphs by introducing the concept of graph temperature.
Information theory fixes weighted networks’ degeneracy issues with a generalisation of binary graphs and an optimal scale of link intensities.
An intriguing analogy exists between neural networks and immune networks.
The Eiffel Tower was never intended to be a permanent feature of the Parisian landscape.
The most efficient load-bearing fractals are designed as big structures under gentle loads ... a situation common in aerospace applications.
Complex networks detect the driver institutions of an interbank market and ascertain that intervention policies should be time-scale dependent.
New mathematical tools can help infer financial networks from partial data to understand the propagation of distress through the network.
Network-based metrics to assess systemic risk and the importance of financial institutions can help tame the financial derivatives market.
Vulnerability and systemicity do not depend only on GDP but also on the complex network of financial relations.
A statistical procedure identifies dominant edges within weighted networks to determine whether a network has reached its steady state.
A material’s architecture can be controlled over an ever increasing set of length scales.
Network theory finds unexpected interactions between the number of products a country produces and the number of countries producing each product.
A snapshot of the bipartite network for the most important countries; size of vertices is the fitness and the complexity
Network analysis of diagnostic data identifies combinations of the key factors which cause Class III malocclusion and how they evolve over time.
Graphical illustration of the analysis presented in this paper.
Unbiased randomisation processes generate sophisticated synthetic networks for modelling and testing the properties of real-world networks.
Spectral analysis shows that disassortative networks exhibit a higher epidemiological threshold and are therefore easier to immunize.
Edge multiplicity—the number of triangles attached to edges—is a powerful analytic tool to understand and generalize network properties.
It is vital that we understand in detail how the topological characteristics of a real network relate to those of a finite random network.
Analysis of the linear elastic behaviour of plant cell dispersions improves our understanding of how to stabilise and texturise food products.
A transfer operator formalism solves the macroscopic dynamics of disordered Ising chain systems which are relevant for ageing phenomena.
Our Monte Carlo model sheds light on the Ostwald ripening, namely the growth of big particles at the expense of the small ones.
New mathematical tools quantify the topological structure of large directed networks which describe how genes interact within a cell.
The information needed to self-assemble a structure quantifies its modularity and explains the prevalence of certain structures over others.
Techniques from random sphere packing predict the dimension of the Apollonian gasket, a fractal made up of non-overlapping hyperspheres.
The topological structure of tie knots categorises them by shape, size and aesthetic appeal and defines the sequence of knots to produce them.
Of the 256 elementary cellular automata, 28 of them exhibit random behavior over time, but spatio-temporal currents still lurk underneath.
In single elimination competition the best indicator of success is a player's wealth: the accumulated wealth of all defeated players.