The space of functions computed by deep layered machines

The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.

Submitted to Physical Review Letters (2020)

A. Mozeika, B. Li, D. Saad

LQ placeholderThe space of functions computed by deep layered machines

We study the space of Boolean functions computed by random layered machines, including deep neural networks, and Boolean circuits. Investigating recurrent and layered feed-forward architectures, we find that the spaces of functions realized by both architectures are the same. We show that, depending on the initial conditions and computing elements used, the entropy of Boolean functions computed by deep layered machines is either monotonically increasing or decreasing with growing depth, and characterize the space of functions computed at the large depth limit.

LQ placeholderNetwork valuation in financial systems

Network valuation in financial systems

P. Barucca, M. Bardoscia, F. Caccioli, M. D’Errico, G. Visentin, G. Caldarelli, S. Battiston

Mathematical Finance

LQ placeholderThe space of functions computed by deep layered machines

The space of functions computed by deep layered machines

A. Mozeika, B. Li, D. Saad

Sub. to Physical Review Letters

LQ placeholderReplica analysis of overfitting in generalized linear models

Replica analysis of overfitting in generalized linear models

T. Coolen, M. Sheikh, A. Mozeika, F. Aguirre-Lopez, F. Antenucci

Sub. to Journal of Physics A

LQ placeholderTaming complexity

Taming complexity

M. Reeves, S. Levin, T. Fink, A. Levina

Harvard Business Review

LQ placeholderReplica analysis of Bayesian data clustering

Replica analysis of Bayesian data clustering

A. Mozeika, T. Coolen

Journal of Physics A

LQ placeholderDegree-correlations in a bursting dynamic network model

Degree-correlations in a bursting dynamic network model

F. Vanni, P. Barucca

Journal of Economic Interaction and Coordination

123 / 123 papers