The space of functions computed by deep layered machines

The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.

Submitted to Physical Review Letters (2020)

A. Mozeika, B. Li, D. Saad

LQ placeholder

We study the space of Boolean functions computed by random layered machines, including deep neural networks, and Boolean circuits. Investigating recurrent and layered feed-forward architectures, we find that the spaces of functions realized by both architectures are the same. We show that, depending on the initial conditions and computing elements used, the entropy of Boolean functions computed by deep layered machines is either monotonically increasing or decreasing with growing depth, and characterize the space of functions computed at the large depth limit.

LQ placeholder

The space of functions computed by deep layered machines

A. Mozeika, B. Li, D. Saad

Sub. to Physical Review Letters

LQ placeholder

Replica analysis of overfitting in generalized linear models

T. Coolen, M. Sheikh, A. Mozeika, F. Aguirre-Lopez, F. Antenucci

Sub. to Journal of Physics A

LQ placeholder

Taming complexity

M. Reeves, S. Levin, T. Fink, A. Levina

Harvard Business Review

LQ placeholder

Replica analysis of Bayesian data clustering

A. Mozeika, T. Coolen

Journal of Physics A

LQ placeholder

Degree-correlations in a bursting dynamic network model

F. Vanni, P. Barucca

Journal of Economic Interaction and Coordination

LQ placeholder

Scale of non-locality for a system of n particles

S. Talaganis, I. Teimouri

Sub. to Physical Review D

123 / 123 papers