The space of functions computed by deep layered machines
The ability of deep neural networks to generalize can be unraveled using path integral methods to compute their typical Boolean functions.
Submitted to Physical Review Letters (2020)
A. Mozeika, B. Li, D. Saad
We study the space of Boolean functions computed by random layered machines, including deep neural networks, and Boolean circuits. Investigating recurrent and layered feed-forward architectures, we find that the spaces of functions realized by both architectures are the same. We show that, depending on the initial conditions and computing elements used, the entropy of Boolean functions computed by deep layered machines is either monotonically increasing or decreasing with growing depth, and characterize the space of functions computed at the large depth limit.