AUTHOR=Krakauer David C. TITLE=Unifying complexity science and machine learning JOURNAL=Frontiers in Complex Systems VOLUME=1 YEAR=2023 URL=https://www.frontiersin.org/journals/complex-systems/articles/10.3389/fcpxs.2023.1235202 DOI=10.3389/fcpxs.2023.1235202 ISSN=2813-6187 ABSTRACT=
Complexity science and machine learning are two complementary approaches to discovering and encoding regularities in irreducibly high dimensional phenomena. Whereas complexity science represents a coarse-grained paradigm of understanding, machine learning is a fine-grained paradigm of prediction. Both approaches seek to solve the “Wigner-Reversal” or the unreasonable ineffectiveness of mathematics in the adaptive domain where broken symmetries and broken ergodicity dominate. In order to integrate these paradigms I introduce the idea of “Meta-Ockham” which 1) moves minimality from the description of a model for a phenomenon to a description of a process for generating a model and 2) describes low dimensional features–schema–in these models. Reinforcement learning and natural selection are both parsimonious in this revised sense of minimal processes that parameterize arbitrarily high-dimensional inductive models containing latent, low-dimensional, regularities. I describe these models as “super-Humean” and discuss the scientic value of analyzing their latent dimensions as encoding functional schema.