Structure from noise: Learning networks through errors in cognition

We seek to understand recent results in cognitive science and neuroscience suggesting that humans are adept at uncovering abstract associations in the world around them, either between words in spoken and written language, notes in music, or even concepts in classroom lectures. Intuitively, researchers in statistical learning have hypothesized that inferring the higher-order structure between concepts or stimuli should involve complicated mental processes, such as Bayesian inference or hierarchical learning algorithms. Here we propose a competing perspective: that rather than being the result of complex inference algorithms, higher-order associations actually arise from natural errors in learning and memory.

Combining ideas from information theory and reinforcement learning, we construct a novel maximum entropy model of people’s internal representations of the transitions between concepts or stimuli. The model itself derives from the free energy principle in statistical mechanics, which has recently become recognized by computational neuroscientists as a useful tool for formalizing abstract assumptions about the function of the brain. Importantly, our model (i) affords a concise analytic form, thereby aiding intuition; (ii) qualitatively explains the effects of transition network structure on human expectations; and (iii) quantitatively predicts human reaction times in probabilistic sequential motor tasks. Together, these results suggest that mental errors influence our abstract representations of the world in significant and predictable ways and have direct implications for the study and design of optimally learnable sources of information.

References:

Christopher W. Lynn, Ari E. Kahn, and Danielle S. Bassett. Structure from noise: Mental errors yield abstract representations of events. In Revision, Nature Physics. (arxiv.org/abs/1805.12491)

 

Collective human activity emerges from pairwise correlations

Modern life is rife with examples of correlated human behavior, from spikes in online activity to traffic jams on the highway. But where do these surges in human activity come from? Existing explanations have focused primarily on context-specific mechanisms, such as daily and weekly routines giving rise to traffic jams and natural disasters sparking increases in demand for emergency services. Here, we propose and test a more general explanation: that surges in human activity emerge organically from fine-scale correlations between pairs of individuals within a population.

To investigate the flow of information from the scale of individual humans to an entire population, we adapt and extend ideas from information theory and statistical mechanics. The result is a maximum entropy model that provides accurate quantitative predictions of large-scale human behaviors based only upon the simple pairwise correlations between individuals. The accuracy of the maximum entropy model, in turn, suggests that patterns of activity across an entire human population can be understood as emerging from an aggregation of simple pairwise correlations, fundamentally shifting the way that we perceive many collective human behaviors.

By harkening back to centuries-old intuitions from statistical physics, these results open the door for new predictive models of human activity on large scales, with implications for resource allocation in communication and transportation networks, understanding social organization, and preventing viral epidemics.

References:

Christopher W. Lynn, Lia Papadopoulos, Daniel D. Lee, and Danielle S. Bassett. Surges of collective human activity emerge from simple pairwise correlations. Physical Review X 9,011022 (2019).

 

Control of Ising networks

The last 10 years have witnessed a dramatic increase in the use of maximum entropy models to describe a diverse range of real-world systems, including networks of neurons in the brain, flocks of birds in flight, and interactions between proteins. Broadly speaking, the maximum entropy principle allows researchers to formalize the hypothesis that large-scale patterns in complex systems emerge organically from an aggregation of simple fine-scale interactions between individual elements. Given the wealth of real-world systems that are quantitatively described by maximum entropy models, it is of fundamental practical and scientific interest to understand how external influence affects the dynamics of these systems. Fortunately, all maximum entropy models are similar, if not formally equivalent, to the Ising model, a foundational model in statistical physics.

As a first step toward understanding how to control such complex systems, we study the problem of maximizing the total activity of an Ising network given a budget of external influence. To solve this problem, which we coin Ising influence maximization, we propose a standard gradient ascent algorithm. Interestingly, the gradient in our algorithm is equivalent to the susceptibility of the system in statistical mechanics, and, in theory, can be calculated using the foundational fluctuation-dissipation theorem. From a practical perspective, however, calculations in the Ising model require an amount of time that is exponential in the size of the system, making an exact computation of the susceptibility infeasible in most real-world settings. To address this problem, we propose a sequence of approximations to the susceptibility based on the Plefka expansion that achieve progressively more accurate solutions to the Ising influence maximization problem. Moreover, we find that the structure of optimal stimulation undergoes a transition from focusing influence on central hubs in the Ising network at high temperatures to peripheral nodes at low temperatures. This final result suggests that optimal control strategies for complex systems depend critically on the strength of random fluctuations in the interactions between elements.

References:

Christopher W. Lynn and Daniel D. Lee. Maximizing Activity in Ising Networks via the TAP Approximation. Association for the Advancement of Artificial Intelligence (2018). (arxiv.org/abs/1803.00110)

Christopher W. Lynn and Daniel D. Lee. Statistical Mechanics of Influence Maximization with Thermal Noise. EPL (Europhysics Letters) 117.6: 66001 (2017).

Christopher W. Lynn and Daniel D. Lee. Maximizing Influence in an Ising Network: A Mean-Field Optimal Solution. Advances in Neural Information Processing Systems (2016). (arxiv.org/abs/1608.06850)

 

Simulating channeling radiation at Fermilab

linacpicI spent the summer of 2013 at Fermilab working with Tanaji Sen. Together, we developed simulation software to predict the results of electron-channeling experiments at the LINAC at Fermilab (previously named ASTA). In the experiments, a tight beam of electrons is fired parallel to the carbon planes of a diamond. Once the electrons enter the diamond, they become trapped in the inter-planar potential and hop between the induced quantum excited states. When electrons hop from high- to low-energy states, they release radiation that is Lorentz-boosted into the X-ray spectrum in the direction of the electron beam. Given the structure of the beam and the diamond crystal, our simulations were able to accurately predict the spectrum of the resulting channeling radiation from first-principles.

References:

Tanaji Sen and Christopher LynnSpectral Brilliance of Channeling Radiation at the ASTA Photoinjector. Journal of Modern Physics A 29.30: 1450179. 2014.

Ben Blomberg, Daniel Mihalcea, Harsha Panuganti, Philippe Piot, Charles Brau, Bo Choi, William Gabella, Borislav Ivanov, Marcus Mendenhall, Christopher Lynn, Tanaji Sen, Wolfgang Wagner. Planned High-brightness Channeling Radiation Experiment at Fermilab’s Advanced Superconducting Test Accelerator. International Particle Accelerator Conference (IPAC). 2014.

 

Skip to toolbar