Abstract: |
Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. As a sideline, in this walk we derive an alternative (with respect to the original Hebb proposal) way to recover the Hebbian paradigm, stemming from mixing ferromagnets with spin-glasses. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers, hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective. |