Back to Projects

Deconstructing Hopfield Networks from Scratch

Explore the mathematics of associative memory, energy landscapes, and the deep connection to Transformer attention — built entirely from scratch in pure PyTorch.

Part 1

The Math of Associative Memory

Breaking down energy functions, Hebbian learning, storage capacity limits, and the connection between Hopfield networks and modern attention.

Part 2

PyTorch Implementation

Building Classical binary Hopfield and Modern continuous Hopfield networks with exponential storage capacity — entirely from scratch.
View Code on GitHub

Part 3

Memory Retrieval vs Attention

Pattern completion demos, capacity comparison (linear vs exponential), and proving mathematical equivalence to softmax attention.