Back to Projects

Deconstructing Echo State Networks (ESNs)

Explore Reservoir Computing, the Echo State Property, closed-form PyTorch linear readouts, and forecasting chaotic systems 1000x faster than BPTT.

Part 1

The End of BPTT

The math behind Reservoir Computing, the Echo State Property, Spectral Radius, and why we can replace gradient descent with closed-form linear algebra.

Part 2

PyTorch Implementation

Building the massive, sparse, non-trainable random reservoir and Ridge Regression (Tikhonov Regularization) readout layer in pure PyTorch.
View Code on GitHub

Part 3

Predicting Chaos

Benchmarking our ESN against a standard PyTorch LSTM on predicting the chaotic Mackey-Glass sequence to demonstrate massive training speedups.