Deconstructing ResNets from Scratch
Explore the mathematics of residual learning, identity skip connections, and gradient highways — built entirely from scratch in pure PyTorch.
Part 1
The Math of Residual Learning
Breaking down the degradation problem, the residual formulation $y = F(x) + x$, gradient highways via skip connections, and the ensemble interpretation of deep residual networks.
Part 2
PyTorch Implementation
Building ResidualBlocks, BottleneckBlocks, and the full ResNet-18/34/50/101/152
family plus a SmallResNet variant for CIFAR-10 — entirely from scratch.
View Code on GitHub
Part 3
Training & Analyzing Deep Networks
Training on CIFAR-10, visualizing activation flow through residual layers, and inspecting learned feature maps to verify that skip connections preserve signal.