What Happened
In late 2015, the ResNet paper proposed residual (skip) connections, making it possible to train substantially deeper networks by easing optimization.
Why It Matters
Residual connections became a broadly used design pattern across deep learning, influencing architectures in vision and beyond, and contributing to stable training of deep stacks.
Technical Details
Residual blocks add an identity shortcut around transformations, allowing gradients to propagate more reliably and reducing degradation as depth increases.