Back to Papers
2015|Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun
Deep Residual Learning (ResNet)
Introduces residual learning with skip connections to train very deep networks. The key insight is that learning residual mappings $\mathcal{F}(x) := \mathcal{H}(x) - x$ is easier than learning unreferenced mappings directly.
The degradation problem motivated the design: adding more layers to a suitably deep model leads to higher training error. This is not caused by overfitting, but by the difficulty of optimizing deeper networks.
Residual connections provide an identity shortcut that allows gradients to flow directly through the network, enabling training of networks with 100+ layers.
Read Original PaperImplementation Track
4 tasks to complete
Task01
EasyBasic Residual Block
Implement a basic residual block with two 3x3 conv layers and skip connection
Task02
MediumBottleneck Block
Implement the 1x1 → 3x3 → 1x1 bottleneck architecture for deeper networks
Task03
EasySkip Connection
Implement identity and projection shortcuts for dimension matching
Task04
HardResNet Forward Pass
Assemble blocks into a full ResNet-18 forward pass
