Deep Paper Pool really deep.

Inception v4, Inception-ResNet and the Impact of Residual Connections on Learning

Paper: arxiv

Key idea:

Here we give clear empirical evidence that training with residual connections accelerates the training of Inception networks significantly.

Some idea:

The authors argue that residual connections are inherently necessary for training very deep convolutional models. Our findings do not seem to support this view, at least for image recognition. In the experimental section we demonstrate that it is not very difficult to train competitive very deep networks without utilizing residual connections. However the use of residual connections seems to improve the training speed greatly, which is alone a great argument for their use.