0%

Convolutional Neural Networks. Part 2.

Convolutional Neural Networks. Andrew Ng.


Convolutional Neural Networks. Part 2.

Part 2.
Case Studies. Convolutional Neural Networks. Practical advice for using ConvNets.

Why look at case studies?

how to put together these basic building blocks to form effective conv nets
One good way : to read or see other examples of effective conv nets

It turns out that a neural network architecture that works well on one computer vision task often works well on other tasks as well, such as maybe on your task.

Ontline

Classic networks:

  • LeNet-5
  • AlexNet
  • VGG

ResNet

Inception

Classic networks

LeNet - 5

AlexNet

VGG - 16

Residual Networks (ResNets)

Very, very deep neural networks are difficult to train because of vanishing and exploding gradient types of problems. You’ll learn skip connections which allows you to take the activation from one layer and suddenly feed it to another layer even much deeper in the neural network. And using that, you’ll build ResNet which enables you to train very, very deep networks.

Why ResNets work

Network in Network and 1 × 1 convolutions

Inception network motivation

Inception network

MobileNet

MobileNet Architecture

EfficientNet

Using open-source implementations

Transfer Learning

Data augmentation

The state of computer vision


Papers

[Lecun et al., 1998. Gradient-based learning applied to document recognition]
[Krizhevsky et al., 2012. ImageNet classification with deep convolutional neural networks]
[Simonyan & Zisserman 2015. Very deep convolutional networks for large-scale image recognition]
[He et al., 2015. Deep residual networks for image recognition]
[Lin et al., 2013. Network in network]
[Szegedy et al., 2014. Going deeper with convolutions]
[Howard et al. 2017, MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications]
[Sandler et al. 2019, MobileNetV2: Inverted Residuals and Linear Bottlenecks]
[Tan and Le, 2019, EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks]

传送门

(强推)2021吴恩达深度学习-卷积神经网络