Refresh your knowledge of normalization and regularization. Explore neural networks and how they map to TensorFlow*. Starting with a single neuron, apply an activation function, learn about layers of neurons, and finally understand how that translates to a feed-forward network.
Learn about batching and how to use it to help train your network. Discover ways to use full batch, mini batch, or stochastic gradient descent. Learn how to implement a multiclass classification, use back-propagation to update network weights, and identify the type of activation functions to use. See how to use dropout to smooth out your solution and avoid letting a single neuron dominate your network.
Learn about kernels and how they apply to convolutional neural networks (CNN). Explore the different parameters in a CNN and how a pooling layer can help. Review the LeNet* topology and how it covers all the different CNN layers discussed in earlier lessons.
Understand the AlexNet topology and how it compares to LeNet. See how to use a basic template for a CNN. Learn how to save and load models in TensorFlow*. Learn about momentum and certain optimizers, such as AdaGrad (adaptive gradient descent), RMSProp (root mean square propagation), and Adam that help with regularizing a neural network.
Gain a basic understanding of transfer learning, tensors, and operations. See how to apply them to an existing pretrained model, and to accelerate your training. Learn about batch normalization, why it is important, and how to implement it in TensorFlow. Get a brief look at Visual Geometry Group (VGG) and how it compares to other networks.