Based on feedback that course content is at graduate level (501) rather than undergraduate level (101), we have renumbered the courses to reflect the depth of the content more accurately. There has been no change to the lectures or exercises.


TensorFlow* is a popular machine learning framework and open-source library for dataflow programming. In this course, you will learn about:

  • The fundamentals of building models with TensorFlow*
  • Machine learning basics like linear regression, loss functions, and gradient descent
  • Important techniques like normalization, regularization, and mini-batching
  • Kernels and how to apply them to convolutional neural networks (CNN)
  • The basic template for a CNN and different parameters that can be adjusted
  • TFRecord, queues, and coordinators

By the end of this course, students will have a firm understanding of:

  • Basic network construction, kernels, pooling, and multiclass classification
  • How to expand a basic network into a more complex network
  • Using transfer learning to take advantage of existing networks by building on top of them

The course is structured around eight weeks of lectures and exercises. Each week requires at least three hours to complete.

Week 1

During this course you will learn the fundamentals of TensorFlow, as well as how to use it to define and run a computational graph.


Week 2

Review machine learning basics beginning with linear regression, loss functions, and gradient descent. Learn how to implement a basic gradient descent in TensorFlow.


Week 3

Refresh your knowledge of normalization and regularization. Explore neural networks and how they map to TensorFlow*. Starting with a single neuron, apply an activation function, learn about layers of neurons, and finally understand how that translates to a feed-forward network.


Week 4

Learn about batching and how to use it to help train your network. Discover ways to use full batch, mini batch, or stochastic gradient descent. Learn how to implement a multiclass classification, use back-propagation to update network weights, and identify the type of activation functions to use. See how to use dropout to smooth out your solution and avoid letting a single neuron dominate your network.


Week 5

Learn about kernels and how they apply to convolutional neural networks (CNN). Explore the different parameters in a CNN and how a pooling layer can help. Review the LeNet* topology and how it covers all the different CNN layers discussed in earlier lessons.


Week 6

Understand the AlexNet topology and how it compares to LeNet. See how to use a basic template for a CNN. Learn how to save and load models in TensorFlow*. Learn about momentum and certain optimizers, such as AdaGrad (adaptive gradient descent), RMSProp (root mean square propagation), and Adam that help with regularizing a neural network.


Week 7

Gain a basic understanding of transfer learning, tensors, and operations. See how to apply them to an existing pretrained model, and to accelerate your training. Learn about batch normalization, why it is important, and how to implement it in TensorFlow. Get a brief look at Visual Geometry Group (VGG) and how it compares to other networks.


Week 8

Learn about the TFRecords format and how to create your own TFRecord. Also learn about TensorFlow queues and how it speeds up data delivery.