码迷,mamicode.com
首页 > 其他好文 > 详细

Udacity Nanodegree Program: Deep Learning Foundation: New Syllusbus

时间:2017-04-27 10:20:09      阅读:339      评论:0      收藏:0      [点我收藏+]

标签:learn   tle   graph   too   nump   html   mon   clear   his   

Program Structure

Every week, you can expect to see this content coming up:

  • Siraj‘s introductory video & One hour coding session
  • Additional lesson(s) from Mat & other Udacity experts

Then, approximately every four weeks you‘ll get a project.

The first week‘s content contains a bit more than an average week, as we‘re covering some introductory material and two topics from Siraj. So you can expect a little less than that going forward. Keep in mind the program is for students of all backgrounds. Some of the material might feel easy for more experienced students, but we‘re covering a lot of different topics and there will be a lot of advanced material.

 

Weekly Syllabus

Here is the list of topics that will be taught throughout the program:

Week 1: Introduction to Deep Learning

We’ll start off with a simple introduction to linear regression and machine learning. This will give you the vocabulary you need to understand recent advancements, and make clear where deep learning fits into the broader picture of ML techniques.

Then, you‘ll learn how to build a simple neural network from scratch using Numpy. We‘ll cover the algorithms used to train networks such as gradient descent and backpropagation.

The first project is also available this week. In this project, you‘ll predict bike ridership using a simple neural network.

 
技术分享
 

Week 2: Graph Computations

TensorFlow is the most popular framework for building deep learning networks. It is based on graph computation, an efficient method to represent and calculate the matrix operations involved in training networks. In this lesson, you‘ll build your own small version of TensorFlow, called MiniFlow, to deepen your understanding of backpropagation and start your work with TensorFlow.

You‘ll also learn how to evaluate machine learning models such as neural networks. We do this using validation, testing the model‘s performance on a small portion of the data.

 

Week 3: Sentiment Analysis

This week, you‘ll learn about sentiment analysis from Siraj and our guest instructor, Andrew Trask. That is, you‘ll use neural networks to predict if some text is positive or negative. Andrew will extend the network from project one and show you how to prepare your data and network to get much more efficient performance.

 

Week 4: Intro to TensorFlow

In this lesson, you‘ll be learning about TensorFlow, a popular deep learning framework built by Google. You‘ll use it to build a simple neural network.

You‘ll also be introduced to using cloud computing services such as AWS and FloydHub to run your networks on GPUs.

 
技术分享
 

Week 5: Deep Neural Networks

Deep neural networks have revolutionized multiple fields including computer vision, natural language processing, and artificial intelligence. In this lesson, you‘ll learn about using TensorFlow to build deep networks for classifying handwritten digits. We‘ll also cover common training improvements like dropout.

 

Week 6: Convolutional Networks

Convolutional networks have achieved state of the art results in computer vision. These types of networks can detect and identify objects in images. You‘ll learn how to build convolutional networks in TensorFlow.

You‘ll also get the second project, where you‘ll build a convolutional network to classify images of frogs, planes, cars, and more.

 
技术分享
 

Week 7: Recurrent Neural Networks

In this lesson, you’ll learn about Recurrent Neural Networks?—?a type of network architecture particularly well suited to data that forms sequences like text, music, and time series data. You‘ll build a recurrent neural network that can generate new text character by character.

 
技术分享
 

Week 8: Word Embeddings

When dealing with natural language problems, you‘ll end up working with huge vocabularies. This ends up being computationally inefficient, so instead we find smaller representations for all the words, called word embeddings. The words are represented by vectors that contain information about what the words actually mean semantically. To learn more about word embeddings, you‘ll implement a model known as Word2vec.

 
技术分享
 

Week 9: Using TensorBoard

TensorBoard is a visualization tool useful for inspecting your networks. We‘ll show you how to use TensorBoard to visualize the graphs you build with TensorFlow, as well as find the best parameters for your models.

Week 10: Text Generation

In this lesson, you‘ll learn about using a recurrent neural network to predict sentiment from text. You‘ll also start working on the third project, generating new TV scripts using a recurrent neural network.

Week 11: Sequence to Sequence

Neural Networks have been a fundamental part of the recent advancements in machine translation. The latest production versions of Google Translate and Baidu Translate both use deep learning architectures to automatically translate text from one language to another. This is done using a process known as Sequence to Sequence Learning, which we will explore in this lesson.

 
技术分享
 

Week 11: Chatbot QA System with voice (sequence to sequence more in-depth)

We’ll further explore Sequence to Sequence learning through building our very own Chatbot QA system that can answer unstructured queries from a user.

 
技术分享
 

Week 12: Transfer Learning

A common technique in deep learning is using pre-trained networks on new problems. For example, you can use a convolutional network trained on a huge dataset to classify images in a much smaller dataset. This method is called transfer learning and you‘ll learn how to use it to classify images of flowers without training a whole network yourself.

Week 13: Reinforcement Learning

Some of the most interesting advancements in deep learning have been in the field of Reinforcement Learning, where instead of training on a corpus of existing data, a network learns from live data it receives and adjusts accordingly. We’ll see how to apply Reinforcement Learning to build simple Game-Playing AIs that can win in a wide variety of Atari games.

You‘ll also build a network that can translate text in the fourth project.

 
技术分享
 

Week 14: Autoencoders

As recently shown by Google, deep learning can also be used to dramatically improve compression techniques. In this lesson we’ll explore using deep learning to build autoencoders that automatically find sparse representations of data.

Week 15: Generative Adversarial Networks

Generate Adversarial Networks (GANs) are a recent major advancement in deep learning methods, producing state-of-the-art results in image generation. The inventor of GANs, Ian Goodfellow, will teach you about building the networks yourself.

 
技术分享
 

Week 16: Image Generation

As echoed by Yan LeCunn, Generative Adversarial Networks are one of the most fundamental advancements in deep learning. You’ll explore this state of the art concept to generate images that most humans wouldn’t believe are generated by a computer.

 

Week 17: One-shot learning (Probabilistic Programming) Finally, we’ll look at one-shot learning, where our neural network is able to just learn from one (or a few) example, as opposed to a large amount of data.

Through this curriculum, you will absorb an exciting introduction to some of the most compelling advancements in deep learning! We hope you join us on this journey and we can’t wait to share more of these ideas with you.

In the fifth project, you‘ll use a GAN to generate new human faces.

Udacity Nanodegree Program: Deep Learning Foundation: New Syllusbus

标签:learn   tle   graph   too   nump   html   mon   clear   his   

原文地址:http://www.cnblogs.com/casperwin/p/6772255.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!