Skip to content

Latest commit

 

History

History
135 lines (110 loc) · 5.13 KB

PE304-Deep-Learning.org

File metadata and controls

135 lines (110 loc) · 5.13 KB

<<<PE304>>> DEEP LEARNING

{{{credits}}}

LTPC
3003

COURSE OBJECTIVES

  • To understand the basics of deep neural networks
  • To understand CNN and RNN architectures of deep neural networks
  • To comprehend advanced deep learning models
  • To learn the evaluation metrics for deep learning models.

{{{unit}}}

UNIT IDEEP NETWORKS BASICS9

Linear Algebra: Scalars – Vectors – Matrices and tensors; Probability Distributions – Gradient-based Optimization – Machine Learning Basics: Capacity – Overfitting and underfitting – Hyperparameters and validation sets – Estimators – Bias and variance – Stochastic gradient descent – Challenges motivating deep learning; Deep Networks: Deep feedforward networks; Regularization – Optimization.

{{{unit}}}

UNIT IICONVOLUTIONAL NEURAL NETWORKS9

Convolution Operation – Sparse Interactions – Parameter Sharing – Equivariance – Pooling – Convolution Variants: Strided – Tiled – Transposed and dilated convolutions; CNN Learning: Nonlinearity Functions – Loss Functions – Regularization – Optimizers – Gradient Computation.

{{{unit}}}

UNIT IIIRECURRENT NEURAL NETWORKS10

Unfolding Graphs – RNN Design Patterns: Acceptor – Encoder – Transducer; Gradient Computation – Sequence Modeling Conditioned on Contexts – Bidirectional RNN – Sequence to Sequence RNN – Deep Recurrent Networks – Recursive Neural Networks – Long Term Dependencies; Leaky Units: Skip connections and dropouts; Gated Architecture: LSTM.

{{{unit}}}

UNIT IVMODEL EVALUATION8

Performance metrics – Baseline Models – Hyperparameters: Manual Hyperparameter – Automatic Hyperparameter – Grid search – Random search – Debugging strategies.

{{{unit}}}

UNIT VAUTOENCODERS AND GENERATIVE MODELS9

Autoencoders: Undercomplete autoencoders – Regularized autoencoders – Stochastic encoders and decoders – Learning with autoencoders; Deep Generative Models: Variational autoencoders – Generative adversial networks.

\hfill Total Periods: 45

COURSE OUTCOMES

After the completion of this course, students will be able to:

  • Understand basics in deep neural networks (K2)
  • Apply Convolution Neural Network for image processing (K3)
  • Apply Recurrent Neural Network and its variants for text analysis (K3)
  • Apply model evaluation for various applications (K3)
  • Understand the concepts in autoencoders and generative models (K2).

TEXT BOOKS

  1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.

REFERENCES

  1. Salman Khan, Hossein Rahmani, Syed Afaq Ali Shah, Mohammed Bennamoun, “A Guide to Convolutional Neural Networks for Computer Vision”, Synthesis Lectures on Computer Vision, Morgan & Claypool publishers, 2018.
  2. Yoav Goldberg, “Neural Network Methods for Natural Language Processing”, Synthesis Lectures on Human Language Technologies, Morgan & Claypool publishers, 2017.
  3. Francois Chollet, “Deep Learning with Python”, Manning Publications Co, 2018.
  4. Charu C. Aggarwal, “Neural Networks and Deep Learning: A Textbook”, Springer International Punlishing, 2018.
  5. Josh Patterson, Adam Gibson, “Deep Learning: A Practitioner’s Approach”, O’Reilly Media, 2017.

** CO PO PSO MAPPING :noexport:

PO1PO2PO3PO4PO5PO6PO7PO8PO9PO10PO11PO12PSO1PSO2PSO3
K3K6K6K6K6K6K5K6
CO1K2211
CO2K3322222
CO3K3322222
CO4K3322
CO5K2211
Score1384484
Course Mapping322222