{{{credits}}}
L | T | P | C |
3 | 0 | 0 | 3 |
- To understand the basics of deep neural networks
- To understand CNN and RNN architectures of deep neural networks
- To comprehend advanced deep learning models
- To learn the evaluation metrics for deep learning models.
{{{unit}}}
UNIT I | DEEP NETWORKS BASICS | 9 |
Linear Algebra: Scalars – Vectors – Matrices and tensors; Probability Distributions – Gradient-based Optimization – Machine Learning Basics: Capacity – Overfitting and underfitting – Hyperparameters and validation sets – Estimators – Bias and variance – Stochastic gradient descent – Challenges motivating deep learning; Deep Networks: Deep feedforward networks; Regularization – Optimization.
{{{unit}}}
UNIT II | CONVOLUTIONAL NEURAL NETWORKS | 9 |
Convolution Operation – Sparse Interactions – Parameter Sharing – Equivariance – Pooling – Convolution Variants: Strided – Tiled – Transposed and dilated convolutions; CNN Learning: Nonlinearity Functions – Loss Functions – Regularization – Optimizers – Gradient Computation.
{{{unit}}}
UNIT III | RECURRENT NEURAL NETWORKS | 10 |
Unfolding Graphs – RNN Design Patterns: Acceptor – Encoder – Transducer; Gradient Computation – Sequence Modeling Conditioned on Contexts – Bidirectional RNN – Sequence to Sequence RNN – Deep Recurrent Networks – Recursive Neural Networks – Long Term Dependencies; Leaky Units: Skip connections and dropouts; Gated Architecture: LSTM.
{{{unit}}}
UNIT IV | MODEL EVALUATION | 8 |
Performance metrics – Baseline Models – Hyperparameters: Manual Hyperparameter – Automatic Hyperparameter – Grid search – Random search – Debugging strategies.
{{{unit}}}
UNIT V | AUTOENCODERS AND GENERATIVE MODELS | 9 |
Autoencoders: Undercomplete autoencoders – Regularized autoencoders – Stochastic encoders and decoders – Learning with autoencoders; Deep Generative Models: Variational autoencoders – Generative adversial networks.
\hfill Total Periods: 45
After the completion of this course, students will be able to:
- Understand basics in deep neural networks (K2)
- Apply Convolution Neural Network for image processing (K3)
- Apply Recurrent Neural Network and its variants for text analysis (K3)
- Apply model evaluation for various applications (K3)
- Understand the concepts in autoencoders and generative models (K2).
- Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.
- Salman Khan, Hossein Rahmani, Syed Afaq Ali Shah, Mohammed Bennamoun, “A Guide to Convolutional Neural Networks for Computer Vision”, Synthesis Lectures on Computer Vision, Morgan & Claypool publishers, 2018.
- Yoav Goldberg, “Neural Network Methods for Natural Language Processing”, Synthesis Lectures on Human Language Technologies, Morgan & Claypool publishers, 2017.
- Francois Chollet, “Deep Learning with Python”, Manning Publications Co, 2018.
- Charu C. Aggarwal, “Neural Networks and Deep Learning: A Textbook”, Springer International Punlishing, 2018.
- Josh Patterson, Adam Gibson, “Deep Learning: A Practitioner’s Approach”, O’Reilly Media, 2017.
PO1 | PO2 | PO3 | PO4 | PO5 | PO6 | PO7 | PO8 | PO9 | PO10 | PO11 | PO12 | PSO1 | PSO2 | PSO3 | |
CO1 | 3 | 2 | 2 | ||||||||||||
CO2 | 3 | 3 | 3 | 3 | 3 | 2 | |||||||||
CO3 | 3 | 3 | 3 | 3 | 3 | 2 | |||||||||
CO4 | 3 | 2 | 2 | ||||||||||||
CO5 | 3 | 2 | 2 | ||||||||||||
Score | 15 | 12 | 6 | 6 | 12 | 4 | |||||||||
Course Mapping | 3 | 3 | 3 | 3 | 3 | 2 |