{{{credits}}}
L | T | P | C |
3 | 0 | 2 | 4 |
- To have a basic knowledge of the concepts and techniques of machine learning.
- To understand the working of various machine learning algorithms.
- To use the various probability based learning techniques and evolutionary models.
- To understand graphical models.
{{{unit}}}
Unit I | Introduction | 8 |
Learning: Types of machine learning – Design of a learning system – Perspectives and issues in machine learning; Concept Learning Task: Concept learning as search – Finding a maximally specific hypothesis – Version spaces and Candidate elimination algorithm; Curse of dimensionality – Overfitting – Bias-variance tradeoff.
{{{unit}}}
Unit II | Linear and Non-Linear Models | 10 |
The Brain and the Neuron – Perceptron – Linear separability – Linear regression; Multi-Layer Perceptron: Going forwards – Going backwards – Back propagation error – Multi-layer perceptron in Practice – Examples of using the MLP – Deriving back-propagation; Radial Basis Functions and Splines: Concepts – RBF Network; Support Vector Machines: Kernels.
{{{unit}}}
Unit III | Tree and Probabilistic Models | 9 |
Learning with Trees: Decision trees – Constructing decision trees – Classification and regression trees; Ensemble Learning: Boosting – Bagging – Different ways to Combine Classifiers; Probabilistic Learning: Gaussian Mixture Models – Nearest neighbor methods; Unsupervised Learning: K-means algorithms.
{{{unit}}}
Unit IV | Dimensionality Reduction and Evolutionary Models | 9 |
Dimensionality Reduction: Linear discriminant analysis – Principal component analysis – Independent component analysis; Evolutionary Learning: Genetic algorithms – Genetic offspring – Genetic operators – Using Genetic algorithms; Reinforcement Learning: `Getting lost’ example – Markov decision process.
{{{unit}}}
Unit V | Graphical Models | 9 |
Markov Chain Monte Carlo Methods: Sampling – Proposal distribution – Markov Chain Monte Carlo; Graphical Models: Bayesian networks – Markov Random Fields – Hidden markov models
- Perceptron and Linear Regression
- Multi-layer Perceptron
- Support Vector Machine
- Decision Tree algorithm
- k-Nearest Neighbor algorithm
- K-means clustering
- Random Forest and AdaBoost ensemble techniques
- Dimensionality reduction techniques : LDA, PCA
\hfill Total: 75
On successful completion of this course, the student will be able to
- Explain the basic concepts of machine learning (K2)
- Analyze linear and non-linear techniques for classification problems (K4)
- Apply tree and probabilistic models for the given problems (K3)
- Apply various dimensionality reduction techniques and evolutionary models (K3)
- Explain the concepts of graphical models (K2)
- Stephen Marsland, “Machine Learning – An Algorithmic Perspective”, Second Edition, Chapman and Hall/CRC Machine Learning and Pattern Recognition Series, 2014.
- Tom M Mitchell, “Machine Learning, First Edition”, McGraw Hill Education, 2013.
- Ethem Alpaydin, “Introduction to Machine Learning 3e (Adaptive Computation and Machine Learning Series)”, Third Edition, MIT Press, 2014
- Jason Bell, “Machine learning – Hands on for Developers and Technical Professionals”, First Edition, Wiley, 2014
- Peter Flach, “Machine Learning: The Art and Science of Algorithms that Make Sense of Data”, First Edition, Cambridge University Press, 2012.
PO1 | PO2 | PO3 | PO4 | PO5 | PO6 | PO7 | PO8 | PO9 | PO10 | PO11 | ||
K3 | K6 | K6 | K6 | K6 | ||||||||
CO1 | K2 | 2 | ||||||||||
CO2 | K4 | 3 | 2 | 2 | 2 | 2 | 2 | |||||
CO3 | K3 | 3 | 2 | 2 | 2 | 2 | ||||||
CO4 | K3 | 3 | 2 | 2 | 2 | 2 | ||||||
CO5 | K2 | 2 | ||||||||||
Total | 13 | 6 | 6 | 6 | 6 | 2 | ||||||
Course Mapping | 3 | 2 | 2 | 2 | 2 | 2 |