Machine Learning

Paper Code: 
MCA 525E
Credits: 
04
Periods/week: 
04
Max. Marks: 
100.00
Objective: 
  • To understand the basic concept, approaches and techniques in ML
  • To develop a deeper understanding of supervised and unsupervised learning.
  • To develop the design and programming skills that will help you to build intelligent, adaptive artifacts
  • To develop the basic skills necessary to pursue research in ML
10.00
Unit I: 
Introduction

Machine Learning - Machine Learning Foundations –Overview – Applications - Types of machine learning - Basic concepts in machine learning - Examples of Machine Learning.

12.00
Unit II: 
Supervised Learning

Introduction, Linear Models for Classification – Linear Regression – Logistic Regression - Bayesian Logistic Regression - Probabilistic Models. Neural Networks -Feed-forward Network Functions - Error Back propagation - Regularization - Bayesian Neural Networks - Radial Basis Function Networks. Ensemble methods- Random Forest - Bagging- Boosting

12.00
Unit III: 
Unsupervised Learning

Clustering- K-means clustering – EM (Expectation–Maximization) - Mixtures of Gaussians - EM Algorithm in General - The Curse of Dimensionality -Dimensionality Reduction - Factor analysis - Principal Component Analysis - Probabilistic PCA- Independent components analysis. Challenges for big data analytics

12.00
Unit IV: 
Probabilistic Graphical Models

Directed Graphical Models - Bayesian Networks - Exploiting Independence Properties - From Distributions to Graphs -Examples -Markov Random Fields - Inference in Graphical Models - Learning –Naive Bayes classifiers-Markov Models – Hidden Markov Models. Undirected graphical models- Markov Random Fields- Conditional independence properties

14.00
Unit V: 
Advanced Learning

Sampling – Basic sampling methods – Monte Carlo. Reinforcement Learning- K-Armed Bandit Elements - Model-Based Learning- Value Iteration- Policy Iteration. Temporal Difference Learning Exploration Strategies- Deterministic and Non-deterministic Rewards and Actions- Eligibility Traces Generalization- Partially Observable States- The Setting- Example. Semi - Supervised Learning. Computational Learning Theory.

ESSENTIAL READINGS: 
  • Christopher Bishop, “Pattern Recognition and machine learning”, Springer 2006.
  • EthemAlpaydin, “Introduction to Machine Learning”, Prentice Hall of India, 2005
REFERENCES: 
  • Tom Mitchell, "Machine Learning", McGraw-Hill, 1997.
  • Stephen Marsland, “Machine Learning –An Algorithmic Perspective”, CRC Press, 2009
  • Kevin P. Murphy, “Machine Learning: A Probabilistic Perspective”, MIT Press, 2012
Academic Year: