Credits: 3
A broad introduction to the foundations of Machine Learning (ML), as well as hands-on experience in applying ML algorithms to real-world data sets. Topics include various techniques in supervised and unsupervised learning, as well as applications to computer vision, data mining, and speech recognition.
Description
Prerequisite: 1 course with a minimum grade of C- from (ENEE324, STAT400); and 1 course with a minimum grade of C- from (ENEE150, CMSC216); and permission of ENGR-Electrical & Computer Engineering department.
Restriction: Permission of ENGR-Electrical & Computer Engineering department. And must be in one of the following programs (Engineering: Electrical; Engineering: Computer) ; or must be in the ECE Department's Machine Learning notation program.
Credit only granted for: ENEE436, ENEE439M, or CMSC422.
Formerly: ENEE439M.
Semesters Offered
Fall 2020, Spring 2021, Fall 2021, Spring 2022, Fall 2022, Spring 2023, Fall 2023, Spring 2024, Fall 2024, Spring 2025, Fall 2025Learning Objectives
- Learn the mathematical foundations of the field of machine learning.
- Gain insight on how to pose various problems in data analysis in the framework of machine learning.
- Implement classical and state-of-the-art machine learning algorithms on real-world data sets.
Topics Covered
- Overview: Why and What of Machine Learning (Ch. 1)
- Review: Probability (Appendix A)
- Review: Linear Algebra (Appendix A)
- Bayes decision theory (Ch. 2.1 – 2.3, excluding 2.3.1 and 2.3.2)
- Bayesian classifiers: Gaussian case (Ch. 2.4 – 2.9, excluding 2.8.1 and 2.8.2)
- Maximum likelihood estimation (Ch. 3.1 – 3.2)
- Principal component analysis (PCA tutorials, and Ch. 3.8.1)
- Fisher’s linear discriminant (Ch. 3.8.2 and 3.8.3)
- Nearest neighbor rule (Ch. 4.5, excluding 4.5.5, and Ch. 4.6.2)
- The Perceptron algorithm (Ch. 5.2 – 5.5)
- Convex optimization and stochastic gradient descent (Lecture Notes and Papers)
- Support vector machines (Ch. 5.6.1 and 5.11, Lecture Notes)
- Neural networks (Ch. 6.1, 6.2 and 6.3)
- Deep learning (Lecture Notes and Papers)
- Pytorch tutorial (Lecture Notes and Papers)
- Unsupervised learning (Ch. 10.1, 10.2, 10.3, 10.4, excluding 10.4.4)
- Clustering (Ch. 10.6, 10.7, 10.9.1, and 10.9.2)
- Spectral clustering (Lecture Notes and Papers)
- Expectation maximization (Lecture Notes and Papers)
- Hidden Markov models (Lecture Notes and Papers)