:Course title
:Tentative Course Outline
Python Programming & Essential scientific Libraries; (NumPy, pandas, matplotlib, statsmodels, scikitlearn, PyTorch)
Fundamentals of Linear Algebra and Optimisation for Machine Learning in Python
A Gentle Introduction to Computational Learning Theory
Types of Learning (Supervised, Unsupervised, and Reinforcement)
Supervised Modeling: Regression vs Classification
Parametric versus Semi and Nonparametric Models
Kernel Methods: VC-Dimension, Support Vector Machines (SVM) • Tree-Based Models and Random Forest
Neural Networks and Deep Neural Networks
Model Selection and Feature Extraction
Model Selection and Boosting: XGBoost
Unsupervised Modeling and Clustering
k-means Clustering, Principal Component Analysis, Autoencoders, and Factor Analysis
Reinforcement Modeling
Inference and Learning
Probabilistic Machine Learning
Practical Advice for ML projects
This is only a tentative course outline. During the development, some topics will likely need to be expanded, or split into multiple sub-topics
:Main References
This is a restricted list of various interesting and useful books that will be touched during the course. You need to consult them occasionally
Gilbert Strang, Linear Algebra and Learning from Data, Wellesley Cambridge Press, 2019
Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction,Springer, 2017. (Available Online)
Shai Shalev-Shwartz, and Shai Ben-David, Understanding Machine Learning From Theory to Algorithms, Cambridge University Press, 2014. (Available Online)
Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006