CS6140/4420 Machine Learning Sec 1, FALL 2023

     About CS6140     Home     Schedule     Piazza     Final Project(optional) Gradescope

* Schedule and materials subject to change
c
Week / Module Topic / Lecture Other Reading Assignment
  • 9/7 - 9/11
  • Week 1 / Module 1: Intro, Decision Tree
  • Course Map

  • Topics:
  • Administrative
  • Intro to ML, Matrix Data
  • Rule-based Classifiers
  • Decision/Regression Trees
  • Linear Regression
  • HW 1
  • Due: Mon 9/18
  • 9/11 - 9/18
  • Topics:
  • Setup, Cross Validation
  • Error, Accuracy, ROC, AUC


Notes: Ridge Regression (normal equations)

  • DHS ch 5 (up to 5.4.2)
  • KMPP ch 7 (up to 7.5)



  • 9/18 - 9/25
  • Topics:
  • Gradient Descent
  • Linear Regression with GD
  • Logistic Regression
  • Newton Method
  • DHS ch 5
  • KMPP ch 7, 8
  • 9/25 - 10/2
  • Topics:
  • Perceptrons
  • Neural Networks


  • DHS ch 6
  • HW2B
  • Due: Fri 10/13
  • 10/2 - 10/11
  • Topics:
  • Probabilities as data densities
  • Maximum Likelihood, fit params to data
  • Gaussian Discriminant Analysis
  • Naive Bayes
  • DHS ch 2, 3
  • KMPP ch 2, 3, 4
  • 10/11 - 10/21
  • Topics:
  • EM algorithm for fitting mixtures
  • Graphical Models

  • 10/21 - 11/3
  • Topics:
  • Online Learning
  • Adaboost Algorithm
  • Bagging
  • RankBoost, Gradient Boosting


  • 11/3 - 11/13
  • Topics:
  • Active Learning and VC Dimension
  • Multiclass ECOC


  • DHS ch 9.5
  • KMPP ch 16.6
  • KMPP ch 27.6.2
  • 11/13 - 11/20
  • Topics:
  • Margins, Boosting Feature Analysis
  • PCA and LDA, Lagrangian Multipliers
  • Missing Values
  • PCA: DHS ch 10
  • KMPP ch 25
  • 11/20 - 11/27
  • Topics:
  • HAAR Features for Images
  • Regularized Regression RIDGE and LASSO
  • Text Features

ThanksGiving Break : no class/OH
Wed 11/22-Sun 11/26

Stanford slides

regularized Logistic Regression-Andrew Ng  (paperscreencast)

Haar features integrated with  boosting




  • 11/27 - 12/2
  •  
  • Topics:
  • Support Vector Machines
  • Duality with KKT conditions
  • Maximizing Margins Constrained Optimization
  • SMO Algorithm


  • DSH ch 5.11
  • KMPP ch 14
  • 12/2 - 12/9
  • Topics:
  • Kernels
  • Kernels for SVM

paper: Kernel Methods in Machine Learning



  • 12/2 - 12/9
  •    
  • Topics:
  • Catching up
  • Recap
  • Extra Demo Days




  • 12/4 - 12/11

  • Week 14 / Module 7: Locality and Similarity
  • Course Map

  • Topics:
  • K-Nearest Neighbor
  • Kernel Similarity and KNN
  • Kernel Density Estimation
  • Heat Kernels, Harmonic Equation
  • Collaborative Filtering
  • KMPP ch 14, ch 1
  • DHS ch 4
  • Clustering: DHS ch 3.8, KMPP ch 12.2

  • Mon Dec 11, WVH rooms 210-212 Exam
    2pm-6pm

Final Exam, you'll need a laptop. Expect programming problems similar to HW, about 1h to code and 15 min to run. You will have access to internet, but what you implement has to be your own.


Submit a copy of your code on gradescope together with running instructions. The TAs are encouraging you to use Jupyter Notebook, but this is not a requirement for the exam.