MODULE 2 : GRADIENT DESCENT, LOGISTIC REGRESSION, PERCEPTRON, NEURAL NETS STICKY POINTS * GRADIENT DESCENT PROCEDURE - UNDERSTANDING It like finding the minimum, but we cant solve the equation f’=0 so we go little by little in the direction of the solution * GD LEARNING RATES too big: jumps around too small: takes a long time shrinking: start bigger, lower the rates as we move a long * LOGISTIC REGRESSION UPDATE RULE just happen to be same as linear regression, but make sure to understand the optimization is for a different function * WHICH REGRESSION FOR WHICH DATASET? linear regression better suited for quantity labels logistic regression better suited for classification, since we can interpret the predictor as probability * PERCEPTRON UPDATE RULE * PERCEPTRON CLASSIFICATION * NEURAL NET / AUTOENCODER SETUP - use a physical network setup (object for each node) * NEURAL NET BACKPROPAGATION UPDATES - explain procedure first, before getting into math details of GD * AUC AND ROC