CS 7150: Deep Learning - Spring 2026

Time & Location:

This course will be an online asynchronous course.

Staff

Instructor: Paul Hand
Email: p.hand@northeastern.edu     Office Hours: TBD.

Course Description

Note: This differs from the official course description. Please read carefully.

Introduction to deep learning, including the statistical learning framework, empirical risk minimization, loss function selection, fully connected layers, convolutional layers, pooling layers, batch normalization, multi-layer perceptrons, convolutional neural networks, autoencoders, U-nets, residual networks, gradient descent, stochastic gradient descent, backpropagation, autograd, visualization of neural network features, robustness and adversarial examples, interpretability, continual learning, and applications in computer vision and natural language processing. Assumes students already have a basic knowledge of machine learning, optimization, linear algebra, and statistics.

Overview

The learning objectives of this course are that students should be able to:

Expected Background of Students

Students are expected to have experience with machine learning, either through coursework or applied projects. Students are expected to have basic familiarity with the concept of a gradient from multivariable calculus and the concept of maximum likelihood estimation from probability and statistics. Students are expected to have familiarity with programming in Python. The course will use PyTorch. If you do not have experience with PyTorch, a good resource is the book Deep Learning with Pytorch, second edition.

Specific Skills You Will Develop

As a result of this course, you will learn and become stronger in:

Other Skills You Will Develop

As a result of this course, you will learn and become stronger in:

Course Structure and Work:

For each week of the course, the instructor will share a video that introduces the technical content of the week. Based on the content of the video, the instructor will provide multiple tasks for you to complete in order to develop your understanding of course content. Those tasks will include questions for you to answer, problems for you to solve, and computer code for you write. Each task will designate whether you will submit a response in writing or in the form of a video recording of yourself explaining your response. The tasks will include some things that feel like conventional homework problems. They will also include questions for you to respond to after reading a paper on the topic of deep learning.

Once the tasks are due, the instructor will create and share another video that includes him sharing examples of student work and providing feedback about those specific examples. The shared student work will not always be anonymous because it will include clips from the student made videos. Based on that feedback, you will be able to make revisions to your work before it is graded. All registered students will have their work included in the feedback session at multiple times throughout the semester.

At the end of the semester, you will complete a project and can do so as part of a team. You will reproduce an empirical observation in a paper of your choice by independently recreating and executing code for it.

There will be no exams in this class.

Course grades:

Course grades will be based on: 60% Graded weekly tasks, 20% Honest effort in initial task submissions, 20% Project.
Letter grades will be assigned on the following scale: 93%+ A, 90-92% A-, 87-89 B+, 83-87% B, 80-82% B-, 77-79 C+, 73-77% C, 70-72% C-,60-70% D, 0-59% F.

Tentative Schedule (SUBJECT TO CHANGE):

Week Date Class Content Will Be On:
1 W 1/7

Machine Learning Review (Notes)

A DARPA Perspective on Artificial Intelligence

Preparation Questions for Class

2 T 1/13 Deep Learning

Understanding deep learning requires rethinking generalization

Preparation Questions for Class (pdf, tex)

3 T 1/20 Architectural Elements of Neural Networks. (Notes).
4 T 1/27

Visualizing and Understanding Convolutional Networks

Visualizing Higher-Layer Features of a Deep Network

Preparation Questions for Class (pdf, tex)

5 T 2/3 Gradient Descent and Stochastic Gradient Descent (Notes)
6 T 2/10
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

How Does Batch Normalization Help Optimization?

Preparation Questions for Class (pdf, tex)

7 T 2/17

Deep Learning Book - Chapter 8

Adam: A Method for Stochastic Optimization

Preparation Questions for Class (pdf, tex)

8 T 2/24 Neural Network Architectures for Images (Notes) Deep Residual Learning for Image Recognition (ResNets)

ImageNet Classification with Deep Convolutional Neural Networks (AlexNet)

Preparation Questions for Class (pdf, tex)

9 T 3/3 Spring Break
10 T 3/10

Watch DeepLearningAI videos on Object Localization and Detection: 1, 2, 3, 4, 6, 7, 8, 9, 10,

You Only Look Once: Unified, Real-Time Object Detection

11 T 3/17
Adversarial Examples for Deep Neural Networks (Notes)

Explaining and Harnessing Adversarial Examples

Robust Physical-World Attacks on Deep Learning Models

Preparation Questions for Class (pdf, tex)

12 T 3/24 Continual Learning and Catastrophic Forgetting. (Notes). Overcoming catastrophic forgetting in neural networks

Preparation Questions for Class (pdf, tex)

13 T 3/31

YouTube Video: LSTM is dead. Long Live Transformers. Watch the first 23 minutes.

YouTube Video: Illustrated Guide to Transformers Neural Network: A step by step explanation

Attention Is All You Need

Language Models are Few-Shot Learners

Preparation questions for class (pdf, tex)

14 T 4/7
Automatic Differentiation, Backpropagation

Watch this video of automatic differentiation.

Watch this lecture (from start until time 38:30) on backpropagation of neural networks.

15 T 4/14 Generative Adversarial Networks (notes)
16 T 4/21 Work on Projects