Instructor: Shusen Wang
TA: Yao Xiao
Meeting Time:
Thursday, 6:30 - 9:00 PM, Peirce Complex 116
The classes on these dates are canceled: Jan 31
Office Hours:
Thursday, 3:00 - 5:00 PM, North Building 205
The office hours on these dates are canceled: Jan 31, Feb 28
Time change: 3-5PM, May 2 ==> 2-6PM, May 1
Time change: May 9 ==> both May 7 and 8
Contact the Instructor:
For questions regarding grading, talk to the instructor during office hours or send him emails.
For any other questions, come during the office hours; the instructor will NOT reply such emails.
Prerequisite:
Elementary linear algebra, e.g., matrix multiplication, eigenvalue decomposition, and matrix norms.
Elementary calculus, e.g., convex function, differentiation of scalar functions, first derivative, and second derivative.
Python programming (especially the Numpy library) and Jupyter Notebook.
Goal: This is a practical course; the students will be able to use DL methods for solving real-world ML, CV, and NLP problems.
Jan 24, Lecture 1
Fundamental ML problems
Regression
Classification
Jan 24, Homework 0 is assigned.
Submission is not required.
Deadline: finish it before the 1st Quiz. (Otherwise, you will probably fail.)
Jan 24, Homework 1 is assigned (available on Canvas).
Jan 31, CANCELED due to the instructor's conference traveling
Feb 7, Lecture 2
Classification (cont.)
Regularization
Feb 14, Lecture 3
Dimensionality reduction
Matrix computations
Neural network basics
Feb 14, Homework 2 is assigned (available on Canvas).
Feb 21, Lecture 4
Clustering
Keras
Preparation for Quiz
Convolutional neural networks
Feb 24, Deadline for Homework 1
Feb 28, Quiz (No lecture).
Coverage: linear algebra, optimization, and ML basics.
Policy: No electronic device. Printed material is allowed.
Sample questions: [click here]
Mar 7, Lecture 5
Mar 7, Homework 3 is assigned
Available at the course's repo [click here]
Submission: submit to Canvas.
Extended to Mar 8 (originally Mar 7), Deadline for project proposal
Submission: Everyone must submit a proposal to Canvas, even if it is teamwork.
The finally participated competition is supposed be the same as in the proposal. If otherwise, convincing explanation and evidence must be provided in the project document to avoid penalty.
Mar 14, Lecture 6
Convolutional neural networks (cont.)
Autoencoders
Mar 14, Deadline for Homework 2
Mar 15, Homework 4 and Homework 5 are assigned
Available at the course's repo [click here]
Submission: submit to Canvas.
Mar 21, Spring Break, no class
Mar 28, Lecture 7
Autoencoders (cont.)
Recurrent neural networks
Apr 4, Lecture 8
Apr 7, Deadline for Homework 3
Apr 11, Lecture 9
Recurrent neural networks (cont.)
Optimization [read this]
Apr 18, Lecture 10
Recommender system
Adversarial robustness
Review the Quiz
Apr 21, Deadline for signing up for project presentation
Voluntary. Up to 5 bonus scores to the total.
Submission: submit to Canvas.
Selected at most 7 teams.
Selection criteria: Is the problem challenging? Does your method have novelty? Do you have good preliminary results? Can the audience learn anything from your presentation?
Apr 25, Lecture 11
GANs
Preparations for the final exam.
May 1, Office Hours
May 2, Final Exam
Coverage: linear algebra, optimization, ML basics, neural network basics, CNN, RNN, Python programming, Keras, and content in the textbook. [Click here] for the list.
Sample questions: [click here]
Policy: No electronic device (except for electronic calculator). Printed material is allowed.
May 4, Deadline for Homework 4 (Extended to May 11)
May 5, Deadline for Homework 5 (Extended to May 11)
May 7, 3:00 to 5:00 PM, Office Hours
May 8, 3:00 to 5:00 PM, Office Hours
May 9, Office Hours Canceled due to the faculty retreat.
May 9, Project Presentation
Time and location: the same as the class.
The selected groups are required to attend.
If you are confident that you will get A without the bonus, you can email the instructor to cancel your presentation. But the cancelation request must be made 48 hours prior to the presentation to avoid penalty.
May 19, Deadline for Course Project
Machine learning basics. This part briefly introduces the fundamental ML problems-- regression, classification, dimensionality reduction, and clustering-- and the traditional ML models and numerical algorithms for solving the problems.
Neural network basics. This part covers the multilayer perceptron, backpropagation, and deep learning libraries, with focus on Keras.
Convolutional neural networks (CNNs). This part is focused on CNNs and its application to computer vision problems.
CNN basics. [slides]
Tricks for improving test accuracy. [slides]
Feature scaling and batch normalization. [slides]
Advanced topics on CNNs. [slides]
Popular CNN architectures. [slides]
Face recognition. [slides]
Further reading:
[style transfer (Section 8.1, Chollet's book)]
[visualize CNN (Section 5.4, Chollet's book)]
Autoencoders. This part introduces autoencoders for dimensionality reduction and image generation.
Recurrent neural networks (RNNs). This part introduces RNNs and its applications in natural language processing (NLP).
Text processing. [slides]
Text generation. [slides]
Machine translation. [slides]
Attention. [slides][reference-1] [reference-2]
Further reading:
Recommender system. This part is focused on the collaborative filtering approach to recommendation based on the user-item rating data. This part covers matrix completion methods and neural network approaches.
Adversarial Robustness. This part introduces how to attack neural networks using adversarial examples and how to defend from the attack.
White box attack and defend. [slides]
Further reading: [Adversarial Robustness - Theory and Practice]
Generative Adversarial Networks (GANs).
Every student must participate in one Kaggle competition.
Details: [click here]
Teamwork policy: You had better work on your own project. Teamwork (up to 3 students) is allowed if the competition has a heavy workload; the workload and team size will be considered in the grading.
Grading policy: See the evaluation form [click here]. An OK but not excellent work typically lose 3 points.
Required:
Recommended:
Y. Nesterov. Introductory Lectures on Convex Optimization Book. Springer, 2013. (Available online.)
D. S. Watkins. Fundamentals of Matrix Computations. John Wiley & Sons, 2004.
I. Goodfellow, Y. Bengio, A. Courville, Y. Bengio. Deep learning. MIT press, 2016. (Available online.)
M. Mohri, A. Rostamizadeh, and A. Talwalkar. Foundations of machine learning. MIT press, 2012.
J. Friedman, T. Hastie, and R. Tibshirani. The elements of statistical learning. Springer series in statistics, 2001. (Available online.)
Grading percentages:
Homework 50%
Final 15%
Project 20%
Quizzes 15%
Bonus (up to 10%)
Late penalty:
Late submissions of assignments or project document for whatever reason will be punished. 1% of the score of an assignment/project will be deducted per day. For example, if an assignment is submitted 15 days and 1 minute later than the deadline (counted as 16 days) and it gets a grade of 95%, then the score after the deduction will be: 95% - 16% = 79%.
May 20 (not June 1 any more) is the firm deadline for all the homework and the course project. Submissions later than May 20 will not be graded.