Computer Vision Machine Learning Prerequisite Info

CSCI 476/576 - Computer Vision, Winter 2024

The last few weeks of CSCI 476 and CSCI 576 will rely on knowledge of Machine Learning, which is why DATA 471 (previously 371) or 571 are listed as prerequisites. If you would like to take CSCI 476 or 576, there are a few ways to be prepared even if you don’t technically satisfy the prerequisite:

  1. If you’ve taken (undergrad) DATA 371 or 471 but are looking to take (grad) 576, this suffices for the prerequisite, and I will issue you an override.

  2. If you are taking 471 or 571 concurrently with 476/576, our ML content is late in the quarter, so you’ll have what you need by that point. If you’re registered for ML, please send me a screenshot showing this and I’ll grant you an override.

  3. If you are comeing in with prior ML experience, this may suffice. Below is a list of topics I expect to rely on. If you have experience with most or all of these (e.g., via membership in a research group, side projects, etc.), please convince me of this and I will issue an override.

  4. If you have none or only some of the topics listed below, you can learn the necessary ML content via self-study. I’d recommend starting this over winter break so you’re not overloaded once the quarter starts. I’ve included a self-study guide below. This is an at-your-own-risk endeavor: if you don’t follow through properly, you will be underprepared for parts of computer vision and your grade will likely suffer for it. Before committing to this, please introspect on whether you have the discipline to get up to speed with no external motivators (grades, assignments, etc.). If you’re motivated and can commit to follow through, please email me and I will grant you an override.

If any of the above applies to you, please email me (wehrwes@wwu.edu) with:

Self-Study Guide

A rough topics list is as follows:

  1. Linear models: linear regression and logistic regression
  2. Basic nonlinear optimization: gradient descent, stochastic gradient descent
  3. Data splits, overfitting, and generalization
  4. Neural networks
    1. Construction via layers and forward pass (inference); activation functions
    2. Training via backpropagation (conceptual understanding is OK, calculus details not necessary)

The self-study should be completed by the start of the 7th week of the course.

My recommended approach to picking up this content is to use online resources; in particular, I recommend selected videos from Andrew Ng’s Machine Learning course on Coursera. I haven’t watched them all through, but I’ve taken a brief look and his videos come highly recommended. Below I list details on how to access the videos, and which videos I think are important to study. I also recommend completing a hands-on basic machine learning tutorial using PyTorch to make sure the concepts are solid.

Accessing the Coursera Videos

To access the coursera videos for free, you will need to create a free Coursera account. Then, head over to the relevant course:

They make it look like you need to start a free trial or pay for a subscription, but you can access the videos for free by clicking the “Enroll for Free” and then the tiny, subtle “Audit” link at the bottom of the box that pops up:

Below I list the bare minimum set of videos that you should understand to be prepared for what we’ll rely on in computer vision.

Week 1:

Supervised vs Unsupervised Machine Learning

Regression Model

Train the model with gradient descent

Week 2:

Multiple Linear Regression

Gradient descent in practice

Week 3:

Classification with logistic regression

Cost function for logistic regression

The problem of overfitting

Week 1

Neural Networks intuition

Neural network implementation in Python

Week 2

Neural Network Training

Activation functions

Multiclass Classification

Back Propagation

Week 3

Advice for applying machine learning

Bias and variance