In [1]:
from lec_utils import *
import lec22_util as util
from ipywidgets import FloatSlider, interact
from IPython.display import YouTubeVideo

Lecture 22¶

Gradient Descent¶

EECS 398-003: Practical Data Science, Fall 2024¶

practicaldsc.org • github.com/practicaldsc/fa24

Announcements 📣¶

  • The Portfolio Homework has been released! Read all about it here. It has two due dates:
    • A checkpoint (worth 15 points / 100) is due on Monday, November 25th (no slip days!).
    • The full homework is due on Saturday, December 7th (no slip days!).
  • Homework 10 will be out later this week.
  • The Grade Report now includes scores and slip days through Homework 8.

Agenda¶

  • Gradient descent intuition.
  • When is gradient descent guaranteed to work?
  • Gradient descent for multivariate functions.

Question 🤔 (Answer at practicaldsc.org/q)

Remember that you can always ask questions anonymously at the link above!

Gradient descent intuition¶


Minimizing arbitrary functions¶

  • Assume $f(w)$ is some differentiable function.
    For now, we'll assume $f$ takes in a single number, $w$, as input and returns a single number as its output.
  • When tasked with minimizing $f(w)$, our general strategy has been to:
    1. Find $\frac{df}{dw}(w)$, the derivative of $f$.
    2. Find the input $w^*$ such that $\frac{df}{dw}(w^*) = 0$.
  • However, there are cases where we can find $\frac{df}{dw}(w)$, but it is either difficult or impossible to solve $\frac{df}{dw}(w^*) = 0$.
$$f(w) = 5w^4 - w^3 - 5w^2 + 2w - 9$$
  • Then what?
In [2]:
util.draw_f()