Optimization & Numerical Methods

16 questions. Use Show Answer, then slide right (or use Next) to continue.

Card 1 of 16
Question 1 What does optimization mean in machine learning?
Question 2 What is an objective or loss function?
Question 3 What is Gradient Descent (Batch Gradient Descent)?
Question 4 What is Stochastic Gradient Descent (SGD)?
Question 5 What are the key differences between Gradient Descent and Stochastic Gradient Descent?
Question 6 What is the learning rate?
Question 7 What happens if the learning rate is too large or too small?
Question 8 What is convex optimization?
Question 9 What is the difference between a local and global minimum?
Question 10 Why does convexity matter in optimization?
Question 11 What numerical issues can affect optimization?
Question 12 What is \(L_2\) regularization?
Question 13 What is \(L_1\) regularization?
Question 14 How do \(L_1\) and \(L_2\) regularization differ?
Question 15 How does regularization affect bias and variance?
Question 16 How is logistic regression typically optimized?
Back to Topics