Lesson 9.4: Hyperparameter Tuning – Grid Search, Random Search
🔹 What is Hyperparameter Tuning?
Hyperparameters are parameters set before training a model (e.g., number of trees in Random Forest, learning rate).
-
Tuning selects the best hyperparameters to improve model performance.
🔹 Common Tuning Methods
-
Grid Search
-
Exhaustively searches through a specified hyperparameter grid.
-
Evaluates all combinations to find the best set.
Example:
-
Random Search
-
Randomly samples a subset of hyperparameter combinations.
-
Faster than Grid Search, especially for large parameter spaces.
Example:
🔹 Advantages
-
Helps maximize model accuracy.
-
Avoids manual trial and error.
-
Cross-validation ensures robust results.
🔹 Disadvantages
-
Grid Search → Computationally expensive for large grids.
-
Random Search → May miss optimal values, less exhaustive.
✅ Quick Recap:
-
Hyperparameter Tuning → Finds the best model settings.
-
Grid Search → Exhaustive search.
-
Random Search → Random sampling, faster for large spaces.
