- Developed a modular Python framework to simulate and visualize gradient descent algorithms (Vanilla, Momentum & Nag) on loss surfaces
- Designed support for multiple activation functions (sigmoid, linear) and loss functions (cross entropy, squared error loss) for model classification and regression tasks
- Integrated adaptive learning rate optimizers (Adam, RMSProp, AdaGrad) with dynamic batch modes (Batch, Mini Batch and Stochastic) for scalable optimization
- Animated 2D contour and 3D surface plots to illustrate optimizer trajectories and convergence dynamics.
Rijudey2003/Gradient_Descent_Visulaization
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|