Skip to content

Conversation

@Advaitgaur004
Copy link
Contributor

This PR enhances the existing SGD optimiser by adding support for a momentum term, effectively upgrading it to SGDM. This is a standard optimisation technique that helps accelerate convergence and navigate complex loss landscapes more smoothly.

The step function is updated to use the standard momentum update rule:

v = momentum * v + grad
p = p - lr * v

@PrimedErwin PrimedErwin merged commit e5ac661 into pocketpy:test Aug 4, 2025
5 checks passed
@Advaitgaur004 Advaitgaur004 deleted the optimizer-1 branch August 4, 2025 08:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants