Skip to content

[NEW FEATURE] Gradient Descent Algorithms and accelerations methods #194

@paquiteau

Description

@paquiteau

With the integration of online forward-backward algorithms to pysap-mri, come also the need for Machine-Learning inspired gradient descent acceleration (eg. Momentum, SAGA, etc ...).

A solid draft of these algorithm implementation is available here.

This include:

  • Vanilla Gradient Descent
  • Epoch descent (proximal step at the end of each epoch)
  • ADA
  • RMSProp
  • Momentum
  • ADAM
  • SAGA

I suggest to refactor the opt/algorithms.py file in a module similarly to what is done in pysap-mri:

   op/algorithms/base.py -> Setup 
   opt/algorithms/forward_backward.py -> All forward backward algorithms (and POGM)
   opt/algorithms/primal_dual.py -> Condat
   opt/algorithms/gradient_descent.py ->  All the gradient descent algorithms mentioned above

To avoid breaking changes downstream, all these classes will be imported in algorithms/__init_.py

Are you planning to submit a Pull Request?

  • Yes
  • No

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions