-
Notifications
You must be signed in to change notification settings - Fork 15
Closed
Labels
Description
With the integration of online forward-backward algorithms to pysap-mri, come also the need for Machine-Learning inspired gradient descent acceleration (eg. Momentum, SAGA, etc ...).
A solid draft of these algorithm implementation is available here.
This include:
- Vanilla Gradient Descent
- Epoch descent (proximal step at the end of each epoch)
- ADA
- RMSProp
- Momentum
- ADAM
- SAGA
I suggest to refactor the opt/algorithms.py file in a module similarly to what is done in pysap-mri:
op/algorithms/base.py -> Setup
opt/algorithms/forward_backward.py -> All forward backward algorithms (and POGM)
opt/algorithms/primal_dual.py -> Condat
opt/algorithms/gradient_descent.py -> All the gradient descent algorithms mentioned aboveTo avoid breaking changes downstream, all these classes will be imported in algorithms/__init_.py
Are you planning to submit a Pull Request?
- Yes
- No
Reactions are currently unavailable