Skip to content

Add "zero_allow_untested_optimizer" option in conf file#173

Merged
ShadenSmith merged 12 commits intodeepspeedai:masterfrom
CalogeroZarbo:untested_opt_config_flag
Mar 27, 2020
Merged

Add "zero_allow_untested_optimizer" option in conf file#173
ShadenSmith merged 12 commits intodeepspeedai:masterfrom
CalogeroZarbo:untested_opt_config_flag

Conversation

@CalogeroZarbo
Copy link
Copy Markdown
Contributor

Dear all,
I've added the flag, in the same flavor as all the other flags have been addressed, by changing the DeepSpeedConfig class. Also, the assertion has been added to raise an error message if the flag is not set, but the user wants to use the custom optimizer. If both zero_allow_untested_optimizer and zero_optimization are set to true a warning message appears and the computation starts normally.

@msftclas
Copy link
Copy Markdown

msftclas commented Mar 26, 2020

CLA assistant check
All CLA requirements met.

@ShadenSmith
Copy link
Copy Markdown
Contributor

Thanks so much for putting this together @CalogeroZarbo !

Would you mind adding a unit test that should trigger the assertion? Here's a similar test it could be based off of: https://github.com/microsoft/DeepSpeed/blob/master/tests/unit/test_fp16.py#L251

You could catch the assertion error with pytest:

with pytest.raises(AssertionError):
    model, optim, _,_ = deepspeed.initialize(args=args,
                                             model=model,
                                             model_parameters=model.parameters())

and end the test after that point.

@ShadenSmith ShadenSmith added the enhancement New feature or request label Mar 26, 2020
@ShadenSmith ShadenSmith linked an issue Mar 26, 2020 that may be closed by this pull request
@CalogeroZarbo
Copy link
Copy Markdown
Contributor Author

@ShadenSmith Sure, on it! May I add the test to the same test_fp16.py file?

Thank you!
Cal

@ShadenSmith
Copy link
Copy Markdown
Contributor

That would be perfect. Just right under the other ZeRO tests.

@CalogeroZarbo
Copy link
Copy Markdown
Contributor Author

@ShadenSmith pytests are done. I added a helper class to define a custom optimizer on the fly. Thank you for the suggestion.
Let me know if I can help further on.

Cheers,
Cal

@tjruwase
Copy link
Copy Markdown
Contributor

@CalogeroZarbo Thanks so much for doing this.

@ShadenSmith ShadenSmith merged commit 43f2733 into deepspeedai:master Mar 27, 2020
@ShadenSmith
Copy link
Copy Markdown
Contributor

Thanks so much @CalogeroZarbo !

@CalogeroZarbo CalogeroZarbo deleted the untested_opt_config_flag branch April 1, 2020 16:45
kouml pushed a commit to kouml/DeepSpeed that referenced this pull request Apr 3, 2020
)

* added zero_allow_untested_optimizer flag helpers

* add zero_allow_untested_optimizer config constants

* zero_allow_untested_optimizer logic with assertion

* Added unit test and CustomOptimizer helper class
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ZeRO & Custom Optmizer (RangerLars)

4 participants