Add "zero_allow_untested_optimizer" option in conf file#173
Merged
ShadenSmith merged 12 commits intodeepspeedai:masterfrom Mar 27, 2020
CalogeroZarbo:untested_opt_config_flag
Merged
Add "zero_allow_untested_optimizer" option in conf file#173ShadenSmith merged 12 commits intodeepspeedai:masterfrom CalogeroZarbo:untested_opt_config_flag
ShadenSmith merged 12 commits intodeepspeedai:masterfrom
CalogeroZarbo:untested_opt_config_flag
Conversation
added 6 commits
March 26, 2020 11:42
Contributor
|
Thanks so much for putting this together @CalogeroZarbo ! Would you mind adding a unit test that should trigger the assertion? Here's a similar test it could be based off of: https://github.com/microsoft/DeepSpeed/blob/master/tests/unit/test_fp16.py#L251 You could catch the assertion error with pytest: with pytest.raises(AssertionError):
model, optim, _,_ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())and end the test after that point. |
Contributor
Author
|
@ShadenSmith Sure, on it! May I add the test to the same Thank you! |
Contributor
|
That would be perfect. Just right under the other ZeRO tests. |
Contributor
Author
|
@ShadenSmith Cheers, |
Contributor
|
@CalogeroZarbo Thanks so much for doing this. |
tjruwase
approved these changes
Mar 27, 2020
ShadenSmith
approved these changes
Mar 27, 2020
Contributor
|
Thanks so much @CalogeroZarbo ! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Dear all,
I've added the flag, in the same flavor as all the other flags have been addressed, by changing the DeepSpeedConfig class. Also, the assertion has been added to raise an error message if the flag is not set, but the user wants to use the custom optimizer. If both
zero_allow_untested_optimizerandzero_optimizationare set totruea warning message appears and the computation starts normally.