Skip to content

[fp32] fix default dtype#1023

Merged
jeffra merged 1 commit intodeepspeedai:masterfrom
stas00:patch-1
Apr 30, 2021
Merged

[fp32] fix default dtype#1023
jeffra merged 1 commit intodeepspeedai:masterfrom
stas00:patch-1

Conversation

@stas00
Copy link
Copy Markdown
Collaborator

@stas00 stas00 commented Apr 30, 2021

I'm not sure if this is the best approach but with #1004 I still have to pass zero.Init(dtype) because this branch never gets executed as dtype is never None, so the config is ignored.

    def _set_dtype(self, ds_config, dtype):
        if ds_config is not None and dtype is None:
            _ds_config = DeepSpeedConfig(ds_config)
            self.dtype = torch.half if _ds_config.fp16_enabled else torch.float

So with fp16.enabled=False I get: AssertionError: The fp16 is not enabled but dtype on parameters not fp16

I'm not sure if this is the best approach but with deepspeedai#1004 I still have to pass `zero.Init(dtype)` because this branch never gets executed:
```
    def _set_dtype(self, ds_config, dtype):
        if ds_config is not None and dtype is None:
            _ds_config = DeepSpeedConfig(ds_config)
            self.dtype = torch.half if _ds_config.fp16_enabled else torch.float
```
@jeffra jeffra merged commit 18a26e8 into deepspeedai:master Apr 30, 2021
@stas00 stas00 deleted the patch-1 branch April 30, 2021 16:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants