Skip to content

Feature/vitb16#28

Merged
ver217 merged 2 commits intohpcaitech:feature/vitb16from
ver217:feature/vitb16
Nov 18, 2021
Merged

Feature/vitb16#28
ver217 merged 2 commits intohpcaitech:feature/vitb16from
ver217:feature/vitb16

Conversation

@ver217
Copy link
Copy Markdown
Contributor

@ver217 ver217 commented Nov 18, 2021

add an example of ViT-B/16 and remove w_norm clipping in LAMB

FrankLeeeee and others added 2 commits November 18, 2021 19:45
* Add gradient accumulation, fix lr scheduler

* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)

* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes

* fixed trainer

* Revert "fixed trainer"

This reverts commit 2e0b0b7.

* improved consistency between trainer, engine and schedule (#23)

Co-authored-by: 1SAA <c2h214748@gmail.com>

Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
@ver217 ver217 merged commit f734f4b into hpcaitech:feature/vitb16 Nov 18, 2021
@ver217 ver217 deleted the feature/vitb16 branch November 19, 2021 03:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants