Skip to content

add transformers fix#19

Merged
samsja merged 1 commit intomainfrom
fix-transformers
Mar 1, 2025
Merged

add transformers fix#19
samsja merged 1 commit intomainfrom
fix-transformers

Conversation

@samsja
Copy link
Copy Markdown
Member

@samsja samsja commented Mar 1, 2025

YES MFU GO UP
Screenshot from 2025-02-28 19-14-25

See issue here PrimeIntellect-ai/prime#224 and PR in transformer here huggingface/transformers#36487

Signed-off-by: Sami Jaghouar <sami.jaghouar@gmail.com>
@samsja samsja requested a review from apaz-cli March 1, 2025 03:27
@samsja samsja requested a review from Jackmin801 March 1, 2025 03:27
@samsja samsja merged commit 48a5be9 into main Mar 1, 2025
samsja added a commit that referenced this pull request Nov 12, 2025
samsja added a commit that referenced this pull request Dec 4, 2025
samsja added a commit that referenced this pull request Mar 30, 2026
* fix wandb

* add robust eval

* add eval to orch

* fix nccl ready

* deepdive: separate, explicitly named caches for train and online eval (#18)

* delete cache deepdive

* add 105 ckpt interval

* update deepdeive cache

* fix eval

* fix eval

---------

Co-authored-by: sami jaghouar <sami@primeintellect.ai>
Co-authored-by: Sebastian Müller <sebastian@primeintellect.ai>
Co-authored-by: Mika Senghaas <mail@mikasenghaas.de>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants