Skip to content

pre-training olmo code missing #24

@ShreyGanatra

Description

@ShreyGanatra

For reproducibilty and extension of the work, I am trying this repo. I am able to generate superbpe based tokenizer but I believe Olmo pre-training code is missing from this repo. Requesting to provide instruction to install and train necessary models.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions