-
Notifications
You must be signed in to change notification settings - Fork 127
Toml-based Prompt Organization #85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Great progress, but let's make it cleaner and usable for people. |
52274f5 to
3e91550
Compare
…ke it shorter later
|
Great job @AffectionateCurry! Hopefully this organization is a lot more configurable (able for people to easily replicate all the settings in the paper) and customizable. We also fix the indent error the old version had. I think next step along with #71 #95 make the code to construct prompt and invoke dataset object as short as possible (@pythonomar22) so |
Changed up the prompt constructor to use a toml file.
Files added:
prompt.toml
loader.py
Loader.py is doing all the heavy lifting of setting up the prompt from the toml file. The prompt_constructor_multilang is just a light API wrapper now for loader.py.
Changed generate files to now use toml structure:
generate_eval_single_sample.py
generate_eval_single_sample_modal.py
generate_samples.py