Skip to content

[chat] remove lm model class#3653

Merged
ver217 merged 6 commits intohpcaitech:mainfrom
ver217:refactor/chat-model
Apr 27, 2023
Merged

[chat] remove lm model class#3653
ver217 merged 6 commits intohpcaitech:mainfrom
ver217:refactor/chat-model

Conversation

@ver217
Copy link
Copy Markdown
Contributor

@ver217 ver217 commented Apr 27, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

The LM model class is an unnecessary design, which just wrap transformers LM models without introducing any useful atttributes or methods.

I just remove this class and make lora more user-friendly.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@ver217 ver217 added the chatgpt ChatGPT Application label Apr 27, 2023
@ver217 ver217 merged commit 6ef7011 into hpcaitech:main Apr 27, 2023
@ver217 ver217 deleted the refactor/chat-model branch April 27, 2023 07:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

chatgpt ChatGPT Application

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants