From 7462d25e642c92bcc0216536cf43bc8a20c5b0b7 Mon Sep 17 00:00:00 2001 From: binmakeswell Date: Wed, 29 Mar 2023 03:48:52 +0800 Subject: [PATCH 1/2] [doc] add ColossalChat news --- README.md | 5 +++-- applications/Chat/README.md | 17 +++++++++-------- docs/README-zh-Hans.md | 5 +++-- 3 files changed, 15 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index 77c3471d9d25..1a7e4e7edbc9 100644 --- a/README.md +++ b/README.md @@ -25,8 +25,9 @@ ## Latest News +* [2023/03] [ColossalChat: An Open-Source Solution for Cloning ChatGPT With a Complete RLHF Pipeline](https://medium.com/@yangyou_berkeley/colossalchat-an-open-source-solution-for-cloning-chatgpt-with-a-complete-rlhf-pipeline-5edf08fb538b) * [2023/03] [AWS and Google Fund Colossal-AI with Startup Cloud Programs](https://www.hpc-ai.tech/blog/aws-and-google-fund-colossal-ai-with-startup-cloud-programs) -* [2023/02] [Open source solution replicates ChatGPT training process! Ready to go with only 1.6GB GPU memory](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) +* [2023/02] [Open Source Solution Replicates ChatGPT Training Process! Ready to go with only 1.6GB GPU Memory](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) * [2023/01] [Hardware Savings Up to 46 Times for AIGC and Automatic Parallelism](https://medium.com/pytorch/latest-colossal-ai-boasts-novel-automatic-parallelism-and-offers-savings-up-to-46x-for-stable-1453b48f3f02) * [2022/11] [Diffusion Pretraining and Hardware Fine-Tuning Can Be Almost 7X Cheaper](https://www.hpc-ai.tech/blog/diffusion-pretraining-and-hardware-fine-tuning-can-be-almost-7x-cheaper) * [2022/10] [Use a Laptop to Analyze 90% of Proteins, With a Single-GPU Inference Sequence Exceeding 10,000](https://www.hpc-ai.tech/blog/use-a-laptop-to-analyze-90-of-proteins-with-a-single-gpu-inference-sequence-exceeding) @@ -223,7 +224,7 @@ Please visit our [documentation](https://www.colossalai.org/) and [examples](htt -[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): An open-source solution for cloning [ChatGPT](https://openai.com/blog/chatgpt/) with a complete RLHF pipeline. [[code]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[blog]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) [[demo]](https://chat.colossalai.org) +[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): An open-source solution for cloning [ChatGPT](https://openai.com/blog/chatgpt/) with a complete RLHF pipeline. [[code]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[blog]](https://medium.com/@yangyou_berkeley/colossalchat-an-open-source-solution-for-cloning-chatgpt-with-a-complete-rlhf-pipeline-5edf08fb538b) [[demo]](https://chat.colossalai.org)

diff --git a/applications/Chat/README.md b/applications/Chat/README.md index 2f1771bc9d37..f870d35821d5 100644 --- a/applications/Chat/README.md +++ b/applications/Chat/README.md @@ -38,7 +38,7 @@ --- ## What is ColossalChat and Coati ? -ColossalChat is the project to implement LLM with RLHF, powered by the Colossal-AI project. +[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) is the project to implement LLM with RLHF, powered by the [Colossal-AI](https://github.com/hpcaitech/ColossalAI) project. Coati stands for `ColossalAI Talking Intelligence`. It is the name for the module implemented in this project and is also the name of the large language model developed by the ColossalChat project. @@ -55,13 +55,9 @@ The Coati package provides a unified large language model framework that has imp **As Colossa-AI is undergoing some major updates, this project will be actively maintained to stay in line with the Colossal-AI project.** -More details can be found in the [blog](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt). - -

- -Image source: https://openai.com/blog/chatgpt -

- +More details can be found in the latest news. +* [2023/03] [ColossalChat: An Open-Source Solution for Cloning ChatGPT With a Complete RLHF Pipeline](https://medium.com/@yangyou_berkeley/colossalchat-an-open-source-solution-for-cloning-chatgpt-with-a-complete-rlhf-pipeline-5edf08fb538b) +* [2023/02] [Open Source Solution Replicates ChatGPT Training Process! Ready to go with only 1.6GB GPU Memory](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) ## Online demo You can experience the performance of Coati7B on this page. @@ -92,6 +88,11 @@ pip install . ## How to use? +

+ +Image source: https://openai.com/blog/chatgpt +

+ ### Supervised datasets collection we colllected 104K bilingual dataset of Chinese and English, and you can find the datasets in this repo diff --git a/docs/README-zh-Hans.md b/docs/README-zh-Hans.md index 4be923eca024..4d29ae156e5e 100644 --- a/docs/README-zh-Hans.md +++ b/docs/README-zh-Hans.md @@ -24,8 +24,9 @@ ## 新闻 +* [2023/03] [ColossalChat: An Open-Source Solution for Cloning ChatGPT With a Complete RLHF Pipeline](https://medium.com/@yangyou_berkeley/colossalchat-an-open-source-solution-for-cloning-chatgpt-with-a-complete-rlhf-pipeline-5edf08fb538b) * [2023/03] [AWS and Google Fund Colossal-AI with Startup Cloud Programs](https://www.hpc-ai.tech/blog/aws-and-google-fund-colossal-ai-with-startup-cloud-programs) -* [2023/02] [Open source solution replicates ChatGPT training process! Ready to go with only 1.6GB GPU memory](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) +* [2023/02] [Open Source Solution Replicates ChatGPT Training Process! Ready to go with only 1.6GB GPU Memory](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) * [2023/01] [Hardware Savings Up to 46 Times for AIGC and Automatic Parallelism](https://medium.com/pytorch/latest-colossal-ai-boasts-novel-automatic-parallelism-and-offers-savings-up-to-46x-for-stable-1453b48f3f02) * [2022/11] [Diffusion Pretraining and Hardware Fine-Tuning Can Be Almost 7X Cheaper](https://www.hpc-ai.tech/blog/diffusion-pretraining-and-hardware-fine-tuning-can-be-almost-7x-cheaper) * [2022/10] [Use a Laptop to Analyze 90% of Proteins, With a Single-GPU Inference Sequence Exceeding 10,000](https://www.hpc-ai.tech/blog/use-a-laptop-to-analyze-90-of-proteins-with-a-single-gpu-inference-sequence-exceeding) @@ -220,7 +221,7 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的 -[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): 完整RLHF流程0门槛克隆 [ChatGPT](https://openai.com/blog/chatgpt/) [[代码]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[博客]](https://www.hpc-ai.tech/blog/colossal-ai-chatgpt) [[在线样例]](https://chat.colossalai.org) +[ColossalChat](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat): 完整RLHF流程0门槛克隆 [ChatGPT](https://openai.com/blog/chatgpt/) [[代码]](https://github.com/hpcaitech/ColossalAI/tree/main/applications/Chat) [[博客]](https://medium.com/@yangyou_berkeley/colossalchat-an-open-source-solution-for-cloning-chatgpt-with-a-complete-rlhf-pipeline-5edf08fb538b) [[在线样例]](https://chat.colossalai.org)

From 4a51997bcc5ddb982b1c57b30997244ccac49020 Mon Sep 17 00:00:00 2001 From: binmakeswell Date: Wed, 29 Mar 2023 03:54:03 +0800 Subject: [PATCH 2/2] [doc] add ColossalChat news --- applications/Chat/README.md | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/applications/Chat/README.md b/applications/Chat/README.md index f870d35821d5..f69f32ac4a40 100644 --- a/applications/Chat/README.md +++ b/applications/Chat/README.md @@ -52,6 +52,14 @@ The Coati package provides a unified large language model framework that has imp - Fast model deploying - Perfectly integration with the Hugging Face ecosystem, high degree of model customization +

+

+ +

+ + Image source: https://openai.com/blog/chatgpt +
+ **As Colossa-AI is undergoing some major updates, this project will be actively maintained to stay in line with the Colossal-AI project.** @@ -88,11 +96,6 @@ pip install . ## How to use? -

- -Image source: https://openai.com/blog/chatgpt -

- ### Supervised datasets collection we colllected 104K bilingual dataset of Chinese and English, and you can find the datasets in this repo