From 53449f65b4b2e89f744dd623269dcfade1354156 Mon Sep 17 00:00:00 2001
From: binmakeswell
Date: Thu, 14 Sep 2023 22:29:44 +0800
Subject: [PATCH 1/3] [doc] fix llama2 code link
---
README.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/README.md b/README.md
index 0ddcdab741a4..25d3b8f83f1e 100644
--- a/README.md
+++ b/README.md
@@ -224,7 +224,7 @@ Acceleration of [AlphaFold Protein Structure](https://alphafold.ebi.ac.uk/)
- 70 billion parameter LLaMA2 model training accelerated by 195%
-[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
+[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
### LLaMA1
From d82eadce7f0e39353b6e62e22c2d4d25cc251e28 Mon Sep 17 00:00:00 2001
From: binmakeswell
Date: Thu, 14 Sep 2023 22:31:39 +0800
Subject: [PATCH 2/3] [doc] fix llama2 code link
---
docs/README-zh-Hans.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/README-zh-Hans.md b/docs/README-zh-Hans.md
index dda4f86a29a0..41eebc59c493 100644
--- a/docs/README-zh-Hans.md
+++ b/docs/README-zh-Hans.md
@@ -217,7 +217,7 @@ Colossal-AI 为您提供了一系列并行组件。我们的目标是让您的
- 700亿参数LLaMA2训练加速195%
-[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
+[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
### LLaMA1
From d6f2a7cefa22159e525d1a9e1acd97d20c4eb92a Mon Sep 17 00:00:00 2001
From: binmakeswell
Date: Thu, 14 Sep 2023 22:32:29 +0800
Subject: [PATCH 3/3] [doc] fix llama2 code link
---
examples/language/llama2/README.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/examples/language/llama2/README.md b/examples/language/llama2/README.md
index 16b263c1322e..c8fc86d29d97 100644
--- a/examples/language/llama2/README.md
+++ b/examples/language/llama2/README.md
@@ -6,7 +6,7 @@
- 70 billion parameter LLaMA2 model training accelerated by 195%
-[[code]](https://github.com/hpcaitech/ColossalAI/tree/example/llama/examples/language/llama)
+[[code]](https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/llama2)
[[blog]](https://www.hpc-ai.tech/blog/70b-llama2-training)
### LLaMA1