diff --git a/docs/L2/InternVL/README.md b/docs/L2/InternVL/README.md index d21c5c2b6..9d05a65b4 100644 --- a/docs/L2/InternVL/README.md +++ b/docs/L2/InternVL/README.md @@ -61,7 +61,7 @@ conda activate xtuner-env pip install xtuner==0.1.23 timm==1.0.9 pip install 'xtuner[deepspeed]' pip install torch==2.4.1 torchvision==0.19.1 torchaudio==2.4.1 --index-url https://download.pytorch.org/whl/cu121 -pip install transformers==4.39.0 peft==0.13.2 +pip install transformers==4.39.0 tokenizers==0.15.2 peft==0.13.2 datasets==3.1.0 accelerate==1.2.0 huggingface-hub==0.26.5 ``` 训练环境既为安装成功。 @@ -180,6 +180,14 @@ cd /root/xtuner conda activate xtuner-env # 或者是你自命名的训练环境 ``` +如果没有该路径,可以从GitHub上克隆一个: + +```Bash +cd /root +git clone https://github.com/InternLM/xtuner.git +conda activate xtuner-env +``` + 原始internvl的微调配置文件在路径`./xtuner/configs/internvl/v2`下,假设上面克隆的仓库在/`root/InternVL2-Tutorial`,复制配置文件到目标目录下: ```Bash