diff --git a/docs/source/en/model_doc/ernie.md b/docs/source/en/model_doc/ernie.md index 596a7b1f4b38..8b2290ded685 100644 --- a/docs/source/en/model_doc/ernie.md +++ b/docs/source/en/model_doc/ernie.md @@ -14,29 +14,83 @@ rendered properly in your Markdown viewer. --> +
+
+ PyTorch +
+
+ # ERNIE -
-PyTorch -
+[ERNIE1.0](https://arxiv.org/abs/1904.09223), [ERNIE2.0](https://ojs.aaai.org/index.php/AAAI/article/view/6428), +[ERNIE3.0](https://arxiv.org/abs/2107.02137), [ERNIE-Gram](https://arxiv.org/abs/2010.12148), [ERNIE-health](https://arxiv.org/abs/2110.07244) are a series of powerful models proposed by baidu, especially in Chinese tasks. + +ERNIE (Enhanced Representation through kNowledge IntEgration) is designed to learn language representation enhanced by knowledge masking strategies, which includes entity-level masking and phrase-level masking. + +Other ERNIE models released by baidu can be found at [Ernie 4.5](./ernie4_5.md), and [Ernie 4.5 MoE](./ernie4_5_moe.md). + +> [!TIP] +> This model was contributed by [nghuyong](https://huggingface.co/nghuyong), and the official code can be found in [PaddleNLP](https://github.com/PaddlePaddle/PaddleNLP) (in PaddlePaddle). +> +> Click on the ERNIE models in the right sidebar for more examples of how to apply ERNIE to different language tasks. + +The example below demonstrates how to predict the `[MASK]` token with [`Pipeline`], [`AutoModel`], and from the command line. -## Overview -ERNIE is a series of powerful models proposed by baidu, especially in Chinese tasks, -including [ERNIE1.0](https://huggingface.co/papers/1904.09223), [ERNIE2.0](https://ojs.aaai.org/index.php/AAAI/article/view/6428), -[ERNIE3.0](https://huggingface.co/papers/2107.02137), [ERNIE-Gram](https://huggingface.co/papers/2010.12148), [ERNIE-health](https://huggingface.co/papers/2110.07244), etc. + + -These models are contributed by [nghuyong](https://huggingface.co/nghuyong) and the official code can be found in [PaddleNLP](https://github.com/PaddlePaddle/PaddleNLP) (in PaddlePaddle). +```py +from transformers import pipeline -### Usage example -Take `ernie-1.0-base-zh` as an example: +pipeline = pipeline( + task="fill-mask", + model="nghuyong/ernie-3.0-xbase-zh" +) -```Python -from transformers import AutoTokenizer, AutoModel -tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-1.0-base-zh") -model = AutoModel.from_pretrained("nghuyong/ernie-1.0-base-zh") +pipeline("巴黎是[MASK]国的首都。") ``` -### Model checkpoints + + + +```py +import torch +from transformers import AutoModelForMaskedLM, AutoTokenizer + +tokenizer = AutoTokenizer.from_pretrained( + "nghuyong/ernie-3.0-xbase-zh", +) +model = AutoModelForMaskedLM.from_pretrained( + "nghuyong/ernie-3.0-xbase-zh", + torch_dtype=torch.float16, + device_map="auto" +) +inputs = tokenizer("巴黎是[MASK]国的首都。", return_tensors="pt").to("cuda") + +with torch.no_grad(): + outputs = model(**inputs) + predictions = outputs.logits + +masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1] +predicted_token_id = predictions[0, masked_index].argmax(dim=-1) +predicted_token = tokenizer.decode(predicted_token_id) + +print(f"The predicted token is: {predicted_token}") +``` + + + + +```bash +echo -e "巴黎是[MASK]国的首都。" | transformers run --task fill-mask --model nghuyong/ernie-3.0-xbase-zh --device 0 +``` + + + + +## Notes + +Model variants are available in different sizes and languages. | Model Name | Language | Description | |:-------------------:|:--------:|:-------------------------------:| @@ -51,18 +105,11 @@ model = AutoModel.from_pretrained("nghuyong/ernie-1.0-base-zh") | ernie-health-zh | Chinese | Layer:12, Heads:12, Hidden:768 | | ernie-gram-zh | Chinese | Layer:12, Heads:12, Hidden:768 | -You can find all the supported models from huggingface's model hub: [huggingface.co/nghuyong](https://huggingface.co/nghuyong), and model details from paddle's official -repo: [PaddleNLP](https://paddlenlp.readthedocs.io/zh/latest/model_zoo/transformers/ERNIE/contents.html) -and [ERNIE](https://github.com/PaddlePaddle/ERNIE/blob/repro). - ## Resources -- [Text classification task guide](../tasks/sequence_classification) -- [Token classification task guide](../tasks/token_classification) -- [Question answering task guide](../tasks/question_answering) -- [Causal language modeling task guide](../tasks/language_modeling) -- [Masked language modeling task guide](../tasks/masked_language_modeling) -- [Multiple choice task guide](../tasks/multiple_choice) +You can find all the supported models from huggingface's model hub: [huggingface.co/nghuyong](https://huggingface.co/nghuyong), and model details from paddle's official +repo: [PaddleNLP](https://paddlenlp.readthedocs.io/zh/latest/model_zoo/transformers/ERNIE/contents.html) +and [ERNIE's legacy branch](https://github.com/PaddlePaddle/ERNIE/tree/legacy/develop). ## ErnieConfig @@ -116,4 +163,4 @@ and [ERNIE](https://github.com/PaddlePaddle/ERNIE/blob/repro). ## ErnieForQuestionAnswering [[autodoc]] ErnieForQuestionAnswering - - forward \ No newline at end of file + - forward