Skip to content

Getting Hosted inference API working?#8030

Closed
longenbach wants to merge 1 commit intohuggingface:masterfrom
longenbach:patch-2
Closed

Getting Hosted inference API working?#8030
longenbach wants to merge 1 commit intohuggingface:masterfrom
longenbach:patch-2

Conversation

@longenbach
Copy link
Copy Markdown
Contributor

@longenbach longenbach commented Oct 25, 2020

Trying to get Hosted inference API to work. Was following https://gist.github.com/julien-c/857ba86a6c6a895ecd90e7f7cab48046 ... is below the correct YAML syntax?

pipeline:
-fill-mask
widget:
-text: "København er [mask] i Danmark."

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to the it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors which may be interested in your PR.

Trying to get Hosted inference API to work. Was following https://gist.github.com/julien-c/857ba86a6c6a895ecd90e7f7cab48046 ... is below the correct YAML syntax? 

pipeline:
- fill-mask
widget:
- text: "København er <mask> i Danmark."
@julien-c julien-c added the model card Related to pretrained model cards label Oct 25, 2020
@julien-c julien-c closed this in efc4a21 Oct 26, 2020
julien-c added a commit that referenced this pull request Oct 26, 2020
@julien-c
Copy link
Copy Markdown
Member

@longenbach pipeline_tag expects a single string, not an array of string.

Note that you wouldn't need any of the tags or pipeline_tag (they would be detected automatically) if your config.json contained:

{
  ...
  "architectures": [
    "BertForMaskedLM"
  ],
  "model_type": "bert"
}

We'll try to make that clearer in a next iteration.

@julien-c
Copy link
Copy Markdown
Member

Nice!

Screenshot 2020-10-26 at 09 38 41

@longenbach
Copy link
Copy Markdown
Contributor Author

@julien-c it works 🤗 Thanks for the insight on the documentation. So you are saying we can avoid making a model card if you include that JSON chunk in your uploaded config.json file?

In case others find confusion with the Hosted inference API. Below is the YAML section of my model card that works:

---
language: da
tags:
- bert
- masked-lm
- lm-head
license: cc-by-4.0
datasets:
- common_crawl
- wikipedia
pipeline_tag: fill-mask
widget:
- text: "København er [MASK] i Danmark."
---

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

model card Related to pretrained model cards

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants