This requirement comes from the contract between Hugging Face and Intel.
We should have a way to allow users to utilize scikit-learn-intelex at inference time. This requires a few steps:
- Check if models trained with
scikit-learn can be served with scikit-learn-intelex and vice versa.
- Expose a method in
skops.hub_utils to add a configuration flag to use scikit-learn-intelex
- Should probably also add a tag, so that we can have better visibility on models using this option.
- On the https://github.com/huggingface/api-inference-community/tree/main/docker_images/sklearn side, if the flag exists, install the
scikit-learn-intelex package in the environment and run the script with this command: python -m sklearnex my_application.py
Project homepage: https://intel.github.io/scikit-learn-intelex/
This requirement comes from the contract between Hugging Face and Intel.
We should have a way to allow users to utilize
scikit-learn-intelexat inference time. This requires a few steps:scikit-learncan be served withscikit-learn-intelexand vice versa.skops.hub_utilsto add a configuration flag to usescikit-learn-intelexscikit-learn-intelexpackage in the environment and run the script with this command:python -m sklearnex my_application.pyconda-forgeare up-to-date: https://anaconda.org/conda-forge/scikit-learn-intelexProject homepage: https://intel.github.io/scikit-learn-intelex/