diff --git a/.github/workflows/build-test.yml b/.github/workflows/build-test.yml
index 8b44f3ef..9f08440d 100644
--- a/.github/workflows/build-test.yml
+++ b/.github/workflows/build-test.yml
@@ -46,7 +46,7 @@ jobs:
- uses: prefix-dev/setup-pixi@v0.8.1
with:
- pixi-version: v0.39.0
+ pixi-version: v0.39.5
environments: ${{ matrix.environment }}
# we can freeze the environment and manually bump the dependencies to the
# latest version time to time.
diff --git a/.github/workflows/clean-skops-user.yml b/.github/workflows/clean-skops-user.yml
deleted file mode 100644
index c32ad9be..00000000
--- a/.github/workflows/clean-skops-user.yml
+++ /dev/null
@@ -1,24 +0,0 @@
-name: clean-skops-user
-
-on:
- schedule:
- # * is a special character in YAML so you have to quote this string
- - cron: '10 1 * * *'
-
-jobs:
- clean-skops-user:
-
- runs-on: ubuntu-latest
- if: "github.repository == 'skops-dev/skops'"
-
- # Timeout: https://stackoverflow.com/a/59076067/4521646
- timeout-minutes: 35
-
- steps:
- - uses: actions/checkout@v4
- - name: Set up Python
- uses: actions/setup-python@v5
- - name: Install Requirements
- run: pip install huggingface_hub
- - name: run cleanup
- run: echo "y" | python scripts/clean_skops.py
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 83222e6c..cd247d41 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -98,21 +98,6 @@ scikit-learn and all other required dependencies with:
pytest
-Certain tests require internet access to run, and they typically take slightly
-longer to run than other tests. If you'd like to skip those tests, you can add
-``-m not network`` to your ``pytest`` command, or ``-m network`` to only run
-those tests. For example, you can run all tests except the ones requiring
-internet with:
-
-.. code:: bash
-
- pytest -m "not network" skops
-
-Similarly, there is a flag, ``-m inference`` for tests that hit the Hugging Face
-Inference API, which can be quite slow or even hang. Skip these tests as long as
-you don't make any changes to this functionality. If you already skip network
-tests, the inference tests will also be skipped.
-
Releases
========
diff --git a/README.rst b/README.rst
index a77ac4ec..7d225121 100644
--- a/README.rst
+++ b/README.rst
@@ -31,26 +31,19 @@ SKOPS
``skops`` is a Python library helping you share your `scikit-learn
`__ based models and put them in production.
-At the moment, it includes tools to easily integrate models on the Hugging Face
-Hub, which allows you to share your models, make them discoverable, and use the
-Hub's API inference and widgets to get outputs of the model without having to
-download or load the model.
-
-- ``skops.hub_utils``: tools to create a model repository to be stored on
- `Hugging Face Hub `__, mainly through
- ``skops.hub_utils.init`` and ``skops.hub_utils.push``. You can see all the
- models uploaded to the hub using this library `here
- `__. Find out more `here
- `__.
+At the moment, it includes `skops.io` to securely persist sklearn estimators and
+more, without using ``pickle``. It also includes `skops.card` to create a model
+card explaining what the model does and how it should be used.
+
+- ``skops.io``: Secure persistence of sklearn estimators and more, without using
+ ``pickle``. Visit `the docs
+ `__ for more
+ information.
- ``skops.card``: tools to create a model card explaining what the model does
and how it should be used. The model card can then be stored as the
``README.md`` file on the Hugging Face Hub, with pre-populated metadata to
help Hub understand the model. More information can be found `here
`__.
-- ``skops.io``: Secure persistence of sklearn estimators and more, without using
- ``pickle``. Visit `the docs
- `__ for more
- information.
Please refer to our `documentation `_
on using the library as user, which includes user guides on the above topics as
diff --git a/docs/changes.rst b/docs/changes.rst
index 22c036e0..c0ddb20e 100644
--- a/docs/changes.rst
+++ b/docs/changes.rst
@@ -39,7 +39,7 @@ v0.9
estimators. :pr:`384` by :user:`Reid Johnson `.
- Fix an issue with visualizing Skops files for `scikit-learn` tree estimators.
:pr:`386` by :user:`Reid Johnson `.
-- :func:`skops.hub_utils.get_model_output` and :func:`skops.hub_utils.push` are
+- ``skops.hub_utils.get_model_output`` and ``skops.hub_utils.push`` are
deprecated and will be removed in version 0.10. :pr:`396` by `Adrin Jalali`_.
v0.8
@@ -156,7 +156,7 @@ v0.2
filesystem if it fails for some reason. :pr:`60` by `Adrin Jalali`_
- When adding figures or tables, it's now possible to set ``folded=True`` to
render the content inside a details tag. :pr:`108` by `Benjamin Bossan`_.
-- Add :meth:`skops.hub_utils.get_model_output` to get the model's output using
+- Add ``skops.hub_utils.get_model_output`` to get the model's output using
The Hugging Face Hub's inference API, and return an array with the outputs.
:pr:`105` by `Adrin Jalali`_.
@@ -167,7 +167,7 @@ This is the first release of the library. It include two main modules:
- :mod:`skops.hub_utils`: tools to create a model repository to be stored on
`Hugging Face Hub `__, mainly through
- :func:`skops.hub_utils.init` and :func:`skops.hub_utils.push`.
+ :func:`skops.hub_utils.init` and ``skops.hub_utils.push``.
- :mod:`skops.card`: tools to create a model card explaining what the model does
and how it should be used. The model card can then be stored as the
``README.md`` file on the Hugging Face Hub, with pre-populated metadata to
diff --git a/docs/community.rst b/docs/community.rst
index 2f7aff3f..0cf0fd34 100644
--- a/docs/community.rst
+++ b/docs/community.rst
@@ -13,16 +13,6 @@ If you'd like to contribute to the project, please make sure you read our
`__.
-Discord
-~~~~~~~
-We also have a place on Hugging Face's discord server. We're happy to see you
-there and answer any questions you might have. You can join using this `invite
-link `__. Once you join, first you need to accept
-the rules on the server regarding respectful and harassment free communication,
-and then you can head to the ``#role-assignment`` channel where you'll find and
-``Open Source ML`` button. Clicking on that will give you access to a few
-channels and categories, including the ``skops`` category.
-
Maintainers
-----------
Current maintainers of the project are (in alphabetical order):
diff --git a/docs/conf.py b/docs/conf.py
index 4c4d99dd..b63f7467 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -148,6 +148,5 @@ def linkcode_resolve(domain, info):
"sklearn": ("https://scikit-learn.org/stable/", None),
"pandas": ("https://pandas.pydata.org/docs/", None),
"joblib": ("https://joblib.readthedocs.io/en/latest/", None),
- "huggingface_hub": ("https://huggingface.co/docs/huggingface_hub/main/en", None),
"fairlearn": ("https://fairlearn.org/v0.8/", None),
}
diff --git a/docs/examples.rst b/docs/examples.rst
index f7ca5d38..45fc78eb 100644
--- a/docs/examples.rst
+++ b/docs/examples.rst
@@ -6,9 +6,6 @@ Examples of interactions with the Hugging Face Hub
- Creating the Model Card:
:ref:`sphx_glr_auto_examples_plot_model_card.py` is an example of using
skops to create a model card that can be used on the Hugging Face Hub.
-- Putting the Model Card on the Hub:
- :ref:`sphx_glr_auto_examples_plot_hf_hub.py` is an example of using skops
- to put a model card on the Hugging Face Hub.
- Tabular Regression:
:ref:`sphx_glr_auto_examples_plot_tabular_regression.py` is an example of using skops to serialize a tabular
regression model and create a model card and a Hugging Face Hub repository.
diff --git a/docs/hf_hub.rst b/docs/hf_hub.rst
deleted file mode 100644
index d116ce56..00000000
--- a/docs/hf_hub.rst
+++ /dev/null
@@ -1,162 +0,0 @@
-.. _hf_hub:
-
-scikit-learn Models on Hugging Face Hub
-=======================================
-
-This library allows you to initialize and create a model repository compatible
-with `Hugging Face Hub `__, which among other
-things, gives you the following benefits:
-
-- Inference API to get model output through REST calls
-- A widget to try the model directly in the browser
-- Metadata tags for better discoverability of the model
-- Collaborating with others on a model through discussions and pull requests
-- Convenient sharing of models with the community
-
-You can see all the models uploaded to the Hugging Face Hub using this library
-`here `_.
-
-In terms of files, there are three which a scikit-learn model repo needs to
-have on the Hub:
-
-- ``README.md``: includes certain metadata on top of the file and then a
- description of the model, aka model card.
-- ``config.json``: contains the configuration needed to run the model.
-- The persisted model file. There are no constraints on the name of the file
- and the name is configured in ``config.json``. The file needs to be loadable
- by :func:`joblib.load` or :func:`pickle.load`.
-
-There are certain requirements in terms of information about the model for the
-Hub to be able to load and run the model. For scikit-learn compatible models,
-this information is stored in two places:
-
-- The metadata in ``README.md`` of the model repository, about which you can
- read `here `__.
-- The configuration stored in ``config.json``.
-
-As a user of ``skops``, you can use the tools in ``skops.hub_utils`` to create
-and persist a ``config.json`` file, and then use it to populate necessary
-metadata in the ``README.md`` file. The metadata in ``README.md`` is used by
-the Hub's backend to understand the type of the model and the kind of task
-which the model tries to solve. An example of a task can be
-``"tabular-classification"`` or ``"text-regression"``.
-
-An example ``config.json`` file looks like this::
-
- {
- "sklearn": {
- "columns": [
- "petal length (cm)",
- "petal width (cm)",
- "sepal length (cm)",
- "sepal width (cm)",
- ],
- "environment": ['scikit-learn="1.1.1"', "numpy"],
- "example_input": {
- "petal length (cm)": [1.4, 1.4, 1.3],
- "petal width (cm)": [0.2, 0.2, 0.2],
- "sepal length (cm)": [5.1, 4.9, 4.7],
- "sepal width (cm)": [3.5, 3.0, 3.2],
- },
- "model": {"file": "model.pkl"},
- "task": "tabular-classification",
- }
- }
-
-The key ``sklearn`` includes the following sub-keys:
-
-- ``columns``: An ordered list of column names. The order is important as it is
- used to make sure the input given to the model is what the model expects.
-- ``example_input``: A list of examples to the model. This is in the form of a
- dictionary of column names to list of values, and is used by the Hugging Face
- Hub backend to show them in the widget to test the model when visiting the
- model's page on the Hub.
-- ``environment``: A list of dependencies that the model requires. These
- packages must be available on conda-forge and are installed before loading
- the model.
-- ``model.file``: The file name of the persisted model.
-- ``task``: The task of the model.
-
-You almost never need to create or touch this file manually, and it's created
-when you call :func:`skops.hub_utils.init`.
-
-It is recommended to include the script itself that creates the whole output in
-the upload. This way, the results are easily reproducible for others. To achieve
-this, call :func:`skops.hub_utils.add_files`:
-
-.. code:: python
-
- # contents of train.py
- ...
- hub_utils.init(model, dst=local_repo)
- hub_utils.add_files(__file__, dst=local_repo) # adds train.py to repo
- hub_utils.push(...)
-
-You may of course add more files if they're useful.
-
-.. _hf_hub_inference:
-
-Inference without Downloading the Models
-----------------------------------------
-
-You can use the Hugging Face Hub's inference API to get model output without
-downloading the models. The :func:`skops.hub_utils.get_model_output` function
-returns the model output for a given input. It can be used as::
-
- import skops.hub_utils as hub_utils
- import pandas as pd
- data = pd.DataFrame(...)
- # Load the model from the Hub
- res = hub_utils.get_model_output("USER/MODEL_ID", data)
-
-In the above code snippet, ``res`` will be a :class:`numpy.ndarray` containing
-the model's output.
-
-.. _hf_hub_gradio:
-..
- TODO: replace gradio link once gradio provides object.inv
-Easily build user interfaces to your scikit-learn models
---------------------------------------------------------
-`gradio `__ is a python library that lets you create interfaces on your model.
-It has a class called `Interface `__ that lets you create application
-interfaces to your machine learning models. Using gradio can have some advantages over the using a plain
-model repository, e.g. the Gradio dataframe component allows uploading a csv for tabular data, unlike the
-widget in the model repository.
-
-``gradio`` is integrated with skops, so you can load an interface with only one
-line of code. During the initialization of the interface, call load method with
-your repository identifier prepended with "huggingface/" will load an
-interface for your model. The interface has a dataframe input that takes samples
-and a dataframe output to return predictions. It also takes the example in the
-repository that is previously pushed with skops.
-Calling `gr.Interface.launch() `__ will launch your application.
-
-.. code:: python
-
- import gradio as gr
- repo_id = "scikit-learn/tabular-playground"
- gr.Interface.load(f"huggingface/{repo_id}").launch()
-
-
-You can further customize your UI, add description, title, and more. If you'd
-like to share your demo, you can set ``share`` to True in `gr.Interface.launch() `__.
-
-.. code:: python
-
- title = "Supersoaker Defective Product Prediction"
- description = ("This model predicts Supersoaker production line failures."
- "Drag and drop any slice from dataset or edit values as you wish in below"
- "dataframe component.")
- gr.Interface.load(f"huggingface/{repo_id}", title = title, description = description)
-
-Sharing your local application has time limitations.
-If you want to share your application continuously, you can deploy it to
-Hugging Face Spaces. You can check out `this blog `__
-on how to do it.
-For more information, please refer to documentation of `gradio `__.
-
-It's also possible to spawn a gradio space directly from the model repository.
-To achieve this, from the model page, click on ``Deploy`` (top right corner) >
-``Spaces`` > ``Create new Space``, then follow the instructions. After
-finishing, you get a gradio space hosted on Hugging Face Hub, with all the
-benefits that brings.
diff --git a/docs/model_card.rst b/docs/model_card.rst
index 0afdb7d2..32ee1e29 100644
--- a/docs/model_card.rst
+++ b/docs/model_card.rst
@@ -8,8 +8,7 @@ which are a short documentation explaining what the model does, how it's
trained, and its limitations. `Hugging Face Hub `__
expects a ``README.md`` file containing a certain set of metadata at the
beginning of it, following with the content of the model card in markdown
-format. The metadata section is used to make models searchable on the Hub, and
-get the inference API and the widgets on the website working.
+format.
Metadata
--------
@@ -17,13 +16,7 @@ Metadata
The metadata part of the file needs to follow the specifications `here
`__. It
includes simple attributes of your models such as the task you're solving,
-dataset you trained the model with, evaluation results and more. When the model
-is hosted on the Hub, information in metadata like task name or dataset help
-your model be discovered on the `Hugging Face Hub
-`__. The task identifiers should follow the task
-taxonomy defined in Hugging Face Hub, as it enables the inference widget on the
-model page. An example to task identifier can be ``"tabular-classification"``
-or ``"text-regression"``.
+dataset you trained the model with, evaluation results and more.
Here's an example of the metadata section of the ``README.md`` file:
diff --git a/examples/plot_california_housing.py b/examples/plot_california_housing.py
index e20c067a..a77f1a5c 100644
--- a/examples/plot_california_housing.py
+++ b/examples/plot_california_housing.py
@@ -39,7 +39,6 @@
# ``python -m pip install jupyter matplotlib pandas scikit-learn skops``
# %%
-import os
from operator import itemgetter
from pathlib import Path
from tempfile import mkdtemp
@@ -47,7 +46,6 @@
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
-import sklearn
from matplotlib.patches import Rectangle
from sklearn.compose import ColumnTransformer
from sklearn.datasets import fetch_california_housing
@@ -66,8 +64,7 @@
from sklearn.preprocessing import FunctionTransformer
from sklearn.tree import DecisionTreeRegressor
-import skops
-from skops import card, hub_utils
+from skops import card
from skops import io as sio
# %%
@@ -1663,23 +1660,6 @@
# just add the same section name again and the value would be overwritten
# by the new content.
-# %%
-# Another useful thing for readers of the card to have is a small code
-# snippet that shows how to load the model. To add this, we use the
-# ``model_card.add_get_started_code`` method. We add a short
-# description, but that’s optional, as well as the model format (remember,
-# we used the skops format here) and the file name. Regarding the latter,
-# since the file is saved in a temporary directory, we should strip that
-# away from the name by calling ``file_name.name``, since another
-# user would not have the file in the exact same temporary location.
-
-# %%
-model_card.add_get_started_code(
- description="Run the code below to load the model",
- model_format="skops",
- file_name=file_name.name,
-)
-
# %%
# Another convenience method we should make use of is the
# ``model_card.add_metrics`` method. This will store the metrics
@@ -1766,103 +1746,7 @@
# %%
# Now the model card is saved as a markdown file in the temporary
# directory, together with the gradient boosting model and the figures we
-# added earlier. We could now share the contents of that folder with
-# people who might be interested in our model by sending them the contents
-# of that directory.
-
-# %%
-# Upload the model to Hugging Face Hub
-# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-# %%
-# A perhaps better way to share these results is to upload it somewhere
-# for other people to discover and use. Hugging Face provides a platform
-# for doing this, the `Hugging Face
-# Hub `__. Even though it’s mostly
-# known for deep learning models, it also works with scikit-learn. So
-# let’s upload our model and model card there.
-
-# %%
-# First, let’s create yet another temporary directory. This one will
-# contain everything we want to upload to the Hugging Face Hub repository:
-
-# %%
-hub_dir = Path(mkdtemp())
-
-# %%
-# Next we use the ``hub_utils`` provided by skops to initialize and
-# push the repository. For now, let’s create the initial repo using
-# ``hub_utils.init``. Note that we pass the full path to the model
-# file here. We also pass the requirements (which are scikit-learn,
-# pandas, and skops in this case), the kind of task we’re solving,
-# i.e. tabular regression, and a sample of our data (only the first 3 rows
-# will actually be used).
-
-# %%
-requirements = [
- f"scikit-learn=={sklearn.__version__}",
- f"pandas=={pd.__version__}",
- f"skops=={skops.__version__}",
-]
-
-hub_utils.init(
- model=file_name,
- requirements=requirements,
- dst=hub_dir,
- task="tabular-regression",
- data=df_test,
-)
-
-# %%
-# When we take a look at the directory of the repo, we find the following:
-
-# %%
-os.listdir(hub_dir)
-
-# %%
-# So the model was automatically copied to the directory (which is why we
-# needed to pass the full path to it), and a ``config.json`` was
-# created, which contains useful metadata about our model.
-
-# %%
-# Remember that we can attach the metadata to our model card? Let’s do
-# this now. To achieve this, we load the metadata from the
-# ``config.json`` using the function ``metadata_from_config``
-# that is provided by skops:
-
-# %%
-metadata = card.metadata_from_config(hub_dir / "config.json")
-metadata
-
-# %%
-# Let’s attach the metadata to our model card and save it again:
-
-# %%
-model_card.metadata = metadata
-model_card.save(temp_dir / "README.md")
-
-# %%
-# So now the ``README.md`` contains the metadata. However, the model
-# card and the figures we created are not part of the repository directory
-# yet, they are still in the temporary directory we created earler. So
-# let’s fix this. We use the ``add_files`` function to do this:
-
-# %%
-hub_utils.add_files(
- temp_dir / "README.md",
- temp_dir / "geographic.png",
- "permutation-importances.png",
- dst=hub_dir,
-)
-
-# %%
-os.listdir(hub_dir)
-
-# %%
-# Creating the Repo and Pushing to Hugging Face Hub
-# You can use the tools available in ``huggingface_hub`` to create a repo and
-# push the contents of the repo folder to that repo. For more information visit
-# https://huggingface.co/docs/huggingface_hub/index
+# added earlier.
# %%
# Conclusion
@@ -1875,11 +1759,6 @@
# known features of scikit-learn, and trained a machine learning model
# that performs well.
-# %%
-# But we didn’t stop there. We also leveraged skops and the Hugging Face
-# Hub to share our results with a wider public, ensuring that the model
-# artifact is safe to use and that our task is well documented.
-
# %%
# If you have any feedback or suggestions for improvement, feel free to
# reach out to the skops team, e.g. by visiting our `discord
diff --git a/examples/plot_custom_model_card.py b/examples/plot_custom_model_card.py
index 1385836f..32548b76 100644
--- a/examples/plot_custom_model_card.py
+++ b/examples/plot_custom_model_card.py
@@ -128,17 +128,6 @@
section="Regression on California Housing dataset/Model Description/Model Diagram",
)
-# %%
-# Add getting started code
-# ------------------------
-
-model_card.add_get_started_code(
- section="Regression on California Housing dataset/Usage/Getting Started",
- description="Load the model using the snippet below.",
- file_name=model_file_name,
- model_format="skops",
-)
-
# %%
# Add partial dependence plot
# ---------------------------
diff --git a/examples/plot_model_card.py b/examples/plot_model_card.py
deleted file mode 100644
index 7f098ee6..00000000
--- a/examples/plot_model_card.py
+++ /dev/null
@@ -1,189 +0,0 @@
-"""
-scikit-learn model cards
-------------------------
-
-This guide demonstrates how you can use this package to create a model card on a
-scikit-learn compatible model and save it.
-"""
-
-# %%
-# Imports
-# =======
-# First we will import everything required for the rest of this document.
-
-import pickle
-from pathlib import Path
-from tempfile import mkdtemp, mkstemp
-
-import pandas as pd
-import sklearn
-from sklearn.datasets import load_breast_cancer
-from sklearn.ensemble import HistGradientBoostingClassifier
-from sklearn.experimental import enable_halving_search_cv # noqa
-from sklearn.inspection import permutation_importance
-from sklearn.metrics import (
- ConfusionMatrixDisplay,
- accuracy_score,
- classification_report,
- confusion_matrix,
- f1_score,
-)
-from sklearn.model_selection import HalvingGridSearchCV, train_test_split
-
-from skops import hub_utils
-from skops.card import Card, metadata_from_config
-
-# %%
-# Data
-# ====
-# We load breast cancer dataset from sklearn.
-
-X, y = load_breast_cancer(as_frame=True, return_X_y=True)
-X_train, X_test, y_train, y_test = train_test_split(
- X, y, test_size=0.3, random_state=42
-)
-print("X's summary: ", X.describe())
-print("y's summary: ", y.describe())
-
-# %%
-# Train a Model
-# =============
-# Using the above data, we train a model. To select the model, we use
-# :class:`~sklearn.model_selection.HalvingGridSearchCV` with a parameter grid
-# over :class:`~sklearn.ensemble.HistGradientBoostingClassifier`.
-
-param_grid = {
- "max_leaf_nodes": [5, 10, 15],
- "max_depth": [2, 5, 10],
-}
-
-model = HalvingGridSearchCV(
- estimator=HistGradientBoostingClassifier(),
- param_grid=param_grid,
- random_state=42,
- n_jobs=-1,
-).fit(X_train, y_train)
-model.score(X_test, y_test)
-
-
-# %%
-# Initialize a repository to save our files in
-# ============================================
-# We will now initialize a repository and save our model
-_, pkl_name = mkstemp(prefix="skops-", suffix=".pkl")
-
-with open(pkl_name, mode="bw") as f:
- pickle.dump(model, file=f)
-
-local_repo = mkdtemp(prefix="skops-")
-
-hub_utils.init(
- model=pkl_name,
- requirements=[f"scikit-learn={sklearn.__version__}"],
- dst=local_repo,
- task="tabular-classification",
- data=X_test,
-)
-
-# %%
-# Create a model card
-# ====================
-# We now create a model card, and populate its metadata with information which
-# is already provided in ``config.json``, which itself is created by the call to
-# :func:`.hub_utils.init` above. We will see below how we can populate the model
-# card with useful information.
-
-model_card = Card(model, metadata=metadata_from_config(Path(local_repo)))
-
-# %%
-# Add more information
-# ====================
-# So far, the model card does not tell viewers a lot about the model. Therefore,
-# we add more information about the model, like a description and what its
-# license is.
-
-model_card.metadata.license = "mit"
-limitations = "This model is not ready to be used in production."
-model_description = (
- "This is a `HistGradientBoostingClassifier` model trained on breast cancer "
- "dataset. It's trained with `HalvingGridSearchCV`, with parameter grids on "
- "`max_leaf_nodes` and `max_depth`."
-)
-model_card_authors = "skops_user"
-citation_bibtex = "**BibTeX**\n\n```\n@inproceedings{...,year={2020}}\n```"
-model_card.add(
- **{ # type: ignore
- "Citation": citation_bibtex,
- "Model Card Authors": model_card_authors,
- "Model description": model_description,
- "Model description/Intended uses & limitations": limitations,
- }
-)
-
-# %%
-# Add plots, metrics, and tables to our model card
-# ================================================
-# Furthermore, to better understand the model performance, we should evaluate it
-# on certain metrics and add those evaluations to the model card. In this
-# particular example, we want to calculate the accuracy and the F1 score. We
-# calculate those using sklearn and then add them to the model card by calling
-# :meth:`.Card.add_metrics`. But this is not all, we can also add matplotlib
-# figures to the model card, e.g. a plot of the confusion matrix. To achieve
-# this, we create the plot using sklearn, save it locally, and then add it using
-# :meth:`.Card.add_plot` method. Finally, we can also add some useful tables to
-# the model card, e.g. the results from the grid search and the classification
-# report. Those can be added using :meth:`.Card.add_table`
-
-y_pred = model.predict(X_test)
-eval_descr = (
- "The model is evaluated on test data using accuracy and F1-score with "
- "macro average."
-)
-model_card.add(**{"Model description/Evaluation Results": eval_descr}) # type: ignore
-
-accuracy = accuracy_score(y_test, y_pred)
-f1 = f1_score(y_test, y_pred, average="micro")
-model_card.add_metrics(**{"accuracy": accuracy, "f1 score": f1})
-
-cm = confusion_matrix(y_test, y_pred, labels=model.classes_)
-disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=model.classes_)
-disp.plot()
-
-disp.figure_.savefig(Path(local_repo) / "confusion_matrix.png")
-model_card.add_plot(
- **{"Model description/Evaluation Results/Confusion Matrix": "confusion_matrix.png"}
-)
-
-importances = permutation_importance(model, X_test, y_test, n_repeats=10)
-model_card.add_permutation_importances(
- importances,
- X_test.columns,
- plot_file="importance.png",
- plot_name="Permutation Importance",
-)
-
-cv_results = model.cv_results_
-clf_report = classification_report(
- y_test, y_pred, output_dict=True, target_names=["malignant", "benign"]
-)
-# The classification report has to be transformed into a DataFrame first to have
-# the correct format. This requires removing the "accuracy", which was added
-# above anyway.
-del clf_report["accuracy"]
-clf_report = pd.DataFrame(clf_report).T.reset_index()
-model_card.add_table(
- folded=True,
- **{
- "Model description/Evaluation Results/Hyperparameter search results": (
- cv_results
- ),
- "Model description/Evaluation Results/Classification report": clf_report,
- },
-)
-
-# %%
-# Save model card
-# ===============
-# We can simply save our model card by providing a path to :meth:`.Card.save`.
-
-model_card.save(Path(local_repo) / "README.md")
diff --git a/examples/plot_tabular_regression.py b/examples/plot_tabular_regression.py
index 762939b3..5ca2e8ce 100644
--- a/examples/plot_tabular_regression.py
+++ b/examples/plot_tabular_regression.py
@@ -16,7 +16,6 @@
from tempfile import mkdtemp, mkstemp
import matplotlib.pyplot as plt
-import sklearn
from sklearn.datasets import load_diabetes
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_absolute_error, mean_squared_error, r2_score
@@ -25,7 +24,7 @@
from sklearn.preprocessing import StandardScaler
import skops.io as sio
-from skops import card, hub_utils
+from skops import card
# %%
# Data
@@ -70,27 +69,13 @@
local_repo = mkdtemp(prefix="skops-")
-hub_utils.init(
- model=pkl_name,
- requirements=[f"scikit-learn={sklearn.__version__}"],
- dst=local_repo,
- task="tabular-regression",
- data=X_test,
-)
-
-if "__file__" in locals(): # __file__ not defined during docs built
- # Add this script itself to the files to be uploaded for reproducibility
- hub_utils.add_files(__file__, dst=local_repo)
-
# %%
# Create a model card
# ===================
-# We now create a model card, and populate its metadata with information which
-# is already provided in ``config.json``, which itself is created by the call to
-# :func:`.hub_utils.init` above. We will see below how we can populate the model
+# We now create a model card. We will see below how we can populate the model
# card with useful information.
-model_card = card.Card(model, metadata=card.metadata_from_config(Path(local_repo)))
+model_card = card.Card(model)
# %%
# Add more information
@@ -99,7 +84,6 @@
# we add more information about the model, like a description and what its
# license is.
-model_card.metadata.license = "mit"
limitations = (
"This model is made for educational purposes and is not ready to be used in"
" production."
diff --git a/examples/plot_text_classification.py b/examples/plot_text_classification.py
index e8ca8dc3..e5c6fdb3 100644
--- a/examples/plot_text_classification.py
+++ b/examples/plot_text_classification.py
@@ -17,7 +17,6 @@
from tempfile import mkdtemp, mkstemp
import pandas as pd
-import sklearn
from sklearn.datasets import fetch_20newsgroups
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.metrics import (
@@ -31,7 +30,7 @@
from sklearn.naive_bayes import MultinomialNB
from sklearn.pipeline import Pipeline
-from skops import card, hub_utils
+from skops import card
# %%
# Data
@@ -87,23 +86,13 @@
local_repo = mkdtemp(prefix="skops-")
-hub_utils.init(
- model=pkl_name,
- requirements=[f"scikit-learn={sklearn.__version__}"],
- dst=local_repo,
- task="text-classification",
- data=X_test,
-)
-
# %%
# Create a model card
# ===================
-# We now create a model card, and populate its metadata with information which
-# is already provided in ``config.json``, which itself is created by the call to
-# :func:`.hub_utils.init` above. We will see below how we can populate the model
+# We now create a model card. We will see below how we can populate the model
# card with useful information.
-model_card = card.Card(model, metadata=card.metadata_from_config(Path(local_repo)))
+model_card = card.Card(model)
# %%
# Add more information
@@ -112,7 +101,6 @@
# we add more information about the model, like a description and what its
# license is.
-model_card.metadata.license = "mit"
limitations = "This model is not ready to be used in production."
model_description = (
"This is a Multinomial Naive Bayes model trained on 20 news groups dataset."
diff --git a/pixi.lock b/pixi.lock
index e7a9893d..ef954d09 100644
--- a/pixi.lock
+++ b/pixi.lock
@@ -219,14 +219,7 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.21.0-pyhd8ed1ab_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/28/89/60f51ad71f63aaaa7e51a2a2ad37919985a341a1d267070f212cdf6c2d22/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
osx-64:
- conda: https://conda.anaconda.org/conda-forge/osx-64/_py-xgboost-mutex-2.0-cpu_0.tar.bz2
@@ -357,14 +350,7 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.21.0-pyhd8ed1ab_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-hd23fc13_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
osx-arm64:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/_py-xgboost-mutex-2.0-cpu_0.tar.bz2
@@ -495,14 +481,7 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.21.0-pyhd8ed1ab_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-h8359307_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
win-64:
- conda: https://conda.anaconda.org/conda-forge/win-64/_openmp_mutex-4.5-2_gnu.conda
@@ -657,14 +636,7 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/zipp-3.21.0-pyhd8ed1ab_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/c5/77/3a78bf28bfaa0863f9cfef278dbeadf55efe064eafff8c7c424ae3c4c1bf/charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn11:
@@ -873,15 +845,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py39h08a7858_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/28/89/60f51ad71f63aaaa7e51a2a2ad37919985a341a1d267070f212cdf6c2d22/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -1025,14 +991,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-hd23fc13_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py39hc23f734_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -1176,14 +1136,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-h8359307_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py39hcf1bb16_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -1343,14 +1297,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py39h9bf74da_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/c5/77/3a78bf28bfaa0863f9cfef278dbeadf55efe064eafff8c7c424ae3c4c1bf/charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn12:
@@ -1555,15 +1503,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py310ha39cb0e_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/f8/01/344ec40cf5d85c1da3c1f57566c59e0c9b56bcc5566c08804a95a6cc8257/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -1704,14 +1646,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-hd23fc13_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py310h41d873f_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/23/81/d7eef6a99e42c77f444fdd7bc894b0ceca6c3a95c51239e74a722039521c/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -1852,14 +1788,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-h8359307_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py310h2665a74_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/21/67/b4564d81f48042f520c948abac7079356e94b30cb8ffb22e747532cf469d/charset_normalizer-3.4.0-cp310-cp310-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -2016,14 +1946,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py310he5e10e1_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/d6/20/f1d4670a8a723c46be695dff449d86d6092916f9e99c53051954ee33a1bc/charset_normalizer-3.4.0-cp310-cp310-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn13:
@@ -2229,15 +2153,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py311hbc35293_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/eb/5b/6f10bad0f6461fa272bfbbdf5d0023b5fb9bc6217c92bf068fa5a99820f5/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -2379,14 +2297,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-hd23fc13_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py311hdf6fcd6_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/77/d5/8c982d58144de49f59571f940e329ad6e8615e1e82ef84584c5eeb5e1d72/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -2528,14 +2440,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-h8359307_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py311ha60cc69_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/bf/19/411a64f01ee971bed3231111b69eb56f9331a769072de479eae7de52296d/charset_normalizer-3.4.0-cp311-cp311-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -2693,14 +2599,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py311h53056dc_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/0b/6e/b13bd47fa9023b3699e94abf565b5a2f0b0be6e9ddac9812182596ee62e4/charset_normalizer-3.4.0-cp311-cp311-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn14:
@@ -2906,15 +2806,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py312hef9b889_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/16/92/92a76dc2ff3a12e69ba94e7e05168d37d0345fa08c87e1fe24d0c2a42223/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -3056,14 +2950,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/zlib-1.3.1-hd23fc13_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py312h7122b0e_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/50/89/354cc56cf4dd2449715bc9a0f54f3aef3dc700d2d62d1fa5bbea53b13426/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -3205,14 +3093,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zlib-1.3.1-h8359307_2.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py312h15fbf35_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/fa/44/b730e2a2580110ced837ac083d8ad222343c96bb6b66e9e4e706e4d0b6df/charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -3370,14 +3252,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py312h7606c53_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/3e/67/7b72b69d25b89c0b3cea583ee372c43aa24df15f0e0f8d3982c57804984b/charset_normalizer-3.4.0-cp312-cp312-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn15:
@@ -3569,15 +3445,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py313h80202fe_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -3689,14 +3559,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py313hab0894d_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -3808,14 +3672,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/yaml-0.2.5-h3422bc3_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py313hf2da073_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -3957,14 +3815,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py313h574b89f_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn16:
@@ -4156,15 +4008,9 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py313h80202fe_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -4275,14 +4121,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py313hab0894d_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -4393,14 +4233,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/yaml-0.2.5-h3422bc3_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py313hf2da073_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -4541,14 +4375,8 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py313h574b89f_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
ci-sklearn17:
@@ -4737,20 +4565,14 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py312hef9b889_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/16/92/92a76dc2ff3a12e69ba94e7e05168d37d0345fa08c87e1fe24d0c2a42223/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -4856,19 +4678,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py313hab0894d_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -4974,19 +4790,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/yaml-0.2.5-h3422bc3_2.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py312h15fbf35_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/fa/44/b730e2a2580110ced837ac083d8ad222343c96bb6b66e9e4e706e4d0b6df/charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp312-cp312-macosx_12_0_arm64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp312-cp312-macosx_12_0_arm64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp312-cp312-macosx_12_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -5122,19 +4932,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py313h574b89f_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-win_amd64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-win_amd64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
default:
@@ -5364,15 +5168,12 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py312hef9b889_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4e/19/1b928cad70a4e1a3e2c37d5417ca2182510f2451eaadb6c91cd9ec692cae/lightgbm-4.5.0-py3-none-manylinux_2_28_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/ed/1f/6482380ec8dcec4894e7503490fc536d846b0d59694acad9cf99f27d0e7d/nvidia_nccl_cu12-2.23.4-py3-none-manylinux2014_x86_64.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/32/93/66826e2f50cefecbb0a44bd1e667316bf0a3c8e78cd1f0cdf52f5b2c5c6f/xgboost-2.1.3-py3-none-manylinux_2_28_x86_64.whl
- pypi: .
osx-64:
@@ -5519,14 +5320,11 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py313hab0894d_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/1b/d2/46520b6e255298e920df26ff6e5e4fc788c927886e1e30a96b27c2f94924/lightgbm-4.5.0-py3-none-macosx_10_15_x86_64.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/cd/c6/773ebd84414879bd0566788868ae46a6574f6efaf81e694f01ea1fed3277/xgboost-2.1.3-py3-none-macosx_10_15_x86_64.macosx_11_0_x86_64.macosx_12_0_x86_64.whl
- pypi: .
osx-arm64:
@@ -5673,14 +5471,11 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py313hf2da073_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/11/3f/49913ed111286e23bcc40daab54542d80924264dca8ae371514039ab83ab/lightgbm-4.5.0-py3-none-macosx_12_0_arm64.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-macosx_12_0_arm64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-macosx_12_0_arm64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-macosx_12_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/28/3c/ddf5d9eb742cdb7fbcd5c854bce07471bad01194ac37de91db64fbef0c58/xgboost-2.1.3-py3-none-macosx_12_0_arm64.whl
- pypi: .
win-64:
@@ -5855,14 +5650,11 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py313h574b89f_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- pypi: https://files.pythonhosted.org/packages/ec/10/7142b64f0835958920672410c0002b3575d668db979000266e81b19eb4ac/fairlearn-0.11.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d9/28/3be76b591a2e14a031b681b8283acf1dec2ad521f6f1701b7957df68c466/lightgbm-4.5.0-py3-none-win_amd64.whl
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-win_amd64.whl
- - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-win_amd64.whl
+ - pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/58/2f94976df39470fb00eec2cb4f914dde44cd0df8d96483208bf7db4bc97e/xgboost-2.1.3-py3-none-win_amd64.whl
- pypi: .
docs:
@@ -6056,11 +5848,6 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/linux-64/zlib-1.3.1-hb9d3cd8_2.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstandard-0.23.0-py312hef9b889_1.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.6-ha6fb4c9_0.conda
- - pypi: https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: .
osx-64:
- conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_0.conda
@@ -6171,11 +5958,6 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstandard-0.23.0-py313hab0894d_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/zstd-1.5.6-h915ae27_0.conda
- - pypi: https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: .
osx-arm64:
- conda: https://conda.anaconda.org/conda-forge/noarch/alabaster-1.0.0-pyhd8ed1ab_0.conda
@@ -6286,11 +6068,6 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/xz-5.2.6-h57fd34a_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstandard-0.23.0-py312h15fbf35_1.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/zstd-1.5.6-hb46c0d2_0.conda
- - pypi: https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: .
win-64:
- conda: https://conda.anaconda.org/conda-forge/win-64/_openmp_mutex-4.5-2_gnu.conda
@@ -6431,11 +6208,6 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/zlib-1.3.1-h2466b09_2.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstandard-0.23.0-py313h574b89f_1.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/zstd-1.5.6-h0ea2cb4_0.conda
- - pypi: https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- pypi: .
lint:
channels:
@@ -6486,21 +6258,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/virtualenv-20.28.0-pyhd8ed1ab_0.conda
- conda: https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/linux-64/yaml-0.2.5-h7f98852_2.tar.bz2
- - pypi: https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/70/50/73f9a5aa0810cdccda9c1d20be3cbe4a4d6ea6bfd6931464a44c95eef731/numpy-2.1.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/a7/48/fbfb4dc72bed0fe31fe045fb30e924909ad03f717c36694351612973b1a9/scikit_learn-1.5.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/56/46/2449e6e51e0d7c3575f289f6acb7f828938eaab8874dbccfeb0cd2b71a27/scipy-1.14.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
osx-64:
- conda: https://conda.anaconda.org/conda-forge/osx-64/bzip2-1.0.8-hfdf4475_7.conda
@@ -6538,21 +6302,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/virtualenv-20.28.0-pyhd8ed1ab_0.conda
- conda: https://conda.anaconda.org/conda-forge/osx-64/xz-5.2.6-h775f41a_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-64/yaml-0.2.5-h0d85af4_2.tar.bz2
- - pypi: https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4d/0b/620591441457e25f3404c8057eb924d04f161244cb8a3680d529419aa86e/numpy-2.1.3-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/a4/50/8891028437858cc510e13578fe7046574a60c2aaaa92b02d64aac5b1b412/scikit_learn-1.5.2-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/50/ef/ac98346db016ff18a6ad7626a35808f37074d25796fd0234c2bb0ed1e054/scipy-1.14.1-cp313-cp313-macosx_10_13_x86_64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
osx-arm64:
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/bzip2-1.0.8-h99b78c6_7.conda
@@ -6590,21 +6346,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/noarch/virtualenv-20.28.0-pyhd8ed1ab_0.conda
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/xz-5.2.6-h57fd34a_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/osx-arm64/yaml-0.2.5-h3422bc3_2.tar.bz2
- - pypi: https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/45/e1/210b2d8b31ce9119145433e6ea78046e30771de3fe353f313b2778142f34/numpy-2.1.3-cp313-cp313-macosx_11_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/d2/79/17feef8a1c14149436083bec0e61d7befb4812e272d5b20f9d79ea3e9ab1/scikit_learn-1.5.2-cp313-cp313-macosx_12_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/b9/cc/70948fe9f393b911b4251e96b55bbdeaa8cca41f37c26fd1df0232933b9e/scipy-1.14.1-cp313-cp313-macosx_12_0_arm64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
win-64:
- conda: https://conda.anaconda.org/conda-forge/win-64/bzip2-1.0.8-h2466b09_7.conda
@@ -6643,22 +6391,13 @@ environments:
- conda: https://conda.anaconda.org/conda-forge/win-64/vs2015_runtime-14.42.34433-hdffcdeb_23.conda
- conda: https://conda.anaconda.org/conda-forge/win-64/xz-5.2.6-h8d14728_0.tar.bz2
- conda: https://conda.anaconda.org/conda-forge/win-64/yaml-0.2.5-h8ffe710_2.tar.bz2
- - pypi: https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl
- - pypi: https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/91/29/df4b9b42f2be0b623cbd5e2140cafcaa2bef0759a00b7b70104dcfe2fb51/joblib-1.4.2-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/bb/f9/12297ed8d8301a401e7d8eb6b418d32547f1d700ed3c038d325a605421a4/numpy-2.1.3-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/a5/e7/0c869f9e60d225a77af90d2aefa7a4a4c0e745b149325d1450f0f0ce5399/scikit_learn-1.5.2-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/f5/1b/6ee032251bf4cdb0cc50059374e86a9f076308c1512b61c4e003e241efb7/scipy-1.14.1-cp313-cp313-win_amd64.whl
- pypi: https://files.pythonhosted.org/packages/40/44/4a5f08c96eb108af5cb50b41f76142f0afa346dfa99d5296fe7202a11854/tabulate-0.9.0-py3-none-any.whl
- pypi: https://files.pythonhosted.org/packages/4b/2c/ffbf7a134b9ab11a67b0cf0726453cedd9c5043a4fe7a35d1cefa9a1bcfb/threadpoolctl-3.5.0-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- - pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- pypi: .
packages:
- conda: https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2
@@ -7748,11 +7487,6 @@ packages:
- pkg:pypi/catboost?source=hash-mapping
size: 50942701
timestamp: 1725742477979
-- pypi: https://files.pythonhosted.org/packages/12/90/3c9ff0512038035f59d279fddeb79f5f1eccd8859f06d6163c58798b9487/certifi-2024.8.30-py3-none-any.whl
- name: certifi
- version: 2024.8.30
- sha256: 922820b53db7a7257ffbda3f597266d435245903d80737e34f8a45ff3e3230d8
- requires_python: '>=3.6'
- conda: https://conda.anaconda.org/conda-forge/noarch/certifi-2024.8.30-pyhd8ed1ab_0.conda
sha256: 7020770df338c45ac6b560185956c32f0a5abf4b76179c037f115fc7d687819f
md5: 12f7d00853807b0531775e9be891cb11
@@ -8089,106 +7823,6 @@ packages:
- pkg:pypi/cfgv?source=hash-mapping
size: 10788
timestamp: 1629909423398
-- pypi: https://files.pythonhosted.org/packages/0b/6e/b13bd47fa9023b3699e94abf565b5a2f0b0be6e9ddac9812182596ee62e4/charset_normalizer-3.4.0-cp311-cp311-win_amd64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: cee4373f4d3ad28f1ab6290684d8e2ebdb9e7a1b74fdc39e4c211995f77bec27
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/16/92/92a76dc2ff3a12e69ba94e7e05168d37d0345fa08c87e1fe24d0c2a42223/charset_normalizer-3.4.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 8cda06946eac330cbe6598f77bb54e690b4ca93f593dee1568ad22b04f347c15
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/21/67/b4564d81f48042f520c948abac7079356e94b30cb8ffb22e747532cf469d/charset_normalizer-3.4.0-cp310-cp310-macosx_11_0_arm64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 5ed2e36c3e9b4f21dd9422f6893dec0abf2cca553af509b10cd630f878d3eb99
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/23/81/d7eef6a99e42c77f444fdd7bc894b0ceca6c3a95c51239e74a722039521c/charset_normalizer-3.4.0-cp310-cp310-macosx_10_9_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 0de7b687289d3c1b3e8660d0741874abe7888100efe14bd0f9fd7141bcbda92b
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/28/89/60f51ad71f63aaaa7e51a2a2ad37919985a341a1d267070f212cdf6c2d22/charset_normalizer-3.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 309a7de0a0ff3040acaebb35ec45d18db4b28232f21998851cfa709eeff49d62
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/2b/c9/1c8fe3ce05d30c87eff498592c89015b19fade13df42850aafae09e94f35/charset_normalizer-3.4.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 4796efc4faf6b53a18e3d46343535caed491776a22af773f366534056c4e1fbc
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/3e/67/7b72b69d25b89c0b3cea583ee372c43aa24df15f0e0f8d3982c57804984b/charset_normalizer-3.4.0-cp312-cp312-win_amd64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: b197e7094f232959f8f20541ead1d9862ac5ebea1d58e9849c1bf979255dfac9
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/4f/cd/8947fe425e2ab0aa57aceb7807af13a0e4162cd21eee42ef5b053447edf5/charset_normalizer-3.4.0-cp313-cp313-macosx_10_13_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: e9e3c4c9e1ed40ea53acf11e2a386383c3304212c965773704e4603d589343ed
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/50/89/354cc56cf4dd2449715bc9a0f54f3aef3dc700d2d62d1fa5bbea53b13426/charset_normalizer-3.4.0-cp312-cp312-macosx_10_13_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: de7376c29d95d6719048c194a9cf1a1b0393fbe8488a22008610b0361d834ecf
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/5b/f0/b5263e8668a4ee9becc2b451ed909e9c27058337fda5b8c49588183c267a/charset_normalizer-3.4.0-cp313-cp313-macosx_11_0_arm64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 92a7e36b000bf022ef3dbb9c46bfe2d52c047d5e3f3343f43204263c5addc250
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 707b82d19e65c9bd28b81dde95249b07bf9f5b90ebe1ef17d9b57473f8a64b7b
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/77/d5/8c982d58144de49f59571f940e329ad6e8615e1e82ef84584c5eeb5e1d72/charset_normalizer-3.4.0-cp311-cp311-macosx_10_9_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: c57516e58fd17d03ebe67e181a4e4e2ccab1168f8c2976c6a334d4f819fe5944
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/bf/19/411a64f01ee971bed3231111b69eb56f9331a769072de479eae7de52296d/charset_normalizer-3.4.0-cp311-cp311-macosx_11_0_arm64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 6dba5d19c4dfab08e58d5b36304b3f92f3bd5d42c1a3fa37b5ba5cdf6dfcbcee
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/c5/77/3a78bf28bfaa0863f9cfef278dbeadf55efe064eafff8c7c424ae3c4c1bf/charset_normalizer-3.4.0-cp39-cp39-win_amd64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 95c3c157765b031331dd4db3c775e58deaee050a3042fcad72cbc4189d7c8dca
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: f28f891ccd15c514a0981f3b9db9aa23d62fe1a99997512b0491d2ed323d229a
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/d6/20/f1d4670a8a723c46be695dff449d86d6092916f9e99c53051954ee33a1bc/charset_normalizer-3.4.0-cp310-cp310-win_amd64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 55f56e2ebd4e3bc50442fbc0888c9d8c94e4e06a933804e2af3e89e2f9c1c749
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: a8aacce6e2e1edcb6ac625fb0f8c3a9570ccc7bfba1f63419b3769ccf6a00ed0
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/eb/5b/6f10bad0f6461fa272bfbbdf5d0023b5fb9bc6217c92bf068fa5a99820f5/charset_normalizer-3.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 3710a9751938947e6327ea9f3ea6332a09bf0ba0c09cae9cb1f250bd1f1549bc
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/f8/01/344ec40cf5d85c1da3c1f57566c59e0c9b56bcc5566c08804a95a6cc8257/charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 7f683ddc7eedd742e2889d2bfb96d69573fde1d92fcb811979cdb7165bb9c7d3
- requires_python: '>=3.7.0'
-- pypi: https://files.pythonhosted.org/packages/fa/44/b730e2a2580110ced837ac083d8ad222343c96bb6b66e9e4e706e4d0b6df/charset_normalizer-3.4.0-cp312-cp312-macosx_11_0_arm64.whl
- name: charset-normalizer
- version: 3.4.0
- sha256: 4a51b48f42d9358460b78725283f04bddaf44a9358197b889657deba38f329db
- requires_python: '>=3.7.0'
- conda: https://conda.anaconda.org/conda-forge/noarch/charset-normalizer-3.4.0-pyhd8ed1ab_1.conda
sha256: 63022ee2c6a157a9f980250a66f54bdcdf5abee817348d0f9a74c2441a6fbf0e
md5: 6581a17bba6b948bb60130026404a9d6
@@ -8199,11 +7833,6 @@ packages:
- pkg:pypi/charset-normalizer?source=hash-mapping
size: 47533
timestamp: 1733218182393
-- pypi: https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl
- name: colorama
- version: 0.4.6
- sha256: 4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6
- requires_python: '>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*'
- conda: https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.6-pyhd8ed1ab_1.conda
sha256: ab29d57dc70786c1269633ba3dff20288b81664d3ff8d21af995742e2bb03287
md5: 962b9857ee8e7018c22f2776ffa0b2d7
@@ -9022,25 +8651,6 @@ packages:
- pkg:pypi/fairlearn?source=hash-mapping
size: 151448
timestamp: 1730711778904
-- pypi: https://files.pythonhosted.org/packages/b9/f8/feced7779d755758a52d1f6635d990b8d98dc0a29fa568bbe0625f18fdf3/filelock-3.16.1-py3-none-any.whl
- name: filelock
- version: 3.16.1
- sha256: 2082e5703d51fbf98ea75855d9d5527e33d8ff23099bec374a134febee6946b0
- requires_dist:
- - furo>=2024.8.6 ; extra == 'docs'
- - sphinx-autodoc-typehints>=2.4.1 ; extra == 'docs'
- - sphinx>=8.0.2 ; extra == 'docs'
- - covdefaults>=2.3 ; extra == 'testing'
- - coverage>=7.6.1 ; extra == 'testing'
- - diff-cover>=9.2 ; extra == 'testing'
- - pytest-asyncio>=0.24 ; extra == 'testing'
- - pytest-cov>=5 ; extra == 'testing'
- - pytest-mock>=3.14 ; extra == 'testing'
- - pytest-timeout>=2.3.1 ; extra == 'testing'
- - pytest>=8.3.3 ; extra == 'testing'
- - virtualenv>=20.26.4 ; extra == 'testing'
- - typing-extensions>=4.12.2 ; python_full_version < '3.11' and extra == 'typing'
- requires_python: '>=3.8'
- conda: https://conda.anaconda.org/conda-forge/noarch/filelock-3.16.1-pyhd8ed1ab_0.conda
sha256: 1da766da9dba05091af87977922fe60dc7464091a9ccffb3765d403189d39be4
md5: 916f8ec5dd4128cd5f207a3c4c07b2c6
@@ -9703,114 +9313,6 @@ packages:
purls: []
size: 64567
timestamp: 1604417122064
-- pypi: https://files.pythonhosted.org/packages/c6/b2/454d6e7f0158951d8a78c2e1eb4f69ae81beb8dca5fee9809c6c99e9d0d0/fsspec-2024.10.0-py3-none-any.whl
- name: fsspec
- version: 2024.10.0
- sha256: 03b9a6785766a4de40368b88906366755e2819e758b83705c88cd7cb5fe81871
- requires_dist:
- - adlfs ; extra == 'abfs'
- - adlfs ; extra == 'adl'
- - pyarrow>=1 ; extra == 'arrow'
- - dask ; extra == 'dask'
- - distributed ; extra == 'dask'
- - pre-commit ; extra == 'dev'
- - ruff ; extra == 'dev'
- - numpydoc ; extra == 'doc'
- - sphinx ; extra == 'doc'
- - sphinx-design ; extra == 'doc'
- - sphinx-rtd-theme ; extra == 'doc'
- - yarl ; extra == 'doc'
- - dropbox ; extra == 'dropbox'
- - dropboxdrivefs ; extra == 'dropbox'
- - requests ; extra == 'dropbox'
- - adlfs ; extra == 'full'
- - aiohttp!=4.0.0a0,!=4.0.0a1 ; extra == 'full'
- - dask ; extra == 'full'
- - distributed ; extra == 'full'
- - dropbox ; extra == 'full'
- - dropboxdrivefs ; extra == 'full'
- - fusepy ; extra == 'full'
- - gcsfs ; extra == 'full'
- - libarchive-c ; extra == 'full'
- - ocifs ; extra == 'full'
- - panel ; extra == 'full'
- - paramiko ; extra == 'full'
- - pyarrow>=1 ; extra == 'full'
- - pygit2 ; extra == 'full'
- - requests ; extra == 'full'
- - s3fs ; extra == 'full'
- - smbprotocol ; extra == 'full'
- - tqdm ; extra == 'full'
- - fusepy ; extra == 'fuse'
- - gcsfs ; extra == 'gcs'
- - pygit2 ; extra == 'git'
- - requests ; extra == 'github'
- - gcsfs ; extra == 'gs'
- - panel ; extra == 'gui'
- - pyarrow>=1 ; extra == 'hdfs'
- - aiohttp!=4.0.0a0,!=4.0.0a1 ; extra == 'http'
- - libarchive-c ; extra == 'libarchive'
- - ocifs ; extra == 'oci'
- - s3fs ; extra == 's3'
- - paramiko ; extra == 'sftp'
- - smbprotocol ; extra == 'smb'
- - paramiko ; extra == 'ssh'
- - aiohttp!=4.0.0a0,!=4.0.0a1 ; extra == 'test'
- - numpy ; extra == 'test'
- - pytest ; extra == 'test'
- - pytest-asyncio!=0.22.0 ; extra == 'test'
- - pytest-benchmark ; extra == 'test'
- - pytest-cov ; extra == 'test'
- - pytest-mock ; extra == 'test'
- - pytest-recording ; extra == 'test'
- - pytest-rerunfailures ; extra == 'test'
- - requests ; extra == 'test'
- - aiobotocore>=2.5.4,<3.0.0 ; extra == 'test-downstream'
- - dask-expr ; extra == 'test-downstream'
- - dask[dataframe,test] ; extra == 'test-downstream'
- - moto[server]>4,<5 ; extra == 'test-downstream'
- - pytest-timeout ; extra == 'test-downstream'
- - xarray ; extra == 'test-downstream'
- - adlfs ; extra == 'test-full'
- - aiohttp!=4.0.0a0,!=4.0.0a1 ; extra == 'test-full'
- - cloudpickle ; extra == 'test-full'
- - dask ; extra == 'test-full'
- - distributed ; extra == 'test-full'
- - dropbox ; extra == 'test-full'
- - dropboxdrivefs ; extra == 'test-full'
- - fastparquet ; extra == 'test-full'
- - fusepy ; extra == 'test-full'
- - gcsfs ; extra == 'test-full'
- - jinja2 ; extra == 'test-full'
- - kerchunk ; extra == 'test-full'
- - libarchive-c ; extra == 'test-full'
- - lz4 ; extra == 'test-full'
- - notebook ; extra == 'test-full'
- - numpy ; extra == 'test-full'
- - ocifs ; extra == 'test-full'
- - pandas ; extra == 'test-full'
- - panel ; extra == 'test-full'
- - paramiko ; extra == 'test-full'
- - pyarrow ; extra == 'test-full'
- - pyarrow>=1 ; extra == 'test-full'
- - pyftpdlib ; extra == 'test-full'
- - pygit2 ; extra == 'test-full'
- - pytest ; extra == 'test-full'
- - pytest-asyncio!=0.22.0 ; extra == 'test-full'
- - pytest-benchmark ; extra == 'test-full'
- - pytest-cov ; extra == 'test-full'
- - pytest-mock ; extra == 'test-full'
- - pytest-recording ; extra == 'test-full'
- - pytest-rerunfailures ; extra == 'test-full'
- - python-snappy ; extra == 'test-full'
- - requests ; extra == 'test-full'
- - smbprotocol ; extra == 'test-full'
- - tqdm ; extra == 'test-full'
- - urllib3 ; extra == 'test-full'
- - zarr ; extra == 'test-full'
- - zstandard ; extra == 'test-full'
- - tqdm ; extra == 'tqdm'
- requires_python: '>=3.8'
- conda: https://conda.anaconda.org/conda-forge/noarch/future-1.0.0-pyhd8ed1ab_0.conda
sha256: 8c918a63595ae01575b738ddf0bff10dc23a5002d4af4c8b445d1179a76a8efd
md5: 650a7807e689642dddd3590eb817beed
@@ -10373,116 +9875,6 @@ packages:
- pkg:pypi/hpack?source=hash-mapping
size: 25341
timestamp: 1598856368685
-- pypi: https://files.pythonhosted.org/packages/95/9b/3068fb3ae0b498eb66960ca5f4d92a81c91458cacd4dc17bfa6d40ce90fb/huggingface_hub-0.26.3-py3-none-any.whl
- name: huggingface-hub
- version: 0.26.3
- sha256: e66aa99e569c2d5419240a9e553ad07245a5b1300350bfbc5a4945cf7432991b
- requires_dist:
- - filelock
- - fsspec>=2023.5.0
- - packaging>=20.9
- - pyyaml>=5.1
- - requests
- - tqdm>=4.42.1
- - typing-extensions>=3.7.4.3
- - inquirerpy==0.3.4 ; extra == 'all'
- - aiohttp ; extra == 'all'
- - jedi ; extra == 'all'
- - jinja2 ; extra == 'all'
- - pytest>=8.1.1,<8.2.2 ; extra == 'all'
- - pytest-cov ; extra == 'all'
- - pytest-env ; extra == 'all'
- - pytest-xdist ; extra == 'all'
- - pytest-vcr ; extra == 'all'
- - pytest-asyncio ; extra == 'all'
- - pytest-rerunfailures ; extra == 'all'
- - pytest-mock ; extra == 'all'
- - urllib3<2.0 ; extra == 'all'
- - soundfile ; extra == 'all'
- - pillow ; extra == 'all'
- - gradio>=4.0.0 ; extra == 'all'
- - numpy ; extra == 'all'
- - fastapi ; extra == 'all'
- - ruff>=0.5.0 ; extra == 'all'
- - mypy==1.5.1 ; extra == 'all'
- - libcst==1.4.0 ; extra == 'all'
- - typing-extensions>=4.8.0 ; extra == 'all'
- - types-pyyaml ; extra == 'all'
- - types-requests ; extra == 'all'
- - types-simplejson ; extra == 'all'
- - types-toml ; extra == 'all'
- - types-tqdm ; extra == 'all'
- - types-urllib3 ; extra == 'all'
- - inquirerpy==0.3.4 ; extra == 'cli'
- - inquirerpy==0.3.4 ; extra == 'dev'
- - aiohttp ; extra == 'dev'
- - jedi ; extra == 'dev'
- - jinja2 ; extra == 'dev'
- - pytest>=8.1.1,<8.2.2 ; extra == 'dev'
- - pytest-cov ; extra == 'dev'
- - pytest-env ; extra == 'dev'
- - pytest-xdist ; extra == 'dev'
- - pytest-vcr ; extra == 'dev'
- - pytest-asyncio ; extra == 'dev'
- - pytest-rerunfailures ; extra == 'dev'
- - pytest-mock ; extra == 'dev'
- - urllib3<2.0 ; extra == 'dev'
- - soundfile ; extra == 'dev'
- - pillow ; extra == 'dev'
- - gradio>=4.0.0 ; extra == 'dev'
- - numpy ; extra == 'dev'
- - fastapi ; extra == 'dev'
- - ruff>=0.5.0 ; extra == 'dev'
- - mypy==1.5.1 ; extra == 'dev'
- - libcst==1.4.0 ; extra == 'dev'
- - typing-extensions>=4.8.0 ; extra == 'dev'
- - types-pyyaml ; extra == 'dev'
- - types-requests ; extra == 'dev'
- - types-simplejson ; extra == 'dev'
- - types-toml ; extra == 'dev'
- - types-tqdm ; extra == 'dev'
- - types-urllib3 ; extra == 'dev'
- - toml ; extra == 'fastai'
- - fastai>=2.4 ; extra == 'fastai'
- - fastcore>=1.3.27 ; extra == 'fastai'
- - hf-transfer>=0.1.4 ; extra == 'hf-transfer'
- - aiohttp ; extra == 'inference'
- - ruff>=0.5.0 ; extra == 'quality'
- - mypy==1.5.1 ; extra == 'quality'
- - libcst==1.4.0 ; extra == 'quality'
- - tensorflow ; extra == 'tensorflow'
- - pydot ; extra == 'tensorflow'
- - graphviz ; extra == 'tensorflow'
- - tensorflow ; extra == 'tensorflow-testing'
- - keras<3.0 ; extra == 'tensorflow-testing'
- - inquirerpy==0.3.4 ; extra == 'testing'
- - aiohttp ; extra == 'testing'
- - jedi ; extra == 'testing'
- - jinja2 ; extra == 'testing'
- - pytest>=8.1.1,<8.2.2 ; extra == 'testing'
- - pytest-cov ; extra == 'testing'
- - pytest-env ; extra == 'testing'
- - pytest-xdist ; extra == 'testing'
- - pytest-vcr ; extra == 'testing'
- - pytest-asyncio ; extra == 'testing'
- - pytest-rerunfailures ; extra == 'testing'
- - pytest-mock ; extra == 'testing'
- - urllib3<2.0 ; extra == 'testing'
- - soundfile ; extra == 'testing'
- - pillow ; extra == 'testing'
- - gradio>=4.0.0 ; extra == 'testing'
- - numpy ; extra == 'testing'
- - fastapi ; extra == 'testing'
- - torch ; extra == 'torch'
- - safetensors[torch] ; extra == 'torch'
- - typing-extensions>=4.8.0 ; extra == 'typing'
- - types-pyyaml ; extra == 'typing'
- - types-requests ; extra == 'typing'
- - types-simplejson ; extra == 'typing'
- - types-toml ; extra == 'typing'
- - types-tqdm ; extra == 'typing'
- - types-urllib3 ; extra == 'typing'
- requires_python: '>=3.8.0'
- conda: https://conda.anaconda.org/conda-forge/noarch/hyperframe-6.0.1-pyhd8ed1ab_0.tar.bz2
sha256: e374a9d0f53149328134a8d86f5d72bca4c6dcebed3c0ecfa968c02996289330
md5: 9f765cbfab6870c8435b9eefecd7a1f4
@@ -10550,16 +9942,6 @@ packages:
- pkg:pypi/identify?source=hash-mapping
size: 78352
timestamp: 1732589463054
-- pypi: https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl
- name: idna
- version: '3.10'
- sha256: 946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3
- requires_dist:
- - ruff>=0.6.2 ; extra == 'all'
- - mypy>=1.11.2 ; extra == 'all'
- - pytest>=8.3.2 ; extra == 'all'
- - flake8>=7.1.1 ; extra == 'all'
- requires_python: '>=3.6'
- conda: https://conda.anaconda.org/conda-forge/noarch/idna-3.10-pyhd8ed1ab_1.conda
sha256: d7a472c9fd479e2e8dcb83fb8d433fce971ea369d704ece380e876f9c3494e87
md5: 39a4f67be3286c86d696df570b1201b7
@@ -15189,8 +14571,6 @@ packages:
- python-dateutil >=2.8.1
- python_abi 3.10.* *_cp310
- pytz >=2020.1
- arch: x86_64
- platform: linux
license: BSD-3-Clause
license_family: BSD
purls:
@@ -15304,8 +14684,6 @@ packages:
- python-dateutil >=2.8.1
- python_abi 3.10.* *_cp310
- pytz >=2020.1
- arch: x86_64
- platform: osx
license: BSD-3-Clause
license_family: BSD
purls:
@@ -15415,8 +14793,6 @@ packages:
- python-dateutil >=2.8.1
- python_abi 3.10.* *_cp310
- pytz >=2020.1
- arch: arm64
- platform: osx
license: BSD-3-Clause
license_family: BSD
purls:
@@ -15532,8 +14908,6 @@ packages:
- ucrt >=10.0.20348.0
- vc >=14.2,<15
- vs2015_runtime >=14.29.30139
- arch: x86_64
- platform: win
license: BSD-3-Clause
license_family: BSD
purls:
@@ -17766,26 +17140,6 @@ packages:
- pkg:pypi/pytz?source=hash-mapping
size: 185890
timestamp: 1733215766006
-- pypi: https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl
- name: pyyaml
- version: 6.0.2
- sha256: ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725
- requires_python: '>=3.8'
-- pypi: https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- name: pyyaml
- version: 6.0.2
- sha256: 80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476
- requires_python: '>=3.8'
-- pypi: https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl
- name: pyyaml
- version: 6.0.2
- sha256: efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba
- requires_python: '>=3.8'
-- pypi: https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl
- name: pyyaml
- version: 6.0.2
- sha256: 8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563
- requires_python: '>=3.8'
- conda: https://conda.anaconda.org/conda-forge/linux-64/pyyaml-6.0.2-py310ha75aee5_1.conda
sha256: bf6002aef0fd9753fa6de54e82307b2d7e67a1d701dba018869471426078d5d1
md5: 0d4c5c76ae5f5aac6f0be419963a19dd
@@ -18486,18 +17840,6 @@ packages:
purls: []
size: 250351
timestamp: 1679532511311
-- pypi: https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl
- name: requests
- version: 2.32.3
- sha256: 70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6
- requires_dist:
- - charset-normalizer>=2,<4
- - idna>=2.5,<4
- - urllib3>=1.21.1,<3
- - certifi>=2017.4.17
- - pysocks>=1.5.6,!=1.5.7 ; extra == 'socks'
- - chardet>=3.0.2,<6 ; extra == 'use-chardet-on-py3'
- requires_python: '>=3.8'
- conda: https://conda.anaconda.org/conda-forge/noarch/requests-2.32.3-pyhd8ed1ab_1.conda
sha256: d701ca1136197aa121bbbe0e8c18db6b5c94acbd041c2b43c70e5ae104e1d8ad
md5: a9b9368f3701a417eac9edbcae7cb737
@@ -18804,11 +18146,11 @@ packages:
- joblib>=1.2.0 ; extra == 'install'
- threadpoolctl>=3.1.0 ; extra == 'install'
- matplotlib>=3.3.4 ; extra == 'benchmark'
- - pandas>=1.1.5 ; extra == 'benchmark'
+ - pandas>=1.2.0 ; extra == 'benchmark'
- memory-profiler>=0.57.0 ; extra == 'benchmark'
- matplotlib>=3.3.4 ; extra == 'docs'
- scikit-image>=0.17.2 ; extra == 'docs'
- - pandas>=1.1.5 ; extra == 'docs'
+ - pandas>=1.2.0 ; extra == 'docs'
- seaborn>=0.9.0 ; extra == 'docs'
- memory-profiler>=0.57.0 ; extra == 'docs'
- sphinx>=7.3.7 ; extra == 'docs'
@@ -18829,13 +18171,13 @@ packages:
- towncrier>=24.8.0 ; extra == 'docs'
- matplotlib>=3.3.4 ; extra == 'examples'
- scikit-image>=0.17.2 ; extra == 'examples'
- - pandas>=1.1.5 ; extra == 'examples'
+ - pandas>=1.2.0 ; extra == 'examples'
- seaborn>=0.9.0 ; extra == 'examples'
- pooch>=1.6.0 ; extra == 'examples'
- plotly>=5.14.0 ; extra == 'examples'
- matplotlib>=3.3.4 ; extra == 'tests'
- scikit-image>=0.17.2 ; extra == 'tests'
- - pandas>=1.1.5 ; extra == 'tests'
+ - pandas>=1.2.0 ; extra == 'tests'
- pytest>=7.1.2 ; extra == 'tests'
- pytest-cov>=2.9.0 ; extra == 'tests'
- ruff>=0.5.1 ; extra == 'tests'
@@ -18846,7 +18188,7 @@ packages:
- pyarrow>=12.0.0 ; extra == 'tests'
- numpydoc>=1.2.0 ; extra == 'tests'
- pooch>=1.6.0 ; extra == 'tests'
- - conda-lock==2.5.6 ; extra == 'maintenance'
+ - conda-lock==2.5.7 ; extra == 'maintenance'
requires_python: '>=3.9'
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
name: scikit-learn
@@ -18865,11 +18207,11 @@ packages:
- joblib>=1.2.0 ; extra == 'install'
- threadpoolctl>=3.1.0 ; extra == 'install'
- matplotlib>=3.3.4 ; extra == 'benchmark'
- - pandas>=1.1.5 ; extra == 'benchmark'
+ - pandas>=1.2.0 ; extra == 'benchmark'
- memory-profiler>=0.57.0 ; extra == 'benchmark'
- matplotlib>=3.3.4 ; extra == 'docs'
- scikit-image>=0.17.2 ; extra == 'docs'
- - pandas>=1.1.5 ; extra == 'docs'
+ - pandas>=1.2.0 ; extra == 'docs'
- seaborn>=0.9.0 ; extra == 'docs'
- memory-profiler>=0.57.0 ; extra == 'docs'
- sphinx>=7.3.7 ; extra == 'docs'
@@ -18890,13 +18232,13 @@ packages:
- towncrier>=24.8.0 ; extra == 'docs'
- matplotlib>=3.3.4 ; extra == 'examples'
- scikit-image>=0.17.2 ; extra == 'examples'
- - pandas>=1.1.5 ; extra == 'examples'
+ - pandas>=1.2.0 ; extra == 'examples'
- seaborn>=0.9.0 ; extra == 'examples'
- pooch>=1.6.0 ; extra == 'examples'
- plotly>=5.14.0 ; extra == 'examples'
- matplotlib>=3.3.4 ; extra == 'tests'
- scikit-image>=0.17.2 ; extra == 'tests'
- - pandas>=1.1.5 ; extra == 'tests'
+ - pandas>=1.2.0 ; extra == 'tests'
- pytest>=7.1.2 ; extra == 'tests'
- pytest-cov>=2.9.0 ; extra == 'tests'
- ruff>=0.5.1 ; extra == 'tests'
@@ -18907,7 +18249,7 @@ packages:
- pyarrow>=12.0.0 ; extra == 'tests'
- numpydoc>=1.2.0 ; extra == 'tests'
- pooch>=1.6.0 ; extra == 'tests'
- - conda-lock==2.5.6 ; extra == 'maintenance'
+ - conda-lock==2.5.7 ; extra == 'maintenance'
requires_python: '>=3.9'
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-macosx_10_13_x86_64.whl
name: scikit-learn
@@ -18926,11 +18268,11 @@ packages:
- joblib>=1.2.0 ; extra == 'install'
- threadpoolctl>=3.1.0 ; extra == 'install'
- matplotlib>=3.3.4 ; extra == 'benchmark'
- - pandas>=1.1.5 ; extra == 'benchmark'
+ - pandas>=1.2.0 ; extra == 'benchmark'
- memory-profiler>=0.57.0 ; extra == 'benchmark'
- matplotlib>=3.3.4 ; extra == 'docs'
- scikit-image>=0.17.2 ; extra == 'docs'
- - pandas>=1.1.5 ; extra == 'docs'
+ - pandas>=1.2.0 ; extra == 'docs'
- seaborn>=0.9.0 ; extra == 'docs'
- memory-profiler>=0.57.0 ; extra == 'docs'
- sphinx>=7.3.7 ; extra == 'docs'
@@ -18951,13 +18293,13 @@ packages:
- towncrier>=24.8.0 ; extra == 'docs'
- matplotlib>=3.3.4 ; extra == 'examples'
- scikit-image>=0.17.2 ; extra == 'examples'
- - pandas>=1.1.5 ; extra == 'examples'
+ - pandas>=1.2.0 ; extra == 'examples'
- seaborn>=0.9.0 ; extra == 'examples'
- pooch>=1.6.0 ; extra == 'examples'
- plotly>=5.14.0 ; extra == 'examples'
- matplotlib>=3.3.4 ; extra == 'tests'
- scikit-image>=0.17.2 ; extra == 'tests'
- - pandas>=1.1.5 ; extra == 'tests'
+ - pandas>=1.2.0 ; extra == 'tests'
- pytest>=7.1.2 ; extra == 'tests'
- pytest-cov>=2.9.0 ; extra == 'tests'
- ruff>=0.5.1 ; extra == 'tests'
@@ -18968,7 +18310,7 @@ packages:
- pyarrow>=12.0.0 ; extra == 'tests'
- numpydoc>=1.2.0 ; extra == 'tests'
- pooch>=1.6.0 ; extra == 'tests'
- - conda-lock==2.5.6 ; extra == 'maintenance'
+ - conda-lock==2.5.7 ; extra == 'maintenance'
requires_python: '>=3.9'
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-macosx_12_0_arm64.whl
name: scikit-learn
@@ -18987,11 +18329,11 @@ packages:
- joblib>=1.2.0 ; extra == 'install'
- threadpoolctl>=3.1.0 ; extra == 'install'
- matplotlib>=3.3.4 ; extra == 'benchmark'
- - pandas>=1.1.5 ; extra == 'benchmark'
+ - pandas>=1.2.0 ; extra == 'benchmark'
- memory-profiler>=0.57.0 ; extra == 'benchmark'
- matplotlib>=3.3.4 ; extra == 'docs'
- scikit-image>=0.17.2 ; extra == 'docs'
- - pandas>=1.1.5 ; extra == 'docs'
+ - pandas>=1.2.0 ; extra == 'docs'
- seaborn>=0.9.0 ; extra == 'docs'
- memory-profiler>=0.57.0 ; extra == 'docs'
- sphinx>=7.3.7 ; extra == 'docs'
@@ -19012,13 +18354,13 @@ packages:
- towncrier>=24.8.0 ; extra == 'docs'
- matplotlib>=3.3.4 ; extra == 'examples'
- scikit-image>=0.17.2 ; extra == 'examples'
- - pandas>=1.1.5 ; extra == 'examples'
+ - pandas>=1.2.0 ; extra == 'examples'
- seaborn>=0.9.0 ; extra == 'examples'
- pooch>=1.6.0 ; extra == 'examples'
- plotly>=5.14.0 ; extra == 'examples'
- matplotlib>=3.3.4 ; extra == 'tests'
- scikit-image>=0.17.2 ; extra == 'tests'
- - pandas>=1.1.5 ; extra == 'tests'
+ - pandas>=1.2.0 ; extra == 'tests'
- pytest>=7.1.2 ; extra == 'tests'
- pytest-cov>=2.9.0 ; extra == 'tests'
- ruff>=0.5.1 ; extra == 'tests'
@@ -19029,7 +18371,7 @@ packages:
- pyarrow>=12.0.0 ; extra == 'tests'
- numpydoc>=1.2.0 ; extra == 'tests'
- pooch>=1.6.0 ; extra == 'tests'
- - conda-lock==2.5.6 ; extra == 'maintenance'
+ - conda-lock==2.5.7 ; extra == 'maintenance'
requires_python: '>=3.9'
- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scikit-learn/1.7.dev0/scikit_learn-1.7.dev0-cp313-cp313-win_amd64.whl
name: scikit-learn
@@ -19048,11 +18390,11 @@ packages:
- joblib>=1.2.0 ; extra == 'install'
- threadpoolctl>=3.1.0 ; extra == 'install'
- matplotlib>=3.3.4 ; extra == 'benchmark'
- - pandas>=1.1.5 ; extra == 'benchmark'
+ - pandas>=1.2.0 ; extra == 'benchmark'
- memory-profiler>=0.57.0 ; extra == 'benchmark'
- matplotlib>=3.3.4 ; extra == 'docs'
- scikit-image>=0.17.2 ; extra == 'docs'
- - pandas>=1.1.5 ; extra == 'docs'
+ - pandas>=1.2.0 ; extra == 'docs'
- seaborn>=0.9.0 ; extra == 'docs'
- memory-profiler>=0.57.0 ; extra == 'docs'
- sphinx>=7.3.7 ; extra == 'docs'
@@ -19073,13 +18415,13 @@ packages:
- towncrier>=24.8.0 ; extra == 'docs'
- matplotlib>=3.3.4 ; extra == 'examples'
- scikit-image>=0.17.2 ; extra == 'examples'
- - pandas>=1.1.5 ; extra == 'examples'
+ - pandas>=1.2.0 ; extra == 'examples'
- seaborn>=0.9.0 ; extra == 'examples'
- pooch>=1.6.0 ; extra == 'examples'
- plotly>=5.14.0 ; extra == 'examples'
- matplotlib>=3.3.4 ; extra == 'tests'
- scikit-image>=0.17.2 ; extra == 'tests'
- - pandas>=1.1.5 ; extra == 'tests'
+ - pandas>=1.2.0 ; extra == 'tests'
- pytest>=7.1.2 ; extra == 'tests'
- pytest-cov>=2.9.0 ; extra == 'tests'
- ruff>=0.5.1 ; extra == 'tests'
@@ -19090,7 +18432,7 @@ packages:
- pyarrow>=12.0.0 ; extra == 'tests'
- numpydoc>=1.2.0 ; extra == 'tests'
- pooch>=1.6.0 ; extra == 'tests'
- - conda-lock==2.5.6 ; extra == 'maintenance'
+ - conda-lock==2.5.7 ; extra == 'maintenance'
requires_python: '>=3.9'
- conda: https://conda.anaconda.org/conda-forge/label/scikit-learn_rc/linux-64/scikit-learn-1.6.0rc1-py313h55ffb2e_0.conda
sha256: fcf11cc4238ef0f2430a343d1149f2abc10a77c316f7e0474a20536b830aa6eb
@@ -19759,9 +19101,9 @@ packages:
- doit>=0.36.0 ; extra == 'dev'
- pydevtool ; extra == 'dev'
requires_python: '>=3.10'
-- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp312-cp312-macosx_12_0_arm64.whl
+- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp312-cp312-macosx_12_0_arm64.whl
name: scipy
- version: 1.15.0.dev0
+ version: 1.16.0.dev0
requires_dist:
- numpy>=1.23.5
- pytest ; extra == 'test'
@@ -19789,7 +19131,7 @@ packages:
- jupytext ; extra == 'doc'
- myst-nb ; extra == 'doc'
- pooch ; extra == 'doc'
- - jupyterlite-sphinx>=0.16.5 ; extra == 'doc'
+ - jupyterlite-sphinx>=0.17.1 ; extra == 'doc'
- jupyterlite-pyodide-kernel ; extra == 'doc'
- mypy==1.10.0 ; extra == 'dev'
- typing-extensions ; extra == 'dev'
@@ -19801,9 +19143,9 @@ packages:
- doit>=0.36.0 ; extra == 'dev'
- pydevtool ; extra == 'dev'
requires_python: '>=3.10'
-- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
name: scipy
- version: 1.15.0.dev0
+ version: 1.16.0.dev0
requires_dist:
- numpy>=1.23.5
- pytest ; extra == 'test'
@@ -19831,7 +19173,7 @@ packages:
- jupytext ; extra == 'doc'
- myst-nb ; extra == 'doc'
- pooch ; extra == 'doc'
- - jupyterlite-sphinx>=0.16.5 ; extra == 'doc'
+ - jupyterlite-sphinx>=0.17.1 ; extra == 'doc'
- jupyterlite-pyodide-kernel ; extra == 'doc'
- mypy==1.10.0 ; extra == 'dev'
- typing-extensions ; extra == 'dev'
@@ -19843,9 +19185,9 @@ packages:
- doit>=0.36.0 ; extra == 'dev'
- pydevtool ; extra == 'dev'
requires_python: '>=3.10'
-- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
+- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-macosx_10_13_x86_64.whl
name: scipy
- version: 1.15.0.dev0
+ version: 1.16.0.dev0
requires_dist:
- numpy>=1.23.5
- pytest ; extra == 'test'
@@ -19873,7 +19215,7 @@ packages:
- jupytext ; extra == 'doc'
- myst-nb ; extra == 'doc'
- pooch ; extra == 'doc'
- - jupyterlite-sphinx>=0.16.5 ; extra == 'doc'
+ - jupyterlite-sphinx>=0.17.1 ; extra == 'doc'
- jupyterlite-pyodide-kernel ; extra == 'doc'
- mypy==1.10.0 ; extra == 'dev'
- typing-extensions ; extra == 'dev'
@@ -19885,9 +19227,9 @@ packages:
- doit>=0.36.0 ; extra == 'dev'
- pydevtool ; extra == 'dev'
requires_python: '>=3.10'
-- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-macosx_12_0_arm64.whl
+- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-macosx_12_0_arm64.whl
name: scipy
- version: 1.15.0.dev0
+ version: 1.16.0.dev0
requires_dist:
- numpy>=1.23.5
- pytest ; extra == 'test'
@@ -19915,7 +19257,7 @@ packages:
- jupytext ; extra == 'doc'
- myst-nb ; extra == 'doc'
- pooch ; extra == 'doc'
- - jupyterlite-sphinx>=0.16.5 ; extra == 'doc'
+ - jupyterlite-sphinx>=0.17.1 ; extra == 'doc'
- jupyterlite-pyodide-kernel ; extra == 'doc'
- mypy==1.10.0 ; extra == 'dev'
- typing-extensions ; extra == 'dev'
@@ -19927,9 +19269,9 @@ packages:
- doit>=0.36.0 ; extra == 'dev'
- pydevtool ; extra == 'dev'
requires_python: '>=3.10'
-- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.15.0.dev0/scipy-1.15.0.dev0-cp313-cp313-win_amd64.whl
+- pypi: https://pypi.anaconda.org/scientific-python-nightly-wheels/simple/scipy/1.16.0.dev0/scipy-1.16.0.dev0-cp313-cp313-win_amd64.whl
name: scipy
- version: 1.15.0.dev0
+ version: 1.16.0.dev0
requires_dist:
- numpy>=1.23.5
- pytest ; extra == 'test'
@@ -19957,7 +19299,7 @@ packages:
- jupytext ; extra == 'doc'
- myst-nb ; extra == 'doc'
- pooch ; extra == 'doc'
- - jupyterlite-sphinx>=0.16.5 ; extra == 'doc'
+ - jupyterlite-sphinx>=0.17.1 ; extra == 'doc'
- jupyterlite-pyodide-kernel ; extra == 'doc'
- mypy==1.10.0 ; extra == 'dev'
- typing-extensions ; extra == 'dev'
@@ -20470,10 +19812,9 @@ packages:
timestamp: 1733216901349
- pypi: .
name: skops
- version: 0.11.dev0
- sha256: dcbfbe445939c6a829b254426dd38dc111bdc8b60190c209c354253ffab5c76e
+ version: 0.12.dev0
+ sha256: 35c757b6b516b1ffd6c44d851692b2278f893c8df5808a2f6266d3f542616c4a
requires_dist:
- - huggingface-hub>=0.17.0
- packaging>=17.0
- scikit-learn>=1.1
- tabulate>=0.8.8
@@ -21081,22 +20422,6 @@ packages:
- pkg:pypi/tornado?source=hash-mapping
size: 645144
timestamp: 1732616217328
-- pypi: https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl
- name: tqdm
- version: 4.67.1
- sha256: 26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2
- requires_dist:
- - colorama ; platform_system == 'Windows'
- - pytest>=6 ; extra == 'dev'
- - pytest-cov ; extra == 'dev'
- - pytest-timeout ; extra == 'dev'
- - pytest-asyncio>=0.24 ; extra == 'dev'
- - nbval ; extra == 'dev'
- - requests ; extra == 'discord'
- - slack-sdk ; extra == 'slack'
- - requests ; extra == 'telegram'
- - ipywidgets>=6 ; extra == 'notebook'
- requires_python: '>=3.7'
- conda: https://conda.anaconda.org/conda-forge/noarch/traitlets-5.14.3-pyhd8ed1ab_0.conda
sha256: 8a64fa0f19022828513667c2c7176cfd125001f3f4b9bc00d33732e627dd2592
md5: 3df84416a021220d8b5700c613af2dc5
@@ -21707,17 +21032,6 @@ packages:
- pkg:pypi/unicodedata2?source=hash-mapping
size: 365699
timestamp: 1729705064037
-- pypi: https://files.pythonhosted.org/packages/ce/d9/5f4c13cecde62396b0d3fe530a50ccea91e7dfc1ccf0e09c228841bb5ba8/urllib3-2.2.3-py3-none-any.whl
- name: urllib3
- version: 2.2.3
- sha256: ca899ca043dcb1bafa3e262d73aa25c465bfb49e0bd9dd5d59f1d0acba2f8fac
- requires_dist:
- - brotli>=1.0.9 ; platform_python_implementation == 'CPython' and extra == 'brotli'
- - brotlicffi>=0.8.0 ; platform_python_implementation != 'CPython' and extra == 'brotli'
- - h2>=4,<5 ; extra == 'h2'
- - pysocks>=1.5.6,!=1.5.7,<2.0 ; extra == 'socks'
- - zstandard>=0.18.0 ; extra == 'zstd'
- requires_python: '>=3.8'
- conda: https://conda.anaconda.org/conda-forge/noarch/urllib3-2.2.3-pyhd8ed1ab_1.conda
sha256: 416e30a1c3262275f01a3e22e783118d9e9d2872a739a9ed860d06fa9c7593d5
md5: 4a2d8ef7c37b8808c5b9b750501fffce
diff --git a/pyproject.toml b/pyproject.toml
index 830e1b6b..fd69ed0a 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -38,7 +38,6 @@ classifiers = [
license = {file = "LICENSE"}
dependencies = [
"scikit-learn>=1.1",
- "huggingface-hub>=0.17.0",
"tabulate>=0.8.8",
"packaging>=17.0",
]
@@ -90,10 +89,6 @@ filterwarnings = [
# BaseEstimator._validate_data deprecation warning in sklearn 1.6 #TODO can be removed when a new release of quantile-forest is out
"ignore:`BaseEstimator._validate_data` is deprecated in 1.6 and will be removed in 1.7:FutureWarning",
]
-markers = [
- "network: marks tests as requiring internet (deselect with '-m \"not network\"')",
- "inference: marks tests that call inference API (deselect with '-m \"not inference\"')",
-]
addopts = "--cov=skops --cov-report=term-missing --doctest-modules"
[tool.coverage.run]
diff --git a/skops/card/__init__.py b/skops/card/__init__.py
index 0febe42a..bf1a0c47 100644
--- a/skops/card/__init__.py
+++ b/skops/card/__init__.py
@@ -1,4 +1,4 @@
-from ._model_card import Card, metadata_from_config
+from ._model_card import Card
from ._parser import parse_modelcard
-__all__ = ["Card", "metadata_from_config", "parse_modelcard"]
+__all__ = ["Card", "parse_modelcard"]
diff --git a/skops/card/_model_card.py b/skops/card/_model_card.py
index 61e09f0f..efa19b86 100644
--- a/skops/card/_model_card.py
+++ b/skops/card/_model_card.py
@@ -1,6 +1,5 @@
from __future__ import annotations
-import json
import re
import shutil
import sys
@@ -12,10 +11,9 @@
from hashlib import sha256
from pathlib import Path
from reprlib import Repr
-from typing import Any, Iterator, List, Literal, Optional, Sequence, Union
+from typing import Any, Iterator, List, Literal, Optional, Sequence
import joblib
-from huggingface_hub import ModelCardData
from sklearn.utils import estimator_html_repr
from tabulate import tabulate # type: ignore
@@ -55,63 +53,6 @@ def _clean_table(table: str) -> str:
return table
-def metadata_from_config(config_path: Union[str, Path]) -> ModelCardData:
- """Construct a ``ModelCardData`` object from a ``config.json`` file.
-
- Most information needed for the metadata section of a ``README.md`` file on
- Hugging Face Hub is included in the ``config.json`` file. This utility
- function constructs a :class:`huggingface_hub.ModelCardData` object which
- can then be passed to the :class:`~skops.card.Card` object.
-
- This method populates the following attributes of the instance:
-
- - ``library_name``: It needs to be ``"sklearn"`` for scikit-learn
- compatible models.
- - ``tags``: Set to a list, containing ``"sklearn"`` and the task of the
- model. You can then add more tags to this list.
- - ``widget``: It is populated with the example data to be used by the
- widget component of the Hugging Face Hub widget, on the model's
- repository page.
-
- Parameters
- ----------
- config_path: str, or Path
- Filepath to the ``config.json`` file, or the folder including that
- file.
-
- Returns
- -------
- card_data: huggingface_hub.ModelCardData
- :class:`huggingface_hub.ModelCardData` object.
-
- """
- config_path = Path(config_path)
- if not config_path.is_file():
- config_path = config_path / "config.json"
-
- with open(config_path) as f:
- config = json.load(f)
- card_data = ModelCardData(
- model_format=config.get("sklearn", {}).get("model_format", {})
- )
- card_data.library_name = "sklearn"
- card_data.tags = ["sklearn", "skops"]
- task = config.get("sklearn", {}).get("task", None)
- if task:
- card_data.tags += [task]
- card_data.model_file = config.get("sklearn", {}).get("model", {}).get("file") # type: ignore
-
- example_input = config.get("sklearn", {}).get("example_input", None)
- # Documentation on what the widget expects:
- # https://huggingface.co/docs/hub/models-widgets-examples
- if example_input:
- if "tabular" in task:
- card_data.widget = [{"structuredData": example_input}] # type: ignore
- # TODO: add text data example here.
-
- return card_data
-
-
def split_subsection_names(key: str) -> list[str]:
r"""Split a string containing multiple sections into a list of strings for
each.
@@ -148,32 +89,6 @@ def split_subsection_names(key: str) -> list[str]:
return [part.replace(placeholder, "/") for part in parts]
-def _getting_started_code(
- file_name: str, model_format: Literal["pickle", "skops"], indent: str = " "
-) -> list[str]:
- # get lines of code required to load the model
- lines = [
- "import json",
- "import pandas as pd",
- ]
- if model_format == "skops":
- lines += ["import skops.io as sio"]
- else:
- lines += ["import joblib"]
-
- if model_format == "skops":
- lines += [f'model = sio.load("{file_name}")']
- else: # pickle
- lines += [f'model = joblib.load("{file_name}")']
-
- lines += [
- 'with open("config.json") as f:',
- indent + "config = json.load(f)",
- 'model.predict(pd.DataFrame.from_dict(config["sklearn"]["example_input"]))',
- ]
- return lines
-
-
@dataclass
class Section:
"""Building block of the model card.
@@ -375,6 +290,12 @@ class Card:
model: pathlib.Path, str, or sklearn estimator object
``Path``/``str`` of the model or the actual model instance that will be
documented. If a ``Path`` or ``str`` is provided, model will be loaded.
+ Note that a a "get started" code block will be added to the card only if
+ the model is a ``Path`` or ``str``.
+
+ model_format: Literal["pickle", "skops"] or None (default=None)
+ The format of the model file. If ``None``, the format will be inferred
+ from the file extension of the model file if possible.
model_diagram: bool or "auto" or str, default="auto"
If using the skops template, setting this to ``True`` or ``"auto"`` will
@@ -389,16 +310,6 @@ class Card:
diagram can, however, always be added later using
:meth:`Card.add_model_plot`.
- metadata: ModelCardData, optional
- :class:`huggingface_hub.ModelCardData` object. The contents of this
- object are saved as metadata at the beginning of the output file, and
- used by Hugging Face Hub.
-
- You can use :func:`~skops.card.metadata_from_config` to create an
- instance pre-populated with necessary information based on the contents
- of the ``config.json`` file, which itself is created by
- :func:`skops.hub_utils.init`.
-
template: "skops", dict, or None (default="skops")
Whether to add default sections or not. The template can be a predefined
template, which at the moment can only be the string ``"skops"``, which
@@ -419,10 +330,6 @@ class Card:
model: estimator object
The scikit-learn compatible model that will be documented.
- metadata: ModelCardData
- Metadata to be stored at the beginning of the saved model card, as
- metadata to be understood by the Hugging Face Hub.
-
Examples
--------
>>> from sklearn.metrics import (
@@ -439,7 +346,6 @@ class Card:
>>> X, y = load_iris(return_X_y=True)
>>> model = LogisticRegression(solver="liblinear", random_state=0).fit(X, y)
>>> model_card = Card(model)
- >>> model_card.metadata.license = "mit"
>>> y_pred = model.predict(X)
>>> model_card.add_metrics(**{
... "accuracy": accuracy_score(y, y_pred),
@@ -480,13 +386,13 @@ class Card:
def __init__(
self,
model,
+ model_format: Literal["pickle", "skops"] | None = None,
model_diagram: bool | Literal["auto"] | str = "auto",
- metadata: ModelCardData | None = None,
template: Literal["skops"] | dict[str, str] | None = "skops",
trusted: Optional[List[str]] = None,
) -> None:
self.model = model
- self.metadata = metadata or ModelCardData()
+ self.model_format = model_format
self.template = template
self.trusted = trusted
@@ -524,7 +430,6 @@ def _populate_template(self, model_diagram: bool | Literal["auto"] | str):
self.add(folded=False, **SKOPS_TEMPLATE)
# for the skops template, automatically add some default sections
self.add_hyperparams()
- self.add_get_started_code()
if (model_diagram is True) or (model_diagram == "auto"):
self.add_model_plot()
@@ -915,101 +820,6 @@ def _add_hyperparams(
)
self._add_single(section_name, section)
- def add_get_started_code(
- self,
- section: str = "How to Get Started with the Model",
- description: str | None = None,
- file_name: str | None = None,
- model_format: Literal["pickle", "skops"] | None = None,
- ) -> Self:
- """Add getting started code
-
- This code can be copied by users to load the model and make predictions
- with it.
-
- Parameters
- ----------
- section : str (default="How to Get Started with the Model")
- The section that the code for loading the model should be added to.
- By default, the section is set to fit the skops model card template.
- If you're using a different template, you may have to choose a
- different section name.
-
- description : str or None, default=None
- An optional description to be added before the code. If you're using
- the default skops template, a standard text is used. Pass a string
- here if you want to use your own text instead. Leave this empty to
- not add any description.
-
- file_name : str or None, default=None
- The file name of the model. If no file name is indicated, there will
- be an attempt to read the file name from the card's metadata. If
- that fails, an error is raised and you have to pass this argument
- explicitly.
-
- model_format : "skops", "pickle", or None, default=None
- The model format used to store the model.If format is indicated,
- there will be an attempt to read the model format from the card's
- metadata. If that fails, an error is raised and you have to pass
- this argument explicitly.
-
- Returns
- -------
- self : object
- Card object.
-
- """
- if file_name is None:
- file_name = self.metadata.to_dict().get("model_file")
-
- if model_format is None:
- model_format = (
- self.metadata.to_dict().get("sklearn", {}).get("model_format")
- )
-
- if model_format and (model_format not in ("pickle", "skops")):
- msg = (
- f"Invalid model format '{model_format}', should be one of "
- "'pickle' or 'skops'"
- )
- raise ValueError(msg)
-
- if (not file_name) or (not model_format):
- return self
-
- self._add_get_started_code(
- section,
- file_name=file_name,
- model_format=model_format,
- description=description,
- )
-
- return self
-
- def _add_get_started_code(
- self,
- section_name: str,
- file_name: str,
- model_format: Literal["pickle", "skops"],
- description: str | None,
- indent: str = " ",
- ) -> None:
- """Add getting started code to the corresponding section"""
- lines = _getting_started_code(
- file_name, model_format=model_format, indent=indent
- )
- lines = ["```python"] + lines + ["```"]
- code = "\n".join(lines)
-
- if description:
- content = f"{description}\n\n{code}"
- else:
- content = code
-
- title = split_subsection_names(section_name)[-1]
- section = Section(title=title, content=content)
- self._add_single(section_name, section)
-
def add_plot(
self,
*,
@@ -1309,16 +1119,6 @@ def _add_metrics(
section = TableSection(title=title, content=description, table=table)
self._add_single(section_name, section)
- def _generate_metadata(self, metadata: ModelCardData) -> Iterator[str]:
- """Yield metadata in yaml format"""
- # Repr attributes can be used to control the behavior of repr
- aRepr = Repr()
- aRepr.maxother = 79
- aRepr.maxstring = 79
-
- for key, val in metadata.to_dict().items() if metadata else {}:
- yield aRepr.repr(f"metadata.{key}={val},").strip('"').strip("'")
-
def _generate_content(
self,
data: dict[str, Section],
@@ -1391,16 +1191,6 @@ def __repr__(self) -> str:
else:
model_repr = None
- # repr for metadata
- metadata_reprs = []
- for key, val in self.metadata.to_dict().items() if self.metadata else {}:
- if key == "widget":
- metadata_reprs.append("metadata.widget=[{...}],")
- continue
-
- metadata_reprs.append(self._format_repr(f"metadata.{key}", repr(val)))
- metadata_repr = "\n".join(metadata_reprs)
-
# repr for contents
content_reprs = []
for title, section in self._iterate_content(self._data):
@@ -1417,18 +1207,13 @@ def __repr__(self) -> str:
complete_repr = "Card(\n"
if model_repr:
complete_repr += textwrap.indent(model_repr, " ") + "\n"
- if metadata_reprs:
- complete_repr += textwrap.indent(metadata_repr, " ") + "\n"
if content_reprs:
complete_repr += textwrap.indent(content_repr, " ") + "\n"
complete_repr += ")"
return complete_repr
def _generate_card(self, destination_path: Path | None = None) -> Iterator[str]:
- """Yield sections of the model card, including the metadata."""
- if self.metadata.to_dict():
- yield f"---\n{self.metadata.to_yaml()}\n---"
-
+ """Yield sections of the model card."""
for line in self._generate_content(
self._data, destination_path=destination_path
):
@@ -1454,11 +1239,6 @@ def save(self, path: str | Path, copy_files: bool = False) -> None:
before creating the repository. Without this path the README will
have an absolute path to the plot that won't exist in the
repository.
-
- Notes
- -----
- The keys in model card metadata can be seen `here
- `__.
"""
with open(path, "w", encoding="utf-8") as f:
if not isinstance(path, Path):
diff --git a/skops/card/_parser.py b/skops/card/_parser.py
index 06fbc29f..35a0ad41 100644
--- a/skops/card/_parser.py
+++ b/skops/card/_parser.py
@@ -11,10 +11,8 @@
import json
import subprocess
from pathlib import Path
-from tempfile import mkdtemp
-from typing import Any, Literal
+from typing import Literal
-import yaml # type: ignore
from packaging.version import Version
from skops.card import Card
@@ -209,55 +207,6 @@ def check_pandoc_installed(
)
-def _card_with_detached_metainfo(path: str | Path) -> tuple[str | Path, dict[str, Any]]:
- """Detach the possibly existing yaml part of the model card
-
- Model cards always have a markdown part and optionally a yaml part at the
- head, delimited by "---". Obviously, pandoc cannot parse that. Therefore, we
- detach the yaml part and return it as a separate dict, only leaving
- (hopefully) valid markdown.
-
- path : str or pathlib.Path
- The path to the model card file.
-
- Returns
- -------
- file : path
- The path to the model card without any yaml metainfo. If the model card
- didn't contain that metainfo to begin with, this is just the path to the
- original model card. If it did contain metainfo, this is a path to a new
- temporary file with the metainfo removed.
-
- metainfo : dict
- The metainfo from the yaml part as a parsed dict. If no metainfo was
- present, the dict is empty.
- """
- with open(path, "r") as f:
- text = f.read()
-
- sep_start, sep_end = "---\n", "\n---"
-
- metainfo: dict[str, Any] = {}
- if not text.startswith(sep_start): # no metainfo:
- return path, metainfo
-
- idx_separator = text.find(sep_end)
- if idx_separator < len(sep_start): # pragma: no cover
- # separator shouldn't come earlier than this
- return path, metainfo
-
- # https://black.readthedocs.io/en/stable/faq.html#why-are-flake8-s-e203-and-w503-violated
- text_clean = text[idx_separator + len(sep_end) :] # noqa: E203
- metainfo = yaml.safe_load( # type: ignore
- text[len(sep_start) : idx_separator] # noqa: E203
- )
-
- file = Path(mkdtemp()) / "tmp-model-card.md"
- with open(file, "w") as f:
- f.write(text_clean)
- return file, metainfo
-
-
def parse_modelcard(path: str | Path) -> Card:
"""Read a model card and return a Card object
@@ -328,8 +277,6 @@ def parse_modelcard(path: str | Path) -> Card:
"""
check_pandoc_installed()
- path, metainfo = _card_with_detached_metainfo(path)
-
proc = subprocess.run(
["pandoc", "-t", "json", "-s", str(path)],
capture_output=True,
@@ -338,7 +285,4 @@ def parse_modelcard(path: str | Path) -> Card:
parser = PandocParser(source)
card = parser.generate()
- for key, val in metainfo.items():
- setattr(card.metadata, key, val)
-
return card
diff --git a/skops/card/default_template.md b/skops/card/default_template.md
index 91141dfe..4e2f4ad0 100644
--- a/skops/card/default_template.md
+++ b/skops/card/default_template.md
@@ -37,15 +37,6 @@ You can find the details about evaluation process and the evaluation results.
{{ eval_results | default("[More Information Needed]", true)}}
-# How to Get Started with the Model
-
-Use the code below to get started with the model.
-
-```python
-{{ get_started_code | default("[More Information Needed]", true)}}
-```
-
-
# Model Card Authors
This model card is written by following authors:
diff --git a/skops/card/tests/test_card.py b/skops/card/tests/test_card.py
index 750bebb4..6b07965d 100644
--- a/skops/card/tests/test_card.py
+++ b/skops/card/tests/test_card.py
@@ -9,15 +9,13 @@
import numpy as np
import pytest
import sklearn
-from huggingface_hub import ModelCardData, metadata_load
from sklearn.datasets import load_iris
from sklearn.inspection import permutation_importance
from sklearn.linear_model import LinearRegression, LogisticRegression
from sklearn.metrics import f1_score, make_scorer
from sklearn.neighbors import KNeighborsClassifier
-from skops import hub_utils
-from skops.card import Card, metadata_from_config
+from skops.card import Card
from skops.card._model_card import (
CONTENT_PLACEHOLDER,
SKOPS_TEMPLATE,
@@ -113,38 +111,20 @@ def iris_skops_file(iris_estimator):
def _create_model_card_from_saved_model(
destination_path,
iris_estimator,
- iris_data,
- save_file,
):
- X, y = iris_data
- hub_utils.init(
- model=save_file,
- requirements=[f"scikit-learn=={sklearn.__version__}"],
- dst=destination_path,
- task="tabular-classification",
- data=X,
- )
- card = Card(iris_estimator, metadata=metadata_from_config(destination_path))
+ card = Card(iris_estimator)
card.save(Path(destination_path) / "README.md")
return card
@pytest.fixture
-def skops_model_card_metadata_from_config(
- destination_path, iris_estimator, iris_skops_file, iris_data
-):
- yield _create_model_card_from_saved_model(
- destination_path, iris_estimator, iris_data, iris_skops_file
- )
+def skops_model_card(destination_path, iris_estimator):
+ yield _create_model_card_from_saved_model(destination_path, iris_estimator)
@pytest.fixture
-def pkl_model_card_metadata_from_config(
- destination_path, iris_estimator, iris_pkl_file, iris_data
-):
- yield _create_model_card_from_saved_model(
- destination_path, iris_estimator, iris_data, iris_pkl_file
- )
+def pkl_model_card(destination_path, iris_estimator):
+ yield _create_model_card_from_saved_model(destination_path, iris_estimator)
@pytest.fixture
@@ -158,15 +138,13 @@ def test_save_model_card(destination_path, model_card):
assert (Path(destination_path) / "README.md").exists()
-def test_model_caching(
- skops_model_card_metadata_from_config, iris_skops_file, destination_path
-):
+def test_model_caching(skops_model_card, iris_skops_file, destination_path):
"""Tests that the model card caches the model to avoid loading it multiple times"""
new_model = LogisticRegression(random_state=4321)
# mock _load_model, it still loads the model but we can track call count
mock_load_model = mock.Mock(side_effect=load)
- card = Card(iris_skops_file, metadata=metadata_from_config(destination_path))
+ card = Card(iris_skops_file)
with mock.patch("skops.card._model_card._load_model", mock_load_model):
model1 = card.get_model()
model2 = card.get_model()
@@ -620,177 +598,6 @@ def test_permutation_importances_with_description(
assert section.format() == expected
-class TestAddGetStartedCode:
- """Tests for getting started code"""
-
- @pytest.fixture
- def metadata(self):
- # dummy ModelCardData using pickle
- class Metadata:
- def to_dict(self):
- return {
- "model_file": "my-model.pickle",
- "sklearn": {
- "model_format": "pickle",
- },
- }
-
- return Metadata()
-
- @pytest.fixture
- def model_card(self, metadata):
- model = fit_model()
- card = Card(model, metadata=metadata)
- return card
-
- @pytest.fixture
- def metadata_skops(self):
- # dummy ModelCardData using skops
- class Metadata:
- def to_dict(self):
- return {
- "model_file": "my-model.skops",
- "sklearn": {
- "model_format": "skops",
- },
- }
-
- return Metadata()
-
- @pytest.fixture
- def model_card_skops(self, metadata_skops):
- model = fit_model()
- card = Card(model, metadata=metadata_skops)
- return card
-
- @pytest.fixture
- def text_pickle(self):
- return (
- "```python\n"
- "import json\n"
- "import pandas as pd\n"
- "import joblib\n"
- 'model = joblib.load("my-model.pickle")\n'
- 'with open("config.json") as f:\n'
- " config = json.load(f)\n"
- 'model.predict(pd.DataFrame.from_dict(config["sklearn"]["example_input"]))\n'
- "```"
- )
-
- @pytest.fixture
- def text_skops(self):
- return (
- "```python\n"
- "import json\n"
- "import pandas as pd\n"
- "import skops.io as sio\n"
- 'model = sio.load("my-model.skops")\n'
- 'with open("config.json") as f:\n'
- " config = json.load(f)\n"
- 'model.predict(pd.DataFrame.from_dict(config["sklearn"]["example_input"]))\n'
- "```"
- )
-
- def test_default_pickle(self, model_card, text_pickle):
- # by default, don't add a table, as there are no metrics
- result = model_card.select("How to Get Started with the Model").format()
- assert result == text_pickle
-
- def test_default_skops(self, model_card_skops, text_skops):
- # by default, don't add a table, as there are no metrics
- result = model_card_skops.select("How to Get Started with the Model").format()
- assert result == text_skops
-
- def test_no_metadata_file_name(self):
- model = fit_model()
- card = Card(model, metadata=None)
- card.add_get_started_code() # does not raise
-
- def test_no_metadata_file_format(self):
- class Metadata:
- def to_dict(self):
- return {
- "model_file": "my-model.skops",
- # missing file format entry
- }
-
- model = fit_model()
- card = Card(model, metadata=Metadata())
- card.add_get_started_code() # does not raise
-
- def test_other_section(self, model_card, text_pickle):
- model_card.add_get_started_code(section="Other section")
- result = model_card.select("Other section").format()
- assert result == text_pickle
-
- def test_use_description(self, model_card):
- model_card.add_get_started_code(description="Awesome code")
- result = model_card.select("How to Get Started with the Model").format()
- assert result.startswith("Awesome code")
-
- def test_other_filename(self, model_card, text_pickle):
- model_card.add_get_started_code(file_name="foobar.pkl")
- text = text_pickle.replace("my-model.pickle", "foobar.pkl")
- result = model_card.select("How to Get Started with the Model").format()
- assert result == text
-
- def test_explicitly_set_other_model_format(self, model_card, text_skops):
- model_card.add_get_started_code(model_format="skops")
- result = model_card.select("How to Get Started with the Model").format()
- # file name is still "my-model.pickle", only the loading code changes
- text = text_skops.replace(".skops", ".pickle")
- assert result == text
-
- def test_invalid_model_format_passed(self, model_card):
- # json is not a valid model format
- msg = "Invalid model format 'json', should be one of 'pickle' or 'skops'"
- with pytest.raises(ValueError, match=msg):
- model_card.add_get_started_code(model_format="json")
-
- def test_invalid_model_format_passed_via_metadata(self):
- # metadata contains invalid model format json
- class Metadata:
- def to_dict(self):
- return {
- "model_file": "my-model.skops",
- "sklearn": {
- "model_format": "json",
- },
- }
-
- model = fit_model()
-
- msg = "Invalid model format 'json', should be one of 'pickle' or 'skops'"
- with pytest.raises(ValueError, match=msg):
- Card(model, metadata=Metadata())
-
- @pytest.mark.parametrize("template", CUSTOM_TEMPLATES)
- def test_custom_template_no_section_uses_default(self, template, text_pickle):
- model = fit_model()
-
- class Metadata:
- def to_dict(self):
- return {
- "model_file": "my-model.pickle",
- "sklearn": {
- "model_format": "pickle",
- },
- }
-
- model_card = Card(model, metadata=Metadata(), template=template)
- model_card.add_get_started_code()
- result = model_card.select("How to Get Started with the Model").format()
- assert result == text_pickle
-
- def test_add_twice(self, model_card):
- # it's possible to add the section twice, even if it doesn't make a lot
- # of sense
- text1 = model_card.select("How to Get Started with the Model").format()
- model_card.add_get_started_code(section="Other section")
- text2 = model_card.select("Other section").format()
- assert text1 == text2
-
-
class TestRender:
def test_render(self, model_card, destination_path):
file_name = destination_path / "README.md"
@@ -801,18 +608,6 @@ def test_render(self, model_card, destination_path):
rendered = model_card.render()
assert loaded == rendered
- def test_render_with_metadata(self, model_card):
- model_card.metadata.foo = "something"
- model_card.metadata.bar = "something else"
- rendered = model_card.render()
- expected = textwrap.dedent("""
- ---
- foo: something
- bar: something else
- ---
- """).strip()
- assert rendered.startswith(expected)
-
class TestSelect:
"""Selecting sections from the model card"""
@@ -1167,49 +962,6 @@ def test_add_plot_with_alt_text(self, destination_path, model_card):
assert plot_content == ""
-class TestMetadata:
- def test_adding_metadata(self, model_card):
- # test if the metadata is added to the card
- model_card.metadata.tags = "dummy"
- metadata = list(model_card._generate_metadata(model_card.metadata))
- assert len(metadata) == 1
- assert metadata[0] == "metadata.tags=dummy,"
-
- def test_metadata_from_config_tabular_data(
- self, pkl_model_card_metadata_from_config, destination_path
- ):
- # test if widget data is correctly set in the README
- metadata = metadata_load(local_path=Path(destination_path) / "README.md")
- assert "widget" in metadata
-
- expected_data = [
- {
- "structuredData": {
- "petal length (cm)": [1.4, 1.4, 1.3],
- "petal width (cm)": [0.2, 0.2, 0.2],
- "sepal length (cm)": [5.1, 4.9, 4.7],
- "sepal width (cm)": [3.5, 3.0, 3.2],
- }
- },
- ]
- assert metadata["widget"] == expected_data
-
- for tag in ["sklearn", "skops", "tabular-classification"]:
- assert tag in metadata["tags"]
-
- def test_metadata_model_format_pkl(
- self, pkl_model_card_metadata_from_config, destination_path
- ):
- metadata = metadata_load(local_path=Path(destination_path) / "README.md")
- assert metadata["model_format"] == "pickle"
-
- def test_metadata_model_format_skops(
- self, skops_model_card_metadata_from_config, destination_path
- ):
- metadata = metadata_load(local_path=Path(destination_path) / "README.md")
- assert metadata["model_format"] == "skops"
-
-
@pytest.mark.xfail(reason="dynamic adjustment when model changes not implemented yet")
class TestModelDynamicUpdate:
def test_model_related_sections_updated_dynamically_skops_template(
@@ -1359,32 +1111,6 @@ def test_without_model_attribute(self, card: Card, meth, expected_lines):
result = meth(card)
assert reprs_equal(expected, result)
- @pytest.mark.parametrize("meth", [repr, str])
- def test_with_metadata(self, card: Card, meth, expected_lines):
- metadata = ModelCardData(
- language="fr",
- license="bsd",
- library_name="sklearn",
- tags=["sklearn", "tabular-classification"],
- foo={"bar": 123},
- widget=[{"something": "very-long"}],
- )
- card.metadata = metadata
-
- # metadata comes after model line, i.e. position 2
- extra_lines = [
- " metadata.language=fr,",
- " metadata.license=bsd,",
- " metadata.library_name=sklearn,",
- " metadata.tags=['sklearn', 'tabular-classification'],",
- " metadata.foo={'bar': 123},",
- " metadata.widget=[{...}],",
- ]
- expected = "\n".join(expected_lines[:2] + extra_lines + expected_lines[2:])
- result = meth(card)
-
- assert reprs_equal(expected, result)
-
class TestCardModelAttributeIsPath:
def path_to_card(self, path, suffix):
@@ -1660,15 +1386,6 @@ def test_add_metrics(self, card):
assert "accuracy" in content
assert "0.1" in content
- def test_add_get_started_code(self, card):
- card.add_get_started_code(
- section="Getting Started",
- file_name="foobar.skops",
- model_format="skops",
- )
- content = card.select("Getting Started").content
- assert "load" in content
-
def test_custom_template_all_sections_present(self, template, card):
# model_card contains all default sections
for key in template:
@@ -1853,7 +1570,6 @@ def card(self):
card.add_model_plot()
card.add_hyperparams()
card.add_metrics(accuracy=0.1)
- card.add_get_started_code()
return card
def test_toc(self, card):
diff --git a/skops/hub_utils/__init__.py b/skops/hub_utils/__init__.py
deleted file mode 100644
index c7c05aa0..00000000
--- a/skops/hub_utils/__init__.py
+++ /dev/null
@@ -1,21 +0,0 @@
-from ._hf_hub import (
- add_files,
- download,
- get_config,
- get_model_output,
- get_requirements,
- init,
- push,
- update_env,
-)
-
-__all__ = [
- "add_files",
- "download",
- "get_config",
- "get_requirements",
- "get_model_output",
- "init",
- "push",
- "update_env",
-]
diff --git a/skops/hub_utils/_hf_hub.py b/skops/hub_utils/_hf_hub.py
deleted file mode 100644
index d1d1eff6..00000000
--- a/skops/hub_utils/_hf_hub.py
+++ /dev/null
@@ -1,756 +0,0 @@
-"""
-This module contains utilities to push a model to the hub and pull from the
-hub.
-"""
-from __future__ import annotations
-
-import collections
-import itertools
-import json
-import os
-import shutil
-import warnings
-from pathlib import Path
-from typing import Any, List, Literal, MutableMapping, Optional, Sequence, Union
-
-import numpy as np
-from huggingface_hub import HfApi, InferenceClient, snapshot_download
-from sklearn.utils import check_array
-
-SUPPORTED_TASKS = [
- "tabular-classification",
- "tabular-regression",
- "text-classification",
- "text-regression",
-]
-
-
-def _validate_folder(path: Union[str, Path]) -> None:
- """Validate the contents of a folder.
-
- This function checks if the contents of a folder make a valid repo for a
- scikit-learn based repo on the Hugging Face Hub.
-
- A valid repository is one which is understood by the Hub as well as this
- library to run and use the model. Otherwise anything can be put as a model
- repository on the Hub and use it as a `git` and `git lfs` server.
-
- Raises a ``TypeError`` if invalid.
-
- Parameters
- ----------
- path: str or Path
- The location of the repo.
-
- Raises
- ------
- TypeError
- Raised when the passed path is invalid.
-
- Returns
- -------
- None
- """
- path = Path(path)
- if not path.is_dir():
- raise TypeError("The given path is not a directory.")
-
- config_path = path / "config.json"
- if not config_path.exists():
- raise TypeError("Configuration file `config.json` missing.")
-
- with open(config_path, "r") as f:
- config = json.load(f)
-
- model_path = config.get("sklearn", {}).get("model", {}).get("file", None)
- if not model_path:
- raise TypeError(
- "Model file not configured in the configuration file. It should be stored"
- " in the hf_hub.sklearn.model key."
- )
-
- if not (path / model_path).exists():
- raise TypeError(f"Model file {model_path} does not exist.")
-
-
-def _get_example_input_from_tabular_data(data):
- """Returns the example input of a model for a tabular task.
-
- The input is converted into a dictionary which is then stored in the config
- file.
-
- Parameters
- ----------
- data: array-like
- The input needs to be either a ``pandas.DataFrame``, a 2D
- ``numpy.ndarray`` or a list/tuple that can be converted to a 2D
- ``numpy.ndarray``. The first 3 rows are used as example input.
-
- Returns
- -------
- example_input: dict of lists
- The example input of the model as accepted by Hugging Face's backend.
- """
- try:
- import pandas as pd
-
- if isinstance(data, pd.DataFrame):
- return {x: data[x][:3].to_list() for x in data.columns}
- except ImportError:
- # pandas is not installed, the data cannot be a pandas DataFrame
- pass
-
- # here we convert the first three rows of `data` to a dict of lists
- # to be stored in the config file
- if isinstance(data, (np.ndarray, list, tuple)):
- data_slice = data[:3]
- # This will raise a ValueError if the array is not 2D
- data_slice_array = check_array(data_slice, ensure_2d=True)
- return {
- f"x{x}": data_slice_array[:, x].tolist()
- for x in range(data_slice_array.shape[1])
- }
-
- raise ValueError(
- "The data is not a pandas.DataFrame, a 2D numpy.ndarray or a "
- "list/tuple that can be converted to a 2D numpy.ndarray."
- )
-
-
-def _get_example_input_from_text_data(data: Sequence[str]):
- """Returns the example input of a model for a text task.
-
- The input is converted into a dictionary which is then stored in the config
- file.
-
- Parameters
- ----------
- data: Sequence[str]
- A sequence of strings. The first 3 elements are used as example input.
-
- Returns
- -------
- example_input: dict of lists
- The example input of the model as accepted by Hugging Face's backend.
- """
-
- def _head(data, n):
- is_data_subscriptable = hasattr(data, "__getitem__")
- if is_data_subscriptable:
- return data[:n]
-
- return list(itertools.islice(data, n))
-
- def _is_sequence_of_strings(data):
- return not isinstance(data, str) and all(isinstance(x, str) for x in data)
-
- error_message = "The data needs to be a sequence of strings."
- try:
- data_head = _head(data, n=3)
- if _is_sequence_of_strings(data_head):
- return {"data": data_head}
- else:
- raise ValueError(error_message)
- except TypeError as e:
- raise ValueError(error_message) from e
-
-
-def _get_column_names(data):
- """Returns the column names of the input.
-
- If data is not a ``pandas.DataFrame``, column names are assumed to be
- ``x0`` to ``xn-1``, where ``n`` is the number of columns.
-
- Parameters
- ----------
- data: array-like
- The data whose columns names are to be returned. Must be a
- ``pandas.DataFrame``, a 2D ``numpy.ndarray`` or a list/tuple that can
- be converted to a 2D ``numpy.ndarray``.
-
- Returns
- -------
- columns: list of strings
- A list of strings. Each string is a column name.
- """
- try:
- import pandas as pd
-
- if isinstance(data, pd.DataFrame):
- return list(data.columns)
- except ImportError:
- # pandas is not installed, the data cannot be a pandas DataFrame
- pass
-
- # TODO: this is going to fail for Structured Arrays. We can add support for
- # them later if we see need for it.
- if isinstance(data, (np.ndarray, list, tuple)):
- # This will raise a ValueError if the array is not 2D
- data_array = check_array(data, ensure_2d=True)
- return [f"x{x}" for x in range(data_array.shape[1])]
-
- raise ValueError(
- "The data is not a pandas.DataFrame, a 2D numpy.ndarray or a "
- "list/tuple that can be converted to a 2D numpy.ndarray."
- )
-
-
-def _create_config(
- *,
- model_path: Union[str, Path],
- requirements: List[str],
- dst: Union[str, Path],
- task: Literal[
- "tabular-classification",
- "tabular-regression",
- "text-classification",
- "text-regression",
- ],
- data,
- model_format: Literal[
- "skops",
- "pickle",
- "auto",
- ] = "auto",
-) -> None:
- """Write the configuration into a ``config.json`` file.
-
- Parameters
- ----------
- model_path : str, or Path
- The relative path (from the repo root) to the model file.
-
- requirements : list of str
- A list of required packages. The versions are then extracted from the
- current environment.
-
- dst : str, or Path
- The path to an existing folder where the config file should be created.
-
- task: "tabular-classification", "tabular-regression",
- "text-classification", /
- or "text-regression"
- The task of the model, which determines the input and output type of
- the model. It can be one of: ``tabular-classification``,
- ``tabular-regression``, ``text-classification``, ``text-regression``.
-
- data: array-like or sequence
- The input to the model. This is used for two purposes:
-
- 1. Save an example input to the model, which is used by
- HuggingFace's backend and shown in the widget of the model's
- page.
- 2. Store the columns and their order of the input, which is used by
- HuggingFace's backend to pass the data in the right form to the
- model.
-
- The first 3 input values are used as example inputs. If the task is
- ``tabular-classification`` or ``tabular-regression``, then data is
- expected to be an array-like. Otherwise, it is expected to be an
- sequence of strings.
-
- model_format: str (default="auto")
- The format used to persist the model. Can be ``"auto"``, ``"skops"``
- or ``"pickle"``. Defaults to ``"auto"``, which would mean:
-
- - ``"pickle"`` if the extension is one of ``{".pickle", ".pkl", ".joblib"}``
- - ``"skops"`` if the extension is ``".skops"``
- """
-
- # so that we don't have to explicitly add keys and they're added as a
- # dictionary if they are not found
- # see: https://stackoverflow.com/a/13151294/2536294
- def recursively_default_dict() -> MutableMapping:
- return collections.defaultdict(recursively_default_dict)
-
- if model_format == "auto":
- extension = Path(model_path).suffix
- if extension in [".pkl", ".pickle", ".joblib"]:
- model_format = "pickle"
- elif extension == ".skops":
- model_format = "skops"
- if model_format not in ["skops", "pickle"]:
- raise ValueError(
- "Cannot determine the input file format. Please indicate the format using"
- " the `model_format` argument."
- )
-
- config = recursively_default_dict()
- config["sklearn"]["model"]["file"] = str(model_path)
- config["sklearn"]["environment"] = requirements
- config["sklearn"]["task"] = task
- config["sklearn"]["model_format"] = model_format
-
- if "tabular" in task:
- config["sklearn"]["example_input"] = _get_example_input_from_tabular_data(data)
- config["sklearn"]["columns"] = _get_column_names(data)
- elif "text" in task:
- config["sklearn"]["example_input"] = _get_example_input_from_text_data(data)
-
- dump_json(Path(dst) / "config.json", config)
-
-
-def _check_model_file(path: str | Path) -> Path:
- """Perform sanity checks on the model file
-
- Parameters
- ----------
- path : str or Path
- The model path
-
- Returns
- -------
- path : Path
- The model path as a ``pathlib.Path``.
-
- Raises
- ------
- OSError
- If the model file does not exist.
- """
- if not os.path.exists(path):
- raise OSError(f"Model file '{path}' does not exist.")
-
- if os.path.getsize(path) == 0:
- raise RuntimeError(f"Model file '{path}' is empty.")
-
- return Path(path)
-
-
-def init(
- *,
- model: Union[str, Path],
- requirements: List[str],
- dst: Union[str, Path],
- task: Literal[
- "tabular-classification",
- "tabular-regression",
- "text-classification",
- "text-regression",
- ],
- data,
- model_format: Literal[
- "skops",
- "pickle",
- "auto",
- ] = "auto",
-) -> None:
- """Initialize a scikit-learn based Hugging Face repo.
-
- Given a pickled model and a set of required packages, this function
- initializes a folder to be a valid Hugging Face scikit-learn based repo.
-
- Parameters
- ----------
- model: str, or Path
- The path to a model pickle file.
-
- requirements: list of str
- A list of required packages. The versions are then extracted from the
- current environment.
-
- dst: str, or Path
- The path to a non-existing or empty folder which is to be initialized.
-
- task: str
- The task of the model, which determines the input and output type of
- the model. It can be one of: ``tabular-classification``,
- ``tabular-regression``, ``text-classification``, ``text-regression``.
-
- data: array-like
- The input to the model. This is used for two purposes:
-
- 1. Save an example input to the model, which is used by
- HuggingFace's backend and shown in the widget of the model's
- page.
- 2. Store the columns and their order of the input, which is used by
- HuggingFace's backend to pass the data in the right form to the
- model.
-
- The first 3 input values are used as example inputs.
-
- If ``task`` is ``"tabular-classification"`` or ``"tabular-regression"``,
- the data needs to be a :class:`pandas.DataFrame` or a
- :class:`numpy.ndarray`. If ``task`` is ``"text-classification"`` or
- ``"text-regression"``, the data needs to be a ``list`` of strings.
-
- model_format: str (default="auto")
- The format the model was persisted in. Can be ``"auto"``, ``"skops"``
- or ``"pickle"``. Defaults to ``"auto"`` that relies on file extension.
- """
- dst = Path(dst)
- if dst.exists() and bool(next(dst.iterdir(), None)):
- raise OSError("None-empty dst path already exists!")
-
- if task not in SUPPORTED_TASKS:
- raise ValueError(
- f"Task {task} not supported. Supported tasks are: {SUPPORTED_TASKS}"
- )
-
- model = _check_model_file(model)
-
- dst.mkdir(parents=True, exist_ok=True)
-
- try:
- shutil.copy2(src=model, dst=dst)
-
- model_name = model.name
- _create_config(
- model_path=model_name,
- requirements=requirements,
- dst=dst,
- task=task,
- data=data,
- model_format=model_format,
- )
- except Exception:
- shutil.rmtree(dst)
- raise
-
-
-def add_files(*files: str | Path, dst: str | Path, exist_ok: bool = False) -> None:
- """Add files to initialized repo.
-
- After having called :func:`.hub_utils.init`, use this function to add
- arbitrary files to be uploaded in addition to the model and model card.
-
- In particular, it can be useful to upload the script itself that produces
- those artifacts by calling ``hub_utils.add_files([__file__], dst=...)``.
-
- Parameters
- ----------
- *files : str or Path
- The files to be added.
-
- dst : str or Path
- Path to the initialized repo, same as used during
- :func:`.hub_utils.init`.
-
- exist_ok : bool (default=False)
- Whether it's okay or not to add a file that already exists. If
- ``True``, override the files, otherwise raise a ``FileExistsError``.
-
- Raises
- ------
- FileNotFoundError
- When the target folder or the files to be added are not found.
-
- FileExistsError
- When a file is added that already exists at the target location and
- ``exist_ok=False``.
-
- """
- dst = Path(dst)
- # check dst exists
- if not dst.exists():
- msg = f"Could not find '{dst}', did you run 'skops.hub_utils.init' first?"
- raise FileNotFoundError(msg)
-
- src_files = [Path(file) for file in files]
- # check that source files exist
- for file in src_files:
- if not file.exists():
- msg = f"File '{file}' could not be found."
- raise FileNotFoundError(msg)
-
- dst_files = [dst / Path(file).name for file in files]
- for src_file, dst_file in zip(src_files, dst_files):
- # check if target file already exists
- if dst_file.exists() and not exist_ok:
- msg = f"File '{src_file.name}' already found at '{dst}'."
- raise FileExistsError(msg)
-
- shutil.copy2(src_file, dst_file)
-
-
-def dump_json(path, content):
- with open(Path(path), mode="w") as f:
- json.dump(content, f, sort_keys=True, indent=4)
-
-
-def update_env(
- *, path: Union[str, Path], requirements: Union[List[str], None] = None
-) -> None:
- """Update the environment requirements of a repo.
-
- This function takes the path to the repo, and updates the requirements of
- running the scikit-learn based model in the repo.
-
- Parameters
- ----------
- path: str, or Path
- The path to an existing local repo.
-
- requirements: list of str, optional
- The list of required packages for the model. If none is passed, the
- list of existing requirements is used and their versions are updated.
-
- """
-
- with open(Path(path) / "config.json") as f:
- config = json.load(f)
-
- config["sklearn"]["environment"] = requirements
-
- dump_json(Path(path) / "config.json", config)
-
-
-def push(
- *,
- repo_id: str,
- source: Union[str, Path],
- token: str | None = None,
- commit_message: str | None = None,
- create_remote: bool = False,
- private: bool | None = None,
-) -> None:
- """Pushes the contents of a model repo to Hugging Face Hub.
-
- This function validates the contents of the folder before pushing it to the
- Hub.
-
- Parameters
- ----------
- repo_id: str
- The ID of the destination repository in the form of ``OWNER/REPO_NAME``.
-
- source: str or Path
- A folder where the contents of the model repo are located.
-
- token: str, optional
- A token to push to the Hub. If not provided, the user should be already
- logged in using ``huggingface-cli login``.
-
- commit_message: str, optional
- The commit message to be used when pushing to the repo.
-
- create_remote: bool, default=False
- Whether to create the remote repository if it doesn't exist. If the
- remote repository doesn't exist and this parameter is ``False``, it
- raises an error. Otherwise it checks if the remote repository exists,
- and would create it if it doesn't.
-
- private: bool, default=None
- Whether the remote repository should be public or private. If ``True``
- or ``False`` is passed, this method will set the private/public status
- of the remote repository, regardless of it already existing or not. If
- ``None``, no change is applied.
-
- .. versionadded:: 0.3
-
- Returns
- -------
- None
-
- Raises
- ------
- TypeError
- This function raises a ``TypeError`` if the contents of the source
- folder do not make a valid Hugging Face Hub scikit-learn based repo.
- """
- warnings.warn(
- "Creating repos on hf.co is subject to strict rate limits now and therefore"
- " this feature is to be removed from this library in version 0.10. You can"
- " use tools directly available in the huggingface_hub library instead to"
- " create and push files.",
- FutureWarning,
- )
- _validate_folder(path=source)
- client = HfApi()
-
- if create_remote:
- client.create_repo(
- repo_id=repo_id, token=token, repo_type="model", exist_ok=True
- )
-
- if private is not None:
- client.update_repo_visibility(repo_id=repo_id, private=private, token=token)
-
- client.upload_folder(
- repo_id=repo_id,
- path_in_repo=".",
- folder_path=source,
- commit_message=commit_message,
- commit_description=None,
- token=token,
- repo_type=None,
- revision=None,
- create_pr=False,
- )
-
-
-def get_config(path: Union[str, Path]) -> dict[str, Any]:
- """Returns the configuration of a project.
-
- Parameters
- ----------
- path: str
- The path to the directory holding the project and its ``config.json``
- configuration file.
-
- Returns
- -------
- config: dict
- A dictionary which holds the configs of the project.
- """
- with open(Path(path) / "config.json", "r") as f:
- config = json.load(f)
- return config
-
-
-def get_requirements(path: Union[str, Path]) -> List[str]:
- """Returns the requirements of a project.
-
- Parameters
- ----------
- path: str
- The path to the director holding the project and its ``config.json``
- configuration file.
-
- Returns
- -------
- requirements: list of str
- The list of requirements which can be passed to the package manager to
- be installed.
- """
- config = get_config(path)
- return config.get("sklearn", dict()).get("environment", list())
-
-
-def download(
- *,
- repo_id: str,
- dst: Union[str, Path],
- revision: str | None = None,
- token: str | None = None,
- keep_cache: bool = True,
- **kwargs: Any,
-) -> None:
- """Download a repository into a directory.
-
- The directory needs to be an empty or a non-existing one.
-
- Parameters
- ----------
- repo_id: str
- The ID of the Hugging Face Hub repository in the form of
- ``OWNER/REPO_NAME``.
-
- dst: str, or Path
- The directory to which the files are downloaded.
-
- revision: str, optional
- The revision of the project to download. This can be a git tag, branch,
- or a git commit hash. By default the latest revision of the default
- branch is downloaded.
-
- token: str, optional
- The token to be used to download the files. Only required if the
- repository is private.
-
- keep_cache: bool, default=True
- Whether the cached data should be kept or removed after download. By
- default a copy of the cached files will be created in the ``dst``
- folder. If ``False``, the cache will be removed after the contents are
- copied. Note that the cache is git based and by default new files are
- only downloaded if there is a new revision of them on the hub. If you
- keep the cache, the old files are not removed after downloading the
- newer versions of them.
-
- kwargs: dict
- Other parameters to be passed to
- :func:`huggingface_hub.snapshot_download`.
-
- Returns
- -------
- None
- """
- dst = Path(dst)
- if dst.exists() and bool(next(dst.iterdir(), None)):
- raise OSError("None-empty dst path already exists!")
-
- # remove the folder only if it's empty and it exists
- if dst.exists():
- dst.rmdir()
-
- # TODO: Switch from use_auth_token to token once huggingface_hub<0.11 is
- # dropped. Until then, we ignore the mypy type check, because mypy doesn't
- # see that use_auth_token is handled by the decorator of snapshot_download.
- cached_folder = snapshot_download(
- repo_id=repo_id, revision=revision, use_auth_token=token, **kwargs # type: ignore
- )
- shutil.copytree(cached_folder, dst)
- if not keep_cache:
- shutil.rmtree(path=cached_folder)
-
-
-# TODO(v0.10): remove this function
-def get_model_output(repo_id: str, data: Any, token: Optional[str] = None) -> Any:
- """Returns the output of the model using Hugging Face Hub's inference API.
-
- See the :ref:`User Guide ` for more details.
-
- .. deprecated:: 0.9
- Will be removed in version 0.10. Use ``huggingface_hub.InferenceClient``
- instead.
-
- Parameters
- ----------
- repo_id: str
- The ID of the Hugging Face Hub repository in the form of
- ``OWNER/REPO_NAME``.
-
- data: Any
- The input to be given to the model. This can be a
- :class:`pandas.DataFrame` or a :class:`numpy.ndarray`. If possible, you
- should always pass a :class:`pandas.DataFrame` with correct column
- names.
-
- token: str, optional
- The token to be used to call the inference API. Only required if the
- repository is private.
-
- Returns
- -------
- output: numpy.ndarray
- The output of the model.
-
- Notes
- -----
- If there are warnings or exceptions during inference, this function raises
- a :class:`RuntimeError` including the original errors and warnings
- returned from the server.
-
- Also note that if the model repo is private, the inference API would not be
- available.
- """
- warnings.warn(
- "This feature is no longer free on hf.co and therefore this function will"
- " be removed in the next release. Use `huggingface_hub.InferenceClient`"
- " instead.",
- FutureWarning,
- )
- model_info = HfApi().model_info(repo_id=repo_id, use_auth_token=token) # type: ignore
- if not model_info.pipeline_tag:
- raise ValueError(
- f"Repo {repo_id} has no pipeline tag. You should set a valid 'task' in"
- " config.json and README.md files. This is automatically done for you if"
- " you pass a valid task to the skops.hub_utils.init() and"
- " skops.card.metadata_from_util() functions to generate those files."
- )
-
- try:
- inputs = {"data": data.to_dict(orient="list")}
- except AttributeError:
- # the input is not a pandas DataFrame
- inputs = {f"x{i}": data[:, i] for i in range(data.shape[1])}
- inputs = {"data": inputs}
-
- client = InferenceClient(token=token)
- res_bytes = client.post(json={"inputs": inputs}, model=repo_id)
- res = json.loads(res_bytes.decode("utf-8"))
-
- if isinstance(res, list):
- return np.array(res)
- else:
- raise RuntimeError(f"There were errors or warnings during inference: {res}")
diff --git a/skops/hub_utils/tests/common.py b/skops/hub_utils/tests/common.py
deleted file mode 100644
index f06f3b6a..00000000
--- a/skops/hub_utils/tests/common.py
+++ /dev/null
@@ -1,2 +0,0 @@
-# This is the token for the skops user on the hub, used for the CI.
-HF_HUB_TOKEN = "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"
diff --git a/skops/hub_utils/tests/test_hf_hub.py b/skops/hub_utils/tests/test_hf_hub.py
deleted file mode 100644
index b4609dba..00000000
--- a/skops/hub_utils/tests/test_hf_hub.py
+++ /dev/null
@@ -1,654 +0,0 @@
-import json
-import os
-import pickle
-import re
-import shutil
-import tempfile
-import warnings
-from importlib import metadata
-from pathlib import Path
-from uuid import uuid4
-
-import numpy as np
-import pandas as pd
-import pytest
-import sklearn
-from flaky import flaky
-from huggingface_hub import HfApi
-from sklearn.datasets import load_diabetes, load_iris
-from sklearn.linear_model import LinearRegression, LogisticRegression
-
-from skops import card
-from skops.hub_utils import (
- add_files,
- get_config,
- get_model_output,
- get_requirements,
- init,
- push,
- update_env,
-)
-from skops.hub_utils._hf_hub import (
- _create_config,
- _get_column_names,
- _get_example_input_from_tabular_data,
- _get_example_input_from_text_data,
- _validate_folder,
-)
-from skops.hub_utils.tests.common import HF_HUB_TOKEN
-from skops.io import dump
-
-iris = load_iris(as_frame=True, return_X_y=False)
-diabetes = load_diabetes(as_frame=True, return_X_y=False)
-
-IS_SKLEARN_DEV_BUILD = "dev" in sklearn.__version__
-
-
-@pytest.fixture
-def temp_path():
- with tempfile.TemporaryDirectory(prefix="skops-test-temp-path") as temp_path:
- yield temp_path
-
-
-@pytest.fixture(scope="session")
-def repo_path():
- with tempfile.TemporaryDirectory(prefix="skops-test-sample-repo") as repo_path:
- yield Path(repo_path)
-
-
-@pytest.fixture
-def destination_path():
- with tempfile.TemporaryDirectory(prefix="skops-test") as dir_path:
- yield Path(dir_path)
-
-
-def get_classifier():
- X, y = iris.data, iris.target
- clf = LogisticRegression(solver="newton-cg").fit(X, y)
- return clf
-
-
-def get_regressor():
- X, y = diabetes.data, diabetes.target
- model = LinearRegression().fit(X, y)
- return model
-
-
-@pytest.fixture(scope="session")
-def classifier(repo_path, config_json):
- # Create a simple model file for the purpose of testing
- clf = get_classifier()
- config_path, file_format = config_json
- model_file = CONFIG[file_format]["sklearn"]["model"]["file"]
- path = repo_path / model_file
-
- try:
- if file_format == "pickle":
- with open(path, "wb") as f:
- pickle.dump(clf, f)
- elif file_format == "skops":
- dump(clf, path)
- yield path
- finally:
- path.unlink(missing_ok=True)
-
-
-CONFIG = {
- "pickle": {
- "sklearn": {
- "environment": ['scikit-learn="1.1.1"'],
- "model": {"file": "model.pickle"},
- }
- },
- "skops": {
- "sklearn": {
- "environment": ['scikit-learn="1.1.1"'],
- "model": {"file": "model.skops"},
- }
- },
-}
-
-
-@pytest.fixture(scope="session", params=["skops", "pickle"])
-def config_json(repo_path, request):
- path = repo_path / "config.json"
- try:
- with open(path, "w") as f:
- json.dump(CONFIG[request.param], f)
- yield path, request.param
- finally:
- path.unlink(missing_ok=True)
-
-
-def test_validate_format(classifier):
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
- with pytest.raises(ValueError, match="Cannot determine the input file*"):
- init(
- model=classifier,
- requirements=["scikit-learn"],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- model_format="dummy",
- )
-
-
-def test_validate_folder(config_json):
- config_path, file_format = config_json
- _, file_path = tempfile.mkstemp()
- dir_path = tempfile.mkdtemp()
- with pytest.raises(TypeError, match="The given path is not a directory."):
- _validate_folder(path=file_path)
-
- with pytest.raises(TypeError, match="Configuration file `config.json` missing."):
- _validate_folder(path=dir_path)
-
- with open(Path(dir_path) / "config.json", "w") as f:
- json.dump(dict(), f)
-
- with pytest.raises(
- TypeError, match="Model file not configured in the configuration file."
- ):
- _validate_folder(path=dir_path)
-
- shutil.copy2(config_path, dir_path)
- model_file = CONFIG[file_format]["sklearn"]["model"]["file"]
- with pytest.raises(TypeError, match=f"Model file {model_file} does not exist."):
- _validate_folder(path=dir_path)
-
- (Path(dir_path) / model_file).touch()
-
- # this should now work w/o an error
- _validate_folder(path=dir_path)
-
-
-@pytest.mark.parametrize(
- "data, task, expected_config",
- [
- (
- iris.data,
- "tabular-classification",
- {
- "sklearn": {
- "columns": [
- "petal length (cm)",
- "petal width (cm)",
- "sepal length (cm)",
- "sepal width (cm)",
- ],
- "environment": ['scikit-learn="1.1.1"', "numpy"],
- "example_input": {
- "petal length (cm)": [1.4, 1.4, 1.3],
- "petal width (cm)": [0.2, 0.2, 0.2],
- "sepal length (cm)": [5.1, 4.9, 4.7],
- "sepal width (cm)": [3.5, 3.0, 3.2],
- },
- "model": {"file": "model.pkl"},
- "task": "tabular-classification",
- }
- },
- ),
- (
- ["test", "text", "problem", "random"],
- "text-classification",
- {
- "sklearn": {
- "environment": ['scikit-learn="1.1.1"', "numpy"],
- "example_input": {"data": ["test", "text", "problem"]},
- "model": {"file": "model.pkl"},
- "task": "text-classification",
- }
- },
- ),
- ],
-)
-def test_create_config(data, task, expected_config):
- dir_path = tempfile.mkdtemp()
- _create_config(
- model_path="model.pkl",
- requirements=['scikit-learn="1.1.1"', "numpy"],
- dst=dir_path,
- task=task,
- data=data,
- )
-
- with open(Path(dir_path) / "config.json") as f:
- config = json.load(f)
- for key in ["environment", "model", "task"]:
- assert config["sklearn"][key] == expected_config["sklearn"][key]
-
- keys = ["example_input"]
- if "tabular" in task:
- # text data doesn't introduce any "columns" in the configuration
- keys += ["columns"]
- for key in keys:
- assert sorted(config["sklearn"][key]) == sorted(
- expected_config["sklearn"][key]
- )
-
-
-def test_create_config_invalid_text_data(temp_path):
- with pytest.raises(ValueError, match="The data needs to be a sequence of strings."):
- _create_config(
- model_path="model.pkl",
- requirements=['scikit-learn="1.1.1"', "numpy"],
- task="text-classification",
- data=[1, 2, 3],
- dst=temp_path,
- )
-
-
-def test_atomic_init(classifier, temp_path):
- with pytest.raises(ValueError):
- # this fails since we're passing an invalid task.
- init(
- model=classifier,
- requirements=["scikit-learn"],
- dst=temp_path,
- task="tabular-classification",
- data="invalid",
- )
-
- # this passes even though the above init has failed once, on the same
- # destination path.
- init(
- model=classifier,
- requirements=["scikit-learn"],
- dst=temp_path,
- task="tabular-classification",
- data=iris.data,
- )
-
-
-def test_init_invalid_task(classifier, temp_path):
- with pytest.raises(
- ValueError, match="Task invalid not supported. Supported tasks are"
- ):
- init(
- model=classifier,
- requirements=["scikit-learn"],
- dst=temp_path,
- task="invalid",
- data=iris.data,
- )
-
-
-def test_init(classifier, config_json):
- config_path, file_format = config_json
- # create a temp directory and delete it, we just need a unique name.
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
-
- version = metadata.version("scikit-learn")
- init(
- model=classifier,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
- _validate_folder(path=dir_path)
-
- # it should fail a second time since the folder is no longer empty.
- with pytest.raises(OSError, match="None-empty dst path already exists!"):
- init(
- model=classifier,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
-
-
-def test_init_no_warning_or_error(classifier, config_json):
- config_path, file_format = config_json
- # for the happy path, there should be no warning
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
- version = metadata.version("scikit-learn")
-
- with warnings.catch_warnings():
- warnings.simplefilter("error")
- init(
- model=classifier,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
-
-
-def test_model_file_does_not_exist_raises(repo_path, config_json):
- config_path, file_format = config_json
- # when the model file does not exist, raise an OSError
- model_path = repo_path / "foobar.pickle"
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
- version = metadata.version("scikit-learn")
-
- msg = re.escape(f"Model file '{model_path}' does not exist.")
- with pytest.raises(OSError, match=msg):
- init(
- model=model_path,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
- model_path.unlink(missing_ok=True)
-
-
-def test_init_empty_model_file_errors(repo_path, config_json):
- config_path, file_format = config_json
- # when model file is empty, warn users
- model_path = Path(repo_path / "foobar.pickle")
- model_path.touch()
-
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
- version = metadata.version("scikit-learn")
-
- with pytest.raises(
- RuntimeError, match=re.escape(f"Model file '{model_path}' is empty.")
- ):
- init(
- model=model_path,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
- model_path.unlink(missing_ok=True)
-
-
-def test_push_deprecation():
- with pytest.raises(Exception):
- with pytest.warns(FutureWarning, match="Creating repos on hf.co is subject"):
- push(repo_id="dummy", source=".")
-
-
-@pytest.fixture
-def repo_path_for_inference():
- # Create a separate path for test_inference so that the test does not have
- # any side-effect on existing tests
- with tempfile.TemporaryDirectory(prefix="skops-test-sample-repo") as repo_path:
- yield Path(repo_path)
-
-
-@pytest.mark.network
-@pytest.mark.inference
-@pytest.mark.skipif(
- IS_SKLEARN_DEV_BUILD, reason="Inference tests cannot run with sklearn dev build"
-)
-@flaky(max_runs=3)
-@pytest.mark.parametrize(
- "model_func, data, task",
- [
- (get_classifier, iris, "tabular-classification"),
- (get_regressor, diabetes, "tabular-regression"),
- ],
- ids=["classifier", "regressor"],
-)
-def test_inference(
- config_json,
- model_func,
- data,
- task,
- repo_path_for_inference,
- destination_path,
-):
- # test inference backend for classifier and regressor models. This test can
- # take a lot of time and be flaky.
- config_path, file_format = config_json
- if file_format != "pickle":
- pytest.skip(
- f"Inference only supports pickle at the moment. Given format: {file_format}"
- )
-
- client = HfApi()
- repo_path = repo_path_for_inference
- model_file = CONFIG[file_format]["sklearn"]["model"]["file"]
- model = model_func()
- model_path = repo_path / model_file
-
- with open(model_path, "wb") as f:
- pickle.dump(model, f)
-
- version = metadata.version("scikit-learn")
- init(
- model=model_path,
- requirements=[f'scikit-learn="{version}"'],
- dst=destination_path,
- task=task,
- data=data.data,
- )
-
- # TODO: remove when card init at repo init is merged
- model_card = card.Card(
- model, metadata=card.metadata_from_config(Path(destination_path))
- )
- model_card.save(Path(destination_path) / "README.md")
-
- user = client.whoami(token=HF_HUB_TOKEN)["name"]
- repo_id = f"{user}/test-{uuid4()}"
-
- with pytest.warns(FutureWarning, match="Creating repos on hf.co is subject"):
- push(
- repo_id=repo_id,
- source=destination_path,
- token=HF_HUB_TOKEN,
- commit_message="test message",
- create_remote=True,
- # api-inference doesn't support private repos for community projects.
- private=False,
- )
-
- X_test = data.data.head(5)
- y_pred = model.predict(X_test)
- with pytest.warns(FutureWarning):
- output = get_model_output(repo_id, data=X_test, token=HF_HUB_TOKEN)
-
- # cleanup
- client.delete_repo(repo_id=repo_id, token=HF_HUB_TOKEN)
- model_path.unlink(missing_ok=True)
-
- assert np.allclose(output, y_pred)
-
-
-def test_get_model_output_deprecated():
- with pytest.raises(Exception):
- with pytest.warns(FutureWarning, match="This feature is no longer free"):
- get_model_output("dummy", data=iris.data)
-
-
-def test_get_config(repo_path, config_json):
- config_path, file_format = config_json
- config = get_config(repo_path)
-
- assert config == CONFIG[file_format]
- assert get_requirements(repo_path) == ['scikit-learn="1.1.1"']
-
-
-def test_update_env(repo_path, config_json):
- # sanity check
- assert get_requirements(repo_path) == ['scikit-learn="1.1.1"']
- update_env(path=repo_path, requirements=['scikit-learn="1.1.2"'])
- assert get_requirements(repo_path) == ['scikit-learn="1.1.2"']
-
-
-def test_get_example_input_from_tabular_data():
- with pytest.raises(
- ValueError,
- match=(
- "The data is not a pandas.DataFrame, a 2D numpy.ndarray or a "
- "list/tuple that can be converted to a 2D numpy.ndarray."
- ),
- ):
- _get_example_input_from_tabular_data("random")
- with pytest.raises(ValueError):
- _get_example_input_from_tabular_data(["a", "b", "c"])
-
- examples = _get_example_input_from_tabular_data(np.ones((5, 10)))
- # the result is a dictionary of column name: list of values
- assert len(examples) == 10
- assert len(examples["x0"]) == 3
-
- examples = _get_example_input_from_tabular_data(np.ones((5, 10)).tolist())
- # the result is a dictionary of column name: list of values
- assert len(examples) == 10
- assert len(examples["x0"]) == 3
-
- examples = _get_example_input_from_tabular_data(
- pd.DataFrame(np.ones((5, 10)), columns=[f"column{x}" for x in range(10)])
- )
- # the result is a dictionary of column name: list of values
- assert len(examples) == 10
- assert len(examples["column0"]) == 3
-
-
-@pytest.mark.parametrize(
- "data, expected_length",
- [
- (["a", "b", "c", "d"], 3),
- (np.array(["a", "b", "c", "d"]), 3),
- (set(["a", "b", "c", "d"]), 3),
- (tuple(["a", "b", "c", "d"]), 3),
- (["a"], 1),
- ([], 0),
- ],
-)
-def test_get_example_input_from_text_data(data, expected_length):
- example_input = _get_example_input_from_text_data(data)
- assert len(example_input["data"]) == expected_length
-
-
-@pytest.mark.parametrize("data", ["random", [1, 2, 3], 420])
-def test_get_example_input_from_text_data_invalid_text_data(data):
- with pytest.raises(ValueError, match="The data needs to be a sequence of strings."):
- _get_example_input_from_text_data(data)
-
-
-def test_get_column_names():
- with pytest.raises(
- ValueError,
- match=(
- "The data is not a pandas.DataFrame, a 2D numpy.ndarray or a "
- "list/tuple that can be converted to a 2D numpy.ndarray."
- ),
- ):
- _get_column_names("random")
- with pytest.raises(ValueError):
- _get_column_names(["a", "b", "c"])
-
- X_array = np.ones((5, 10), dtype=np.float32)
- expected_columns = [f"x{x}" for x in range(10)]
- assert _get_column_names(X_array) == expected_columns
-
- expected_columns = [f"column{x}" for x in range(10)]
- X_df = pd.DataFrame(X_array, columns=expected_columns)
- assert _get_column_names(X_df) == expected_columns
-
-
-def test_get_example_input_from_tabular_data_pandas_not_installed(pandas_not_installed):
- # use pandas_not_installed fixture from conftest.py to pretend that pandas
- # is not installed and check that the function does not raise when pandas
- # import fails
- _get_example_input_from_tabular_data(np.ones((5, 10)))
-
-
-def test_get_column_names_pandas_not_installed(pandas_not_installed):
- # use pandas_not_installed fixture from conftest.py to pretend that pandas
- # is not installed and check that the function does not raise when pandas
- # import fails
- _get_column_names(np.ones((5, 10)))
-
-
-class TestAddFiles:
- @pytest.fixture
- def init_path(self, classifier, config_json):
- # create temporary directory
- dir_path = tempfile.mkdtemp()
- shutil.rmtree(dir_path)
-
- version = metadata.version("scikit-learn")
- init(
- model=classifier,
- requirements=[f'scikit-learn="{version}"'],
- dst=dir_path,
- task="tabular-classification",
- data=iris.data,
- )
- yield dir_path
-
- @pytest.fixture
- def some_file_0(self, temp_path):
- filename = Path(temp_path) / "file0.txt"
- with open(filename, "w") as f:
- f.write("")
- yield filename
-
- @pytest.fixture
- def some_file_1(self, temp_path):
- filename = Path(temp_path) / "file1.txt"
- with open(filename, "w") as f:
- f.write("")
- yield filename
-
- def test_adding_one_file_path(self, init_path, some_file_0):
- add_files(some_file_0, dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
-
- def test_adding_two_file_paths(self, init_path, some_file_0, some_file_1):
- add_files(some_file_0, some_file_1, dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
- assert os.path.exists(Path(init_path) / some_file_1.name)
-
- def test_adding_one_file_str(self, init_path, some_file_0):
- add_files(str(some_file_0), dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
-
- def test_adding_two_files_str(self, init_path, some_file_0, some_file_1):
- add_files(str(some_file_0), str(some_file_1), dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
- assert os.path.exists(Path(init_path) / some_file_1.name)
-
- def test_adding_str_and_path(self, init_path, some_file_0, some_file_1):
- add_files(str(some_file_0), some_file_1, dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
- assert os.path.exists(Path(init_path) / some_file_1.name)
-
- def test_dst_does_not_exist_raises(self, some_file_0):
- dst = tempfile.mkdtemp()
- shutil.rmtree(dst)
- msg = (
- rf"Could not find \'{re.escape(dst)}\', did you run "
- r"\'skops.hub_utils.init\' first\?"
- )
- with pytest.raises(FileNotFoundError, match=msg):
- add_files(some_file_0, dst=dst)
-
- def test_file_does_not_exist_raises(self, init_path, some_file_0):
- non_existing_file = "foobar.baz"
- msg = r"File \'foobar.baz\' could not be found."
- with pytest.raises(FileNotFoundError, match=msg):
- add_files(some_file_0, non_existing_file, dst=init_path)
-
- def test_adding_existing_file_works_if_exist_ok(self, init_path, some_file_0):
- add_files(some_file_0, dst=init_path)
- assert os.path.exists(Path(init_path) / some_file_0.name)
- add_files(some_file_0, dst=init_path, exist_ok=True)
- assert os.path.exists(Path(init_path) / some_file_0.name)
-
- def test_adding_existing_file_raises(self, init_path, some_file_0):
- # first time around no warning
- with warnings.catch_warnings():
- warnings.simplefilter("error")
- add_files(some_file_0, dst=init_path, exist_ok=False)
-
- msg = (
- f"File '{re.escape(some_file_0.name)}' already found "
- f"at '{re.escape(init_path)}'."
- )
- with pytest.raises(FileExistsError, match=msg):
- add_files(some_file_0, dst=init_path)