Skip to content

Remove requests code#42845

Merged
ArthurZucker merged 35 commits intohuggingface:mainfrom
omkar-334:remove-requests-code
Jan 23, 2026
Merged

Remove requests code#42845
ArthurZucker merged 35 commits intohuggingface:mainfrom
omkar-334:remove-requests-code

Conversation

@omkar-334
Copy link
Copy Markdown
Contributor

@omkar-334 omkar-334 commented Dec 12, 2025

What does this PR do?

Removes requests code from setup, src and utils files.
Fixes #42817 partially

  • Quality checks: make fixup passes with no errors

cc @CoderTCY @Wauplin @Rocketknight1

notes -

  1. I've removed requests from setup/workflow files as well, you might want to check the first commit.
  2. A few formatting fixes like "spacing around operators" got added in as well

@guibruand
Copy link
Copy Markdown

guibruand commented Dec 15, 2025

Hello, I got some issues with the current port to httpx.
I managed to set up huggingface_hub calls using the set_factory_client() with my custom httpx Client, and it would be nice if all httpx sessions in transformers rely on the same mechanism.

One way to do it would be for example:


from huggingface_hub import get_session

client = get_session()
result = client.post(....)

This also concerns some other portions of code (like safetensors_conversion.py).
Thank you for your support.

@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Dec 19, 2025

Agree with @guibruand 's suggestion here! It's best to use huggingface_hub.get_session and then deal with the returned Client as if it was httpx directly. @omkar-334 would you like to give it a try? 🙏

@omkar-334
Copy link
Copy Markdown
Contributor Author

Agree with @guibruand 's suggestion here! It's best to use huggingface_hub.get_session and then deal with the returned Client as if it was httpx directly. @omkar-334 would you like to give it a try? 🙏

Got it. I'll try using that and let you know!

@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Dec 19, 2025

Thank you very much! 🤗

@omkar-334
Copy link
Copy Markdown
Contributor Author

omkar-334 commented Dec 20, 2025

Hey @Wauplin , so I've used get_session instead of using httpx directly.
I've called the function once and reused it in the file. Is this appropriate, or should we call get_session for each request?
As far as i know, the get_session already manages a global client and returns it right?

Thanks

Copy link
Copy Markdown
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @omkar-334 , thanks for the changes and sorry for the delay before reviewing. I'm very sorry about my previous comment regarding get_session. I should have more carefully double-checked the changes and context before answering. It turns out most (if not all) changes made in the PR would better benefit from using httpx directly for simplicity rather than huggingface_hub.get_session.

While get_session is better suited to make requests to the Hub (it has HF-specific features), plain httpx is better suited for generic cases that are not HF-related. Also, let's use httpx in all 1-file scripts and utilities (since they are mostly meant for devs/maintainers rather than end users).

Sorry again about this change of direction 🙏

Comment thread src/transformers/cli/chat.py Outdated
Comment thread src/transformers/cli/chat.py Outdated
Comment thread src/transformers/cli/chat.py Outdated
Comment thread src/transformers/cli/chat.py Outdated
import requests
import tensorflow as tf
import torch
from huggingface_hub import get_session
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above (and sorry not being explicit before). For single-file scripts I think it's best to keep it as lean as possible, i.e. use httpx directly as replacement of requests.

get_session is useful when calling the Hub. It adds features to handle request ids and manage offline mode. When in doubt you can use get_session but otherwise keep httpx for light calls to "generic endpoints" (like here, loading an image from the internet).

Comment thread src/transformers/models/beit/convert_beit_unilm_to_pytorch.py Outdated
Comment thread src/transformers/pipelines/image_to_image.py Outdated
Comment thread src/transformers/pipelines/zero_shot_audio_classification.py Outdated
Comment thread utils/check_bad_commit.py Outdated
@omkar-334
Copy link
Copy Markdown
Contributor Author

omkar-334 commented Jan 9, 2026

hey @Wauplin , went through your comments, understood the changes i need to make. Thanks for clarifying this.
Just to be sure, this is what we want -

  • get_session - HF hub specific requests

  • httpx - single-file scripts, utilities, examples, local/generic endpoint, convert scripts, loading images

Will make the changes and push soon...
Thanks!

@omkar-334
Copy link
Copy Markdown
Contributor Author

omkar-334 commented Jan 9, 2026

hey @Wauplin , I've made the changes.

  1. Currently there are only 6 instances where get_session is used - and they are all images hosted on HF. is this alright or revert these to httpx too?
  2. Also, in src/transformers/safetensors_conversion.py, what should be used? Currently, httpx is used for this url - "https://safetensors-convert.hf.space"
  3. There are a few docstrings where requests is used. Since the docs are user-facing, should i let this be as it is, or change it to httpx as well?

Thanks!

@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Jan 15, 2026

@bot /style

@github-actions
Copy link
Copy Markdown
Contributor

Style fix is beginning .... View the workflow run here.

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Jan 15, 2026

Hi @omkar-334 sorry getting back to it only now. A few notes:

  • Some changes are purely formatting issues. Example:
- [0.0000, 0.0000, 0.0000, 0.0000, 0.5300, 0.0000, 0.0000, 0.0000, 0.0000, 1.1221]
+            [
+                0.0000,
+                0.0000,
+                0.0000,
+                0.0000,
+                0.5300,
+                0.0000,
+                0.0000,
+                0.0000,
+                0.0000,
+                1.1221,
+            ]

Can you revert them? The PR is already huge like this so I'd like to keep it "as small as possible".

Same for the

- (f"encoder.deit.blocks.{i}.norm2.weight", f"encoder.encoder.layer.{i}.layernorm_after.weight")
+            (
+                f"encoder.deit.blocks.{i}.norm2.weight",
+                f"encoder.encoder.layer.{i}.layernorm_after.weight",
+            )

etc.

  • Same comment for the pathlib updates
-    with open(filename, "w") as f:
-        f.write("\n".join(new_file_lines))
+      Path(filename).write_text("\n".join(new_file_lines))

While I agree with this codestyle, I do think the changes should be made separately

Currently there are only 6 instances where get_session is used - and they are all images hosted on HF. is this alright or revert these to httpx too?

Let's keep them as they are before this PR.

Also, in src/transformers/safetensors_conversion.py, what should be used? Currently, httpx is used for this url - "https://safetensors-convert.hf.space"

httpx is good in this case

There are a few docstrings where requests is used. Since the docs are user-facing, should i let this be as it is, or change it to httpx as well?

Let's change them to httpx as well.

Thanks in advance!

@omkar-334
Copy link
Copy Markdown
Contributor Author

hey @Wauplin , i've reverted the formatting changes and updated the docstrings. I think it's good to go now.
For some reason, the tests are failing. Could you tell me why, so that i can fix them...

Thanks!

Comment thread src/transformers/pipelines/image_to_image.py Outdated
Comment thread src/transformers/testing_utils.py Outdated
Comment thread src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py Outdated
Comment thread src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py Outdated
Comment thread src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py Outdated
Comment thread src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py Outdated
Copy link
Copy Markdown
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this last iteration @omkar-334! I've made a few changes to fix the last remaining issues and reviewed all files manually. Took some time but definitely a good thing to clean up!

Let's wait for a last approval from an official @huggingface/transformers-core-maintainers 🤗

Comment thread src/transformers/models/idefics2/modeling_idefics2.py Outdated
Comment thread src/transformers/models/idefics3/modeling_idefics3.py Outdated
@Wauplin Wauplin requested a review from a team January 22, 2026 13:46
@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Jan 22, 2026

failing test seems unrelated

FAILED tests/models/udop/test_processing_udop.py::UdopProcessorTest::test_model_input_names - TypeError: TextInputSequence must be str
FAILED tests/models/udop/test_processing_udop.py::UdopProcessorTest::test_processor_with_multiple_inputs - TypeError: TextInputSequence must be str
============ 2 failed, 225 passed, 297 skipped in 108.35s (0:01:48) ============

Exited with code exit status 1

I cannot reproduce it locally 😕

@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: aimv2, align, altclip, aria, beit, bit, blip, blip_2, bridgetower, chameleon, chinese_clip, clip, clipseg

@ArthurZucker ArthurZucker merged commit db1e6f1 into huggingface:main Jan 23, 2026
14 of 25 checks passed
@omkar-334
Copy link
Copy Markdown
Contributor Author

Thanks for this last iteration @omkar-334! I've made a few changes to fix the last remaining issues and reviewed all files manually. Took some time but definitely a good thing to clean up!

Thanks for the careful review and cleanup! @Wauplin .... my auto-format-on-save and search-and-replace on VS Code messed up things, but looks good in the end!

@Wauplin
Copy link
Copy Markdown
Contributor

Wauplin commented Jan 23, 2026

Thanks for your work! I'll be shipped as part of the v5 release (coming very soon)

SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Jan 23, 2026
* setup, workflow and utils files

* src core files

* use get_session instead of httpx

* fix code quality - remove unused imports

* utils and prevous-http-used and cli

* style changes

* use httpx for non-HF images

* get_session for HF images

* change docstrings

* revert formatting - 1

* revert formatting - 2

* docstrings fixes

* revert formatting - 3

* docstring fixes

* fix failing test

* Update src/transformers/pipelines/image_to_image.py

* Update src/transformers/testing_utils.py

* fix modular generation check

* fix consistency

* Apply suggestions from code review

* Update src/transformers/models/idefics2/modeling_idefics2.py

* Update src/transformers/models/idefics3/modeling_idefics3.py

---------

Co-authored-by: Lucain <lucainp@gmail.com>
ydshieh added a commit that referenced this pull request Jan 26, 2026
Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Copy link
Copy Markdown

@Pediboi666 Pediboi666 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fertig

Copy link
Copy Markdown

@Pediboi666 Pediboi666 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fett

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Remove unnessary requests module

6 participants