Use logical or for grounding dino models#2
Open
lmarshall12 wants to merge 50 commits intomainfrom
Open
Conversation
* Import Sequence from collections.abc Signed-off-by: cyy <cyyever@outlook.com> * Apply ruff UP rules Signed-off-by: cyy <cyyever@outlook.com> --------- Signed-off-by: cyy <cyyever@outlook.com>
* Support MUSA (Moore Threads GPU) backend in transformers Add accelerate version check, needs accelerate>=0.33.0 * Support TF32 flag for MUSA backend * fix typo
remove flag
* docs: ko: deepseek_v3.md * feat: nmt draft * fix: manual edits * fix: glossary edits * docs : 4N3MONE recommandced modified contents * Update docs/source/ko/model_doc/deepseek_v3.md Co-authored-by: Kim Juwon <81630351+Kim-Ju-won@users.noreply.github.com> * Update docs/source/ko/model_doc/deepseek_v3.md Co-authored-by: Kim Juwon <81630351+Kim-Ju-won@users.noreply.github.com> * add_toctree.yml --------- Co-authored-by: Kim Juwon <81630351+Kim-Ju-won@users.noreply.github.com>
…40623) * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* fix: gas for gemma fixed * feat: run fix-copies * feat: added issue label
propagate kwargs
…gface#40619) Fix attention mask validation for context parallelism Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
huggingface#40643) * fix * fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
c164edc to
8d1f7f2
Compare
|
[For maintainers] Suggested jobs to run (before merge) run-slow: auto, bamba, falcon_h1, falcon_mamba, florence2, gemma3, gemma3n, granitemoehybrid, grounding_dino, mamba, mamba2, mm_grounding_dino, sam2, sam2_video, zamba2 |
* add DeepseekV3ForTokenClassification * fix typo --------- Co-authored-by: json.bourne <json.bourne@kakaocorp.com>
…ingface#40565) * fix MetaCLIP 2 wrong link & wrong model names in the documentation and docstrings * ruff reformatted * update files generated by modular * update meta_clip2 to metaclip_2 to match the original * _supports_flash_attn = False --------- Co-authored-by: Yung-Sung Chuang <yungsung@meta.com>
* Remove TF/Flax examples * Remove check_full_copies * Trigger CI
…ace#40655) * fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
Signed-off-by: jiqing-feng <jiqing.feng@intel.com> Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
* Fix Metaclip modular conversion * manually run check_copies
Signed-off-by: cyy <cyyever@outlook.com>
… args structure (huggingface#40586) * Squashed commit of the following: commit beb2b5f Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 16:03:25 2025 +0200 also standardize _get_stopping_criteria commit 15c2566 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 15:48:38 2025 +0200 watch super.generate() usages commit 67dd845 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 14:44:32 2025 +0200 ops commit 4655dfa Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 14:41:36 2025 +0200 wrong merge commit 4647814 Merge: a72c2c4 8564e21 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 14:36:15 2025 +0200 Merge branch 'main' of github.com:huggingface/transformers into fix-custom-gen-from-function2 commit a72c2c4 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 14:04:59 2025 +0200 ops5 commit e72f914 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 12:06:19 2025 +0200 ops4 commit 12ca97b Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:58:59 2025 +0200 ops3 commit 8cac6c6 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:43:03 2025 +0200 ops2 commit 4681a7d Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:40:51 2025 +0200 ops commit 0d72aa6 Merge: e0d47e9 5bb6186 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:37:28 2025 +0200 Merge branch 'remove-constrained-bs' into fix-custom-gen-from-function2 commit 5bb6186 Merge: 44973da b0db5a0 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:36:30 2025 +0200 Merge branch 'main' into remove-constrained-bs commit 44973da Merge: 1ddab4b 893d89e Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 11:29:48 2025 +0200 Merge commit '893d89e5e6fac7279fe4292bfa3b027172287162' into remove-constrained-bs commit e0d47e9 Merge: 88128e4 1ddab4b Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 10:52:50 2025 +0200 Merge branch 'remove-constrained-bs' into fix-custom-gen-from-function2 commit 88128e4 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Mon Sep 1 10:44:38 2025 +0200 fix custom generate args, refactor gen mode args commit 1ddab4b Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Sun Aug 31 21:03:53 2025 +0200 fix commit 6095fdd Merge: 4a8b6d2 04addbc Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 17:49:16 2025 +0200 Merge branch 'remove-constrained-bs' of github.com:manueldeprada/transformers into remove-constrained-bs commit 4a8b6d2 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 17:48:25 2025 +0200 restore and deprecate beam obkects commit 04addbc Merge: e800c78 becab2c Author: Manuel de Prada Corral <6536835+manueldeprada@users.noreply.github.com> Date: Thu Aug 28 14:38:29 2025 +0200 Merge branch 'main' into remove-constrained-bs commit e800c78 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 14:38:10 2025 +0200 tests gone after green commit 33971d2 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 14:07:11 2025 +0200 tests green, changed handling of deprecated methods commit ab30383 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 12:58:01 2025 +0200 tests fix commit ec74274 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 12:32:05 2025 +0200 ops commit 0fb1900 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 11:45:16 2025 +0200 whoops commit c946bea Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 11:35:36 2025 +0200 testing... commit 924c0de Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 11:22:46 2025 +0200 sweeep ready for tests commit b05aa77 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Thu Aug 28 11:13:01 2025 +0200 restore and deprecate constraints commit 9c7962d Merge: fceeb38 c17bf30 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 20:44:21 2025 +0200 Merge branch 'remove-group-bs' into remove-constrained-bs commit c17bf30 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 17:00:50 2025 +0200 fix test commit d579aee Merge: 822efd8 ed5dd29 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 16:04:31 2025 +0200 Merge branch 'main' of github.com:huggingface/transformers into remove-group-bs commit 822efd8 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 15:59:51 2025 +0200 aaand remove tests after all green!! commit 62cb274 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 11:48:19 2025 +0200 fix commit c89c892 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Wed Aug 27 11:45:20 2025 +0200 testing that hub works the same commit fceeb38 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Tue Aug 26 20:06:59 2025 +0200 draft commit 6a9b384 Merge: 8af3af1 58cebc8 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Tue Aug 26 15:00:05 2025 +0200 Merge branch 'main' of github.com:huggingface/transformers into remove-group-bs commit 8af3af1 Author: Manuel de Prada Corral <manueldeprada@gmail.com> Date: Tue Aug 26 11:55:45 2025 +0200 Squashed commit remove-constrastive-search * ops * fix * ops * review * fix * fix dia * review
* fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…uggingface#40663) fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* Fix attn_implementation for output_attentions * remove setting attention, just raise warning * improve message * Update src/transformers/utils/generic.py
…udio` (huggingface#40664) * Skip `test_prompt_lookup_decoding_matches_greedy_search` for `qwen2_audio` * Skip `test_prompt_lookup_decoding_matches_greedy_search` for `qwen2_audio` --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…face#40666) fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* Start revamping benchmarking * Start refactoring benchmarking * Use Pandas for CSV * import fix * Remove benchmark files * Remove sample data * Address review comments * Benchmarking v2 * Fix llama bench parameters * Working checkpoint * Readme touchups * Remove unnecessary test * Massage the framework a bit * Small cleanup * Remove unnecessary flushes * Remove references to mock benchmark * Take commit ID from CLI * Address review comments * Use Events for thread comms * Tiny renaming
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…ng to build (huggingface#40677) fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…ight_sdpa_kernels` as flaky (huggingface#40683) * fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* fix * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…ggingface#40562) * add seq class for gemma3 text model * add Gemma3TextForSequenceClassification to modeling file * After run make fixup * let's just check * thiis is why it was crashing, tests were just failing... * skip it, tested only for seq clf --------- Co-authored-by: Raushan Turganbay <raushan@huggingface.co>
…Quantize.from_latents() (huggingface#40665) * Add instance attribute to DacVectorQuantize for use in DacResidualVectorQuantize.from_latents * add from_latent tests * style fix * Fix style for test_modeling_dac.py
…40669) * fix broken offline mode when loading tokenizer from hub * formatting * make quality * fix import order
* load a tiny video to make CI faster * add video in url_to_local_path
* run * build * build * fix --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* Gemma 3 for Embeddings * Style fixes * Rename conversion file for consistency * Default padding side emb vs gen * Corrected 270m config * style fixes * EmbeddingGemma config * TODO for built-in prompts * Resolving the sentence similarity bug and updating the architecture * code style * Add query prompt for SentenceTransformers * Code quality * Fixing or_mask_function return types * Adding placeholder prompts for document and passage * Finalizing prompt templates * Adding Retrieval ro preconfigured prompts * Add Gemma 3 270M Config * Correcting num_linear_layers flag default * Export Sentence Transformer in correct dtype --------- Co-authored-by: Sindhu Raghuram <sindhuraghuram@google.com>
* feat: support request cancellation * test: add cancellation test * refactor: use exisitng fn to check req cancellation * feat(cb): make cancellation thread safe * refactor(serve): update test to use `requests` instead of `httpx`
…face#40671) * Fixing bug when replacing text-audio token placeholders with audio embeddings * apply changes --------- Co-authored-by: Eustache Le Bihan <eulebihan@gmail.com> Co-authored-by: eustlb <94853470+eustlb@users.noreply.github.com>
* Change docker image to preview for the MI355 CI * Use pushed image
…gingface#40667) Fix dropout_p is not defined for SamAttention/Sam2Attention
* fix * add a test case
* Fix broken Llama4 accuracy in MoE part Llama4 accuracy is broken by a bug in huggingface#39501 . It forgot to transpose the router_scores before applying it to routed_in, causing Llama4 to generate garbage output. This PR fixes that issue by adding back the transpose() and adding some comments explaining why the transpose() is needed. Signed-off-by: Po-Han Huang <pohanh@nvidia.com> * remove comment --------- Signed-off-by: Po-Han Huang <pohanh@nvidia.com> Co-authored-by: Cyril Vallez <cyril.vallez@gmail.com>
…ky (huggingface#40702) fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
…ggingface#40657) * Squashed previous branch * unify assisted generate to common decoding method signature * move checks to validate steps where possible * fix csm and other models that override _sample * ops dia you again * opsie * joao review
fix Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* initial commit * initial setup * Overiding imageGPT specific functions * imported is_torch_available and utilized it for importing torch in imageGPT fast * Created init and ImageGPTFastImageProcessorKwargs * added return_tensors, data_format, and input_data_format to ImageGPTFastImageProcessorKwargs * set up arguments and process and _preprocess definitions * Added arguments to _preprocess * Added additional optional arguments * Copied logic over from base imageGPT processor * Implemented 2nd draft of fast imageGPT preprocess using batch processing * Implemented 3rd draft of imageGPT fast _preprocessor. Pulled logic from BaseImageProcessorFast * modified imageGPT test file to properly run fast processor tests * converts images to torch.float32 from torch.unit8 * fixed a typo with self.image_processor_list in the imagegpt test file * updated more instances of image_processing = self.image_processing_class in the test file to test fast processor * standardized normalization to not use image mean or std * Merged changes from solution2 branch * Merged changes from solution2 test file * fixed testing through baseImageGPT processor file * Fixed check_code_quality test. Removed unncessary list comprehension. * reorganized imports in image_processing_imagegpt_fast * formatted image_processing_imagegpt_fast.py * Added arg documentation * Added FastImageProcessorKwargs class + Docs for new kwargs * Reformatted previous * Added F to normalization * fixed ruff linting and cleaned up fast processor file * implemented requested changes * fixed ruff checks * fixed formatting issues * fix(ruff after merging main) * simplify logic and reuse standard equivalenec tests --------- Co-authored-by: Ethan Ayaay <ayaayethan@gmail.com> Co-authored-by: chris <christine05789@gmail.com> Co-authored-by: Ethan Ayaay <98191976+ayaayethan@users.noreply.github.com> Co-authored-by: yonigozlan <yoni.gozlan@huggingface.co>
…ferent backends in grounding dino model
d9f5b79 to
789a56d
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes # (issue)
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.