Skip to content

[tests] run one test but got 2 test results #35159

@faaany

Description

@faaany

System Info

  • transformers version: 4.47.0.dev0
  • Platform: Linux-4.18.0-425.3.1.el8.x86_64-x86_64-with-glibc2.35
  • Python version: 3.10.12
  • Huggingface_hub version: 0.26.5
  • Safetensors version: 0.4.5
  • Accelerate version: 1.1.0.dev0
  • Accelerate config: - compute_environment: LOCAL_MACHINE
    - distributed_type: MULTI_GPU
    - mixed_precision: bf16
    - use_cpu: False
    - debug: False
    - num_processes: 2
    - machine_rank: 0
    - num_machines: 1
    - gpu_ids: all
    - rdzv_backend: static
    - same_network: True
    - main_training_function: main
    - enable_cpu_affinity: False
    - downcast_bf16: no
    - tpu_use_cluster: False
    - tpu_use_sudo: False
    - tpu_env: []
  • PyTorch version (GPU?): 2.5.1+cu121 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using distributed or parallel set-up in script?:
  • Using GPU in script?:
  • GPU type: NVIDIA A100 80GB PCIe

Who can help?

@ydshieh

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

with the following test command:

pytest -rA tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes

I will get 2 test results:

======================================================== short test summary info =========================================================
PASSED tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes
[Testing <class 'transformers.models.mra.modeling_mra.MraForSequenceClassification'>] SUBFAIL tests/models/mra/test_modeling_mra.py::MraModelTest::test_load_with_mismatched_shapes - ValueError: sequence length must be divisible by the block_size.
================================================ 1 failed, 1 passed, 3 warnings in 6.27s =================================================

Expected behavior

I expect that this command gives_ me only one test result rather than two.

Below are the experiments I tried:

  • with pytest-subtests installed, comment out "with self.subTest(msg=f"Testing {model_class}"):", test pass with 1 test case
  • with pytest-subtests uninstalled, comment out "with self.subTest(msg=f"Testing {model_class}"):", test pass with 1 test case
  • with pytest-subtests uninstalled, test fails with 1 test case

I also saw your other PR #34806. Maybe this is another issue with pytest-subtests?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions