Skip to content

soft_fail | operator is skipped in all cases and not only "data" related fail #40787

@raphaelauv

Description

@raphaelauv

Apache Airflow version

2.9.2

What happened?

S3KeySensor with soft_fail is not failing on configuration errors (missing credentials or airflow connection )

What you think should happen instead?

S3KeySensor with soft_fail should fail on missing configuration and only skip on missing S3 data

How to reproduce

from datetime import timedelta
from airflow import DAG
from airflow.providers.amazon.aws.sensors.s3 import S3KeySensor

with DAG(
        dag_id="xx",
        schedule_interval=None,
):

    S3KeySensor(
        task_id="s3_sensor",
        bucket_key=f"a/*",
        bucket_name="toto",
        wildcard_match=True,
        soft_fail=True,
        timeout=10,
        poke_interval=15)
[2024-07-15T10:02:01.886+0000] {baseoperator.py:400} WARNING - S3KeySensor.execute cannot be called outside TaskInstance!
[2024-07-15T10:02:01.886+0000] {s3.py:117} INFO - Poking for key : s3://toto/a/*
[2024-07-15T10:02:01.905+0000] {base_aws.py:587} WARNING - Unable to find AWS Connection ID 'aws_default', switching to empty.
[2024-07-15T10:02:01.906+0000] {base_aws.py:164} INFO - No connection ID provided. Fallback on boto3 credential strategy (region_name=None). See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html
[2024-07-15T10:02:03.675+0000] {taskinstance.py:441} INFO - ::group::Post task execution logs
[2024-07-15T10:02:03.675+0000] {taskinstance.py:2506} INFO - Skipping due to soft_fail is set to True.
[2024-07-15T10:02:03.685+0000] {taskinstance.py:1206} INFO - Marking task as SKIPPED.

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions