Skip to content

Conversation

@jerrypeng
Copy link
Contributor

What changes were proposed in this pull request?

Add some additional end to end tests for RTM

Why are the changes needed?

To have better test coverage for RTM functionality

Does this PR introduce any user-facing change?

no

How was this patch tested?

N/A. Only tests are added

Was this patch authored or co-authored using generative AI tooling?

no

@dongjoon-hyun
Copy link
Member

Thank you, @jerrypeng .

import org.apache.spark.sql.test.TestSparkSession
import org.apache.spark.sql.types.{IntegerType, StringType, StructType}

class StreamRealTimeModeE2ESuite extends StreamRealTimeModeE2ESuiteBase {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you check the CI failure?

2025-11-04T07:01:24.2425213Z �[0m[�[0m�[31merror�[0m] �[0m�[0mFailed tests:�[0m
2025-11-04T07:01:24.2426468Z �[0m[�[0m�[31merror�[0m] �[0m�[0m	org.apache.spark.sql.streaming.StreamRealTimeModeSuite�[0m

override protected def createSparkSession =
new TestSparkSession(
new SparkContext(
"local[15]",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use more smaller values like other tests? According to the commit logs, RTM test suites seem to use this kind of high values. I'm wondering if this is required for some reasons.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dongjoon-hyun when a streaming query runs in RTM, all stages of the query run concurrently. Which means the cluster needs to have the number of available slots equal to the total number of tasks across all stages.

@dongjoon-hyun
Copy link
Member

Gentle ping, @jerrypeng .

@dongjoon-hyun
Copy link
Member

Gentle ping once more, @jerrypeng .

@dongjoon-hyun
Copy link
Member

dongjoon-hyun commented Nov 14, 2025

Hi, @jerrypeng . Could you share us your thoughts about this PR?

@dongjoon-hyun dongjoon-hyun changed the title [SPARK-53998] Add addition E2E tests for RTM [SPARK-53998][TESTS] Add addition E2E tests for RTM Nov 14, 2025
@dongjoon-hyun
Copy link
Member

Gentle ping, @jerrypeng .

1 similar comment
@dongjoon-hyun
Copy link
Member

Gentle ping, @jerrypeng .

@jerrypeng
Copy link
Contributor Author

@dongjoon-hyun sorry for the delay. Let me take a look at the failure.

@dongjoon-hyun
Copy link
Member

Thank you, @jerrypeng .

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM.

Merged to master/4.1 for Apache Spark 4.1.0 RC3.

dongjoon-hyun pushed a commit that referenced this pull request Dec 10, 2025
### What changes were proposed in this pull request?

Add some additional end to end tests for RTM

### Why are the changes needed?

To have better test coverage for RTM functionality

### Does this PR introduce _any_ user-facing change?

no

### How was this patch tested?

N/A. Only tests are added

### Was this patch authored or co-authored using generative AI tooling?

no

Closes #52870 from jerrypeng/SPARK-53998-2.

Authored-by: Jerry Peng <jerry.peng@databricks.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit 7df7dad)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
@jerrypeng
Copy link
Contributor Author

@dongjoon-hyun thank you!

@dongjoon-hyun
Copy link
Member

Thank YOU!

baibaichen added a commit to apache/incubator-gluten that referenced this pull request Jan 13, 2026
* Move GlutenStreamingQuerySuite to correct package

* Add Spark 4.1 new test suites for Gluten

* Enable new and existing Gluten test suites for Spark 4.1 UT

* Update workflow trigger paths to exclude Spark 4.0 and 4.1 shims directories for clickhouse backend

* Add support for Spark 4.1 in build script

* Merge Spark 4.1.0 sql-tests into Gluten Spark 4.1 (three-way merge)

Three-way merge performed using Git:
- Base: Spark 4.0.1 (29434ea766b)
- Left: Spark 4.1.0 (e221b56be7b)
- Right: Gluten Spark 4.1 backends-velox

Summary:
- Auto-merged: 165 files
- New tests added: 31 files (collations, edge cases, recursion, spatial, etc.)
- Modified tests: 134 files
- Deleted tests: 2 files (collations.sql -> split into 4 files, timestamp-ntz.sql)

Conflicts resolved:
- inputs/timestamp-ntz.sql: Right deleted + Left modified -> DELETED (per resolution rule)

New test suites from Spark 4.1.0:
- Collations (4 files): aliases, basic, padding-trim, string-functions
- Edge cases (6 files): alias-resolution, extract-value, join-resolution, etc.
- Advanced features: cte-recursion, generators, kllquantiles, thetasketch, time
- Name resolution: order-by-alias, session-variable-precedence, runtime-replaceable
- Spatial functions: st-functions (ANSI and non-ANSI variants)
- Various resolution edge cases

Total files after merge: 671 (up from 613)

* Enable additional Spark 4.1 SQL tests by resolving TODOs

* Add new Spark 4.1 test files to VeloxSQLQueryTestSettings

* [Fix] Replace `RuntimeReplaceable` with its `replacement` to fix UT.

see apache/spark#50287

* [4.1.0] Exclude "infer shredding with mixed scale"

see apache/spark#52406

* [Fix] Implement Kryo serialization for CachedColumnarBatch

see apache/spark#50599

* [4.1.0] Exclude GlutenMapStatusEndToEndSuite and configure parallelism

see apache/spark#50230

* [4.1.0] Exclude Spark Structure Steaming tests in Gluten

see
 - apache/spark#52473
 - apache/spark#52870
 - apache/spark#52891

* [4.1.0] Exclude failing SQL tests on Spark 4.1

* Replace SparkException.require with standard require in ColumnarCachedBatchSerializer to work across different spark versions

* [Fix] Replace `RuntimeReplaceable` with its `replacement` to fix UT.

see apache/spark#50287

* Exclude Spark 4.0 and 4.1 paths in clickhouse_be_trigger using `!` prefix

* [Fix] Update GlutenShowNamespacesParserSuite to use GlutenSQLTestsBaseTrait
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants