Skip to content

[SPARK-53968][SQL] Store decimal precision loss conf in arithmetic expressions#52681

Closed
stefankandic wants to merge 4 commits intoapache:masterfrom
stefankandic:fixViewDec
Closed

[SPARK-53968][SQL] Store decimal precision loss conf in arithmetic expressions#52681
stefankandic wants to merge 4 commits intoapache:masterfrom
stefankandic:fixViewDec

Conversation

@stefankandic
Copy link
Copy Markdown
Contributor

@stefankandic stefankandic commented Oct 21, 2025

What changes were proposed in this pull request?

Currently, arithmetic expressions such as Add and Multiply use the configuration spark.sql.decimalOperations.allowPrecisionLoss to determine their output type when working with decimal values. This approach is problematic because if the expression is transformed or copied, its return type could change depending on the active configuration value.

This issue can happen during view resolution; we can use one value of the config during analysis and different one during query optimization. If a referenced expression changes type and that reference is reused elsewhere in the plan it will trigger a plan validation error.

Why are the changes needed?

To address this, we should follow a similar approach to what was done for ANSI mode: store the relevant context directly within the expression as part of its state. This ensures the expression remains stable and unaffected by configuration changes when it’s copied or transformed. To make this transition smooth, I’ve generalized the existing EvalMode used for ANSI so that it can be extended to multiple configuration dimensions.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Added a new unit test in SQLViewSuite which was failing with plan validation error before.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions Bot added the SQL label Oct 21, 2025
@stefankandic stefankandic marked this pull request as ready for review October 22, 2025 08:48
@cloud-fan
Copy link
Copy Markdown
Contributor

thanks, merging to master!

@cloud-fan cloud-fan closed this in a96e9ca Oct 22, 2025
huangxiaopingRD pushed a commit to huangxiaopingRD/spark that referenced this pull request Nov 25, 2025
…pressions

### What changes were proposed in this pull request?
Currently, arithmetic expressions such as `Add` and `Multiply` use the configuration `spark.sql.decimalOperations.allowPrecisionLoss` to determine their output type when working with decimal values. This approach is problematic because if the expression is transformed or copied, its return type could change depending on the active configuration value.

This issue can happen during view resolution; we can use one value of the config during analysis and different one during query optimization. If a referenced expression changes type and that reference is reused elsewhere in the plan it will trigger a plan validation error.

### Why are the changes needed?
To address this, we should follow a similar approach to what was done for ANSI mode: store the relevant context directly within the expression as part of its state. This ensures the expression remains stable and unaffected by configuration changes when it’s copied or transformed. To make this transition smooth, I’ve generalized the existing EvalMode used for ANSI so that it can be extended to multiple configuration dimensions.

### Does this PR introduce _any_ user-facing change?
No.

### How was this patch tested?
Added a new unit test in SQLViewSuite which was failing with plan validation error before.

### Was this patch authored or co-authored using generative AI tooling?
No.

Closes apache#52681 from stefankandic/fixViewDec.

Authored-by: Stefan Kandic <stefan.kandic@databricks.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
dongjoon-hyun pushed a commit that referenced this pull request Feb 27, 2026
### What changes were proposed in this pull request?

Fix V2FunctionBenchmark, caused by SPARK-53968 (#52681)

```
$ build/sbt "sql/Test/runMain org.apache.spark.sql.connector.functions.V2FunctionBenchmark"
...
[info] running (fork) org.apache.spark.sql.connector.functions.V2FunctionBenchmark
[error] WARNING: Using incubator modules: jdk.incubator.vector
[info] 13:43:41.084 WARN org.apache.spark.util.Utils: Your hostname, H27212-MAC-01.local, resolves to a loopback address: 127.0.0.1; using 10.242.159.140 instead (on interface en0)
[info] 13:43:41.087 WARN org.apache.spark.util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
[info] 13:43:41.260 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[info] Running benchmark: scalar function (long + long) -> long, result_nullable = true codegen = true
[info]   Running case: native_long_add
[error] Exception in thread "main" java.lang.NullPointerException: Cannot invoke "org.apache.spark.sql.catalyst.expressions.NumericEvalContext.evalMode()" because the return value of "org.apache.spark.sql.catalyst.expressions.BinaryArithmetic.evalContext()" is null
[error] 	at org.apache.spark.sql.catalyst.expressions.BinaryArithmetic.evalMode(arithmetic.scala:201)
[error] 	at org.apache.spark.sql.catalyst.expressions.BinaryArithmetic.failOnError(arithmetic.scala:209)
[error] 	at org.apache.spark.sql.catalyst.expressions.BinaryArithmetic.initQueryContext(arithmetic.scala:251)
[error] 	at org.apache.spark.sql.catalyst.expressions.SupportQueryContext.$init$(Expression.scala:670)
[error] 	at org.apache.spark.sql.catalyst.expressions.BinaryArithmetic.<init>(arithmetic.scala:193)
[error] 	at org.apache.spark.sql.connector.functions.V2FunctionBenchmark$NativeAdd.<init>(V2FunctionBenchmark.scala:111)
[error] 	at org.apache.spark.sql.connector.functions.V2FunctionBenchmark$.$anonfun$scalarFunctionBenchmark$3(V2FunctionBenchmark.scala:88)
...
```

### Why are the changes needed?

Fix the benchmark.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Verified locally, GHA benchmark reports for JDK 17 and 21 are updated.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #54540 from pan3793/SPARK-55744.

Lead-authored-by: Cheng Pan <chengpan@apache.org>
Co-authored-by: pan3793 <pan3793@users.noreply.github.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants