Skip to content

Comet fails on limit operator with negative limit parameter #287

@viirya

Description

@viirya

Describe the bug

Another Spark SQL test failure when working in #250.

DataFrameSuite:

- offset *** FAILED *** (36 milliseconds)
[info]   org.apache.spark.SparkException: [INTERNAL_ERROR] The Spark SQL phase planning failed with an internal error. You hit a bug in Spark or the Spark plugins you use. Please, report this bug to the corresponding communities or vendors, and provide the full stack trace.
[info]   at org.apache.spark.SparkException$.internalError(SparkException.scala:88)
...
  Cause: java.lang.AssertionError: assertion failed: limit should be greater or equal to zero
[info]   at scala.Predef$.assert(Predef.scala:223)
[info]   at org.apache.comet.serde.QueryPlanSerde$.operator2Proto(QueryPlanSerde.scala:1847)
[info]   at org.apache.comet.CometSparkSessionExtensions$CometExecRule.org$apache$comet$CometSparkSessionExtensions$CometExecRule$$transform1$1(CometSparkSessionExtensions.scala:232)

Steps to reproduce

No response

Expected behavior

No response

Additional context

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions