Spark: Implement the architecture to read default values #4547
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR is the sequel to #4525, it implements the default value reading semantics specified by spec #4301, to read Avro and ORC format backed files/tables in Spark.
It made the core change in
BaseDataReaderto support converting default values to Spark-expected constant value objects.For ORC, it extends the
OrcSchemaWithTypeVisitorto have a default-value-aware visitor to build the tree of readers based on theidToConstantmap, cooperating with changes inORCSchemaUtil.buildOrcProjectionto make things work. The logic for ORC vectorized reader was also implemented.In Avro, similar implementation techniques were used, extending and modifying
AvroSchemaWithTypeVisitorandAvroSchemaUtil.buildAvroProjectionrespectively.@rdblue Please take a look, Thanks!