[SPARK-14797] [BUILD] Spark SQL POM should not hardcode spark-sketch_2.11 dep. #12563
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Spark SQL's POM hardcodes a dependency on
spark-sketch_2.11, which causes Scala 2.10 builds to include the_2.11dependency. This is harmless sincespark-sketchis a pure-Java module (see #12334 for a discussion of dropping the Scala version suffixes from these modules' artifactIds), but it's confusing to people looking at the published POMs.This patch fixes this by using
${scala.binary.version}to substitute the correct suffix, and also adds a set of Maven Enforcer rules to ensure that_2.11artifacts are not used in 2.10 builds (and vice-versa)./cc @ahirreddy, who spotted this issue.