Skip to content

Conversation

@vanzin
Copy link
Contributor

@vanzin vanzin commented Nov 5, 2019

SPARK-29397 added new interfaces for creating driver and executor
plugins. These were added in a new, more isolated package that does
not pollute the main o.a.s package.

The old interface is now redundant. Since it's a DeveloperApi and
we're about to have a new major release, let's remove it instead of
carrying more baggage forward.

SPARK-29397 added new interfaces for creating driver and executor
plugins. These were added in a new, more isolated package that does
not pollute the main o.a.s package.

The old interface is now redundant. Since it's a DeveloperApi and
we're about to have a new major release, let's remove it instead of
carrying more baggage forward.
@SparkQA
Copy link

SparkQA commented Nov 5, 2019

Test build #113228 has finished for PR 26390 at commit a66af7f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@LucaCanali
Copy link
Contributor

Good idea. Indeed the new implementation of spark plugins is a superset of the spark executor plugins.

@SparkQA
Copy link

SparkQA commented Nov 7, 2019

Test build #113393 has finished for PR 26390 at commit fc1b05e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

DeprecatedConfig("spark.yarn.am.port", "2.0.0", "Not used anymore"),
DeprecatedConfig("spark.executor.port", "2.0.0", "Not used anymore"),
DeprecatedConfig("spark.shuffle.service.index.cache.entries", "2.3.0",
"Not used anymore. Please use spark.shuffle.service.index.cache.size"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vanzin, not related to this PR but do you know why it's called deprecated instead of removed config?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because the name of the variable hardly matters, what you want is a message that explains what's happening with that config (which is what the user sees). It was initially created for deprecated configs but works just as well for removed ones that we want to warn about.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, thanks. It might be better to rename it later then :-).

Copy link
Member

@HyukjinKwon HyukjinKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a migration guide at docs/core-migration-guide.md? Otherwise looks fine to me.

@vanzin
Copy link
Contributor Author

vanzin commented Nov 12, 2019

docs/core-migration-guide.md

Ah, I was looking for something like that instead of adding a release note to the bug. I'll also remove the "In Spark 3.0" prefix from the existing entries which is completely redundant.

@SparkQA
Copy link

SparkQA commented Nov 12, 2019

Test build #113642 has started for PR 26390 at commit ad13257.

@vanzin
Copy link
Contributor Author

vanzin commented Nov 12, 2019

retest this please

@SparkQA
Copy link

SparkQA commented Nov 12, 2019

Test build #113649 has finished for PR 26390 at commit ad13257.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

Merged to master.

@vanzin vanzin deleted the SPARK-29399 branch November 22, 2019 23:00
.createWithDefault(Nil)

private[spark] val EXECUTOR_PLUGINS =
ConfigBuilder("spark.executor.plugins")
Copy link
Member

@gatorsmile gatorsmile Feb 24, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was pointed out above #26390 (comment). Probably we should fix it to make it clear.

## Upgrading from Core 2.4 to 3.0

- In Spark 3.0, deprecated method `TaskContext.isRunningLocally` has been removed. Local execution was removed and it always has returned `false`.
- The `org.apache.spark.ExecutorPlugin` interface and related configuration has been replaced with
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we explicitly specify the conf name "spark.executor.plugins"?

// [SPARK-28091[CORE] Extend Spark metrics system with user-defined metrics using executor plugins
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ExecutorPlugin.init"),
// [SPARK-29399][core] Remove old ExecutorPlugin interface.
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.ExecutorPlugin"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this removal is not reported in MIMA? This API was added in Spark 2.4

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @HyukjinKwon Do you have an idea?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you mean org.apache.spark.ExecutorPluginContext, that's not in Spark 2.4. Seems org.apache.spark.ExecutorPlugin is properly excluded here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants