-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-29399][core] Remove old ExecutorPlugin interface. #26390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
SPARK-29397 added new interfaces for creating driver and executor plugins. These were added in a new, more isolated package that does not pollute the main o.a.s package. The old interface is now redundant. Since it's a DeveloperApi and we're about to have a new major release, let's remove it instead of carrying more baggage forward.
|
Test build #113228 has finished for PR 26390 at commit
|
|
Good idea. Indeed the new implementation of spark plugins is a superset of the spark executor plugins. |
|
Test build #113393 has finished for PR 26390 at commit
|
| DeprecatedConfig("spark.yarn.am.port", "2.0.0", "Not used anymore"), | ||
| DeprecatedConfig("spark.executor.port", "2.0.0", "Not used anymore"), | ||
| DeprecatedConfig("spark.shuffle.service.index.cache.entries", "2.3.0", | ||
| "Not used anymore. Please use spark.shuffle.service.index.cache.size"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vanzin, not related to this PR but do you know why it's called deprecated instead of removed config?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the name of the variable hardly matters, what you want is a message that explains what's happening with that config (which is what the user sees). It was initially created for deprecated configs but works just as well for removed ones that we want to warn about.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, thanks. It might be better to rename it later then :-).
HyukjinKwon
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a migration guide at docs/core-migration-guide.md? Otherwise looks fine to me.
Ah, I was looking for something like that instead of adding a release note to the bug. I'll also remove the "In Spark 3.0" prefix from the existing entries which is completely redundant. |
|
Test build #113642 has started for PR 26390 at commit |
|
retest this please |
|
Test build #113649 has finished for PR 26390 at commit
|
|
Merged to master. |
| .createWithDefault(Nil) | ||
|
|
||
| private[spark] val EXECUTOR_PLUGINS = | ||
| ConfigBuilder("spark.executor.plugins") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This conf is removed, but it is still part of deprecatedConfigs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was pointed out above #26390 (comment). Probably we should fix it to make it clear.
| ## Upgrading from Core 2.4 to 3.0 | ||
|
|
||
| - In Spark 3.0, deprecated method `TaskContext.isRunningLocally` has been removed. Local execution was removed and it always has returned `false`. | ||
| - The `org.apache.spark.ExecutorPlugin` interface and related configuration has been replaced with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we explicitly specify the conf name "spark.executor.plugins"?
| // [SPARK-28091[CORE] Extend Spark metrics system with user-defined metrics using executor plugins | ||
| ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ExecutorPlugin.init"), | ||
| // [SPARK-29399][core] Remove old ExecutorPlugin interface. | ||
| ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.ExecutorPlugin"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why this removal is not reported in MIMA? This API was added in Spark 2.4
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @HyukjinKwon Do you have an idea?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you mean org.apache.spark.ExecutorPluginContext, that's not in Spark 2.4. Seems org.apache.spark.ExecutorPlugin is properly excluded here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see. Thanks!
SPARK-29397 added new interfaces for creating driver and executor
plugins. These were added in a new, more isolated package that does
not pollute the main o.a.s package.
The old interface is now redundant. Since it's a DeveloperApi and
we're about to have a new major release, let's remove it instead of
carrying more baggage forward.