-
Notifications
You must be signed in to change notification settings - Fork 331
Update Arrow to 0.15.1 and fix Broadcast and GroupedMapUdf Tests for Spark-3.0.0 #653
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
25 commits
Select commit
Hold shift + click to select a range
893433f
initial commit
suhsteve 2bddfdb
Merge branch 'master' into spark3_r
suhsteve d1d8b8c
fix tests
suhsteve 5789846
trigger test.
suhsteve 1ac30ef
fix tests.
suhsteve 1ec33f3
mitigate xUnit testing deadlock.
suhsteve 37c3760
pass spark version to executor.
suhsteve 39b31e1
Merge branch 'master' into spark3_r
suhsteve 0253e8c
remove comment
suhsteve 5c1ce47
remove whitespace
suhsteve 72150b3
Merge branch 'master' into spark3_r
suhsteve f39d3ee
dcf8196
e11b17c
clean up usings.
suhsteve e92a7e6
skip tests in backward compat pipeline.
suhsteve fd79786
PR comments.
suhsteve d421cc0
double quotes.
suhsteve f4e9863
add comment.
suhsteve ee7782d
fix flag.
suhsteve 905fd0f
write end of stream per arrow spec.
suhsteve 7f215f5
Merge branch 'master' into spark3_r
suhsteve 2b37a45
PR comments.
suhsteve 247a1ab
Remove backward compatibility for 3.0
suhsteve f55b79f
readd 3.0 backward compat tests.
suhsteve ce37879
remove whitespace/tabs.
suhsteve File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it easy to track if we have a different backward compatibility for different Spark version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At the moment there are no published workers that's backward compatible with 3.0 (since the previous workers only use 0.14.1 and aren't aware of the new spec). But I agree that this is a breaking change.
For backward compatibility, do we want to differentiate between different spark versions and test them against different spark Worker versions? Or one Worker version where we say is backward compatible for all spark versions?
This can be addressed in a separate PR if needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say the latter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then I think we will have to wait until the next official Worker release before we can remove these extra filters.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... since we have one worker binary to support all spark versions.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, you can remove the backward compatibility test (breaking change) then add it back when the new one is released.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you want me to remove the extra filters for 3.0 and in the unit tests add the skip attribute ? Or just remove the spark 3.0.0 section in the backward compatibility tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed 3.0 backward compatibility testing.