-
Notifications
You must be signed in to change notification settings - Fork 4.5k
feat(yaml): add schema validation for flatten transform #35675
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Check schema compatibility when flattening PCollections to ensure all inputs have matching schemas. This prevents runtime errors from schema mismatches.
damccorm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
- Split long comment line for better readability in yaml_provider.py - Rename test method create_has_schema to test_create_has_schema - Change expected exception type from ValueError to Exception in schema validation test - Add new test cases for flattening compatible schemas and null values - Fix circular reference test to actually test successful execution
| pickle_library='cloudpickle')) as p: | ||
| # This should raise an error because null values create different schema types | ||
| # (nullable logical type vs INT64) | ||
| with self.assertRaisesRegex( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@damccorm I do not like this but Beam treat this case with different schemas since passenger_count should be INT64 not None. So we report the error now with this validation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh interesting - one option would be to define our own deep equality function here (basically a consistency check), or we could try to move forward with #35672
I don't think we can move forward with this change if it regresses these use cases
| pickle_library='cloudpickle')) as p: | ||
| # This should raise an error because null values create different schema types | ||
| # (nullable logical type vs INT64) | ||
| with self.assertRaisesRegex( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh interesting - one option would be to define our own deep equality function here (basically a consistency check), or we could try to move forward with #35672
I don't think we can move forward with this change if it regresses these use cases
|
decided to do the forward fix #35672 later |
Check schema compatibility when flattening PCollections to ensure all inputs have matching schemas. This prevents runtime errors from schema mismatches.
Addresses #35666
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>instead.CHANGES.mdwith noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.