Spark: Add support for write-audit-publish workflows.#342
Merged
rdblue merged 1 commit intoapache:masterfrom Aug 1, 2019
Merged
Spark: Add support for write-audit-publish workflows.#342rdblue merged 1 commit intoapache:masterfrom
rdblue merged 1 commit intoapache:masterfrom
Conversation
2bcb812 to
63d2a97
Compare
Parth-Brahmbhatt
approved these changes
Aug 1, 2019
danielcweeks
pushed a commit
that referenced
this pull request
Aug 1, 2019
* Add argument validation to HadoopTables#create (#298) * Install source JAR when running install target (#310) * Add projectStrict for Dates and Timestamps (#283) * Correctly publish artifacts on JitPack (#321) The Gradle install target produces invalid POM files that are missing the dependencyManagement section and versions for some dependencies. Instead, we directly tell JitPack to run the correct Gradle target. * Add build info to README.md (#304) * Convert Iceberg time type to Hive string type (#325) * Add overwrite option to write builders (#318) * Fix out of order Pig partition fields (#326) * Add mapping to Iceberg for external name-based schemas (#338) * Site: Fix broken link to Iceberg API (#333) * Add forTable method for Avro WriteBuilder (#322) * Remove multiple literal strings check rule for scala (#335) * Fix invalid javadoc url in README.md (#336) * Use UnicodeUtil.truncateString for Truncate transform. (#340) This truncates by unicode codepoint instead of Java chars. * Refactor metrics tests for reuse (#331) * Spark: Add support for write-audit-publish workflows (#342) * Avoid write failures if metrics mode is invalid (#301) * Fix truncateStringMax in UnicodeUtil (#334) Fixes #328, fixes #329. Index to codePointAt should be the offset calculated by code points * [Vectorization] Added batch sizing, switched to BufferAllocator, other minor style fixes.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This adds support for write-audit-publish (WAP) workflows. When a WAP write is committed, the new table snapshot is added to table metadata but not made the current table snapshot. Instead, an external audit process runs to validate the new data and publishes the new snapshot (updates the table's current snapshot) only when audits pass.
This adds a table property,
write.wap.enabled, that must be true for a commit to be staged. Similarly, a Spark configuration property,spark.wap.id, must be set in a job for that job's changes to be staged. The job's WAP ID is added to the commit summary so that audit processes can find the changes to audit.