-
Notifications
You must be signed in to change notification settings - Fork 4.5k
Add streaming tests to Storage Write API Python wrapper #26491
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
R: @chamikaramj |
|
Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control |
Codecov Report
@@ Coverage Diff @@
## master #26491 +/- ##
==========================================
- Coverage 81.11% 71.92% -9.20%
==========================================
Files 469 752 +283
Lines 67438 101698 +34260
==========================================
+ Hits 54705 73142 +18437
- Misses 12733 27065 +14332
- Partials 0 1491 +1491
Flags with carried forward coverage won't be shown. Click here to find out more. see 290 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
|
Some tips to make tests running faster
|
|
Adding the flag is not yet effective as the sdk_location is set here for this test:
instead of
|
…abu98/beam into storage_write_python_tests
| } | ||
| } | ||
| if (config.pythonPipelineOptions.contains("--runner=TestDataflowRunner")) { | ||
| pythonTask.configure {dependsOn ':sdks:python:test-suites:dataflow:initializeForDataflowJob'} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This actually won't work. initializeForDataflowJob exists in project like :sdks:python:test-suites:dataflow:py310 not :sdks:python:test-suites:dataflow.
We can add a parameter(s) setupTasks to createCrossLanguageUsingJavaExpansionTask , and then declare here
pythonTask.configure {dependsOn setupTask}
and call
createCrossLanguageUsingJavaExpansionTask(setupTask: 'initializeForDataflowJob'
...
chamikaramj
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks.
|
LGTM. |
|
Run Python_Xlang_Gcp_Dataflow PostCommit |
|
Run Python_Xlang_Gcp_Direct PostCommit |
Abacn
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
possible followup: move CROSS_LANGUAGE_VALIDATES_RUNNER_PYTHON_VERSIONS and the logic of pick up two py versions from jenkins script to gradle property of test-suites,
| dataflow_validates_runner_batch_tests_V2=3.7,3.11 |
Then claim a gradle task in dataflow/build.gradle:
| task validatesRunnerBatchTestsV2 { |
like what we did for validate runner tests:
The running time of the dataflow test suite would decrease by half because now py37 and py311 tests are running concurrently. Same thing can be done for the direct runner test.
* add autosharding and use at-least-once tests * at-least-once, streaming, and autosharding tests * spotless * assign argument to parameter name * fix * increase timeout for dataflow tests * use wheel distribution * use wheels sdk location * add dependency on initializeForDataflowJob for tests using TestDataflowRunner * tab * correct method call * correct path for initializeForDataflowJob * use python version set by applyPythonNature * wait for initializeForDataflowJob earlier * fetch sdkLocation in doLast * streaming and at-least-once test * spotless * sanity check * revert sanity check * remove ref to sdkLocation from common * fix path to initializeForDataflowJob
No description provided.