Skip to content

Conversation

@AnandInguva
Copy link
Contributor

Please add a meaningful description for your change here


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Choose reviewer(s) and mention them in a comment (R: @username).
  • Format the pull request title like [BEAM-XXX] Fixes bug in ApproximateQuantiles, where you replace BEAM-XXX with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests

See CI.md for more information about GitHub Actions CI.

@asf-ci
Copy link

asf-ci commented Apr 29, 2022

Can one of the admins verify this patch?

1 similar comment
@asf-ci
Copy link

asf-ci commented Apr 29, 2022

Can one of the admins verify this patch?

@AnandInguva
Copy link
Contributor Author

Run PythonDocs PreCommit

@AnandInguva AnandInguva changed the title Init inference add __Init__ to inference. Apr 30, 2022
@AnandInguva AnandInguva marked this pull request as ready for review April 30, 2022 00:23
@AnandInguva
Copy link
Contributor Author

R: @tvalentyn

@codecov
Copy link

codecov bot commented Apr 30, 2022

Codecov Report

Merging #17514 (c7c1520) into master (713389c) will increase coverage by 0.00%.
The diff coverage is 37.50%.

@@           Coverage Diff            @@
##           master   #17514    +/-   ##
========================================
  Coverage   73.85%   73.85%            
========================================
  Files         690      691     +1     
  Lines       90843    91053   +210     
========================================
+ Hits        67093    67251   +158     
- Misses      22537    22589    +52     
  Partials     1213     1213            
Flag Coverage Δ
python 83.62% <37.50%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
sdks/python/apache_beam/ml/inference/base.py 92.53% <0.00%> (ø)
sdks/python/apache_beam/ml/inference/pytorch.py 0.00% <0.00%> (ø)
sdks/python/apache_beam/ml/inference/api.py 90.00% <100.00%> (ø)
...thon/apache_beam/ml/inference/sklearn_inference.py 92.68% <100.00%> (ø)
sdks/python/apache_beam/io/gcp/gcsfilesystem.py 88.23% <0.00%> (-1.77%) ⬇️
sdks/python/apache_beam/dataframe/frame_base.py 89.54% <0.00%> (-0.83%) ⬇️
sdks/python/apache_beam/io/filesystem.py 88.76% <0.00%> (-0.65%) ⬇️
sdks/python/apache_beam/io/localfilesystem.py 90.97% <0.00%> (-0.50%) ⬇️
sdks/python/apache_beam/io/gcp/gcsio.py 92.26% <0.00%> (-0.36%) ⬇️
sdks/python/apache_beam/io/gcp/bigquery.py 63.60% <0.00%> (-0.17%) ⬇️
... and 14 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 713389c...c7c1520. Read the comment docs.

:param model_class: class of the Pytorch model that defines the model
structure.
:param device: the device on which you wish to run the model. If
``device = GPU`` then device will be cuda if it is avaiable. Otherwise,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
``device = GPU`` then device will be cuda if it is avaiable. Otherwise,
``device = GPU`` then a GPU device will be used if it is available. Otherwise,

sphinx_rtd_theme==0.4.3
docutils<0.18
Jinja2==3.0.3 # TODO(BEAM-14172): Sphinx version is too old.
torch==1.9.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should hardcode a particular version of torch, since Beam doesn't have a requirement on torch.

structure.
:param device: the device on which you wish to run the model. If
``device = GPU`` then device will be cuda if it is avaiable. Otherwise,
it will be cpu.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
it will be cpu.
it will be CPU.

# See the License for the specific language governing permissions and
# limitations under the License.
#
# mypy: ignore-errors
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this necessary now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TypeVar should have a bounded source. Right now, we want to keep it generic as possible. So, only this file will be ignored for mypy for now and will circle back to this later.

Jira related to this: https://issues.apache.org/jira/browse/BEAM-14217

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add relevant comments to BEAM-14217 and consider adding a # TODO (BEAM-14217): ...
here if applicable at a later change.

@tvalentyn
Copy link
Contributor

also R: @yeandy

@AnandInguva
Copy link
Contributor Author

PTAL @tvalentyn

Copy link
Contributor

@yeandy yeandy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@yeandy
Copy link
Contributor

yeandy commented May 3, 2022

Run Python PreCommit

@tvalentyn tvalentyn merged commit 7312377 into apache:master May 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants