Add support for distributed training to LinUCB#677
Closed
alexnikulkov wants to merge 3 commits intofacebookresearch:mainfrom
Closed
Add support for distributed training to LinUCB#677alexnikulkov wants to merge 3 commits intofacebookresearch:mainfrom
alexnikulkov wants to merge 3 commits intofacebookresearch:mainfrom
Conversation
|
This pull request was exported from Phabricator. Differential Revision: D38294567 |
Differential Revision: D38283370 fbshipit-source-id: 5185d3e7dd75bee372801e84e36a1c8d22830fd9
Differential Revision: D38305542 fbshipit-source-id: 4391a7f85407baa44de24e4a4da3647b9f1fd614
Summary: Pull Request resolved: facebookresearch#677 Enable distributed training by reducing the values of `A` and `b` across all trainer instances. For non-distributed training nothing changes because `sync_ddp_if_available` returns the input if distributed training isn't used Differential Revision: D38294567 fbshipit-source-id: f6d131686790ee0a7145ccf6dd5ff282a4e480ef
023487b to
7b895e7
Compare
|
This pull request was exported from Phabricator. Differential Revision: D38294567 |
Codecov Report
@@ Coverage Diff @@
## main #677 +/- ##
=======================================
Coverage 87.46% 87.46%
=======================================
Files 366 366
Lines 23351 23361 +10
Branches 44 44
=======================================
+ Hits 20424 20433 +9
- Misses 2901 2902 +1
Partials 26 26
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
xuruiyang
pushed a commit
that referenced
this pull request
Sep 20, 2025
Summary: Pull Request resolved: #677 Enable distributed training by reducing the values of `A` and `b` across all trainer instances. For non-distributed training nothing changes because `sync_ddp_if_available` returns the input if distributed training isn't used Differential Revision: D38294567 fbshipit-source-id: af1973c764f85fbc1c27b0e1395ed726c5193649
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary:
Enable distributed training by reducing the values of
Aandbacross all trainer instances.For non-distributed training nothing changes because
sync_ddp_if_availablereturns the input if distributed training isn't usedDifferential Revision: D38294567