Skip to content

Add support for distributed training to LinUCB#677

Closed
alexnikulkov wants to merge 3 commits intofacebookresearch:mainfrom
alexnikulkov:export-D38294567
Closed

Add support for distributed training to LinUCB#677
alexnikulkov wants to merge 3 commits intofacebookresearch:mainfrom
alexnikulkov:export-D38294567

Conversation

@alexnikulkov
Copy link
Contributor

Summary:
Enable distributed training by reducing the values of A and b across all trainer instances.
For non-distributed training nothing changes because sync_ddp_if_available returns the input if distributed training isn't used

Differential Revision: D38294567

@facebook-github-bot
Copy link

This pull request was exported from Phabricator. Differential Revision: D38294567

alexnik and others added 3 commits August 23, 2022 12:42
Differential Revision: D38283370

fbshipit-source-id: 5185d3e7dd75bee372801e84e36a1c8d22830fd9
Differential Revision: D38305542

fbshipit-source-id: 4391a7f85407baa44de24e4a4da3647b9f1fd614
Summary:
Pull Request resolved: facebookresearch#677

Enable distributed training by reducing the values of `A` and `b` across all trainer instances.
For non-distributed training nothing changes because `sync_ddp_if_available` returns the input if distributed training isn't used

Differential Revision: D38294567

fbshipit-source-id: f6d131686790ee0a7145ccf6dd5ff282a4e480ef
@facebook-github-bot
Copy link

This pull request was exported from Phabricator. Differential Revision: D38294567

@codecov-commenter
Copy link

codecov-commenter commented Aug 23, 2022

Codecov Report

Merging #677 (7b895e7) into main (1fe8ba8) will increase coverage by 0.00%.
The diff coverage is 92.30%.

@@           Coverage Diff           @@
##             main     #677   +/-   ##
=======================================
  Coverage   87.46%   87.46%           
=======================================
  Files         366      366           
  Lines       23351    23361   +10     
  Branches       44       44           
=======================================
+ Hits        20424    20433    +9     
- Misses       2901     2902    +1     
  Partials       26       26           
Impacted Files Coverage Δ
reagent/training/cb/linucb_trainer.py 93.02% <50.00%> (-4.42%) ⬇️
reagent/models/deep_represent_linucb.py 96.87% <83.33%> (+2.43%) ⬆️
reagent/models/linear_regression.py 97.61% <100.00%> (+0.74%) ⬆️
reagent/test/models/test_linear_regression_ucb.py 100.00% <100.00%> (ø)
reagent/test/training/cb/test_linucb.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

xuruiyang pushed a commit that referenced this pull request Sep 20, 2025
Summary:
Pull Request resolved: #677

Enable distributed training by reducing the values of `A` and `b` across all trainer instances.
For non-distributed training nothing changes because `sync_ddp_if_available` returns the input if distributed training isn't used

Differential Revision: D38294567

fbshipit-source-id: af1973c764f85fbc1c27b0e1395ed726c5193649
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants