Skip to content
This repository was archived by the owner on Jan 24, 2024. It is now read-only.

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Aug 4, 2023

Updates the requirements on bitsandbytes to permit the latest version.

Changelog

Sourced from bitsandbytes's changelog.

0.0.21

  • Ampere, RTX 30 series GPUs now compatible with the library.

0.0.22:

  • Fixed an error where a reset_parameters() call on the StableEmbedding would lead to an error in older PyTorch versions (from 1.7.0).

0.0.23:

Bugs:

  • Unified quantization API: each quantization function now returns Q, S where Q is the quantized tensor and S the quantization state which may hold absolute max values, a quantization map or more. For dequantization all functions now accept the inputs Q, S so that Q is dequantized with the quantization state S.
  • Fixed an issue where the CUDA 11.1 binary was not compiled with the right headers

API changes:

  • Block-wise quantization for optimizers now enabled by default

Features:

  • Block-wise quantization routines now support CPU Tensors.

0.0.24:

  • Fixed a bug where a float/half conversion led to a compilation error for CUDA 11.1 on Turning GPUs.
  • removed Apex dependency for bnb LAMB

0.0.25:

Features:

  • Added skip_zeros for block-wise and 32-bit optimizers. This ensures correct updates for sparse gradients and sparse models.
  • Added support for Kepler GPUs. (#4)
  • Added Analysis Adam to track 8-bit vs 32-bit quantization errors over time.
  • Make compilation more user friendly.

Bug fixes:

  • fixed "undefined symbol: __fatbinwrap_38" error for P100 GPUs on CUDA 10.1 (#5)

Docs:

  • Added docs with instructions to compile from source.

0.26.0:

Features:

  • Added Adagrad (without grad clipping) as 32-bit and 8-bit block-wise optimizer.
  • Added AdamW (copy of Adam with weight decay init 1e-2). #10
  • Introduced ModuleConfig overrides which can be seamlessly be used at initialization time of a module.
  • Added bnb.nn.Embedding layer which runs at 32-bit but without the layernorm. This works well if you need to fine-tune pretrained models that do not have a embedding layer norm. #19

Bug fixes:

  • Fixed a bug where weight decay was incorrectly applied to 32-bit Adam. #13

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) to permit the latest version.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases)
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md)
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits)

---
updated-dependencies:
- dependency-name: bitsandbytes
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Aug 4, 2023
@codecov-commenter
Copy link

Codecov Report

Patch and project coverage have no change.

Comparison is base (f1b0c1e) 94.29% compared to head (8d4ad0b) 94.29%.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #235   +/-   ##
=======================================
  Coverage   94.29%   94.29%           
=======================================
  Files           7        7           
  Lines         333      333           
=======================================
  Hits          314      314           
  Misses         19       19           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@peakji peakji merged commit d5f1bdd into master Aug 6, 2023
@peakji peakji deleted the dependabot/pip/bitsandbytes-approx-eq-0.41.1 branch August 6, 2023 15:28
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

dependencies Pull requests that update a dependency file python Pull requests that update Python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants