A PyTorch-based distribution parametrized by the logits of CDF bins
The Cumulative Distribution Function (CDF) is a fundamental concept in probability theory and statistics that describes
the probability that a random variable
This repository uses the CDF to model and learn flexible probability distributions in machine learning tasks. By parameterizing the CDF with binned logits, it enables differentiable training and efficient sampling, making it suitable for uncertainty estimation, probabilistic prediction, and distributional modeling in neural networks.
The PiecewiseConstantBinnedCDF and PiecewiseLinearBinnedCDF classes inherit directly from
torch.distributions.Distribution, implementing all necessary methods plus some convenience functions.
They support multi-dimensional batch shapes and CUDA devices.
The bins can be initialized linearly or log-spaced.
torch>=2.7 it the only non-dev dependency of this repo.
I recommend using PiecewiseLinearBinnedCDF for most applications.
from binned_cdf import PiecewiseLinearBinnedCDF
distr = PiecewiseLinearBinnedCDF(
logits=logits, # shape: (*batch_shape, num_bins)
bound_low=-5, # adapt to your data
bound_up=7, # adapt to your data
log_spacing=True, # if False, linear spacing is used
bin_normalization_method="sigmoid", # "sigmoid" or "softmax"
)
# ... use it like any other torch.distribution.Distribution👉 Please have a look at the documentation to get started.