RandCraft is a Python library for object-oriented combination and manipulation of univariate random variables, built on top of the scipy.stats module.
Have you ever wanted to add together random variables but can't be bothered working out an analytical solution? Randcraft makes it simple.
from randcraft import make_normal, make_coin_flip
coin_flip = make_coin_flip()
# <RandomVariable(discrete): mean=0.5, var=0.25>
norm = make_normal(mean=0, std_dev=0.2)
# <RandomVariable(normal): mean=0.0, var=0.04>
combined = coin_flip + norm
# <RandomVariable(mixture): mean=0.5, var=0.29>
combined.sample()
# 0.8678903828104276
combined.plot()- Object-oriented RVs are objects, most things you need are properties or methods on the object instance
- Distribution composition: Scale random variables and add them together
rvc = rva/2 + rvb. Apply arbitrary functions - Plot distributions Quickly have a look at the distribution of any RV (including combinations etc) with
rv.plot() - Sampling and statistics: Easily sample from composed distributions and access statistics
- Distribution estimation Use KDE estimation on samples to create a distribution
- Integration with scipy.stats: Use any frozen continuous distribution from scipy stats
- Deterministic Pass a seed to any random variable during init to guarantee reproducible results
- Extensible: Supports custom distributions via subclassing.
Parametric
- Normal, Uniform, Beta, Gamma, Lognormal + any other parametric continuous distribution from scipy.stats
- Discrete
- DiracDelta
Non-parametric
- Gaussian kde distribution from provided observations
- Distributions based on a provided sampler function
Combinations
- Mixture (Random choice between distributions)
- Multi (Linear combination of distributions)
When combining distributions, the library will simplify the new distribution analytically where possible, and use numerical approaches otherwise
You can also extend RandCraft with your own custom distributions.
# pip
pip install randcraft
# uv
uv add randcraftmake_normal(),make_uniform()etc: Create a random variable(rva - rv2) + 3.0: Addition and subtraction with constants or other RVsrv.scale(k): Scale the RV by a constantrv.multi_sample(n): Create a new RV representing n independent observations of the original RVrv * k:⚠️ Not allowed due to ambiguity between scale/multi_samplerv / k: Scale the RV by a constantrv.sample(): Draw 1 sample (float)rv.sample(n): Draw n samples (np.ndarray)rv.get_mean(),rv.get_variance(): Get statistics (possibly estimated numerically)rv.get_mean(exact=True),rv.get_variance(exact=True): (Only accept exact values)rv.cdf(x): Evaluate cdf at pointsrv.ppf(x): Evaluate inverse of cdf at pointsrv.plot(): Take a look at your distribution
from randcraft.constructors import make_die_roll
die = make_die_roll(sides=6)
# <RandomVariable(discrete): mean=3.5, var=2.92>
three_dice = die.multi_sample(3)
# <RandomVariable(discrete): mean=10.5, var=8.75>
three_dice.cdf(10.0)
# 0.5
three_dice.ppf(0.5)
# 10.0from scipy.stats import uniform
from randcraft.constructors import make_scipy
rv = make_scipy(uniform, loc=1, scale=2)
# <RandomVariable(scipy-uniform): mean=2.0, var=0.333>
b = rv.scale(2.0)
# <RandomVariable(scipy-uniform): mean=4.0, var=1.33>You have observations of two independent random variables. You want to use kernal density estimation to create continuous random variables for each and then add them together.
import numpy as np
from randcraft.observations import make_gaussian_kde
observations_a = np.array([1.0, 2.0, 3.0, 4.0, 5.0])
observations_b = np.array([1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0])
rv_a = make_gaussian_kde(observations=observations_a, bw_method=0.1)
# <RandomVariable(multi): mean=3.0, var=2.42>
rv_b = make_gaussian_kde(observations=observations_b)
# <RandomVariable(multi): mean=0.5, var=0.676>
rv_joined = rv_a + rv_b
# <RandomVariable(multi): mean=3.5, var=3.1>Uses gaussian_kde by scipy.stats under the hood. You also have the option to pass arguments for gaussian_kde, or provide your own kernel as a RandomVariable.
from randcraft import make_uniform
rv = make_uniform(low=0, high=1)
# <RandomVariable(scipy-uniform): mean=0.5, var=0.0833>
rv_sample_mean = rv.multi_sample(n=30)/30
# <RandomVariable(multi): mean=0.5, var=0.00278>
rv_sample_mean.plot()from randcraft.constructors import make_normal, make_uniform, make_discrete
from randcraft.misc import mix_rvs
rv1 = make_normal(mean=0, std_dev=1)
# <RandomVariable(scipy-norm): mean=0.0, var=1.0>
rv2 = make_uniform(low=-1, high=1)
# <RandomVariable(scipy-uniform): mean=-0.0, var=0.333>
combined = rv1 + rv2
# <RandomVariable(multi): mean=0.0, var=1.33>
discrete = make_discrete(values=[1, 2, 3])
# <RandomVariable(discrete): mean=2.0, var=0.667>
# Make a new rv which has a random chance of drawing from one of the other 4 rvs
mixed = mix_rvs([rv1, rv2, combined, discrete])
# <RandomVariable(mixture): mean=0.5, var=1.58>
mixed.plot()You can apply any function of the form (np.ndarray[float] -> np.ndarray[float]) to a random variable. Stats and plotting etc will be estimated numerically
from randcraft import make_coin_flip
from randcraft.misc import apply_func_to_discrete_rv
rv = make_coin_flip()
# <RandomVariable(discrete): mean=0.5, var=0.25>
rv_2 = apply_func_to_discrete_rv(rv=rv, func=lambda x: x * 2 - 1)
# <RandomVariable(anon): mean=0.0, var=1.66>
rv_2.get_mean()
# np.float64(0.0)
values = rv_2.sample(5)
# array([-1., -1., 1., -1., 1.])from randcraft.constructors import make_normal
import numpy as np
rv1 = make_normal(mean=0.0, std_dev=1.0, seed=3)
# <RandomVariable(scipy-norm): mean=0.0, var=1.0, seeded>
rv2 = make_normal(mean=0.0, std_dev=1.0, seed=3)
# <RandomVariable(scipy-norm): mean=0.0, var=1.0, seeded>
np.array_equal(rv2a.sample(10), rv2b.sample(10))
# TrueYou can create custom random variable classes by subclassing the base RV class and implementing required methods.
The library is designed to work with univariate random variables only. Multi-dimensional rvs or correlations etc are not supported.
MIT License
Built on scipy.stats.


