TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
-
Updated
Jan 16, 2026 - Python
TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
Open Bandit Pipeline: a python library for bandit algorithms and off-policy evaluation
[IJAIT 2021] MABWiser: Contextual Multi-Armed Bandits Library
An easy-to-use reinforcement learning library for research and education.
A Pythonic microframework for multi-armed bandit problems
Python library for Multi-Armed Bandits
Learning Multi-Armed Bandits by Examples. Currently covering MAB, UCB, Boltzmann Exploration, Thompson Sampling, Contextual MAB, LinUCB, Deep MAB.
This project is created for the simulations of the paper: [Wang2021] Wenbo Wang, Amir Leshem, Dusit Niyato and Zhu Han, "Decentralized Learning for Channel Allocation inIoT Networks over Unlicensed Bandwidth as aContextual Multi-player Multi-armed Bandit Game", to appear in IEEE Transactions on Wireless Communications, 2021.
Python implementation of UCB, EXP3 and Epsilon greedy algorithms
Bayesian Optimization for Categorical and Continuous Inputs
Online Ranking with Multi-Armed-Bandits
Decentralized Intelligent Resource Allocation for LoRaWAN Networks
Implementations of basic concepts dealt under the Reinforcement Learning umbrella. This project is collection of assignments in CS747: Foundations of Intelligent and Learning Agents (Autumn 2017) at IIT Bombay
[NeurIPS 2022] Supervising the Multi-Fidelity Race of Hyperparameter Configurations
Implementation of the X-armed Bandits algorithm, as detailed in the paper, "X-armed Bandits", Bubeck et al., 2011.
A beer recommendation system using multi-armed bandit approach to solve cold start problems
Python Library for Neural Multi Armed Bandits
A multi-armed bandit (MAB) simulation library in Python
An improved version of Turbo algorithm for the Black-box optimization competition organized by NeurIPS 2020
Add a description, image, and links to the multi-armed-bandits topic page so that developers can more easily learn about it.
To associate your repository with the multi-armed-bandits topic, visit your repo's landing page and select "manage topics."