A python library focuses on constructing Deep Probabilistic Models (DPMs). Our developed Pydpm not only provides efficient distribution sampling functions on GPU, but also has included the implementations of existing popular DPMs.
Documentation | Paper [Arxiv] | Tutorials | Benchmarks | Examples |
🔥Note: We have released a new version that does not depend on Pycuda.
The current version of PyDPM can be installed under either Windows or Linux system with PyPI.
$ pip install pydpm
For Windows system, we recommed to install Visual Studio 2019 as the compiler equipped with CUDA 11.5 toolkit; For Linux system, we recommed to install the latest version of CUDA toolkit.
The overview of the framework of PyDPM library
The workflow of applying PyDPM for downstream tasks
| Probabilistic Model Name | Abbreviation | Paper Link |
|---|---|---|
| Latent Dirichlet Allocation | LDA | Link |
| Poisson Factor Analysis | PFA | Link |
| Poisson Gamma Belief Network | PGBN | Link |
| Convolutional Poisson Factor Analysis | CPFA | Link |
| Convolutional Poisson Gamma Belief Network | CPGBN | Link |
| Poisson Gamma Dynamical Systems | PGDS | Link |
| Deep Poisson Gamma Dynamical Systems | DPGDS | Link |
| Dirichlet Belief Networks | DirBN | Link |
| Deep Poisson Factor Analysis | DPFA | Link |
| Word Embeddings Deep Topic Model | WEDTM | Link |
| Multimodal Poisson Gamma Belief Network | MPGBN | Link |
| Graph Poisson Gamma Belief Network | GPGBN | Link |
Example: a few code lines to quickly construct and evaluate a 3-layer Bayesian model named PGBN on GPU.
from pydpm.model import PGBN
from pydpm.metric import ACC
# create the model and deploy it on gpu or cpu
model = PGBN([128, 64, 32], device='gpu')
model.initial(train_data)
train_local_params = model.train(train_data, iter_all=100)
train_local_params = model.test(train_data, iter_all=100)
test_local_params = model.test(test_data, iter_all=100)
# evaluate the model with classification accuracy
# the demo accuracy can achieve 0.8549
results = ACC(train_local_params.Theta[0], test_local_params.Theta[0], train_label, test_label, 'SVM')
# save the model after training
model.save()Example: a few code lines to quickly deploy distribution sampler of Pydpm on GPU.
from pydpm._sampler import Basic_Sampler
sampler = Basic_Sampler('gpu')
a = sampler.gamma(np.ones(100)*5, 1, times=10)
b = sampler.gamma(np.ones([100, 100])*5, 1, times=10)Compare the distribution sampling efficiency of PyDPM with numpy:
Compare the distribution sampling efficiency of PyDPM with tensorflow and torch:
License: Apache License Version 2.0
Contact: Chaojie Wang xd_silly@163.com, Wei Zhao 13279389260@163.com, Xinyang Liu lxy771258012@163.com, Jiawen Wu wjw19960807@163.com
Copyright (c), 2020, Chaojie Wang, Wei Zhao, Xinyang Liu, Jiawen Wu, Jie Ren, Yewen Li, Hao Zhang, Bo Chen and Mingyuan Zhou




