Skip to content

thalesac/beta9

 
 

Repository files navigation

Logo

Run AI Workloads at Scale

Secure, high-performance AI infrastructure in Python.

⭐ Star the Repo Documentation Join Slack Twitter Tests Passing

Installation

pip install beam-client

Features

  • Extremely Fast: Launch containers in under a second using a custom container runtime
  • Parallelization and Concurrency: Fan out workloads to 100s of containers
  • First-Class Developer Experience: Hot-reloading, webhooks, and scheduled jobs
  • Scale-to-Zero: Workloads are serverless by default
  • Volume Storage: Mount distributed storage volumes
  • GPU Support: Run on our cloud (4090s, H100s, and more) or bring your own GPUs

Quickstart

  1. Create an account at https://beam.cloud
  2. Follow our Getting Started Guide

Creating a sandbox

Spin up isolated containers to run LLM-generated code:

from beam import Image, Sandbox


sandbox = Sandbox(image=Image()).create()
response = sandbox.process.run_code("print('I am running remotely')")

print(response.result)

Deploy a serverless inference endpoint

Create an autoscaling endpoint for your custom model:

from beam import Image, endpoint
from beam import QueueDepthAutoscaler

@endpoint(
    image=Image(python_version="python3.11"),
    gpu="A10G",
    cpu=2,
    memory="16Gi",
    autoscaler=QueueDepthAutoscaler(max_containers=5, tasks_per_container=30)
)
def handler():
    return {"label": "cat", "confidence": 0.97}

Run background tasks

Replace your Celery queue with a simple decorator:

from beam import Image, task_queue

@task_queue(
    image=Image(python_version="python3.11"),
    cpu=1,
    memory=1024,
)
def handler(images):
    for image in images:
      # Do something
      pass

Self-Hosting vs Cloud

Beta9 is the open-source engine powering Beam, our fully-managed cloud platform. You can self-host Beta9 for free or choose managed cloud hosting through Beam.

Contributing

We welcome contributions big or small. These are the most helpful things for us:

Thanks to Our Contributors

About

Secure, high-performance AI infrastructure in Python.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Go 71.9%
  • Python 25.6%
  • HCL 1.6%
  • Shell 0.6%
  • Smarty 0.2%
  • Makefile 0.1%