Model Assurance Demo, a demo tryout for service tools validating and monitoring machine learning models in a web-based SaaS environment.
- infra/: Securely fetch secrets from AWS and provision cloud resources.
- deepchecks_demo/: Simple Python scripts to check for data drift and model robustness, with hands-on examples.
- lakera_integration/: A lightweight proxy that applies safety, privacy, and bias checks on LLM calls.
- robust_integration/: Configuration for running adversarial and fairness tests in CI.
- arize_feedback_loop/: Code to stream predictions into Arize for observability and auto-trigger retraining when needed.
- .github/workflows/: GitHub Actions to automate “model-test” pipeline.
- docs/: Explanations of architecture and data formats.
Load secrets (so no sensitive info ends up in code):
python infra/aws_secrets.pyTry local checks:
- Go to
deepchecks_demo/ - Create the environment and run:
conda env create -f deepchecks_env.yml conda activate deepchecks_demo python checks/data_drift_check.py python checks/adversarial_perturbation_check.py
Pilot in the cloud:
cd lakera_integration
python policy_proxy.py
curl -XPOST localhost:8080/generate?policy=privacy -d '{"prompt":"Hello, world!"}'Automate in CI/CD: Push to GitHub and see the model-test workflow in action.
Monitor & Retrain:
cd arize_feedback_loop
python send_logs.py