Skip to content

AsafJosefMor/Model-Assurance-Tools-Validation-Demo

Repository files navigation

Model Assurance Demo

Model Assurance Demo, a demo tryout for service tools validating and monitoring machine learning models in a web-based SaaS environment.

What's Inside?

  • infra/: Securely fetch secrets from AWS and provision cloud resources.
  • deepchecks_demo/: Simple Python scripts to check for data drift and model robustness, with hands-on examples.
  • lakera_integration/: A lightweight proxy that applies safety, privacy, and bias checks on LLM calls.
  • robust_integration/: Configuration for running adversarial and fairness tests in CI.
  • arize_feedback_loop/: Code to stream predictions into Arize for observability and auto-trigger retraining when needed.
  • .github/workflows/: GitHub Actions to automate “model-test” pipeline.
  • docs/: Explanations of architecture and data formats.

Getting Started

Load secrets (so no sensitive info ends up in code):

python infra/aws_secrets.py

Try local checks:

  • Go to deepchecks_demo/
  • Create the environment and run:
    conda env create -f deepchecks_env.yml
    conda activate deepchecks_demo
    python checks/data_drift_check.py
    python checks/adversarial_perturbation_check.py

Pilot in the cloud:

cd lakera_integration
python policy_proxy.py
curl -XPOST localhost:8080/generate?policy=privacy -d '{"prompt":"Hello, world!"}'

Automate in CI/CD: Push to GitHub and see the model-test workflow in action. Monitor & Retrain:

cd arize_feedback_loop
python send_logs.py

About

A demo tryout for service tools validating and monitoring machine learning models in a web-based SaaS environment.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published