Skip to content

doublej/pii-filter-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PII Filter

Release Build status codecov Commit activity License

A proxy server to filter Personally Identifiable Information (PII) from requests and responses to a Large Language Model (LLM).

How it works

This proxy sits between your application and an LLM API (like OpenAI's). It intercepts the requests, filters the prompt for any banned words or phrases, and then forwards the request to the LLM. It also filters the response from the LLM before sending it back to your application.

The filtering is configured through a YAML file located at ~/.llm_filter/filters.yaml.

Getting Started

Installation

  1. Clone the repository:

    git clone https://github.com/doublej/pii-filter-proxy.git
    cd pii-filter
  2. Install the dependencies:

    uv sync --extra dev

Configuration

Create a file at ~/.llm_filter/filters.yaml with the following structure:

replacements:
  "John Doe": "[REDACTED]"
  "john.doe@example.com": "[REDACTED]"

banned:
  - "secret"
  - "password"

flags:
  case_sensitive: false
  log_filtered: true

Environment Variables

The proxy supports the following environment variables:

  • OPENAI_API_KEY: OpenAI API key for upstream requests
  • OPENAI_ENDPOINT: Custom OpenAI endpoint (defaults to https://api.openai.com)
  • PII_FILTER_DOTENV_PATH: Path to your project's .env file (optional)

Using Your Project's .env File

You can configure the proxy to use your project's environment variables in several ways:

# Start the proxy
uvicorn pii_filter.proxy.server:app

# Configure it to use your project's .env file
curl -X POST http://127.0.0.1:8000/api/env \
  -H "Content-Type: application/json" \
  -d '{"path": "/path/to/your/project/.env"}'

Running the Proxy

To start the proxy server:

uvicorn pii_filter.proxy.server:app --reload

This will start the proxy on http://127.0.0.1:8000.

If no filters are configured, the proxy will return a 400 Bad Request error.

API Endpoints

  • POST /v1/chat/completions - OpenAI-compatible chat completions with PII filtering
  • POST /api/env - Set .env file path: {"path": "/path/to/.env"}
  • GET /api/config - Get current filtering configuration
  • POST /api/config - Update filtering configuration

Using the CLI

You can also manage the configuration through the CLI:

# Show the current configuration
python -m pii_filter_proxy.cli config --show

# Add a banned word
python -m pii_filter_proxy.cli config --add-banned "new secret"

# Add a replacement
python -m pii_filter_proxy.cli config --add-replacement "Jane Doe" "[REDACTED]"

Development

This project uses pre-commit to enforce code style and quality. To set it up, run:

pre-commit install

Repository initiated with fpgmaas/cookiecutter-uv.

About

Proxy to sit between application and LLM to swap out any PII data

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors