A local-first, privacy-focused tool for managing conversations across multiple LLM platforms
LLM Unify consolidates your AI conversation history from multiple platforms into a single, searchable, offline database. Your chats with ChatGPT, Claude, Gemini, and Copilot - all accessible through one interface, stored locally on your machine, under your control.
# Import your ChatGPT export
llm-unify import chatgpt ~/Downloads/conversations.json
# Search across all providers
llm-unify search "async rust tokio"
# Browse in terminal UI
llm-unify tuiThe Problem: Your AI conversations are scattered across platforms, locked in proprietary formats, subject to deletion, and not easily searchable.
The Solution: A unified local archive that:
-
Preserves your conversations permanently on your machine
-
Unifies multiple providers into one searchable database
-
Protects your privacy - no data ever leaves your computer
-
Empowers you to search, export, and analyze your AI interactions
| Feature | Description |
|---|---|
Multi-Platform Import |
ChatGPT, Claude, Gemini, and GitHub Copilot - all four major providers |
Full-Text Search |
SQLite FTS5-powered search with snippet highlighting |
Terminal UI |
Ratatui-based browser with vim bindings ( |
14 CLI Commands |
import, list, show, search, delete, export, stats, validate, backup, restore, init, schema, tui, version |
Local-First Storage |
SQLite database with WAL mode and zero network dependencies |
Type-Safe Rust |
Idiomatic Rust with zero unsafe blocks |
Schema Versioning |
Automatic migrations with full version history |
Backup Integrity |
SHA-256 checksums with atomic operations and validation |
Connection Pooling |
Efficient SQLite access with up to 5 concurrent connections |
-
Database encryption (SQLCipher)
-
Export file encryption (age/GPG)
-
Enhanced TUI with message viewing
-
80%+ test coverage
See ROADMAP.adoc for the full development plan.
# ChatGPT (export from Settings → Data Controls → Export)
llm-unify import chatgpt ./conversations.json
# Claude (export from Anthropic Console → Settings → Export)
llm-unify import claude ./claude-export.json
# Gemini (use browser extensions like Gem Chat Exporter)
llm-unify import gemini ./gemini-export.json
# GitHub Copilot (VS Code: Chat: Export Chat... command)
llm-unify import copilot ./copilot-chat.json# Search all conversations
llm-unify search "machine learning"
# Limit results
llm-unify search "rust async" --limit 5
# List all conversations
llm-unify list
# Filter by provider
llm-unify list --provider chatgptllm-unify tui| Key | Action |
|---|---|
|
Move down |
|
Move up |
|
Search mode |
|
Select conversation |
|
Quit |
LLM Unify is structured as a Rust workspace with 6 focused crates:
llm-unify/
├── crates/
│ ├── llm-unify-core/ # Domain models, traits, errors (179 LOC)
│ ├── llm-unify-storage/ # SQLite persistence layer (322 LOC)
│ ├── llm-unify-parser/ # Provider import parsers (246 LOC)
│ ├── llm-unify-search/ # Full-text search engine (130 LOC)
│ ├── llm-unify-cli/ # Command-line interface (261 LOC)
│ └── llm-unify-tui/ # Terminal UI (216 LOC)
├── docs/ # Additional documentation
├── .well-known/ # RFC 9116 compliant metadata
└── ...// Provider enumeration
pub enum Provider {
ChatGpt,
Claude,
Gemini,
Copilot,
Other(String),
}
// Conversation structure
pub struct Conversation {
pub id: String,
pub title: String,
pub provider: Provider,
pub messages: Vec<Message>,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
}
// Message structure
pub struct Message {
pub id: String,
pub role: MessageRole, // User, Assistant, System
pub content: String,
pub timestamp: DateTime<Utc>,
pub metadata: Metadata,
}-- Conversations table
CREATE TABLE conversations (
id TEXT PRIMARY KEY,
title TEXT NOT NULL,
provider TEXT NOT NULL,
created_at TEXT NOT NULL,
updated_at TEXT NOT NULL
);
-- Messages table with FTS5 index
CREATE TABLE messages (
id TEXT PRIMARY KEY,
conversation_id TEXT NOT NULL,
role TEXT NOT NULL,
content TEXT NOT NULL,
timestamp TEXT NOT NULL,
metadata TEXT,
FOREIGN KEY (conversation_id) REFERENCES conversations(id)
);
-- Full-text search
CREATE VIRTUAL TABLE messages_fts USING fts5(
content,
content=messages,
content_rowid=rowid
);| Command | Description |
|---|---|
|
Initialize the database |
|
Import conversations from a provider export |
|
List all conversations, optionally filtered |
|
Display a specific conversation |
|
Full-text search across messages |
|
Remove a conversation |
|
Export conversation to JSON |
|
Show database statistics |
|
Check database integrity (SQLite + data consistency) |
|
Create database backup with SHA-256 checksum |
|
Restore from backup with integrity verification |
|
Display schema version and migration history |
|
Launch terminal UI |
|
Show version information (schema, backup, export formats) |
-
Zero unsafe code - Memory safety verified by Rust compiler
-
Parameterized queries - SQL injection prevention via SQLx
-
Backup integrity - SHA-256 checksums with pre-restore validation
-
Atomic operations - WAL mode prevents corruption during writes
-
No telemetry - Zero data collection or phone-home behavior
-
Local-only - All data stays on your machine
-
Open source - Fully auditable codebase
-
Database not encrypted at rest (use disk encryption)
-
Export files are plaintext (encryption coming v0.2)
-
Relies on filesystem permissions for access control
See SECURITY.md for vulnerability reporting.
| Standard | Status |
|---|---|
RSR (Rhodium Standard Repository) |
🥈 Silver (51/55 points) |
TPCF (Trusted Perimeter Classification) |
Perimeter 3 (Community Sandbox) |
RFC 9116 |
Compliant (security.txt) |
REUSE |
SPDX license headers |
Keep a Changelog |
v1.0.0 format |
Semantic Versioning |
v2.0.0 spec |
# Clone repository
git clone https://github.com/Hyperpolymath/llm-unify.git
cd llm-unify
# Build all crates
just build
# Run tests
just test
# Run lints
just lint
# Full CI simulation
just ci
# See all recipes
just --listWe welcome contributions! See CONTRIBUTING.md for:
-
Development workflow
-
Coding standards
-
PR process
-
Path to maintainership
| Document | Purpose |
|---|---|
Development roadmap and feature backlog |
|
Release history |
|
How to contribute |
|
Security policy and reporting |
|
Community standards |
|
Governance model |
|
Rhodium Standard compliance report |
|
Citation formats for academic use |
This project is dual-licensed:
-
AGPL-3.0-or-later for the codebase - Strong copyleft ensuring user freedom
-
Palimpsest Protocol for data portability - Your data, your rights
See LICENSE and LICENSE-PALIMPSEST for details.
-
TPCF Perimeter: 3 (Community Sandbox)
-
Code of Conduct: Contributor Covenant 2.1 + CCCP extensions
-
Governance: Consensus-seeking with voting fallback
See .well-known/humans.txt for credits and attribution.
No AI training on user data. See .well-known/ai.txt for our AI policy.