Summary
Add the ability for opencode to load previous conversation context from a local file at startup, enabling continuity between sessions.
Use Case
- Users who want AI assistant to remember previous conversations
- CLI-based AI companions that need persistence between sessions
- Developers wanting a lightweight alternative to database-backed memory
Proposed Solution
Add a --memory-file or --load-context flag that:
- Reads a specified markdown/text file before starting the chat
- Injects the content as system context at session start
- Optionally appends session summary to the file at end of session
Example Usage
# Load previous context
opencode --memory-file ./my-memory.md
# Or with short flag
opencode -m ./my-memory.md
Alternative
Could also support a config option in openclaude.json:
{
"memory": {
"enabled": true,
"file": "./memory.md",
"autoSave": true
}
}
Benefits
- Simple, file-based approach - no database required
- User has full control over their data
- Works offline
- Easy to back up or share
Priority
Medium - useful for personal AI companions and persistent CLI assistants
Summary
Add the ability for opencode to load previous conversation context from a local file at startup, enabling continuity between sessions.
Use Case
Proposed Solution
Add a
--memory-fileor--load-contextflag that:Example Usage
Alternative
Could also support a config option in
openclaude.json:{ "memory": { "enabled": true, "file": "./memory.md", "autoSave": true } }Benefits
Priority
Medium - useful for personal AI companions and persistent CLI assistants