Skip to content

emredeveloper/Mem-LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

112 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mem-LLM

PyPI version Python 3.8+ License: MIT

Mem-LLM is a privacy-first Python framework for building memory-enabled AI assistants that run locally.

What's New in v2.4.6

  • Fixed critical memory, tool parsing, and backend compatibility issues.
  • Improved SQL ordering and thread-safety behavior.
  • Added missing runtime dependencies (psutil, networkx).
  • Updated backend defaults:
    • Ollama: granite4:3b
    • LM Studio: google/gemma-3-12b

Quick Start

Install

pip install mem-llm

Ollama

from mem_llm import MemAgent

agent = MemAgent(backend="ollama", model="granite4:3b")
agent.set_user("alice")
print(agent.chat("My name is Alice."))
print(agent.chat("What is my name?"))

LM Studio

from mem_llm import MemAgent

agent = MemAgent(backend="lmstudio", model="google/gemma-3-12b")
agent.set_user("alice")
print(agent.chat("Summarize Python in one sentence."))

Core Features

  • Persistent memory per user (JSON or SQLite)
  • Multi-backend support (Ollama, LM Studio)
  • Tool calling system (@tool, built-in tools, validation)
  • Streaming responses
  • Knowledge base integration
  • Conversation analytics
  • REST API and Web UI

Repository Layout

  • Memory LLM/ - main package source and release files
  • quickstart/ - step-by-step usage examples & tutorials

Links

License

Mem-LLM is released under the MIT License.

About

Mem-LLM is a Python library for building memory-enabled AI assistants that run entirely on local LLMs, combining persistent multi-user conversation history with configurable knowledge bases, storage backends, and Ollama model support—perfect for privacy-first, production-ready workflows.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors