Skip to content

alimoameri/chatollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatOllama — Local LLM Chatbot UI (Powered by Ollama + Streamlit)

ChatGemma is a simple and modular chatbot UI built with Streamlit and Ollama, intended for offline, educational, or personal experiments only.

It allows multiple users to:

  • Sign up / log in
  • Create and manage multiple chat conversations
  • Chat with a local Ollama-compatible model
  • Persist chats to disk
  • Clear or delete chats easily

Example Screenshot


Features

  • Multi-user login and signup
  • Conversation history per user
  • Streamed responses from a local Ollama model
  • Memory of the last N messages per chat
  • Clear and delete chat options
  • Modular, maintainable architecture

⚠️ Security Notice
This implementation is for educational and local use onlynot secure for production.
Authentication is stored in plain text in a local JSON file (data/users.json).
You should integrate proper password hashing (e.g., bcrypt) and session management before any deployment.


Project Structure


chatollama/
│
├── app.py                # Main Streamlit app
├── config.py             # Configurations (model name, constants)
│
├── utils/
│   ├── storage.py        # Load/save user data
│   ├── chat_engine.py    # Model chat logic (Ollama API)
│   └── ui_components.py  # Login/signup + sidebar UI
│
└── data/
└── users.json        # Auto-created user database


Requirements

You’ll need:

  • Python 3.10+
  • uv (modern Python environment manager)
  • Ollama installed and running locally
    (with any Ollama-compatible model)

Installation (using uv)

# Clone the repository
git clone https://github.com/<your-username>/chatollama.git
cd chatollama

# Create and sync environment
uv sync

# Run Ollama in the background
ollama serve

# Pull the a Ollama-compatible model (if not already installed)
ollama pull gemma3:270m

# Launch Streamlit app
uv run streamlit run app.py

Configuration

You can customize app behavior in config.py:

USER_DB_PATH = "data/users.json"
MODEL_NAME = "gemma3:270m"
LAST_N_MESSAGES_MEMORY = 5
APP_TITLE = "ChatGemma-270m"

To change models, just modify MODEL_NAME (e.g., "llama3:8b" or "mistral:7b").


Data Storage & Authentication

All user data and chat history are stored locally in data/users.json. Passwords are currently stored in plain text — this is not secure.

This design is intended for offline, educational, or personal experiments only. If deploying publicly, you must:

  • Use password hashing (e.g., bcrypt, passlib)
  • Add CSRF/session handling
  • Secure data with encryption or a proper database

Extend or Modify

  • Add new UI elements → utils/ui_components.py
  • Add new model integrations → utils/chat_engine.py
  • Adjust configuration or model name → config.py
  • Modify data schema → utils/storage.py

Future Ideas

  • Add retrieval-augmented generation (RAG) support
  • Deploy proper authentication
  • Add model selector (Gemma, LLaMA, Mistral, etc.)

License

MIT License

About

A simple and modular chatbot UI built with Streamlit and Ollama, intended for offline, educational, or personal experiments only.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages