Lightweight Ruby gem for interacting with locally running Ollama LLMs with streaming, chat, and full offline privacy.
-
Updated
Dec 6, 2025 - Ruby
Lightweight Ruby gem for interacting with locally running Ollama LLMs with streaming, chat, and full offline privacy.
A unified offline AI studio for text, image, audio, and video generation — all running locally on your machine, with no internet or cloud required.
Setup guide for AI-Mini PC. For hosting local LLM's via LM-Studio as RDP/headless-GUI Setup. In this example we'll use a Minisforum AI X1 Pro, AMD Ryzen AI 9 HX 370 / 64GB RAM
Cross-platform desktop tool for chaining local AI models and plugins into powerful, agentic workflows. It supports prompt-driven orchestration, visual DAG editing, and full offline execution.
Add a description, image, and links to the local-ai-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-ai-llm topic, visit your repo's landing page and select "manage topics."