Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
-
Updated
Apr 17, 2026 - Python
Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
LLM-powered knowledge base from your Claude Code, Codex CLI, Copilot, Cursor & Gemini sessions. Karpathy's LLM Wiki pattern — implemented and shipped.
A git template for building your own LLM-powered personal wiki. Training period, metadata standard, lint system included. Clone and go.
A Personal Secretary OS for Claude Code — persistent memory, rituals, dual-domain wiki. $0 extra cost on your existing subscription.
Autonomous knowledge base plugin for Claude Code - captures reserch, ideas, and decisions into an interlinked wiki with reserch-on-miss, semantic search, and a Wikipedia-style web UI. Knowledge compounds as you work.
Andrej Karpathy's LLM Wiki pattern as a Claude Code plugin — turn accumulated sources into a self-maintaining, scalable markdown knowledge base.
"Advanced" agentic dev patterns — we made the mistakes in production so you can make them in staging.
Let your AI agent read, search, and build on your Obsidian notes. MCP + WebSocket + filesystem fallback. Inspired by Karpathy's LLM Wiki.
Obsidian-first LLM Wiki skill pack with ontology-ready bootstrap, canonical JSONL truth layers, and optional graph projection.
A personal knowledge base continuously maintained by Obsidian + LLM. Inspired by Karpathy's LLM Wiki pattern.(一个基于 Obsidian + LLM Wiki 持续维护的个人知识库)
Temporiki — Lightweight temporal Context Graph + Obsidian-native LLM Wiki with real memory (SQLite + Chroma). Zero-command daily flow.
这是一个pdf转换为markdown的skill,llm大模型辅助参与解析。将pdf转化为md后同步生成详细的索引入口,方便自己的llm-wiki读取
Local AI agent: voice & text chat + web UI + persistent memory + file operations + shell commands. No cloud LLMs required. Can run offline. LangGraph + Ollama. Interacts with other agents via API.
Stell dir vor, du hättest ein zusätzliches digitales Gehirn, das in vielen Aufgaben deutlich leistungsfähiger ist als dein primäres und dich dadurch absolut unschlagbar macht. Das ist das Second Brain!
Self improving wiki for your codebase inspired by Karpathy's post on X
Agent Skill that turns any AI coding agent into a personal knowledge librarian — ingest sources, compile a wiki, query with citations.
implementation of karpathy's ideas about llm-wiki
This project is a compiled knowledge system powered by LLMs. Instead of retrieving information from raw documents at query time (as in traditional RAG systems), this approach treats knowledge construction as a compile-time process. Raw information is ingested, structured, and incrementally compiled into a persistent, evolving wiki.
Add a description, image, and links to the llm-wiki topic page so that developers can more easily learn about it.
To associate your repository with the llm-wiki topic, visit your repo's landing page and select "manage topics."