ContextGem: Effortless LLM extraction from documents
-
Updated
Nov 16, 2025 - Python
ContextGem: Effortless LLM extraction from documents
Document Summarization App using large language model (LLM) and Langchain framework. Used a pre-trained T5 model and its tokenizer from Hugging Face Transformers library. Created a summarization pipeline to generate summary using model.
A powerful CLI tool for extracting text from documents using DeepSeek OCR and generating high-quality datasets with LLM assistance.
Compose, train and test fast LLM routers
A type-safe graph execution framework built on top of OpenLit for LLM pipelines
CLI tool for LLM prompt pipelines. Reusable. Shareable. Scriptable.
Sage – Prompt-Based Data Generation & Annotation Platform
Turn messy survey responses into clean research insights. Dual-model pipeline: Claude Opus 4.5 extracts themes and assigns participants, GPT-5.1 writes executive summaries. Tuned temperatures for precision where it matters.
Add a description, image, and links to the llm-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the llm-pipeline topic, visit your repo's landing page and select "manage topics."