Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
-
Updated
Dec 6, 2025 - Python
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
The official GitHub page for the survey paper "A Survey of Large Language Models".
Data processing for and with foundation models! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷
Aligning pretrained language models with instruction data generated by themselves.
Code and models for ICML 2024 paper, NExT-GPT: Any-to-Any Multimodal Large Language Model
【EMNLP 2024🔥】Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
🦦 Otter, a multi-modal model based on OpenFlamingo (open-sourced version of DeepMind's Flamingo), trained on MIMIC-IT and showcasing improved instruction-following and in-context learning ability.
InternLM-XComposer2.5-OmniLive: A Comprehensive Multimodal System for Long-term Streaming Video and Audio Interactions
mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
[ECCV2024] Video Foundation Models & Data for Multimodal Understanding
Cambrian-1 is a family of multimodal LLMs with a vision-centric design.
Synthetic data curation for post-training and structured data extraction
An Open-sourced Knowledgable Large Language Model Framework.
DataDreamer: Prompt. Generate Synthetic Data. Train & Align Models. 🤖💤
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
DISC-FinLLM,中文金融大语言模型(LLM),旨在为用户提供金融场景下专业、智能、全面的金融咨询服务。DISC-FinLLM, a Chinese financial large language model (LLM) designed to provide users with professional, intelligent, and comprehensive financial consulting services in financial scenarios.
[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
Deita: Data-Efficient Instruction Tuning for Alignment [ICLR2024]
DialogStudio: Towards Richest and Most Diverse Unified Dataset Collection and Instruction-Aware Models for Conversational AI
Add a description, image, and links to the instruction-tuning topic page so that developers can more easily learn about it.
To associate your repository with the instruction-tuning topic, visit your repo's landing page and select "manage topics."