Skip to content

TeoSable/llm-mind-maps

Repository files navigation

Mind Map Generation Using Large Language Models

This project was made as the final project for the ods.ai NLP course. For details, check out the project report.

Installation

  1. Clone the repository:
git clone https://github.com/TeoSable/llm-mind-maps
cd llm-mind-maps
  1. (Recommended) Create a clean virtual environment for the project:
python3 -m venv venv

Activate the environment:

Linux/Mac:

source venv/bin/activate

Windows:

venv/bin/Activate.ps1
  1. Install dependencies:
pip install -r requirements.txt

It is highly recommended that you use a GPU for running the project, since it involves inferencing an LLM locally. Keep in mind that the default model for the experiment, Qwen 2.5-3B Instruct, requires at least 8 GB of GPU memory to run smoothly. You can check CUDA availability using:

python -c "import torch; print(torch.cuda.is_available())"

If needed, visit the official PyTorch website for a guide on installation with CUDA enabled.

Usage

For a quick run on three documents from the development subset:

python run.py \
  --data-dir data \
  --split dev \
  --model Qwen/Qwen2.5-3B-Instruct \
  --max-files 3

For the full test split experiment with 1-shot Qwen2.5-3B-Instruct:

python run.py \
  --data-dir data \
  --split test \
  --model Qwen/Qwen2.5-3B-Instruct \
  --few-shot-count 1 \
  --output-json outputs/qwen25_3b_test_1shot.json

For the full test split experiment with 1-shot Qwen3-4B-Instruct:

python run.py \
  --data-dir data \
  --split test \
  --model Qwen/Qwen3-4B-Instruct-2507 \
  --quantization 4bit \
  --few-shot-count 1 \
  --output-json outputs/qwen3_4b_test_1shot.json

For more information on command-line arguments of run.py, run the following command:

python run.py --help

About

The repository for my project "Mind Map Generation Using Large Language Models". This project was made as the final project for the ods.ai NLP course.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors