Skip to content

CUC-MIPG/Edit-Transfer

Repository files navigation

Edit Transfer

Edit Transfer: Learning Image Editing via Vision In-Context Relations
Lan Chen, Qi Mao, Yuchao Gu and Mike Zheng Shou
MIPG, Communication University of China; Show Lab, National University of Singapore

Project Website arXiv HuggingFace HuggingFace

Getting Started

1. Environment setup

git clone https://github.com/CUC-MIPG/Edit-Transfer.git
cd Edit-Transfer

conda create -n EditTransfer python=3.10
conda activate EditTransfer

2. Requirements installation

pip install requirements.txt

3. Start training

We use the open-source AI-Toolkit to train EditTransfer. We provide training data with a configuration file in this repo:

  • Configuration File: config/edit_transfer.yml
  • Training Data: data/edit_transfer.zip

You can start training by running:

python run.py config/edit_transfer.yml

You can download the trained checkpoints of EditTransfer Model for inference: https://drive.google.com/file/d/1V4HraIjlMrbPfAPivk5vYoq4bQTzcP4L/view?usp=sharing

4. Inference

Once the training is done, replace file paths and run the following code:

python edit_transfer.py --model_dir [your_model_dir] --model_name [your_model_name] --img_path [your_img_file_path]--prompt_file [your_prompt_file_path]

Results

Single and Compositional Edit Transfer

Generalization performance

Citation

@article{chen2025edit,
  title={Edit Transfer: Learning Image Editing via Vision In-Context Relations},
  author={Chen, Lan and Mao, Qi and Gu, Yuchao and Shou, Mike Zheng},
  journal={arXiv preprint arXiv:2503.13327},
  year={2025}
}

About

Official code of "Edit Transfer: Learning Image Editing via Vision In-Context Relations"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors