Skip to content

Release CompassNav artifacts (models, dataset) on Hugging Face #1

@NielsRogge

Description

@NielsRogge

Hi @linengcs 🤗

Niels here from the open-source team at Hugging Face. I discovered your work on Arxiv and your project page https://linengcs.github.io/CompassNav. Your paper "CompassNav: Steering From Path Imitation To Decision Understanding In Navigation" got featured on Hugging Face's daily papers: https://huggingface.co/papers/2510.10154.

The paper page lets people discuss your paper and find related artifacts (models, datasets, demos). You can also claim the paper as yours, which will show up on your public profile at HF, and add GitHub and project page URLs.

I noticed in your GitHub repository's TODO list that you plan to release both the CompassNav-7B model and the Compass-Data-22k dataset. I also saw that you've already set up placeholder links to Hugging Face for these! That's fantastic.

We'd love to help make sure these artifacts are fully released and discoverable on the Hugging Face Hub. Hosting them on https://huggingface.co/models and https://huggingface.co/datasets would give them excellent visibility and enable better discoverability. We can add relevant tags to the model and dataset cards, link them directly to your paper page, and ensure they are easily accessible to the community.

For CompassNav-7B, I noticed the current link points to a Hugging Face dataset repository (huggingface.co/datasets/Lineng/CompassNav7B). For better discoverability and to leverage model-specific features on the Hub (like inference widgets, download stats for models, etc.), we recommend pushing models to a dedicated model repository (e.g., huggingface.co/Lineng/CompassNav-7B).

Uploading models

See here for a guide: https://huggingface.co/docs/hub/models-uploading.

In this case, we could leverage the PyTorchModelHubMixin class which adds from_pretrained and push_to_hub to any custom nn.Module. Alternatively, one can leverages the hf_hub_download one-liner to download a checkpoint from the hub.

We encourage researchers to push each model checkpoint to a separate model repository, so that things like download stats also work. We can then also link the checkpoints to the paper page.

Uploading dataset

Would be awesome to make the dataset available on 🤗 , so that people can do:

from datasets import load_dataset

dataset = load_dataset("your-hf-org-or-username/your-dataset")

See here for a guide: https://huggingface.co/docs/datasets/loading.

Besides that, there's the dataset viewer which allows people to quickly explore the first few rows of the data in the browser.

Let me know if you're interested/need any help regarding this!

Cheers,

Niels
ML Engineer @ HF 🤗

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions