Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

num_tokens in 2d_vqvae_transformer_tutorial #301

@Warvito

Description

@Warvito

In the tutorial, the VQVAE is defined to have 256 codes in the latent space. The problem is that the Transformer is also defined to ahve 256.

num_tokens=256, # must be equal to num_embeddings input of VQVAE

In this case, I think the transformer should actually have 256 (from the VQVAE) + 1 (for the begin of sentence token). @Ashayp31 do you agree this is the case here?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions