Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Conversation

@Warvito
Copy link
Collaborator

@Warvito Warvito commented Feb 11, 2023

Implements #247

Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito linked an issue Feb 11, 2023 that may be closed by this pull request
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito
Copy link
Collaborator Author

Warvito commented Feb 11, 2023

@marksgraham @Ashayp31 @danieltudosiu do you think it is possible to simplify any of these parameters?

        adn_ordering: str = "NDA",
        dropout: tuple | str | float | None = 0.1,
        act: tuple | str | None = "RELU",
        output_act: tuple | str | None = None,

Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito marked this pull request as ready for review February 11, 2023 18:16
@ericspod
Copy link
Member

ericspod commented Feb 11, 2023

@marksgraham @Ashayp31 @danieltudosiu do you think it is possible to simplify any of these parameters?

We could have something like ActType = tuple | str | None in a utilities file and act: ActType in the constructor. I would suggest this parameter shouldn't be None ever (nor dropout for which 0.0 should mean no dropout), and we should use the names from the layer factories rather than string literals for the default argument like here. Simplifying the type annotations in MONAI itself is something I had wanted to do a while back now.

@Warvito Warvito added the need reviewer This PR need a reviewer label Feb 13, 2023
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Copy link
Collaborator

@marksgraham marksgraham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, looks good apart from the typing issue. Managed to load in some VQ-VAE weights from a model I trained before this PR too (with some minor weights renaming) so all good :)

Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito merged commit 06a57bc into main Feb 15, 2023
@Warvito Warvito deleted the 247-harmonise-vqvae-with-autoencoderkl branch February 17, 2023 17:51
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

need reviewer This PR need a reviewer

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Harmonise VQVAE with AutoencoderKL

4 participants