Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Conversation

@Warvito
Copy link
Collaborator

@Warvito Warvito commented Feb 3, 2023

Fixes #196

Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito linked an issue Feb 3, 2023 that may be closed by this pull request
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
Warvito added 2 commits March 12, 2023 10:31
Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito marked this pull request as ready for review March 12, 2023 11:59
Copy link
Collaborator

@marksgraham marksgraham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, just a fix to the import and typing

else:
self.mask = None

def forward(self, x):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typing missing from forward

from monai.networks import eval_mode
from parameterized import parameterized

from generative.networks.blocks.selfattention import SABlock
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think we should use an __init__.py file so we can have the nicer
from generative.networks.blocks import SABlock?

Signed-off-by: Walter Hugo Lopez Pinaya <ianonimato@hotmail.com>
@Warvito Warvito merged commit e135f42 into main Mar 13, 2023
@Warvito Warvito deleted the 196-add-self-attention-block-for-autoregressive-transformer branch March 18, 2023 19:44
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add self-attention block for autoregressive transformer

3 participants