Skip to content

Conversation

@ahatamiz
Copy link
Contributor

@ahatamiz ahatamiz commented Sep 16, 2021

Fixes #2775
Fixes #2776
Fixes #2777 .

Status

This pull request adds the full pipeline and support for multimodal (vision + language ) transformers. The transformers implementation follow Huggingface repository.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
@ahatamiz ahatamiz requested review from Nic-Ma and wyli September 16, 2021 02:46
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
@ahatamiz
Copy link
Contributor Author

/build

@ahatamiz
Copy link
Contributor Author

/black

Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
@ahatamiz
Copy link
Contributor Author

/build

@ahatamiz
Copy link
Contributor Author

/black

Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
@ahatamiz
Copy link
Contributor Author

/build

@ahatamiz
Copy link
Contributor Author

/black

@ahatamiz ahatamiz added this to the Multi Modality Support milestone Sep 16, 2021
@ahatamiz ahatamiz self-assigned this Sep 16, 2021
Signed-off-by: ahatamizadeh <ahatamizadeh@nvidia.com>
@ahatamiz
Copy link
Contributor Author

/build

@ahatamiz
Copy link
Contributor Author

/black

@ahatamiz
Copy link
Contributor Author

Hi @wyli , this PR allows for highly flexible specification of the model for BERT config. Users can simply specify them without any restrictions.

It has passed all the tests. Let me know if anything else is needed, otherwise should be good for merging.

Thanks

@wyli
Copy link
Contributor

wyli commented Sep 16, 2021

Hi @wyli , this PR allows for highly flexible specification of the model for BERT config. Users can simply specify them without any restrictions.

It has passed all the tests. Let me know if anything else is needed, otherwise should be good for merging.

Thanks

thanks, it looks good to me. please remove the changes in monai/_version.py, they are probably automatically generated. (or I can do it later today)

@ahatamiz
Copy link
Contributor Author

ahatamiz commented Sep 16, 2021

Hi @wyli , this PR allows for highly flexible specification of the model for BERT config. Users can simply specify them without any restrictions.
It has passed all the tests. Let me know if anything else is needed, otherwise should be good for merging.
Thanks

thanks, it looks good to me. please remove the changes in monai/_version.py, they are probably automatically generated. (or I can do it later today)

Thanks @wyli. Yes I think there were automatically generated. They maybe triggered to be generated again in future PRs.

@ahatamiz
Copy link
Contributor Author

/build

@ahatamiz
Copy link
Contributor Author

/black

@ahatamiz
Copy link
Contributor Author

Hi @wyli , this PR allows for highly flexible specification of the model for BERT config. Users can simply specify them without any restrictions.
It has passed all the tests. Let me know if anything else is needed, otherwise should be good for merging.
Thanks

thanks, it looks good to me. please remove the changes in monai/_version.py, they are probably automatically generated. (or I can do it later today)

Thanks @wyli. Yes I think there were automatically generated. They maybe triggered to be generated again in future PRs.

Just submitted another PR to address this.

Signed-off-by: Wenqi Li <wenqil@nvidia.com>
@wyli
Copy link
Contributor

wyli commented Sep 16, 2021

I've added some basic docs, still this module needs better paper references and general info. this could be done in some follow-up PRs.

@wyli
Copy link
Contributor

wyli commented Sep 16, 2021

/build

Copy link
Contributor

@wyli wyli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me, it requires follow-up documentation/tutorial to show the usages

@wyli wyli enabled auto-merge (squash) September 16, 2021 07:06
@wyli wyli merged commit 3b6f479 into Project-MONAI:dev Sep 16, 2021
@ahatamiz ahatamiz deleted the 2775_multimodality_v2 branch September 16, 2021 14:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add transformer-based multi-modality pipelines Add transformer-based NLP pipelines Add multi-modality transformers

2 participants