Skip to content

Conversation

@wyli
Copy link
Contributor

@wyli wyli commented Oct 27, 2022

Signed-off-by: Wenqi Li wenqil@nvidia.com

Fixes #5422

Description

the kernel size and strides are hard-coded:

self.upconv = UpConv(spatial_dims=spatial_dims, in_channels=out_channels, out_channels=in_channels, strides=2)

this PR makes the values tunable.

level parameter is not used and removed in this PR.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change (fix or new feature that would cause existing functionality to change).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

Signed-off-by: Wenqi Li <wenqil@nvidia.com>
@wyli wyli changed the title update parameters 5422 update attentionunet parameters Oct 27, 2022
@wyli
Copy link
Contributor Author

wyli commented Oct 28, 2022

/build

@wyli wyli enabled auto-merge (squash) October 28, 2022 07:11
@wyli wyli merged commit 6e77d57 into Project-MONAI:dev Oct 28, 2022
@wyli wyli deleted the 5422-att-unet branch October 28, 2022 08:43
KumoLiu pushed a commit that referenced this pull request Nov 2, 2022
Signed-off-by: Wenqi Li <wenqil@nvidia.com>

Fixes #5422

### Description
the kernel size and strides are hard-coded:

https://github.com/Project-MONAI/MONAI/blob/a209b06438343830e561a0afd41b1025516a8977/monai/networks/nets/attentionunet.py#L151

this PR makes the values tunable.

`level` parameter is not used and removed in this PR.

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [ ] In-line docstrings updated.
- [ ] Documentation updated, tested `make html` command in the `docs/`
folder.

Signed-off-by: Wenqi Li <wenqil@nvidia.com>
Signed-off-by: KumoLiu <yunl@nvidia.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

RuntimeError with AttentionUNet

2 participants