Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Enable any resolution for Unet #1029
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable any resolution for Unet #1029
Changes from 5 commits
0cab989
b2166ea
26725de
5b75f11
6b2ca90
eb81c1f
d5a80df
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this an intentional backwards-incompatible change? It is no longer possible to use U-Net without specifying height/width. Could we instead default to the same height/width as the input like we previously did?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, not sure I got this.. this is just for a layer, but Decoder pass height and width. Can you please specify what is broken?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, the problem was that I'm directly using
smp.decoders.unet.decoder.UnetDecoder
. I got this failure in CI when I tried the main branch of SMP: https://github.com/microsoft/torchgeo/actions/runs/14042455085/job/39315632706?pr=2669. Perhaps we are relying on some undocumented implementation details in https://github.com/microsoft/torchgeo/blob/main/torchgeo/models/fcsiam.py?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Part of the fails are related to modified decoder forward, previously input features were unpacked with
*features
while passing into the decoder's forward, but now the star is removed and they should be provided as listThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing star here should work
https://github.com/microsoft/torchgeo/blob/091ad0927c901dd750257a3cbddb755a6ae95dee/torchgeo/models/fcsiam.py#L114
And
https://github.com/microsoft/torchgeo/blob/091ad0927c901dd750257a3cbddb755a6ae95dee/torchgeo/models/fcsiam.py#L153
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The rest can be resolved with renaming center->add_center_block here
https://github.com/microsoft/torchgeo/blob/091ad0927c901dd750257a3cbddb755a6ae95dee/torchgeo/models/fcsiam.py#L83
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know it's easy to support the new API, although it's harder to support both. I guess the question is whether this is intentional or not. Maybe it would help to add a "backwards-incompatible" label to PRs like this. Even better would be to deprecate the old syntax before completely removing it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes that was intentional, added in PR which fix support of torch.script/compile/export. These are internal modules, so I hope it will not break too many things.. But you are right, I will add the label and I will also highlight it in release notes