Skip to content

Bump the torch group in /requirements with 2 updates #1050

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jan 30, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions requirements/required.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ pretrainedmodels==0.7.4
safetensors==0.5.2
six==1.17.0
timm==1.0.14
torch==2.5.1
torchvision==0.20.1
torch==2.6.0
torchvision==0.21.0
tqdm==4.67.1
2 changes: 1 addition & 1 deletion segmentation_models_pytorch/encoders/dpn.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@

class DPNEncoder(DPN, EncoderMixin):
_is_torch_scriptable = False
_is_torch_exportable = False
_is_torch_exportable = True # since torch 2.6.0
Copy link
Collaborator

@adamjstewart adamjstewart Jan 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this have an if-statement based on PyTorch version?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm.. this way, we should test other encoders for all the previous torch versions as well. torch.export is still in active development and it's recommended to use the latest torch, so I suppose a comment is enough unless we see an issue

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's why I designed the tests to run on the newest and oldest version of all of our dependencies.

Copy link
Collaborator

@qubvel qubvel Jan 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I understand, but some features are introduced with the latest version of torch, while the main functionality works with the previous versions. I'm not sure we should claim we are only compatible with PyTorch 2.6+ if only a very small subset of features fails. WDYT?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it should be clearly stated in the docs/README that export/scripting/compilation has been tested and is expected to work with the latest versions and one may encounter issues with previous versions.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can support a wider range of PyTorch versions by using an if-statement on this one line of code. Alternatively, we can skip the export test for this specific model for certain versions of PyTorch.


def __init__(
self,
Expand Down
Loading