Skip to content

Error trying to export models with timm encoder to onnx format #589

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rgsousa88 opened this issue Apr 18, 2022 · 3 comments
Closed

Error trying to export models with timm encoder to onnx format #589

rgsousa88 opened this issue Apr 18, 2022 · 3 comments
Labels

Comments

@rgsousa88
Copy link

Hi,

Is there any issue related to onnx exporting when using timm models as encoders? I'm trying to export a model with timm-MobileNetV3 as encoder and FPN as decoder.

I'm running the script in a conda environment with python=3.7, pytorch=1.8, segmenation_model_pytorch=0.2.1

Building network:

import segmentation_models_pytorch as smp

def build_fpn_mobv3(input_shape): 
    model = smp.FPN(encoder_name="timm-mobilenetv3_large_100",
                    encoder_weights=None, 
                    in_channels=3,
                    classes=1,
                    activation="sigmoid")
    
    shape = (1,3,) + input_shape
    x = torch.zeros(shape,dtype=torch.float32,device=torch.device('cpu'))
    model.eval()
    model(x)
    
    return model 

Exporting trained model

onnxpath = os.path.join(ckpt_path, f"{model_name}.onnx")
dynamic_axes = {0: 'batch_size'}
shape = (1,3,) + args.shape
dummy_input = torch.zeros(shape,requires_grad=True).float().to(device)
net.eval()

torch.onnx._export(net, dummy_input, onnxpath,
                   export_params=True, verbose=True,
                   input_names=['image'], output_names=['maps'],
                   keep_initializers_as_inputs=False,
                   dynamic_axes={'image': {0: 'batch'}, 'maps': {0: 'batch'}},
                   opset_version=10)

The error is:
RuntimeError: Unsupported: ONNX export of Pad in opset 9. The sizes of the padding must be constant. Please try opset version 11.

Is there any workaround to export using opset version 10? I'm able to export using other encoders but none of them are timm ones.

Thanks for attention and time.

@EricPHassey
Copy link

Also wondering this. Any progress or updates we can do?

@github-actions
Copy link

github-actions bot commented Jul 5, 2022

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale label Jul 5, 2022
@rgsousa88
Copy link
Author

Hi, @EricPHassey

Sorry for the late reply... I have "solved" it by building a custom encoder using MobileNetV3 code available at torchvision.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants