RuntimeError when converting onnx model using pytorch2onnx.py

Hi, when I run the script with my onnx model the following error occurred:

/workspace/libs/ONNX_Convertor/optimizer_scripts/tools/constant_folding.py:645: RuntimeWarning: invalid value encountered in true_divide
  new_data = np.divide(np_data1, np_data2)
(op_type:Sigmoid, name:output): Inferred shape and existing shape differ in dimension 2: (-9223372036854775808) vs (320)
Traceback (most recent call last):
  File "/workspace/libs/ONNX_Convertor/optimizer_scripts/pytorch2onnx.py", line 87, in <module>
    m = combo.pytorch_constant_folding(m)
  File "/workspace/libs/ONNX_Convertor/optimizer_scripts/tools/combo.py", line 135, in pytorch_constant_folding
    m = modhelper.inference_shapes(m)
  File "/workspace/libs/ONNX_Convertor/optimizer_scripts/tools/modhelper.py", line 88, in inference_shapes
    m = onnx.shape_inference.infer_shapes(m)
  File "/workspace/miniconda/lib/python3.7/site-packages/onnx/shape_inference.py", line 36, in infer_shapes
    inferred_model_str = C.infer_shapes(model_str)
RuntimeError: Inferred shape and existing shape differ in dimension 2: (-9223372036854775808) vs (320)

my model takes 320x240 as input and is a segmentation model, I can run the model using onnxruntime. please let me know what could go wrong, thanks.

Comments

  • Hi Max,

    It seems the Sigmoid node called 'output' in your model has the wrong output shape in the second dimension. In your model -9223372036854775808 is given but it should be 2.

  • Hi Jiyuan,

    thanks for your reply. Since I can run my original onnx model on onnxruntime without any problem, I wonder what could cause this problem?

  • Maybe you didn't fix your input size? To use the NPU, you have to specify the exact input size for the model, and it cannot be a random size. You might have to modify your PyTorch script.

  • Hi kidd,

    I export my onnx model using the following code:

    rand_image = torch.tensor(np.zeros((1,3,320,240))).type(torch.FloatTensor).to(device)
    torch.onnx.export(model, rand_image, 'torch_model.onnx', keep_initializers_as_inputs=True,verbose=True, export_params=True, opset_version=9, 
                      do_constant_folding=False, input_names=['input'], output_names=['output'])
    

    didn't the rand_image force the model to have fixed input size?

  • Hi Max,

    Yes, it should have set the shape. Then, it might be something wrong with the export function or the converter. Could you please share the model you exported? Dummy weight is okay. We only care about the structure. Thank you.

  • Hi Jiyuan,

    sure, here is the original onnx model

    thanks again for the reply.


  • Thank you for the model. I'm still checking it. It's a little bit busy recently. Sorry for the delay.

  • Hi Max,

    The bug should be fixed now. You can run `git pull` under /workspace/libs/ONNX_Converter to update the converter. Please have a try.

    Thank you for your patience.

The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.