ssd_mobilenet_v2 模型轉換出現錯誤,該如何排除
我在tool_chain docker 安裝https://github.com/onnx/tensorflow-onnx r1.5
然後git clone https://github.com/kneron/ONNX_Convertor
在optimizer_scripts目錄下執行
python tensorflow2onnx.py /data1/ssdmobilenetv2_float32.pb /data1/ssdmobilenetv2_float32.onnx
ssdmobilenetv2_float32.pb 是 tensorflow 1.13.1 訓練出來的float32模型
出現問題如下
INFO:tensorflow:Froze 0 variables.
[2021-11-02 05:56:25,197] INFO: Froze 0 variables.
INFO:tensorflow:Converted 0 variables to const ops.
[2021-11-02 05:56:25,240] INFO: Converted 0 variables to const ops.
[2021-11-02 05:56:25,832] INFO: Using tensorflow=1.15.3, onnx=1.6.0, tf2onnx=1.8.4/cd55bf
[2021-11-02 05:56:25,833] INFO: Using opset <onnx, 11>
[2021-11-02 05:56:27,281] INFO: Computed 1 values for constant folding
[2021-11-02 05:56:29,269] INFO: folding node using tf type=Select, name=Postprocessor/BatchMultiClassNonMaxSuppression/map/while/PadOrClipBoxList/Select_1
[2021-11-02 05:56:29,414] ERROR: rewriter <function rewrite_single_direction_lstm at 0x7faa2b494a70>: exception switch false branch is followed by non-Exit
Traceback (most recent call last):
...
請問該如何排除這個錯誤
Comments
使用官方預訓練版本
http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v2_coco_2018_03_29.tar.gz
python tensorflow2onnx.py /data1/ssd_mobilenet_v2_coco_2018_03_29/frozen_inference_graph.pb /data1/ssdmobilenetv2_coco.onnx
亦出現類似的錯誤
INFO:tensorflow:Froze 0 variables.
[2021-11-02 07:11:21,559] INFO: Froze 0 variables.
INFO:tensorflow:Converted 0 variables to const ops.
[2021-11-02 07:11:21,683] INFO: Converted 0 variables to const ops.
[2021-11-02 07:11:23,991] INFO: Using tensorflow=1.15.3, onnx=1.6.0, tf2onnx=1.8.4/cd55bf
[2021-11-02 07:11:23,991] INFO: Using opset <onnx, 11>
[2021-11-02 07:11:28,214] WARNING: Cannot infer shape for Postprocessor/BatchMultiClassNonMaxSuppression/map/while/PadOrClipBoxList/cond/zeros: Postprocessor/BatchMultiClassNonMaxSuppression/map/while/PadOrClipBoxList/cond/zeros:0
[2021-11-02 07:11:30,399] INFO: Computed 0 values for constant folding
[2021-11-02 07:11:38,584] ERROR: rewriter <function rewrite_single_direction_lstm at 0x7fe04bbcca70>: exception switch false branch is followed by non-Exit
Traceback (most recent call last):
...
Hello,
推測你的模型是來自tensorflow object detection api,
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md#coco-trained-models
來自這裡的模型有非常多的問題(預設會把post process拆成一堆奇怪的operator放在pb graph裡), 強烈建議使用tensorflow object detection api裡提供的script export成tflite並同時移除post process.
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/running_on_mobile_tensorflowlite.md#running-on-mobile-with-tensorflow-lite
或者在export pb model時把末段不支援的operator移除
嗨 Ethon,
使用tensorflow object detection api裡提供的script export成tflite 應該選浮點數輸出?
另外請問如何移除post process ?
上面這些步驟做完再進入Tflite to ONNX 流程
python /workspace/libs/ONNX_Convertor/tflite-onnx/onnx_tflite/tflite2onnx.py -tflite path_of_input_tflite_model -save_path path_of_output_onnx_file -release_mode True
對嗎?
謝謝
Hello,
是的,模型的部分僅需維持原本的浮點架構,toolchain中會涵蓋轉換成定點的流程
移除post process ops的步驟可以先轉換成onnx模型後,再透過toolchain所提供的編輯器修改,相關用法可以參考文件的說明