how can I compile onnx model to nef with input format BGR565?

Hello, I used "kneron/toolchain:v0.19.0", and I try to compile model,not YOLO like depth estimation or deburring model.

If I run the nef model on the chip after a successful compilation, the results are very different.

(However, nef's output using kneron_inference() looks good.)

I'd like to check if the cause is in the input format or not.

The nef model I compiled has rgb888 as the input image format.

So how can I compile onnx model to nef with input format BGR565?

I have to use the toolchain version 0.19.0 and the non-Yolo model.

Comments

  • Hi Hyun,

    Could you provide us with your model (including .onnx model) and your output results for us to take a look?

    Even if you compiled rgb888 as the input image format, the input format should be similar.

    Here is the documentation on input format: Supported NPU Data Layout Format - Document Center (kneron.com)


    If your nef results for kneron_inference() looked good, you could run demo_generic_data_inference or demo_generic_image_inference on Kneron PLUS to bypass preprocessing and check if the result is the same as your nef result. If they were different, it might be because the postprocess wasn't written the same as the one for outputting the nef result.

    Generic Inference API: Generic Inference API - Document Center (kneron.com)

The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.