Re-trained Tiny-YOLOv3 on KL520 AI SoC board doesn't detect any objects during inference

I'm following the documentation to re-train Tiny-YOLOv3.

After finishing the training, I can see the results when running yolo_video.py.

I have also converted the Keras model to ONNX and created the NEF model.

I only edited the scpu classes for the new classes.

After flashing it to the SoC board, it doesn't detect any objects.

Is there any setting I might be missing in the application?

Any help would be appreciated.

Thanks!

Comments

  • Have you ever tried running inference on your ONNX model using ktc.kneron_inference() in the Toolchain before converting it into NEF format?

    If you have already confirmed the performance of the ONNX model you trained, the difference in accuracy between the ONNX model and the NEF model could be caused by incorrect preprocessing or postprocessing.

    Please try to run inference with both the ONNX and NEF models using ktc.kneron_inference().

    https://doc.kneron.com/docs/#toolchain/manual_1_overview/#14-floating-point-model-preparation

    If you are able to get correct results with the ONNX model but still encounter issues with the NEF model, please provide the corresponding materials, and we will help you check it.

  • Hi Ethon,

    Thank you for your recommendation. I followed the example to check the ktc.kneron_inference() results and have also attached the log file.

    However, I made some modifications—specifically to the image tensor shape. The original code was:

    tensor_data = [tf.transpose(data, perm=[0, 2, 3, 1])]  # expects BHWC format
    

    I removed the transpose because it caused a reshape size error during inference.

    When I used the new YOLOv3-Tiny .nef file to test the KL520 AI SoC example for YOLOv3-Tiny, no objects were detected. I was using the .weights file on the COCO dataset for YOLOv3-Tiny.

  • Hi,

    Thank you for providing the log. For the code, did you mean the one below?

    If so, the Yolov3 example on our documentation is outdated; it was written for toolchain v0.22.0, and our latest toolchain version is v0.30.0. Please refer to the main toolchain manual here: 1. Toolchain Manual Overview - Document Center


    When you used your onnx model in ktc.kneron_inference() with postprocess function, was it able to detect anything? As Ethon said, before moving onto the .nef file, please make sure that using ktc.kneron_inference() with both your onnx model and your NEF model give the correct results on Kneron Toolchain. The link above also has example codes on ktc.kneron_inference().

Sign In or Register to comment.