Inference code for testing docker toolchain example

Hi, I am able to write inference code for LittleNet model using ktc.kneron_inference and test it as well on my docker container but Now I want to write the inference code kl720 .

Can you please guide me to write inference code for the same model given in docker toolchain

Do I need to use kdp_wrapper for writing inference code for kl720??

Is it possible for write the inference code ktc APIs

Can you please answer

Comments

  • The easy way to test : use the https://github.com/kneron/host_lib KL720-yolo_public_example

    I attach the files as zip file.

    Abdullah_0001_112_112.bin is the input image in RGB565 format.


    D:\Kneron\host_lib\host_lib-master\python>py main.py -t KL720-yolo_public_example

    adding devices....

    start kdp host lib....

    Task: KL720-yolo_public_example

    [array([[[[-13.31356977,  6.86266483]]]])]

    de init kdp host lib....


  • I able to test successfully with your models_720.nef file but when I replaced it with my model ( /data1/batch_compile/models_720.nef ) which I compiled in toolchain I got the below results


    will@will:~/New_project/kl720/host_lib-master/python$ python3 main.py -t KL720-littlenet_public_example

    adding devices....


    start kdp host lib....


    Task: KL720-littlenet_public_example

    []

    de init kdp host lib....




    When I run the inferencing code for the same model in toolchain I get the below output


    (base) root@edf75761b2af://workspace/william# python kl720_image_inference.py 

    /workspace/miniconda/lib/python3.7/site-packages/numpy/__init__.py:156: UserWarning: mkl-service package failed to import, therefore Intel(R) MKL initialization ensuring its correct out-of-the box operation under condition when Gnu OpenMP had already been loaded by Python process is not assured. Please install mkl-service package, see http://github.com/IntelPython/mkl-service

     from . import _distributor_init

    Using TensorFlow backend.

    Section 3 E2E simulator result:

    [array([[[[-15.372369,  8.235198]]]])]




    Please let me whats the issue and how to fix it

  • could you upload your littlenet models_720.nef

  • This is the model I created

  • My model_id setting is 32768.

    km = ktc.ModelConfig(32768, "0001", "720", onnx_path="/workspace/examples/LittleNet/LittleNet.onnx")

    your model_id is 1001

    So you need change model_id setting to 1001 in yolo_public_example.py

    # Model ID is the same one generated with batch compile (32768 in this case)

    #MODEL_ID = constants.ModelType.CUSTOMER.value

    MODEL_ID = 1001

  • Its working now , Thanks


    One things which I noticed was that while running using kl720 I am getting output as [array([[[[-13.31356977,  6.86266483]]]])]


    and while running the inference code in docker toolchain the output is [array([[[[-15.372369,  8.235198]]]])]

    Can you tell me why there is difference in value for the same model

  • edited August 2021

    In the KL720-yolo_public_example , input image using RGB565 color format (2 bytes per pixel).

    The input data is similar with orginal picture but not same. So this example will get diferent output.

The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.