Facing issue while testing python Inference code for yolov8 face
I have successfully converted yolov8-face into .nef, while testing inference code for yolov8-face .nef model I am getting bellow error.
(kneron) C:\Users\Mantra\Desktop\KL630\kneron_plus\python\example>python3 infer_testY8n.py
[Connect Device]
- Success
[Set Device Timeout]
- Success
[Upload Firmware]
- Success
[Upload Model]
- Success
[Read Image]
- Success
[Starting Inference Work]
- Starting inference loop 10 times
- - Error: inference failed, error = Error raised in function: generic_image_inference_receive. Error code: 103. Description: ApiReturnCode.KP_FW_INFERENCE_TIMEOUT_103
can you please help me to resolve this issue.
I have attached .nef model and that inference code in .txt file.
The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.
Comments
This question was answered by the customer via email. The main reason for the error is that the model contains cpu nodes and KL630 does not support cpu nodes.
The main reason for the error is that the model contains cpu nodes and KL630 does not support cpu nodes.
same issue I am facing now . How can I remove/replace cpu nodes so that I can easily deploy model into KL630 device
could you please tell me as soon as possible
Hi,
For now, KL630 is only available through sales channels and it's not publicly available yet. Please contact Nick Wang (nick.wang@kneron.us) and FanChiang (shihchun.fanchiang@kneron.us) for technical assistance.
Update (April 23): Our team members are working on this issue via mail.