Tflite to Onnx
Is there a cap on the size of the tensorflow lite model we are using? Also, is there a limit to the input size of a tlite model (larger input could lead to a larger model)?
Thanks
Tagged:
The discussion has been closed due to inactivity. To continue with the topic, please feel free to post a new discussion.
Comments
There is no specific size limitation for model. But to implement model inference on Kneron's device, there is memory limitation on chip.
All of following functions cost memory,
model data(NEF), input image, inference output data, npu working buffer, and fifo queue
And all these functions can only using with 32MB memory on KL520 or 70MB on KL720.
The NEF model would be 1/4 size of onnx model in general.