Ask questionsTensorflow Lite Model Maker model works on Python API but not on device (IOS/Android)
Hello, I hope all is well. I have recently created a MobileBERT model using the Python API of the tflite_model_maker library following the steps described in this page. You may find it attached in the following link (let me know if I can share it in any other way).
The good news is that I have been able to run inference with the Python API by following the steps described in this article. However, my colleague has been encountering issues when trying to run the same model using the Swift and Android APIs after following this page.
After some time, he was effortlessly able to run inference on a different model made available through one of the sample apps found here. It is our belief that the issue may be from the MetaData but after inspecting both models' metadata (attached in the zip file), they seem to be exactly the same: MetaDatas.zip.
Our issue is that the inference process is killed without clear reason after the model is loaded. As displayed in the image below:
With the following error:
tensor->bytes == bytes FATAL
Please advise and thank you for your time
Answer questions carl-krikorian
Hello @pkgoogle, The model was attached in the first comment here: https://drive.google.com/file/d/1jeKm7EesBZqi_lgPrCSq_HwPapX54OlL/view?usp=sharing I would have zipped and sent it but the last time I did that it wiped the entire MedtaData. Here is the training script I used: tflite_maker.zip