You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I’m interested in using the Felladrin/Llama-160M-Chat-v1 model with react-native-transformers, which works best with ONNX models—ideally in INT8 for better performance on a Python notebook.
I used the following script to convert Felladrin/Llama-160M-Chat-v1 to ONNX:
After the conversion, it works on onnx optimum on python notebook. I tried loading the quantized model with react-native-transformers, but I encountered the following error:
x.split is not a function (it is undefined)
This error comes from the tokenizer.js file:
My goal is to fine-tune this model on my dataset, convert it to ONNX, and run it with eact-native-transformers — but I’m blocked on this issue. But it works perfectly on Python notebook using onnx optimum.
Could you help clarify what might be going wrong? Am I missing a step?
Thanks!
To Reproduce
Steps to reproduce the behavior:
Load the model.
Expected behavior
Load the model as expected
Screenshots
Smartphone (please complete the following information):
Device: iPhone 16 Pro emu
The text was updated successfully, but these errors were encountered:
Describe the bug
I’m interested in using the
Felladrin/Llama-160M-Chat-v1
model withreact-native-transformers
, which works best with ONNX models—ideally in INT8 for better performance on a Python notebook.I used the following script to convert
Felladrin/Llama-160M-Chat-v1
to ONNX:After the conversion, it works on onnx optimum on python notebook. I tried loading the quantized model with
react-native-transformers
, but I encountered the following error:x.split is not a function (it is undefined)
This error comes from the tokenizer.js file:
My goal is to fine-tune this model on my dataset, convert it to ONNX, and run it with eact-native-transformers — but I’m blocked on this issue. But it works perfectly on Python notebook using onnx optimum.
Could you help clarify what might be going wrong? Am I missing a step?
Thanks!
To Reproduce
Steps to reproduce the behavior:
Load the model.
Expected behavior
Load the model as expected
Screenshots

Smartphone (please complete the following information):
The text was updated successfully, but these errors were encountered: