Skip to content

x.split is not a function (it is undefined) #9

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
lakpriya1s opened this issue Mar 28, 2025 · 1 comment
Open

x.split is not a function (it is undefined) #9

lakpriya1s opened this issue Mar 28, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@lakpriya1s
Copy link

lakpriya1s commented Mar 28, 2025

Describe the bug
I’m interested in using the Felladrin/Llama-160M-Chat-v1 model with react-native-transformers, which works best with ONNX models—ideally in INT8 for better performance on a Python notebook.

I used the following script to convert Felladrin/Llama-160M-Chat-v1 to ONNX:

!python3 /content/convert.py \
  --quantize \
  --task text-generation \
  --model_id Felladrin/Llama-160M-Chat-v1

After the conversion, it works on onnx optimum on python notebook. I tried loading the quantized model with react-native-transformers, but I encountered the following error:

x.split is not a function (it is undefined)
This error comes from the tokenizer.js file:

this.bpe_ranks = new Map(config.merges.map((x, i) => [x, i]));
this.merges = config.merges.map(x => x.split(this.BPE_SPLIT_TOKEN));

My goal is to fine-tune this model on my dataset, convert it to ONNX, and run it with eact-native-transformers — but I’m blocked on this issue. But it works perfectly on Python notebook using onnx optimum.

Could you help clarify what might be going wrong? Am I missing a step?

Thanks!

To Reproduce
Steps to reproduce the behavior:
Load the model.

Expected behavior
Load the model as expected

Screenshots
Image

Smartphone (please complete the following information):

  • Device: iPhone 16 Pro emu
@lakpriya1s lakpriya1s added the bug Something isn't working label Mar 28, 2025
@an-upfeat
Copy link

an-upfeat commented Apr 30, 2025

Got same error when trying to load:

export const MODEL_REPO = "onnx-community/Qwen3-0.6B-ONNX";
export const MODEL_FILE = "onnx/model_quantized.onnx";

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants