Skip to content

[Bug]: Cannot register new model #16228

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
amogkam opened this issue Apr 8, 2025 · 7 comments
Closed
1 task done

[Bug]: Cannot register new model #16228

amogkam opened this issue Apr 8, 2025 · 7 comments
Labels
bug Something isn't working

Comments

@amogkam
Copy link

amogkam commented Apr 8, 2025

Your current environment

The output of `python collect_env.py`
vLLM Version: 0.8.1

🐛 Describe the bug

I am trying to register a new model architecture that is currently not support in vLLM.

AutoConfig.register("model_name", MyModelConfig, exist_ok=True)
ModelRegistry.register_model("MyModelForCausalLM", MyModelForCausalLM)

llm = LLM(model=model_path)
llm.generate("Hello")

But this fails with

ValueError: MyModelForCausalLM has no vLLM implementation and the Transformers implementation is not compatible with vLLM. Try setting VLLM_USE_V1=0.

Seems that the issue is that when I register my model architecture on the driver side, the ModelRegistry dict gets updated. But when loading the model on the worker side, if I print out ModelRegistry in the vllm.model_executor.model_loader.utils.get_model_architecture function, the registry does not show the new model architecture that I added.

I believe the ModelRegistry on the driver side has to be serialized and explicitly passed to the workers?

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@amogkam amogkam added the bug Something isn't working label Apr 8, 2025
@amogkam amogkam closed this as completed Apr 8, 2025
@amogkam amogkam reopened this Apr 8, 2025
@amogkam
Copy link
Author

amogkam commented Apr 8, 2025

This is an only issue with v1 engine. Everything works fine with v0 engine.

@DarkLight1337
Copy link
Member

Are you registering the model using the plugin interface as shown here?

@amogkam
Copy link
Author

amogkam commented Apr 8, 2025

I followed the instructions here https://docs.vllm.ai/en/stable/contributing/model/registration.html#out-of-tree-models. Do I need to also register as a plugin?

@amogkam
Copy link
Author

amogkam commented Apr 8, 2025

Ok this worked if I add it as an entrypoint plugin. Thanks!

@amogkam amogkam closed this as completed Apr 8, 2025
@xxchauncey
Copy link

Hi~ I register model as an entrypoint plugin, and when loading llm I can see it loaded successfully in logs. But I still get the error, do you have any ideas? Thanks.

@DarkLight1337
Copy link
Member

Can you provide more details?

@xxchauncey
Copy link

Hi~ I register model as an entrypoint plugin, and when loading llm I can see it loaded successfully in logs. But I still get the error, do you have any ideas? Thanks.

sorry to bother, this is due to the error in my entrypoint plugin.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants