Skip to content

Local apps: add MLX LM #1443

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions packages/tasks/src/local-apps.ts
Original file line number Diff line number Diff line change
Expand Up @@ -262,6 +262,43 @@ const snippetTgi = (model: ModelData): LocalAppSnippet[] => {
];
};

const snippetMlxLm = (model: ModelData): LocalAppSnippet[] => {
const openaiCurl = [
"# Calling the OpenAI-compatible server with curl",
`curl -X POST "http://localhost:8000/v1/chat/completions" \\`,
` -H "Content-Type: application/json" \\`,
` --data '{`,
` "model": "${model.id}",`,
` "messages": [`,
` {"role": "user", "content": "Hello"}`,
` ]`,
` }'`,
];

return [
{
title: "Generate or start a chat session",
setup: ["# Install MLX LM", "pip install mlx-lm"].join("\n"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or use uvx and drop the pip install?

or even better, uv tool install mlx-lm

content: [
"# One-shot generation",
`mlx_lm.generate --model "${model.id}" --prompt "Hello"`,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need for this one in case the model is conversational, IMO

...(model.tags.includes("conversational")
? ["# Interactive chat REPL", `mlx_lm.chat --model "${model.id}"`]
: []),
].join("\n"),
},
...(model.tags.includes("conversational")
? [
{
title: "Run an OpenAI-compatible server",
setup: ["# Install MLX LM", "pip install mlx-lm"].join("\n"),
content: ["# Start the server", `mlx_lm.server --model "${model.id}"`, ...openaiCurl].join("\n"),
},
]
: []),
];
};

/**
* Add your new local app here.
*
Expand Down Expand Up @@ -302,6 +339,13 @@ export const LOCAL_APPS = {
(model.pipeline_tag === "text-generation" || model.pipeline_tag === "image-text-to-text"),
snippet: snippetVllm,
},
"mlx-lm": {
prettyLabel: "MLX LM",
docsUrl: "https://github.com/ml-explore/mlx-lm",
mainTask: "text-generation",
displayOnModelPage: isMlxModel,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i would only display it on actually supported models (so check the pipeline_tag)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(note, mainTask does not check the model's task on a model page, it's only used for the settings page)

snippet: snippetMlxLm,
},
tgi: {
prettyLabel: "TGI",
docsUrl: "https://huggingface.co/docs/text-generation-inference/",
Expand Down
Loading