Skip to content

docs: Update remote-vllm.md with AMD GPU vLLM server supported. #1858

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 9, 2025

Conversation

AlexHe99
Copy link
Contributor

@AlexHe99 AlexHe99 commented Apr 2, 2025

Add the content to use AMD GPU as the vLLM server. Split the original part to two sub chapters,

  1. AMD vLLM server
  2. NVIDIA vLLM server (orignal)

What does this PR do?

[Provide a short summary of what this PR does and why. Link to relevant issues if applicable.]

Test Plan

[Describe the tests you ran to verify your changes with result summaries. Provide clear instructions so the plan can be easily re-executed.]

Add the content to use AMD GPU as the vLLM server. 
Split the original part to two sub chapters,
1. AMD vLLM server
2. NVIDIA vLLM server (orignal)
@facebook-github-bot
Copy link

Hi @AlexHe99!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@facebook-github-bot
Copy link

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Apr 2, 2025
@leseb
Copy link
Collaborator

leseb commented Apr 7, 2025

@AlexHe99 see pre-commit errors. Thanks!

@AlexHe99
Copy link
Contributor Author

AlexHe99 commented Apr 8, 2025

@AlexHe99 see pre-commit errors. Thanks!

Hi @leseb , Thanks for your reminding.

It seems that pre-commit prevent any changes of docs/source/distributions/self_hosted_distro/remote-vllm.md b/docs/source/distributions/self_hosted_distro/remote-vllm.md.

Now I move the changes to llama_stack/templates/remote-vllm/doc_template.md which is the template and pre-commit will help auto-syncing the changes to docs/source/distributions/self_hosted_distro/remote-vllm.md.

Thanks,
Alex

@AlexHe99
Copy link
Contributor Author

AlexHe99 commented Apr 9, 2025

the error check by pre-commit of the pre-commit has been fixed now. Please help review the new update commit.

@AlexHe99 AlexHe99 closed this Apr 9, 2025
@AlexHe99 AlexHe99 reopened this Apr 9, 2025
@raghotham raghotham merged commit 983f6fe into meta-llama:main Apr 9, 2025
11 checks passed
Move the contents from remote-vllm.md to doc_template.md.

Signed-off-by: Alex He <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants