Skip to content

Commit 49955a0

Browse files
docs: Update quickstart page to structure things a little more for the novices (#1873)
# What does this PR do? Another doc enhancement for #1818 Summary of changes: - `docs/source/distributions/configuration.md` - Updated dropdown title to include a more user-friendly description. - `docs/_static/css/my_theme.css` - Added styling for `<h3>` elements to set a normal font weight. - `docs/source/distributions/starting_llama_stack_server.md` - Changed section headers from bold text to proper markdown headers (e.g., `##`). - Improved descriptions for starting Llama Stack server using different methods (library, container, conda, Kubernetes). - Enhanced clarity and structure by converting instructions into markdown headers and improved formatting. - `docs/source/getting_started/index.md` - Major restructuring of the "Quick Start" guide: - Added new introductory section for Llama Stack and its capabilities. - Reorganized steps into clearer subsections with proper markdown headers. - Replaced dropdowns with tabbed content for OS-specific instructions. - Added detailed steps for setting up and running the Llama Stack server and client. - Introduced new sections for running basic inference and building agents. - Enhanced readability and visual structure with emojis, admonitions, and examples. - `docs/source/providers/index.md` - Updated the list of LLM inference providers to include "Ollama." - Expanded the list of vector databases to include "SQLite-Vec." Let me know if you need further details! ## Test Plan Renders locally, included screenshot. # Documentation For #1818 <img width="1332" alt="Screenshot 2025-04-09 at 11 07 12 AM" src="https://github.com/user-attachments/assets/c106efb9-076c-4059-a4e0-a30fa738585b" /> --------- Signed-off-by: Francisco Javier Arceo <[email protected]>
1 parent edd9aaa commit 49955a0

File tree

7 files changed

+633
-429
lines changed

7 files changed

+633
-429
lines changed

Diff for: docs/_static/css/my_theme.css

+3
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,9 @@
1717
display: none;
1818
}
1919

20+
h3 {
21+
font-weight: normal;
22+
}
2023
html[data-theme="dark"] .rst-content div[class^="highlight"] {
2124
background-color: #0b0b0b;
2225
}

Diff for: docs/source/distributions/configuration.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
The Llama Stack runtime configuration is specified as a YAML file. Here is a simplified version of an example configuration file for the Ollama distribution:
44

5-
```{dropdown} Sample Configuration File
5+
```{dropdown} 👋 Click here for a Sample Configuration File
66
77
```yaml
88
version: 2

Diff for: docs/source/distributions/starting_llama_stack_server.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -2,22 +2,22 @@
22

33
You can run a Llama Stack server in one of the following ways:
44

5-
**As a Library**:
5+
## As a Library:
66

77
This is the simplest way to get started. Using Llama Stack as a library means you do not need to start a server. This is especially useful when you are not running inference locally and relying on an external inference service (eg. fireworks, together, groq, etc.) See [Using Llama Stack as a Library](importing_as_library)
88

99

10-
**Container**:
10+
## Container:
1111

1212
Another simple way to start interacting with Llama Stack is to just spin up a container (via Docker or Podman) which is pre-built with all the providers you need. We provide a number of pre-built images so you can start a Llama Stack server instantly. You can also build your own custom container. Which distribution to choose depends on the hardware you have. See [Selection of a Distribution](selection) for more details.
1313

1414

15-
**Conda**:
15+
## Conda:
1616

1717
If you have a custom or an advanced setup or you are developing on Llama Stack you can also build a custom Llama Stack server. Using `llama stack build` and `llama stack run` you can build/run a custom Llama Stack server containing the exact combination of providers you wish. We have also provided various templates to make getting started easier. See [Building a Custom Distribution](building_distro) for more details.
1818

1919

20-
**Kubernetes**:
20+
## Kubernetes:
2121

2222
If you have built a container image and want to deploy it in a Kubernetes cluster instead of starting the Llama Stack server locally. See [Kubernetes Deployment Guide](kubernetes_deployment) for more details.
2323

0 commit comments

Comments
 (0)