This provides a web UI for AI chat playground that is able to connect virtually any LLM from any platform.
- .NET SDK 9
- Visual Studio Code + C# DevKit
- Docker Desktop or Podman
- PowerShell 7.4+
- Azure Developer CLI
- Azure CLI + Container Apps extension
- GitHub CLI
-
Login to GitHub.
gh auth login
-
Check login status.
gh auth status
-
Fork this repository to your account and clone the forked repository to your local machine.
gh repo fork aliencube/open-chat-playground --clone --default-branch-only
-
Get the repository root.
# bash/zsh REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
# PowerShell $REPOSITORY_ROOT = git rev-parse --show-toplevel
-
Run the app.
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.AppHost
-
If you want to change the model, pass the following arguments:
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.AppHost -- [OPTIONS]
Here are the list of options:
--llm-provider
: Chooseopenai
,ollama
orhface
. Default isopenai
.--openai-deployment
: Provide the deployment name, if you chooseopenai
as the LLM provider. Default isgpt-4o
.--openai-connection-string
: Provide the connection string to OpenAI. It must be provided if you chooseopenai
as the LLM provider. It must follow the format likeEndpoint=xxxxx;Key=xxxxx
.--ollama-image-tag
: Provide the Ollama container version, if you choose eitherollama
orhface
as the LLM provider. Default is0.6.8
.--ollama-use-gpu
: Provide the value whether to use the GPU acceleration or not, if you choose eitherollama
orhface
as the LLM provider. Default isfalse
.--ollama-deployment
: Provide the deployment name, if you chooseollama
as the LLM provider. Default isllama
.--ollama-model
: Provide the model name, if you chooseollama
as the LLM provider. Default isllama3.2
.--huggingface-deployment
: Provide the deployment name, if you choosehface
as the LLM provider. Default isqwen3
.--huggingface-model
: Provide the model name, if you choosehface
as the LLM provider. Default isQwen/Qwen3-14B-GGUF
.--help
: Display the help message.
-
Get the repository root.
# bash/zsh REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
# PowerShell $REPOSITORY_ROOT = git rev-parse --show-toplevel
-
Make sure you are at the repository root.
cd $REPOSITORY_ROOT
-
Login to Azure.
# Login to Azure Dev CLI azd auth login # Login to Azure CLI az login
-
Check login status.
# Azure Dev CLI azd auth login --check-status # Azure CLI az account show
-
Update
appsettings.json
on theOpenChat.AppHost
project. The JSON object below shows the default values. -
Add OpenAI connection string, if you want to use OpenAI as the LLM provider.
# bash/zsh dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.AppHost \ set ConnectionStrings:openai "Endpoint={{API_ENDPOINT}};Key={{API_KEY}}"
# PowerShell dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.AppHost ` set ConnectionStrings:openai "Endpoint={{API_ENDPOINT}};Key={{API_KEY}}"
-
Run the following commands in order to provision and deploy the app.
azd up
NOTE: You will be asked to provide Azure subscription and location for deployment.