vllm/docs/deployment/frameworks/open-webui.md
Harry Mellor b942c094e3
Stop using title frontmatter and fix doc that can only be reached by search (#20623)
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
2025-07-08 03:27:40 -07:00

738 B

Open WebUI

  1. Install the Docker

  2. Start the vLLM server with the supported chat completion model, e.g.

vllm serve qwen/Qwen1.5-0.5B-Chat
  1. Start the Open WebUI docker container (replace the vllm serve host and vllm serve port):
docker run -d -p 3000:8080 \
--name open-webui \
-v open-webui:/app/backend/data \
-e OPENAI_API_BASE_URL=http://<vllm serve host>:<vllm serve port>/v1 \
--restart always \
ghcr.io/open-webui/open-webui:main
  1. Open it in the browser: http://open-webui-host:3000/

On the top of the web page, you can see the model qwen/Qwen1.5-0.5B-Chat.