From 24a36d6d5f789fd2d5105174c24528fc7e659b00 Mon Sep 17 00:00:00 2001 From: Yuan Tang Date: Wed, 11 Dec 2024 21:39:21 -0500 Subject: [PATCH] Update link to LlamaStack remote vLLM guide in serving_with_llamastack.rst (#11112) Signed-off-by: Yuan Tang --- docs/source/serving/serving_with_llamastack.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/serving/serving_with_llamastack.rst b/docs/source/serving/serving_with_llamastack.rst index 8ef96c4e54369..a2acd7b39f887 100644 --- a/docs/source/serving/serving_with_llamastack.rst +++ b/docs/source/serving/serving_with_llamastack.rst @@ -24,7 +24,7 @@ Then start Llama Stack server pointing to your vLLM server with the following co config: url: http://127.0.0.1:8000 -Please refer to `this guide `_ for more details on this remote vLLM provider. +Please refer to `this guide `_ for more details on this remote vLLM provider. Inference via Embedded vLLM ---------------------------