From d330558bab70ce760ea70433f3e9846a909640a1 Mon Sep 17 00:00:00 2001 From: Harry Mellor <19981378+hmellor@users.noreply.github.com> Date: Tue, 1 Apr 2025 11:05:14 +0100 Subject: [PATCH] [Docs] Fix small error in link text (#15868) Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com> --- docs/source/models/supported_models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/models/supported_models.md b/docs/source/models/supported_models.md index 62274854d8bee..fb4c8bde06576 100644 --- a/docs/source/models/supported_models.md +++ b/docs/source/models/supported_models.md @@ -135,7 +135,7 @@ To determine whether a given model is natively supported, you can check the `con If the `"architectures"` field contains a model architecture listed below, then it should be natively supported. Models do not _need_ to be natively supported to be used in vLLM. -The enables you to run models directly using their Transformers implementation (or even remote code on the Hugging Face Model Hub!). +The [Transformers backend](#transformers-backend) enables you to run models directly using their Transformers implementation (or even remote code on the Hugging Face Model Hub!). :::{tip} The easiest way to check if your model is really supported at runtime is to run the program below: