From acb1bfa601c60411728b925a4b2bfc3c516310ab Mon Sep 17 00:00:00 2001 From: Chauncey Date: Fri, 17 Oct 2025 15:53:40 +0800 Subject: [PATCH] [CI] fix docs build failed (#27082) Signed-off-by: chaunceyjiang --- docs/models/supported_models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/models/supported_models.md b/docs/models/supported_models.md index 9ef877bf3891..64a01ec25e43 100644 --- a/docs/models/supported_models.md +++ b/docs/models/supported_models.md @@ -116,7 +116,7 @@ Here is what happens in the background when this model is loaded: 1. The config is loaded. 2. `MyModel` Python class is loaded from the `auto_map` in config, and we check that the model `is_backend_compatible()`. -3. `MyModel` is loaded into one of the Transformers backend classes in [vllm/model_executor/models/transformers.py](../../vllm/model_executor/models/transformers.py) which sets `self.config._attn_implementation = "vllm"` so that vLLM's attention layer is used. +3. `MyModel` is loaded into one of the Transformers backend classes in [vllm/model_executor/models/transformers](../../vllm/model_executor/models/transformers) which sets `self.config._attn_implementation = "vllm"` so that vLLM's attention layer is used. That's it!