From 84ab4feb7e994ee6c692957e6d80a528af072e49 Mon Sep 17 00:00:00 2001 From: Elad Segal Date: Mon, 19 May 2025 19:05:16 +0300 Subject: [PATCH] [Doc] Fix typo (#18355) --- docs/source/models/supported_models.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/models/supported_models.md b/docs/source/models/supported_models.md index 80eccfd034af..4d574216242b 100644 --- a/docs/source/models/supported_models.md +++ b/docs/source/models/supported_models.md @@ -54,7 +54,7 @@ For a model to be compatible with the Transformers backend for vLLM it must: If the compatible model is: -- on the Hugging Face Model Hub, simply set `trust_remote_code=True` for or `--trust-remode-code` for the . +- on the Hugging Face Model Hub, simply set `trust_remote_code=True` for or `--trust-remote-code` for the . - in a local directory, simply pass directory path to `model=` for or `vllm serve ` for the . This means that, with the Transformers backend for vLLM, new models can be used before they are officially supported in Transformers or vLLM!