mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced 2025-12-10 06:25:01 +08:00
[Docs] Fix broken links to docs/api/summary.md (#23637)
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
This commit is contained in:
parent
2b4fc9bd9b
commit
164b2273c8
@ -2,6 +2,6 @@
|
||||
|
||||
vLLM's examples are split into three categories:
|
||||
|
||||
- If you are using vLLM from within Python code, see [Offline Inference](./offline_inference/)
|
||||
- If you are using vLLM from an HTTP application or client, see [Online Serving](./online_serving/)
|
||||
- For examples of using some of vLLM's advanced features (e.g. LMCache or Tensorizer) which are not specific to either of the above use cases, see [Others](./others/)
|
||||
- If you are using vLLM from within Python code, see [Offline Inference](./offline_inference)
|
||||
- If you are using vLLM from an HTTP application or client, see [Online Serving](./online_serving)
|
||||
- For examples of using some of vLLM's advanced features (e.g. LMCache or Tensorizer) which are not specific to either of the above use cases, see [Others](./others)
|
||||
|
||||
@ -19,7 +19,7 @@ Run a model in generation mode via the option `--runner generate`.
|
||||
## Offline Inference
|
||||
|
||||
The [LLM][vllm.LLM] class provides various methods for offline inference.
|
||||
See [configuration](../api/summary.md#configuration) for a list of options when initializing the model.
|
||||
See [configuration](../api/README.md#configuration) for a list of options when initializing the model.
|
||||
|
||||
### `LLM.generate`
|
||||
|
||||
|
||||
@ -81,7 +81,7 @@ which takes priority over both the model's and Sentence Transformers's defaults.
|
||||
## Offline Inference
|
||||
|
||||
The [LLM][vllm.LLM] class provides various methods for offline inference.
|
||||
See [configuration](../api/summary.md#configuration) for a list of options when initializing the model.
|
||||
See [configuration](../api/README.md#configuration) for a list of options when initializing the model.
|
||||
|
||||
### `LLM.embed`
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user