mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced 2025-12-09 21:55:50 +08:00
[Doc] Add API reference for offline inference (#4710)
This commit is contained in:
parent
ac1fbf7fd2
commit
4bfa7e7f75
@ -67,6 +67,13 @@ Documentation
|
||||
getting_started/quickstart
|
||||
getting_started/examples/examples_index
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
:caption: Offline Inference
|
||||
|
||||
offline_inference/llm
|
||||
offline_inference/sampling_params
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
:caption: Serving
|
||||
@ -101,7 +108,6 @@ Documentation
|
||||
:maxdepth: 2
|
||||
:caption: Developer Documentation
|
||||
|
||||
dev/sampling_params
|
||||
dev/engine/engine_index
|
||||
dev/kernel/paged_attention
|
||||
dev/dockerfile/dockerfile
|
||||
|
||||
6
docs/source/offline_inference/llm.rst
Normal file
6
docs/source/offline_inference/llm.rst
Normal file
@ -0,0 +1,6 @@
|
||||
LLM Class
|
||||
==========
|
||||
|
||||
.. autoclass:: vllm.LLM
|
||||
:members:
|
||||
:show-inheritance:
|
||||
@ -1,5 +1,5 @@
|
||||
Sampling Params
|
||||
===============
|
||||
Sampling Parameters
|
||||
===================
|
||||
|
||||
.. autoclass:: vllm.SamplingParams
|
||||
:members:
|
||||
@ -48,7 +48,7 @@ completion = client.chat.completions.create(
|
||||
```
|
||||
|
||||
### Extra Parameters for Chat API
|
||||
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
|
||||
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
|
||||
|
||||
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
||||
:language: python
|
||||
@ -65,7 +65,7 @@ The following extra parameters are supported:
|
||||
```
|
||||
|
||||
### Extra Parameters for Completions API
|
||||
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
|
||||
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.
|
||||
|
||||
```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
|
||||
:language: python
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user