[Doc] uses absolute links for structured outputs (#19582)

Signed-off-by: Aaron Pham <contact@aarnphm.xyz>
This commit is contained in:
Aaron Pham 2025-06-12 23:35:17 -04:00 committed by GitHub
parent c68698b326
commit 7b3c9ff91d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 4 additions and 4 deletions

View File

@ -149,7 +149,7 @@ completion = client.chat.completions.create(
print(completion.choices[0].message.content)
```
See also: [full example](../../examples/online_serving/structured_outputs)
See also: [full example](https://docs.vllm.ai/en/latest/examples/online_serving/structured_outputs.html)
## Reasoning Outputs
@ -190,7 +190,7 @@ print("reasoning_content: ", completion.choices[0].message.reasoning_content)
print("content: ", completion.choices[0].message.content)
```
See also: [full example](../../examples/online_serving/structured_outputs)
See also: [full example](https://docs.vllm.ai/en/latest/examples/online_serving/structured_outputs.html)
## Experimental Automatic Parsing (OpenAI API)
@ -311,4 +311,4 @@ outputs = llm.generate(
print(outputs[0].outputs[0].text)
```
See also: [full example](../../examples/online_serving/structured_outputs)
See also: [full example](https://docs.vllm.ai/en/latest/examples/online_serving/structured_outputs.html)

View File

@ -22,7 +22,7 @@ If you want to run this script standalone with `uv`, you can use the following:
uvx --from git+https://github.com/vllm-project/vllm#subdirectory=examples/online_serving/structured_outputs structured-output
```
See [feature docs](../../../features/structured_outputs.md) for more information.
See [feature docs](https://docs.vllm.ai/en/latest/features/structured_outputs.html) for more information.
!!! tip
If vLLM is running remotely, then set `OPENAI_BASE_URL=<remote_url>` before running the script.