[Doc] Update FAQ links in spec_decode.rst (#9662)

Signed-off-by: whyiug <whyiug@hotmail.com>
This commit is contained in:
whyiug 2024-11-08 12:44:58 +08:00 committed by GitHub
parent 6bb52b0f97
commit 40d0e7411d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -182,7 +182,7 @@ speculative decoding, breaking down the guarantees into three key areas:
3. **vLLM Logprob Stability**
- vLLM does not currently guarantee stable token log probabilities (logprobs). This can result in different outputs for the
same request across runs. For more details, see the FAQ section
titled *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq.rst>`_.
titled *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq>`_.
**Conclusion**
@ -197,7 +197,7 @@ can occur due to following factors:
**Mitigation Strategies**
For mitigation strategies, please refer to the FAQ entry *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq.rst>`_.
For mitigation strategies, please refer to the FAQ entry *Can the output of a prompt vary across runs in vLLM?* in the `FAQs <../serving/faq>`_.
Resources for vLLM contributors
-------------------------------