From 3ed94f9d0ac81773cba52fe78fad74806592aae4 Mon Sep 17 00:00:00 2001 From: Ricardo Decal Date: Tue, 15 Jul 2025 22:46:56 -0400 Subject: [PATCH] [Docs] Enhance Anyscale documentation, add quickstart links for vLLM (#21018) Signed-off-by: Ricardo Decal --- docs/deployment/frameworks/anyscale.md | 13 +++++++++++-- 1 file changed, 11 insertions(+), 2 deletions(-) diff --git a/docs/deployment/frameworks/anyscale.md b/docs/deployment/frameworks/anyscale.md index 5604f7f96157..9957c5b14134 100644 --- a/docs/deployment/frameworks/anyscale.md +++ b/docs/deployment/frameworks/anyscale.md @@ -3,6 +3,15 @@ [](){ #deployment-anyscale } [Anyscale](https://www.anyscale.com) is a managed, multi-cloud platform developed by the creators of Ray. -It hosts Ray clusters inside your own AWS, GCP, or Azure account, delivering the flexibility of open-source Ray -without the operational overhead of maintaining Kubernetes control planes, configuring autoscalers, or managing observability stacks. + +Anyscale automates the entire lifecycle of Ray clusters in your AWS, GCP, or Azure account, delivering the flexibility of open-source Ray +without the operational overhead of maintaining Kubernetes control planes, configuring autoscalers, managing observability stacks, or manually managing head and worker nodes with helper scripts like . + When serving large language models with vLLM, Anyscale can rapidly provision [production-ready HTTPS endpoints](https://docs.anyscale.com/examples/deploy-ray-serve-llms) or [fault-tolerant batch inference jobs](https://docs.anyscale.com/examples/ray-data-llm). + +## Production-ready vLLM on Anyscale quickstarts + +- [Offline batch inference](https://console.anyscale.com/template-preview/llm_batch_inference?utm_source=vllm_docs) +- [Deploy vLLM services](https://console.anyscale.com/template-preview/llm_serving?utm_source=vllm_docs) +- [Curate a dataset](https://console.anyscale.com/template-preview/audio-dataset-curation-llm-judge?utm_source=vllm_docs) +- [Finetune an LLM](https://console.anyscale.com/template-preview/entity-recognition-with-llms?utm_source=vllm_docs)