diff --git a/docs/source/features/quantization/fp8.md b/docs/source/features/quantization/fp8.md index 21969bbc2b9f7..cb304d54726c8 100644 --- a/docs/source/features/quantization/fp8.md +++ b/docs/source/features/quantization/fp8.md @@ -117,7 +117,7 @@ Here's an example of the resulting scores: ## Troubleshooting and Support -If you encounter any issues or have feature requests, please open an issue on the `vllm-project/llm-compressor` GitHub repository. +If you encounter any issues or have feature requests, please open an issue on the [vllm-project/llm-compressor](https://github.com/vllm-project/llm-compressor/issues) GitHub repository. ## Online Dynamic Quantization diff --git a/docs/source/features/quantization/int4.md b/docs/source/features/quantization/int4.md index be48788a4ef60..7a0ab4ad229e6 100644 --- a/docs/source/features/quantization/int4.md +++ b/docs/source/features/quantization/int4.md @@ -169,4 +169,4 @@ recipe = GPTQModifier( ## Troubleshooting and Support -If you encounter any issues or have feature requests, please open an issue on the [`vllm-project/llm-compressor`](https://github.com/vllm-project/llm-compressor) GitHub repository. The full INT4 quantization example in `llm-compressor` is available [here](https://github.com/vllm-project/llm-compressor/blob/main/examples/quantization_w4a16/llama3_example.py). +If you encounter any issues or have feature requests, please open an issue on the [vllm-project/llm-compressor](https://github.com/vllm-project/llm-compressor/issues) GitHub repository. The full INT4 quantization example in `llm-compressor` is available [here](https://github.com/vllm-project/llm-compressor/blob/main/examples/quantization_w4a16/llama3_example.py). diff --git a/docs/source/features/quantization/int8.md b/docs/source/features/quantization/int8.md index d6ddca18e2686..1e4b01d35575c 100644 --- a/docs/source/features/quantization/int8.md +++ b/docs/source/features/quantization/int8.md @@ -138,4 +138,4 @@ Quantized models can be sensitive to the presence of the `bos` token. Make sure ## Troubleshooting and Support -If you encounter any issues or have feature requests, please open an issue on the [`vllm-project/llm-compressor`](https://github.com/vllm-project/llm-compressor) GitHub repository. +If you encounter any issues or have feature requests, please open an issue on the [vllm-project/llm-compressor](https://github.com/vllm-project/llm-compressor/issues) GitHub repository.