This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2025-12-31 04:49:37 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
/
quantization
History
chenqianfzh
4664ceaad6
support bitsandbytes 8-bit and FP4 quantized models (
#7445
)
2024-08-29 19:09:08 -04:00
..
__init__.py
…
test_bitsandbytes.py
support bitsandbytes 8-bit and FP4 quantized models (
#7445
)
2024-08-29 19:09:08 -04:00
test_compressed_tensors.py
…
test_configs.py
…
test_cpu_offload.py
…
test_experts_int8.py
…
test_fp8.py
…
test_lm_head.py
…
utils.py
…