This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-03-26 12:05:55 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
tests
History
youkaichao
443c7cf4cf
[ci][distributed] fix flaky tests (
#6806
)
2024-07-25 17:44:09 -07:00
..
async_engine
…
basic_correctness
…
core
…
distributed
[ci][distributed] fix flaky tests (
#6806
)
2024-07-25 17:44:09 -07:00
engine
…
entrypoints
[Bugfix] Fix encoding_format in examples/openai_embedding_client.py (
#6755
)
2024-07-24 22:48:07 -07:00
fp8_kv
…
kernels
…
lora
…
metrics
…
model_executor
…
models
[Model] Adding support for MiniCPM-V (
#4087
)
2024-07-24 20:59:30 -07:00
multimodal
…
prefix_caching
…
prompt_adapter
…
prompts
…
quantization
[Bugfix] Fix
kv_cache_dtype=fp8
without scales for FP8 checkpoints (
#6761
)
2024-07-25 09:46:15 -07:00
samplers
…
spec_decode
…
tensorizer_loader
…
tokenization
…
tracing
…
worker
[Bugfix] Fix decode tokens w. CUDA graph (
#6757
)
2024-07-24 22:33:56 -07:00
__init__.py
…
conftest.py
[Model] Adding support for MiniCPM-V (
#4087
)
2024-07-24 20:59:30 -07:00
test_cache_block_hashing.py
…
test_config.py
[Bugfix] Bump transformers to 4.43.2 (
#6752
)
2024-07-24 13:22:16 -07:00
test_embedded_commit.py
…
test_inputs.py
…
test_logger.py
…
test_logits_processor.py
…
test_regression.py
…
test_sampling_params.py
…
test_sequence.py
…
test_sharded_state_loader.py
…
test_utils.py
…
utils.py
…