This website requires JavaScript.
Explore
Help
Sign In
xinyun
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
mirror of
https://git.datalinker.icu/vllm-project/vllm.git
synced
2026-01-04 00:19:37 +08:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
vllm
/
vllm
/
transformers_utils
History
Lu Wang
de89472897
Fix the issue for AquilaChat2-* models (
#1339
)
2023-10-13 11:51:29 -07:00
..
configs
Fix the issue for AquilaChat2-* models (
#1339
)
2023-10-13 11:51:29 -07:00
__init__.py
[Tokenizer] Add an option to specify tokenizer (
#284
)
2023-06-28 09:46:58 -07:00
config.py
Bump up transformers version & Remove MistralConfig (
#1254
)
2023-10-13 10:05:26 -07:00
tokenizer.py
Improve detokenization performance (
#1338
)
2023-10-13 09:59:07 -07:00