llama.cpp/llama_cpp
TK-Master b8438f70b5
Added support for min_p (#921)
* Added support for min_p

My small contribution to this great project.

Ref: https://github.com/ggerganov/llama.cpp/pull/3841

Closes: https://github.com/abetlen/llama-cpp-python/issues/911

* Fix for negative temp (sample_softmax)
2023-11-20 23:21:33 -05:00
..
server Added support for min_p (#921) 2023-11-20 23:21:33 -05:00
__init__.py Bump version 2023-11-14 14:10:00 -05:00
_utils.py Clean up stdout / stderr suppression 2023-11-03 13:02:15 -04:00
llama.py Added support for min_p (#921) 2023-11-20 23:21:33 -05:00
llama_chat_format.py Added support for min_p (#921) 2023-11-20 23:21:33 -05:00
llama_cpp.py Update llama.cpp 2023-11-20 14:11:33 -05:00
llama_grammar.py Add $ref and $defs support to json schema converter 2023-11-10 02:50:46 -05:00
llama_types.py Add missing tool_calls finish_reason 2023-11-10 02:51:06 -05:00
llava_cpp.py Multimodal Support (Llava 1.5) (#821) 2023-11-07 22:48:51 -05:00
py.typed Add py.typed 2023-08-11 09:58:48 +02:00