Go to file
2023-03-24 04:04:29 -04:00
examples Add fastapi example 2023-03-24 01:41:24 -04:00
llama_cpp Remove model_name param 2023-03-24 04:04:29 -04:00
vendor Update llama.cpp 2023-03-23 23:00:56 -04:00
.gitignore Updated package to build with skbuild 2023-03-23 13:54:14 -04:00
.gitmodules Add llama.cpp to vendor folder 2023-03-23 05:37:26 -04:00
CMakeLists.txt Update llama.cpp and shared library build process 2023-03-23 17:01:06 -04:00
LICENSE.md Initial commit 2023-03-23 05:33:06 -04:00
poetry.lock Initial pypi release 2023-03-23 14:24:08 -04:00
pyproject.toml Version bump 2023-03-23 16:00:02 -04:00
README.md Update README.md 2023-03-24 00:06:24 -04:00
setup.py Bump Version 2023-03-23 23:13:08 -04:00

🦙 Python Bindings for llama.cpp

PyPI PyPI - Python Version PyPI - License PyPI - Downloads

Simple Python bindings for @ggerganov's llama.cpp library. This package provides:

  • Low-level access to C API via ctypes interface.
  • High-level Python API for text completion
    • OpenAI-like API
    • LangChain compatibility

Installation

Install from PyPI:

pip install llama-cpp-python

Usage

>>> from llama_cpp import Llama
>>> llm = Llama(model_path="models/7B/...")
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
>>> print(output)
{
  "id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "object": "text_completion",
  "created": 1679561337,
  "model": "models/7B/...",
  "choices": [
    {
      "text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
      "index": 0,
      "logprobs": None,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 14,
    "completion_tokens": 28,
    "total_tokens": 42
  }
}

License

This project is licensed under the terms of the MIT license.