Commit graph

533 commits

Author SHA1 Message Date
Andrei Betlen 353e18a781 Move workaround to new sample method 2023-04-02 00:06:34 -04:00
Andrei Betlen a4a1bbeaa9 Update api to allow for easier interactive mode 2023-04-02 00:02:47 -04:00
Andrei Betlen eef627c09c Fix example documentation 2023-04-01 17:39:35 -04:00
Andrei Betlen 1e4346307c Add documentation for generate method 2023-04-01 17:36:30 -04:00
Andrei Betlen 67c70cc8eb Add static methods for beginning and end of sequence tokens. 2023-04-01 17:29:30 -04:00
Andrei Betlen 318eae237e Update high-level api 2023-04-01 13:01:27 -04:00
Andrei Betlen 69e7d9f60e Add type definitions 2023-04-01 12:59:58 -04:00
Andrei Betlen 49c8df369a Fix type signature of token_to_str 2023-03-31 03:25:12 -04:00
Andrei Betlen 670d390001 Fix ctypes typing issue for Arrays 2023-03-31 03:20:15 -04:00
Andrei Betlen 1545b22727 Fix array type signatures 2023-03-31 02:08:20 -04:00
Andrei Betlen c928e0afc8 Formatting 2023-03-31 00:00:27 -04:00
Andrei Betlen 8908f4614c Update llama.cpp 2023-03-28 21:10:23 -04:00
Andrei Betlen 70b8a1ef75 Add support to get embeddings from high-level api. Closes #4 2023-03-28 04:59:54 -04:00
Andrei Betlen 3dbb3fd3f6 Add support for stream parameter. Closes #1 2023-03-28 04:03:57 -04:00
Andrei Betlen 30fc0f3866 Extract generate method 2023-03-28 02:42:22 -04:00
Andrei Betlen 1c823f6d0f Refactor Llama class and add tokenize / detokenize methods Closes #3 2023-03-28 01:45:37 -04:00
Andrei Betlen 8ae3beda9c Update Llama to add params 2023-03-25 16:26:23 -04:00
Andrei Betlen 4525236214 Update llama.cpp 2023-03-25 16:26:03 -04:00
Andrei Betlen b121b7c05b Update docstring 2023-03-25 12:33:18 -04:00
Andrei Betlen fa92740a10 Update llama.cpp 2023-03-25 12:12:09 -04:00
Andrei Betlen df15caa877 Add mkdocs 2023-03-24 18:57:59 -04:00
Andrei Betlen 4da5faa28b Bugfix: cross-platform method to find shared lib 2023-03-24 18:43:29 -04:00
Andrei Betlen b93675608a Handle errors returned by llama.cpp 2023-03-24 15:47:17 -04:00
Andrei Betlen 7786edb0f9 Black formatting 2023-03-24 14:59:29 -04:00
Andrei Betlen c784d83131 Update llama.cpp and re-organize low-level api 2023-03-24 14:58:42 -04:00
Andrei Betlen b9c53b88a1 Use n_ctx provided from actual context not params 2023-03-24 14:58:10 -04:00
Andrei Betlen 2cc499512c Black formatting 2023-03-24 14:35:41 -04:00
Andrei Betlen e24c581b5a Implement prompt batch processing as in main.cpp 2023-03-24 14:33:38 -04:00
Andrei Betlen a28cb92d8f Remove model_name param 2023-03-24 04:04:29 -04:00
Andrei Betlen eec9256a42 Bugfix: avoid decoding partial utf-8 characters 2023-03-23 16:25:13 -04:00
Andrei Betlen e63ea4dbbc Add support for logprobs 2023-03-23 15:51:05 -04:00
Andrei Betlen 465238b179 Updated package to build with skbuild 2023-03-23 13:54:14 -04:00
Andrei Betlen 79b304c9d4 Initial commit 2023-03-23 05:33:06 -04:00