Commit graph

141 commits

Author SHA1 Message Date
Andrei Betlen 5502ac8876 Update llama.cpp 2024-01-15 10:12:10 -05:00
Andrei Betlen 359ae73643 Update llama.cpp 2024-01-14 08:17:22 -05:00
Andrei Betlen 7c898d5684 Update llama.cpp 2024-01-13 22:37:49 -05:00
Andrei Betlen bb610b9428 Update llama.cpp 2024-01-11 22:51:12 -05:00
Andrei Betlen 1ae05c102b Update llama.cpp 2024-01-08 14:51:29 -05:00
Andrei Betlen eb9c7d4ed8 Update llama.cpp 2024-01-03 22:04:04 -05:00
Andrei Betlen 92284f32cb Add HIP_PATH to dll search directories for windows users. 2023-12-22 15:29:56 -05:00
Andrei Betlen 2b0d3f36fa set llama_max_devices using library function 2023-12-22 15:19:28 -05:00
Andrei Betlen 6d8bc090f9 fix: inccorect bindings for kv override. Based on #1011 2023-12-22 14:52:20 -05:00
Andrei Betlen 6473796343 Update llama.cpp 2023-12-22 14:10:34 -05:00
Andrei Betlen 4a85442c35 Update llama.cpp 2023-12-22 00:12:37 -05:00
Andrei Betlen 7df6c32544 Fix type annotations 2023-12-18 18:14:53 -05:00
Andrei Betlen b703aad79e Fix type annotation 2023-12-18 18:13:37 -05:00
Andrei Betlen d0aedfcff6 Fix type annotation 2023-12-18 18:12:49 -05:00
Eduard Christian Dumitrescu 2993936b10
Fix ctypes definitions of llama_kv_cache_view_update and llama_kv_cache_view_free. (#1028) 2023-12-18 18:11:26 -05:00
Brandon Roberts 62944df142
Bugfix: Remove f16_kv, add offload_kqv field (#1019)
F16_KV appears to have been removed here: af99c6fbfc

This addresses two issues:

 - #995 which just requests to add the KV cache offloading param
 - #1006 a NULL ptr exception when using the embeddings (introduced by
   leaving f16_kv in the fields struct)
2023-12-18 14:27:11 -05:00
Andrei Betlen 534b1ea9b5 Update llama.cpp 2023-12-16 18:57:43 -05:00
Andrei Betlen c0fc0a1e82 Update llama.cpp 2023-12-13 21:43:16 -05:00
Andrei Betlen f1edc66b21 Update llama.cpp 2023-12-11 10:21:35 -05:00
Andrei Betlen 396dbf0b2b docs: Improve low-level docstrings 2023-11-27 19:03:02 -05:00
Andrei Betlen f03a38e62a Update llama.cpp 2023-11-26 15:38:22 -05:00
Andrei Betlen 36048d46af Update llama.cpp 2023-11-23 16:26:00 -05:00
Andrei Betlen be1f64d569 docs: Add docstrings from llama.cpp 2023-11-23 00:26:26 -05:00
Andrei Betlen 2c2afa320f Update llama.cpp 2023-11-20 14:11:33 -05:00
Andrei Betlen f0b30ef7dc Update llama.cpp 2023-11-05 16:57:10 -05:00
Andrei Betlen df9362eeea Update llama.cpp 2023-11-03 11:34:50 -04:00
Andrei Betlen fa83cc5f9c Update llama.cpp
Fix build examples

Exclude examples directory

Revert cmake changes

Try actions/checkout@v4

Try to update submodules

Revert

Update llama.cpp

Fix build examples

Exclude examples directory

Revert cmake changes

Try actions/checkout@v4

Try to update submodules

Revert
2023-11-02 14:28:15 -04:00
Sujeendran Menon 7b136bb5b1
Fix for shared library not found and compile issues in Windows (#848)
* fix windows library dll name issue

* Updated README.md Windows instructions

* Update llama_cpp.py to handle different windows dll file versions
2023-11-01 18:55:57 -04:00
Andrei Betlen d808fd436c Update llama.cpp 2023-10-31 21:29:35 -04:00
Andrei Betlen 53861c9e53 Update llama.cpp 2023-10-24 03:13:32 -04:00
Andrei Betlen ff580031d2 Update llama.cpp 2023-10-19 02:55:08 -04:00
Andrei Betlen 43dfe1e2ab Update llama.cpp 2023-10-05 16:07:49 -04:00
Andrei Betlen a7d17b8ac9 Update llama.cpp 2023-10-03 15:23:35 -04:00
Andrei Betlen 3720c739d4 Update llama.cpp 2023-09-29 19:58:21 -04:00
Andrei Betlen 1a1c3dc418 Update llama.cpp 2023-09-28 22:42:03 -04:00
Andrei Betlen 38e34c97f0 Update llama.cpp 2023-09-18 16:11:27 -04:00
Andrei Betlen 8d75016549 Install required runtime dlls to package directory on windows 2023-09-16 14:57:49 -04:00
Andrei Betlen 8474665625 Update base_path to fix issue resolving dll in windows isolation container. 2023-09-14 14:51:43 -04:00
Andrei Betlen f4090a0bb2 Add numa support, low level api users must now explicitly call llama_backend_init at the start of their programs. 2023-09-13 23:00:43 -04:00
Andrei Betlen 517f9ed80b Convert missed llama.cpp constants into standard python types 2023-09-13 21:11:52 -04:00
Andrei Betlen 1910793f56 Merge branch 'main' into v0.2-wip 2023-09-12 16:43:32 -04:00
Andrei Betlen d3f63211ef Update llama.cpp 2023-09-09 12:12:32 -04:00
Andrei Betlen 186626d58e Update llama.cpp 2023-09-01 14:26:13 -04:00
Andrei Betlen 47de3ab104 Update llama.cpp 2023-08-29 07:36:20 -04:00
Andrei Betlen e0dcbc28a1 Update llama.cpp 2023-08-28 10:33:45 -04:00
Andrei Betlen 4887973c22 Update llama.cpp 2023-08-27 12:59:20 -04:00
Andrei Betlen ac47d55577 Merge branch 'main' into v0.2-wip 2023-08-25 15:45:22 -04:00
Andrei Betlen ef23d1e545 Update llama.cpp 2023-08-25 14:35:53 -04:00
Andrei Betlen c2d1deaa8a Update llama.cpp 2023-08-24 18:01:42 -04:00
Andrei Betlen db982a861f Fix 2023-08-24 01:01:12 -04:00