diff --git a/README.md b/README.md index a0ef83c..c4e194b 100644 --- a/README.md +++ b/README.md @@ -109,9 +109,9 @@ CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-
-cuBLAS (CUDA) +CUDA -To install with cuBLAS, set the `LLAMA_CUDA=on` environment variable before installing: +To install with CUDA support, set the `LLAMA_CUDA=on` environment variable before installing: ```bash CMAKE_ARGS="-DLLAMA_CUDA=on" pip install llama-cpp-python