Bug fixed with n_ctx=0 (#1015)

If the n_ctx is set to 0 the code should use the maximum context length of the selected model, but it didn't work. There was a problem with the initialization of this parameter and a related problem with 'n_batch'.
This commit is contained in:
Daniele Morotti 2023-12-17 00:59:50 +01:00 committed by GitHub
parent 5a8944672f
commit f1c631dc53
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -923,6 +923,12 @@ class Llama:
self._model = _LlamaModel(
path_model=self.model_path, params=self.model_params, verbose=self.verbose
)
# Set the default value for the context and correct the batch
if n_ctx == 0:
n_ctx = self._model.n_ctx_train()
self.n_batch = min(n_ctx, n_batch)
self.context_params.n_ctx = self._model.n_ctx_train()
self.context_params.n_batch = self.n_batch
self._ctx = _LlamaContext(
model=self._model,