Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Commit 9a46624

Browse files
committed
force no caching
1 parent 91792dd commit 9a46624

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

controllers/llamaCPP.cc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,8 @@ void llamaCPP::InferenceImpl(
205205
}
206206

207207
// Default values to enable auto caching
208-
data["cache_prompt"] = caching_enabled;
208+
//data["cache_prompt"] = caching_enabled;
209+
data["cache_prompt"] = false;
209210
data["n_keep"] = -1;
210211

211212
// Passing load value

0 commit comments

Comments
 (0)