Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Commit 91792dd

Browse files
committed
turn off caching
1 parent d1133c1 commit 91792dd

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

controllers/llamaCPP.cc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -655,7 +655,7 @@ bool llamaCPP::LoadModelImpl(std::shared_ptr<Json::Value> jsonBody) {
655655
params.cont_batching = jsonBody->get("cont_batching", false).asBool();
656656
this->clean_cache_threshold =
657657
jsonBody->get("clean_cache_threshold", 5).asInt();
658-
this->caching_enabled = jsonBody->get("caching_enabled", true).asBool();
658+
this->caching_enabled = jsonBody->get("caching_enabled", false).asBool();
659659
this->user_prompt = jsonBody->get("user_prompt", "USER: ").asString();
660660
this->ai_prompt = jsonBody->get("ai_prompt", "ASSISTANT: ").asString();
661661
this->system_prompt =

0 commit comments

Comments
 (0)