Skip to content

Main NC proxy is not configured to support the 1800s request timeout #45

@pmarini-nc

Description

@pmarini-nc

(Forked from nextcloud/context_chat_backend#214.)

Given this prompt (the instance has all the sample standard docs in it, so the prompt should return something):

# occ context_chat:stats
ContextChat statistics:
Installed time: 2025-09-14 21:13 UTC
Index complete time: 2025-09-15 09:24 UTC
Total time taken for complete index: 0 days 12:11 (hh:mm)
Total eligible files: 39
Files in indexing queue: 0
Queued documents (without files):array (
)
Files successfully sent to backend: 35
Indexed documents: array (
  'files__default' => 35,
)
Actions in queue: 0
File system events in queue: 0

# occ context_chat:prompt admin-e4zt "Nextcloud"
Error received from Context Chat Backend (ExApp) with status code 504: unknown error

# docker logs nc_app_context_chat_backend
decode: cannot decode batches with this context (calling encode() instead)
init: embeddings required but some input tokens were not marked as outputs -> overriding
llama_perf_context_print:        load time =     116.39 ms
llama_perf_context_print: prompt eval time =    3302.21 ms /     4 tokens (  825.55 ms per token,     1.21 tokens per second)
llama_perf_context_print:        eval time =       0.00 ms /     1 runs   (    0.00 ms per token,      inf tokens per second)
llama_perf_context_print:       total time =    3302.42 ms /     5 tokens
INFO:     127.0.0.1:54948 - "POST /v1/embeddings HTTP/1.1" 200 OK

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions