Skip to content

AI Assistant message stuck in processing when upstream returns an empty response body #4710

@elias-ba

Description

@elias-ba

What's broken

When the AI Assistant background worker handles a workflow or global chat message, it builds the assistant turn from the upstream response body. If the response body has no response field or contains it as an empty string, the assistant ChatMessage ends up with content: nil or "" and the insert fails (the changeset requires content to be 1 to 10,000 characters).

The user-side chat message stays stuck in :processing forever. The user sees no response, no error, no retry option. The Sentry alert fires but the originating user is never notified.

The empty response itself originates upstream in Apollo. Both workflow_chat and global_chat can return HTTP 200 with response: "" when the LLM output is unparseable (workflow_chat JSON parse failure) or contains no text block (global_chat planner exits without end_turn). Tracked separately at OpenFn/apollo#484.

How it surfaced

Sentry on v2.16.2. Three events from a single chat session in production. Silent on the user side.

What to fix

Two-sided fix.

The root-cause fix is on Apollo (OpenFn/apollo#484): when the LLM output can't be parsed or produces no text, the service should raise ApolloError or signal an error explicitly, not return 200 with empty text.

On the Lightning side, defend against an empty body["response"] even after Apollo is fixed. Treat empty or missing response as an error path rather than calling the assistant message builder. MessageProcessor.handle_processing_result/2 already updates the user's message to :error on the error branch, so routing the empty response through the error tuple matches the existing shape. The user then sees the message marked errored and can retry.

Metadata

Metadata

Labels

bugNewly identified bug

Type

No type

Projects

Status

In review

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions