Skip to content

NeMo-Agent-Toolkit-UI is not compatible with agent's output #1365

@maxjeblick

Description

@maxjeblick

Is this a new feature, an improvement, or a change to existing functionality?

Improvement

How would you describe the priority of this feature request

Medium

Please provide a clear description of problem this feature solves

When using NeMo-Agent-Toolkit-UI alongside nat serve, the workflow is failing if it is missing a ChatResponseChunk converter.

For example,

async def _response_fn(input_message: str) -> MyOutputClass
    return MyOutputClass(...)

    yield FunctionInfo.from_fn(
        _response_fn,
        description="dummy agent",
        converters=[my_output_class_to_str],
    )

will fail with

ValueError: Cannot convert type <class 'MyOutputClass'> 
to <class 'nat.data_models.api_server.ChatResponseChunk'>. No match found.

Describe your ideal solution

It would be nice UX to have NeMo-Agent-Toolkit-UI work with any agent/function that has a convert to string method.

  1. The consumer of the UI doesn't need to know about the internals about how the response is expected.
  2. The calcualtor example mentioned in https://github.com/NVIDIA/NeMo-Agent-Toolkit-UI has return types float and string which seem to work nice.
  3. There is ChatResponseChunk.from_string(str(output)) which could be triggered automatically if a string converter exists.

In addition, it would be helpful to check upfront if a required converter is missing to have better debugging experience.

Additional context

Minimal example generated by opus 4.5 ( I haven't run it).

from pydantic import BaseModel, Field
from typing import List

from nat.builder.builder import Builder
from nat.builder.framework_enum import LLMFrameworkEnum
from nat.builder.function_info import FunctionInfo
from nat.cli.register_workflow import register_function
from nat.data_models.api_server import ChatResponseChunk
from nat.data_models.function import FunctionBaseConfig


class MyOutput(BaseModel):
    """Custom structured output from the agent."""
    items: List[str] = Field(default_factory=list)


# Converter 1: MyOutput -> str (works with nat run)
def my_output_to_str(output: MyOutput) -> str:
    return output.model_dump_json()


# Converter 2: MyOutput -> ChatResponseChunk (WORKAROUND for nat serve)
# Without this, nat serve fails with:
# ValueError: Cannot convert type <class 'MyOutput'> to <class 'ChatResponseChunk'>
def my_output_to_chat_response_chunk(output: MyOutput) -> ChatResponseChunk:
    return ChatResponseChunk.from_string(output.model_dump_json())


class MyAgentConfig(FunctionBaseConfig, name="my_agent"):  # type: ignore[call-arg]
    pass


@register_function(config_type=MyAgentConfig, framework_wrappers=[LLMFrameworkEnum.LANGCHAIN])
async def my_agent(config: MyAgentConfig, builder: Builder):
    
    async def _response_fn(query: str) -> MyOutput:
        return MyOutput(items=["result1", "result2"])

    yield FunctionInfo.from_fn(
        _response_fn,
        description="Example agent",
        # Both converters needed: str for nat run, ChatResponseChunk for nat serve
        converters=[my_output_to_str, my_output_to_chat_response_chunk],
    )

Note that I'm using NAT with #1363, as I cannot run nat serve with main.

Finally, I can provide you with the actual intenral codebase if needed.

Code of Conduct

  • I agree to follow this project's Code of Conduct
  • I have searched the open feature requests and have found no duplicates for this feature request

Metadata

Metadata

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions