Skip to content

refactor(agent): decouple ai-proxy via factory injection pattern#1455

Open
Scra3 wants to merge 9 commits intomainfrom
refactor/decouple-ai-proxy-factory-injection
Open

refactor(agent): decouple ai-proxy via factory injection pattern#1455
Scra3 wants to merge 9 commits intomainfrom
refactor/decouple-ai-proxy-factory-injection

Conversation

@Scra3
Copy link
Member

@Scra3 Scra3 commented Feb 7, 2026

Summary

  • Move @forestadmin/ai-proxy from hard dependency to optional peer dependency in @forestadmin/agent
  • Introduce createAiProvider() factory in ai-proxy following the existing addDataSource(createSqlDataSource(...)) pattern
  • Define shared AiRouter / AiProviderDefinition interfaces in datasource-toolkit
  • Remove all @forestadmin/ai-proxy imports from agent source code (duck-typed error handling instead of instanceof)
  • Router in ai-proxy now handles MCP OAuth token extraction/injection internally

Before:

agent.addAi({ provider: 'openai', model: 'gpt-4o', name: 'test', apiKey: '...' });

After:

import { createAiProvider } from '@forestadmin/ai-proxy';
agent.addAi(createAiProvider({ provider: 'openai', model: 'gpt-4o', name: 'test', apiKey: '...' }));

This eliminates ~6 langchain packages from the default install for users who don't use AI features.

Test plan

  • All 3 packages build successfully (datasource-toolkit, ai-proxy, agent)
  • datasource-toolkit tests pass (457 tests)
  • ai-proxy tests pass (82 tests) including new createAiProvider test
  • agent tests pass (886 tests) with updated mocks
  • ai-proxy only appears in peerDependencies of agent's package.json

🤖 Generated with Claude Code

…ateAiProvider)

Move @forestadmin/ai-proxy from hard dependency to optional peer dependency
by introducing a factory injection pattern. Users who need AI features now
explicitly install ai-proxy and inject it via createAiProvider(), following
the same pattern as addDataSource(createSqlDataSource(...)).

This eliminates ~6 langchain packages from the default install for users
who don't use AI features.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@qltysh
Copy link

qltysh bot commented Feb 7, 2026

2 new issues

Tool Category Rule Count
qlty Structure Function with many parameters (count = 4): makeRoutes 1
qlty Structure Function with many returns (count = 4): route 1

alban bertolini and others added 2 commits February 7, 2026 12:04
…re multi-provider support

Change AiProviderDefinition interface from single name/provider to
providers: Array<{name, provider}> so that createAiProvider can later
accept variadic configs without breaking changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@Scra3
Copy link
Member Author

Scra3 commented Feb 7, 2026

Bundle size impact

By making @forestadmin/ai-proxy an optional peer dependency, users who don't use AI features no longer install the langchain ecosystem and LLM SDK dependencies.

Dependencies removed from default install:

Package Size
@langchain/* (8 packages) ~58 MB
openai SDK ~13 MB
@anthropic-ai SDK ~5 MB
langsmith ~3 MB
ai-proxy transitive deps ~40 MB
Total saved ~119 MB

This represents a ~9% reduction of the total monorepo node_modules — but for end users doing a fresh npm install @forestadmin/agent, the savings are more significant since these AI/LLM packages represent a large chunk of the agent's own dependency tree.

Users who need AI features simply add npm install @forestadmin/ai-proxy and use the createAiProvider() factory.

- Remove redundant .map() identity transform in schema generator
- Add runtime validation for mcpServerConfigs before unsafe cast
- Add JSDoc documenting AiRouter error contract
- Fix lint/prettier formatting issues
- Add tests for resolveMcpConfigs path (OAuth injection, backward compat, invalid input)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@qltysh
Copy link

qltysh bot commented Feb 7, 2026

Qlty

Coverage Impact

⬆️ Merging this pull request will increase total coverage on main by 0.05%.

Modified Files with Diff Coverage (8)

RatingFile% DiffUncovered Line #s
Coverage rating: A Coverage rating: A
packages/agent/src/agent.ts100.0%
Coverage rating: A Coverage rating: A
packages/agent/src/routes/ai/ai-proxy.ts100.0%
Coverage rating: A Coverage rating: A
packages/ai-proxy/src/router.ts100.0%
Coverage rating: B Coverage rating: A
packages/ai-proxy/src/errors.ts100.0%
Coverage rating: A Coverage rating: A
packages/agent/src/routes/index.ts100.0%
Coverage rating: A Coverage rating: A
packages/ai-proxy/src/index.ts100.0%
Coverage rating: A Coverage rating: A
packages/agent/src/utils/forest-schema/generator.ts100.0%
New file Coverage rating: A
packages/ai-proxy/src/create-ai-provider.ts100.0%
Total100.0%
🚦 See full report on Qlty Cloud »

🛟 Help
  • Diff Coverage: Coverage for added or modified lines of code (excludes deleted files). Learn more.

  • Total Coverage: Coverage for the whole repository, calculated as the sum of all File Coverage. Learn more.

  • File Coverage: Covered Lines divided by Covered Lines plus Missed Lines. (Excludes non-executable lines including blank lines and comments.)

    • Indirect Changes: Changes to File Coverage for files that were not modified in this PR. Learn more.

Replace inline `{ name: string; provider: string }` with a named
`AiProviderMeta` interface in datasource-toolkit. Use it consistently
in AiProviderDefinition, SchemaGenerator, and ForestSchema.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@Scra3 Scra3 force-pushed the refactor/decouple-ai-proxy-factory-injection branch from eff659c to adb2381 Compare February 7, 2026 12:29
…r handling

AIError and subclasses now extend BusinessError from datasource-toolkit
(BadRequestError, NotFoundError, UnprocessableError). The agent's error
middleware handles HTTP status mapping natively, removing the need for
duck-typed status checks and error re-wrapping in the route handler.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@Scra3 Scra3 force-pushed the refactor/decouple-ai-proxy-factory-injection branch from adb2381 to b77d9af Compare February 7, 2026 12:37
@Scra3
Copy link
Member Author

Scra3 commented Feb 7, 2026

Breaking change: forestadmin-server error handling

AI errors now extend BusinessError subclasses directly instead of a single AIError base class:

  • AIBadRequestError extends BadRequestError (was extends AIError)
  • AINotFoundError extends NotFoundError (was extends AIError)
  • AIUnprocessableError extends UnprocessableError (was extends AIError)
  • AIError extends UnprocessableError (unchanged, still used by McpError, AINotConfiguredError)

Impact: instanceof AIError no longer catches AIBadRequestError, AINotFoundError, AIUnprocessableError.

Required change in forestadmin-server

packages/private-api/src/domain/ai/error-translator-http.ts:

 import {
-  AIError,
+  AIBadRequestError,
   AIToolNotFoundError,
   AIUnprocessableError,
   McpConfigError,
   McpConnectionError,
+  McpConflictError,
 } from '@forestadmin/ai-proxy';
 
 import {
+  BadRequestError,
   ConflictError,
   FailedDependencyError,
-  InternalServerError,
   NotFoundError,
   UnprocessableEntityError,
 } from '../../utils/errors/common-errors';

 export default function aiErrorTranslator(error: unknown) {
   if (!error || !(error instanceof Error)) return error;

   switch (true) {
+    case error instanceof AIBadRequestError:
+      return new BadRequestError(`AI: ${error.message}`);
     case error instanceof AIUnprocessableError:
       return new UnprocessableEntityError('AI command', `AI: ${error.message}`);
     case error instanceof AIToolNotFoundError:
       return new NotFoundError(`AI: ${error.message}`);
     case error instanceof McpConnectionError:
       return new FailedDependencyError('MCP Connection Error', null, error);
     case error instanceof McpConfigError:
       return new UnprocessableEntityError('MCP config', `MCP Config Error: ${error.message}`);
     case error instanceof McpConflictError:
       return new ConflictError('MCP config', `MCP Config Error: ${error.message}`);
-    case error instanceof AIError:
-      return new InternalServerError(`AI: ${error.message}`, null);
     default:
       return error;
   }
 }

This is actually more correct: AIBadRequestError was previously caught by the AIError catch-all and returned as 500 — it now correctly maps to 400.

alban bertolini and others added 3 commits February 7, 2026 13:46
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Include the model name in AiProviderMeta alongside name and provider,
so the schema metadata and agent logs accurately reflect the full AI
configuration (provider, model, name).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Keep ai_llms schema payload consistent with main: only { name, provider }
are sent to the server. The model field remains in AiProviderMeta for
agent-side logging only.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant