Skip to content

Add optional multi-provider LLM code generation to the OpenCAD agent#13

Merged
isayahc merged 6 commits intomainfrom
copilot/allow-different-llm-providers
Mar 11, 2026
Merged

Add optional multi-provider LLM code generation to the OpenCAD agent#13
isayahc merged 6 commits intomainfrom
copilot/allow-different-llm-providers

Conversation

Copy link
Contributor

Copilot AI commented Mar 11, 2026

OpenCAD’s agent could only follow a built-in deterministic planning path. This change adds an optional multi-provider LLM path for generating OpenCAD scripts in the same style as the repository examples, while keeping the existing execution flow unchanged.

  • Agent API

    • Extends /chat request handling with:
      • llm_provider
      • llm_model
      • generate_code
    • Extends the response with optional generated_code
    • Keeps the default planner behavior intact when code generation is not requested
  • Provider abstraction

    • Adds a small LiteLLM-backed integration layer for provider-agnostic code generation
    • Supports provider-qualified model routing (for example openai/gpt-4o-mini)
    • Validates provider/model configuration early instead of silently accepting partial LLM config
  • Example-style code generation

    • Adds a dedicated code-generation prompt builder that includes curated scripts from examples/
    • Uses those examples to steer output toward the existing fluent OpenCAD style
    • Falls back to deterministic built-in script generation when no LLM model is configured
  • Planner compatibility

    • Preserves the current deterministic operation-planning path for feature execution
    • Adds deterministic script generation patterns for common requests such as mounting brackets and PCB carriers
  • Packaging, docs, and examples

    • Adds an optional llm extra for LiteLLM support
    • Documents the new request fields and the code-generation mode in the architecture/runtime docs
    • Adds a runnable usage example under examples/agents/
    • Documents how to run the agent example with deterministic fallback by default or with provider/model environment variables for LiteLLM-backed generation
    • Adds explicit Claude and Gemini example commands in examples/agents/README.md
    • Adds focused tests that execute the new example script and keep the provider-specific example documentation valid

Example request shape:

{
  "message": "Generate a mounting bracket script with corner fasteners and a center cutout",
  "tree_state": { "root_id": "root", "nodes": {} },
  "conversation_history": [],
  "reasoning": true,
  "generate_code": true,
  "llm_provider": "openai",
  "llm_model": "gpt-4o-mini"
}

Example generated code shape:

from opencad import Part, Sketch

bracket_profile = (
    Sketch(name="Generated Bracket Profile")
    .rect(80, 30)
    .circle(3, center=(8, 8), subtract=True)
    .circle(3, center=(72, 8), subtract=True)
    .circle(5, center=(40, 15), subtract=True)
)

Part(name="Generated Mounting Bracket").extrude(
    bracket_profile,
    depth=4,
    name="Bracket Body",
).fillet(edges="top", radius=0.75, name="Bracket Edge Relief")
Original prompt

This section details on the original issue you should resolve

<issue_title>Allow for different llm providers</issue_title>
<issue_description>We need a library that can handle llm calls from different providers.
They should also be able to generate code similar to the the code in the examples directory.
</issue_description>

Comments on the Issue (you are @copilot in this section)


🔒 GitHub Advanced Security automatically protects Copilot coding agent pull requests. You can protect all pull requests by enabling Advanced Security for your repositories. Learn more about Advanced Security.

Copilot AI and others added 3 commits March 11, 2026 16:29
Co-authored-by: isayahc <18591926+isayahc@users.noreply.github.com>
Co-authored-by: isayahc <18591926+isayahc@users.noreply.github.com>
Co-authored-by: isayahc <18591926+isayahc@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for multiple LLM providers Add optional multi-provider LLM code generation to the OpenCAD agent Mar 11, 2026
Copy link
Contributor

@isayahc isayahc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Provide an example of how it is to be used in example/agentsdirectory

Co-authored-by: isayahc <18591926+isayahc@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Mar 11, 2026

Provide an example of how it is to be used in example/agentsdirectory

Added a runnable example under examples/agents plus README/docs references, and added a test that executes the example script to keep it valid. Addressed in cde94e3.

Copy link
Contributor

@isayahc isayahc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sure you also include a claude and gemini example for llms

Co-authored-by: isayahc <18591926+isayahc@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Mar 11, 2026

make sure you also include a claude and gemini example for llms

Added explicit Claude and Gemini usage examples to examples/agents/README.md and a focused test to keep them in place. Addressed in 02f7e5b.

@isayahc isayahc marked this pull request as ready for review March 11, 2026 17:56
@isayahc isayahc self-requested a review March 11, 2026 17:56
Copy link
Contributor

@isayahc isayahc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this library need litellm to run?

Copy link
Contributor Author

Copilot AI commented Mar 11, 2026

does this library need litellm to run?

No — litellm is optional. The library still runs without it for the default deterministic planner and deterministic code-generation fallback. It’s only needed when you want provider-backed LLM code generation by setting generate_code with an LLM model/provider.

@isayahc isayahc merged commit 3d25da0 into main Mar 11, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Allow for different llm providers

2 participants