Skip to content

AI "BYOLLM" core integration for CQL #5

@preston

Description

@preston

In alignment with broader product vision, the concept is:

  • Support optional generative and IDE-integrated AI integration similar to commercial products.
  • By default, support an open source model runner and model(s) that can be deployed either on the user's local machine, or at an endpoint of their choice. Do not require any external SaaS or cloud service that could potentially violate local information egress policy or create infosec problems. We're currently chosen ollama.
  • Provide MCP support both by exposing "tools" directly from within CQL Studio, as well as optional external MCP services. Rely on the protocol and do not assume a user running locally has Docker Desktop or any container runtime used for MCP server purposes.
  • Different operating modes: execution and planning-only, for example.

Sub-issues

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions