Prerequisites
🎯 Problem Statement
Currently, IssueMatch does not support direct integration with major LLM providers like Azure OpenAI, OpenAI, or Anthropic.
I want to use IssueMatch with different LLM backends depending on cost, performance, and deployment needs. The lack of multi-provider support limits flexibility and makes it harder to adopt the tool in real-world and enterprise workflows.
This impacts:
- Enterprise users who rely on Azure OpenAI
- Developers who want to switch between providers easily
- Teams that need vendor-agnostic AI integration
💡 Proposed Solution
Add a pluggable LLM provider integration layer that supports:
- Azure OpenAI
- OpenAI
- Anthropic (Claude)
- And Many other popular integrations
Expected behavior:
- Users can select the provider via config or environment variables
- Unified interface for prompts and responses
- Easy provider switching without code changes
Example (config-based):
LLM_PROVIDER=azure_openai | openai | anthropic
OPENAI_API_KEY=...
AZURE_OPENAI_ENDPOINT=...
AZURE_OPENAI_DEPLOYMENT=...
ANTHROPIC_API_KEY=...
This would make IssueMatch more flexible, scalable, and production-ready.
🔄 Alternatives Considered
No response
📊 Priority Level
High - Significant impact on productivity
🎭 Use Cases
-
Enterprise Deployment
- A company using Azure OpenAI integrates IssueMatch internally.
-
Cost Optimization
- Developers switch between OpenAI and Anthropic based on pricing.
-
Research & Experimentation
- Compare outputs across different LLMs using the same IssueMatch pipeline.
📎 Additional Context
No response
🤝 Contribution
Prerequisites
🎯 Problem Statement
Currently, IssueMatch does not support direct integration with major LLM providers like Azure OpenAI, OpenAI, or Anthropic.
I want to use IssueMatch with different LLM backends depending on cost, performance, and deployment needs. The lack of multi-provider support limits flexibility and makes it harder to adopt the tool in real-world and enterprise workflows.
This impacts:
💡 Proposed Solution
Add a pluggable LLM provider integration layer that supports:
Expected behavior:
Example (config-based):
This would make IssueMatch more flexible, scalable, and production-ready.
🔄 Alternatives Considered
No response
📊 Priority Level
High - Significant impact on productivity
🎭 Use Cases
Enterprise Deployment
Cost Optimization
Research & Experimentation
📎 Additional Context
No response
🤝 Contribution