diff --git a/README.md b/README.md
index 4541b79..50502fd 100644
--- a/README.md
+++ b/README.md
@@ -39,7 +39,7 @@ Whether it’s a compilation error, test failure, or deployment hiccup, this plu
* **One-click error analysis** on any console output
* **Pipeline-ready** with a simple `explainError()` step
-* **AI-powered explanations** via OpenAI GPT models, Google Gemini or local Ollama models
+* **AI-powered explanations** via OpenAI GPT models, Google Gemini, local Ollama models and more
* **Folder-level configuration** so teams can use project-specific settings
* **Smart provider management** — LangChain4j handles most providers automatically
* **Customizable**: set provider, model, API endpoint (enterprise-ready)[^1], log filters, and more
@@ -74,9 +74,9 @@ Whether it’s a compilation error, test failure, or deployment hiccup, this plu
| Setting | Description | Default |
|---------|-------------|---------|
| **Enable AI Error Explanation** | Toggle plugin functionality | ✅ Enabled |
-| **AI Provider** | Choose between OpenAI, Google Gemini, or Ollama | `OpenAI` |
-| **API Key** | Your AI provider API key | Get from [OpenAI](https://platform.openai.com/settings) or [Google AI Studio](https://aistudio.google.com/app/apikey) |
-| **API URL** | AI service endpoint | **Leave empty** for official APIs (OpenAI, Gemini). **Specify custom URL** for OpenAI-compatible services and air-gapped environments. |
+| **AI Provider** | Choose between OpenAI, Google Gemini, Anthropic Claude, Azure OpenAI, DeepSeek, or Ollama | `OpenAI` |
+| **API Key** | Your AI provider API key | Get from provider's platform |
+| **API URL** | AI service endpoint | **Leave empty** for official APIs (OpenAI, Gemini, Anthropic). **Required** for Azure OpenAI endpoint and Ollama. **Optional** for custom/self-hosted services. |
| **AI Model** | Model to use for analysis | *Required*. Specify the model name offered by your selected AI provider |
| **Custom Context** | Additional instructions or context for the AI (e.g., KB article links, organization-specific troubleshooting steps) | *Optional*. Can be overridden at the job level. |
@@ -148,7 +148,7 @@ This allows you to manage the plugin configuration alongside your other Jenkins
## Supported AI Providers
### OpenAI
-- **Models**: `gpt-4`, `gpt-4-turbo`, `gpt-3.5-turbo`, etc.
+- **Models**: `gpt-5`, `gpt-5-mini`, `gpt-4.1`, `gpt-4-turbo`, `gpt-3.5-turbo`, etc.
- **API Key**: Get from [OpenAI Platform](https://platform.openai.com/settings)
- **Endpoint**: Leave empty for official OpenAI API, or specify custom URL for OpenAI-compatible services
- **Best for**: Comprehensive error analysis with excellent reasoning
@@ -156,14 +156,73 @@ This allows you to manage the plugin configuration alongside your other Jenkins
### Google Gemini
- **Models**: `gemini-2.0-flash`, `gemini-2.0-flash-lite`, `gemini-2.5-flash`, etc.
- **API Key**: Get from [Google AI Studio](https://aistudio.google.com/app/apikey)
-- **Endpoint**: Leave empty for official Google AI API, or specify custom URL for Gemini-compatible services
+- **Endpoint**: Leave empty for official Google AI API
- **Best for**: Fast, efficient analysis with competitive quality
+### Anthropic Claude
+- **Models**: `claude-3-7-sonnet`, `claude-3-5-sonnet`, `claude-3-5-haiku`, `claude-3-opus`, etc.
+- **API Key**: Get from [Anthropic Console](https://console.anthropic.com/)
+- **Endpoint**: Leave empty for official Anthropic API
+- **Best for**: Detailed, thorough error analysis with high accuracy
+- **Configuration Example**:
+```yaml
+unclassified:
+ explainError:
+ aiProvider:
+ anthropic:
+ apiKey: "${AI_API_KEY}"
+ model: "claude-3-5-sonnet-20241022"
+ enableExplanation: true
+```
+
+### Azure OpenAI
+- **Models**: Your Azure deployment names (e.g., `gpt-5`, `gpt-4.1`)
+- **API Key**: Get from Azure Portal
+- **Endpoint**: **Required** - Your Azure OpenAI endpoint (e.g., `https://your-resource.openai.azure.com`)
+- **Best for**: Enterprise environments requiring Azure integration and compliance
+- **Configuration Example**:
+```yaml
+unclassified:
+ explainError:
+ aiProvider:
+ azureOpenai:
+ apiKey: "${AZURE_API_KEY}"
+ url: "https://your-resource.openai.azure.com"
+ model: "your-deployment-name"
+ enableExplanation: true
+```
+
+### DeepSeek
+- **Models**: `deepseek-chat`, `deepseek-coder`, `deepseek-reasoner`
+- **API Key**: Get from [DeepSeek Platform](https://platform.deepseek.com/)
+- **Endpoint**: Leave empty for official DeepSeek API (default: https://api.deepseek.com)
+- **Best for**: Cost-effective analysis, competitive performance at lower costs
+- **Configuration Example**:
+```yaml
+unclassified:
+ explainError:
+ aiProvider:
+ deepseek:
+ apiKey: "${AI_API_KEY}"
+ model: "deepseek-chat"
+ enableExplanation: true
+```
+
### Ollama (Local/Private LLM)
- **Models**: `gemma3:1b`, `gpt-oss`, `deepseek-r1`, and any model available in your Ollama instance
- **API Key**: Not required by default (unless your Ollama server is secured)
- **Endpoint**: `http://localhost:11434` (or your Ollama server URL)
- **Best for**: Private, local, or open-source LLMs; no external API usage or cost
+- **Configuration Example**:
+```yaml
+unclassified:
+ explainError:
+ aiProvider:
+ ollama:
+ model: "gemma3:1b"
+ url: "http://localhost:11434"
+ enableExplanation: true
+```
## Usage
diff --git a/pom.xml b/pom.xml
index 3cef0cd..ccd0030 100644
--- a/pom.xml
+++ b/pom.xml
@@ -70,6 +70,14 @@
pom
import
+
+
+ io.netty
+ netty-bom
+ 4.1.128.Final
+ pom
+ import
+
@@ -208,5 +216,39 @@
+
+ dev.langchain4j
+ langchain4j-anthropic
+ ${langchain4j.version}
+ true
+
+
+ org.slf4j
+ *
+
+
+ com.fasterxml.jackson.core
+ *
+
+
+
+
+
+ dev.langchain4j
+ langchain4j-azure-open-ai
+ ${langchain4j.version}
+ true
+
+
+ org.slf4j
+ *
+
+
+ com.fasterxml.jackson.core
+ *
+
+
+
+
diff --git a/src/main/java/io/jenkins/plugins/explain_error/provider/AnthropicProvider.java b/src/main/java/io/jenkins/plugins/explain_error/provider/AnthropicProvider.java
new file mode 100644
index 0000000..d7a277b
--- /dev/null
+++ b/src/main/java/io/jenkins/plugins/explain_error/provider/AnthropicProvider.java
@@ -0,0 +1,126 @@
+package io.jenkins.plugins.explain_error.provider;
+
+import dev.langchain4j.model.anthropic.AnthropicChatModel;
+import dev.langchain4j.model.chat.ChatModel;
+import dev.langchain4j.model.chat.request.ResponseFormat;
+import dev.langchain4j.service.AiServices;
+import edu.umd.cs.findbugs.annotations.CheckForNull;
+import edu.umd.cs.findbugs.annotations.NonNull;
+import hudson.Extension;
+import hudson.Util;
+import hudson.model.AutoCompletionCandidates;
+import hudson.model.TaskListener;
+import hudson.util.FormValidation;
+import hudson.util.Secret;
+import io.jenkins.plugins.explain_error.ExplanationException;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import jenkins.model.Jenkins;
+import org.jenkinsci.Symbol;
+import org.kohsuke.stapler.DataBoundConstructor;
+import org.kohsuke.stapler.QueryParameter;
+import org.kohsuke.stapler.verb.GET;
+import org.kohsuke.stapler.verb.POST;
+
+public class AnthropicProvider extends BaseAIProvider {
+
+ private static final Logger LOGGER = Logger.getLogger(AnthropicProvider.class.getName());
+
+ protected Secret apiKey;
+
+ @DataBoundConstructor
+ public AnthropicProvider(String url, String model, Secret apiKey) {
+ super(url, model);
+ this.apiKey = apiKey;
+ }
+
+ public Secret getApiKey() {
+ return apiKey;
+ }
+
+ @Override
+ public Assistant createAssistant() {
+ ChatModel model = AnthropicChatModel.builder()
+ .baseUrl(Util.fixEmptyAndTrim(getUrl())) // Will use default if null
+ .apiKey(getApiKey().getPlainText())
+ .modelName(getModel())
+ .temperature(0.3)
+ .logRequests(LOGGER.isLoggable(Level.FINE))
+ .logResponses(LOGGER.isLoggable(Level.FINE))
+ .build();
+
+ return AiServices.create(Assistant.class, model);
+ }
+
+ @Override
+ public boolean isNotValid(@CheckForNull TaskListener listener) {
+ if (listener != null) {
+ if (Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null) {
+ listener.getLogger().println("No Api key configured for Anthropic.");
+ }
+ if (Util.fixEmptyAndTrim(getModel()) == null) {
+ listener.getLogger().println("No Model configured for Anthropic.");
+ }
+ }
+ return Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null ||
+ Util.fixEmptyAndTrim(getModel()) == null;
+ }
+
+ @Extension
+ @Symbol("anthropic")
+ public static class DescriptorImpl extends BaseProviderDescriptor {
+
+ private static final String[] MODELS = new String[]{
+ "claude-3-7-sonnet-20250219",
+ "claude-3-5-sonnet-20241022",
+ "claude-3-5-haiku-20241022",
+ "claude-3-opus-20240229",
+ "claude-3-sonnet-20240229",
+ "claude-3-haiku-20240307"
+ };
+
+ @NonNull
+ @Override
+ public String getDisplayName() {
+ return "Anthropic (Claude)";
+ }
+
+ @Override
+ public String getDefaultModel() {
+ return "claude-3-5-sonnet-20241022";
+ }
+
+ @GET
+ @SuppressWarnings("lgtm[jenkins/no-permission-check]")
+ public AutoCompletionCandidates doAutoCompleteModel(@QueryParameter String value) {
+ AutoCompletionCandidates c = new AutoCompletionCandidates();
+ for (String model : MODELS) {
+ if (model.toLowerCase().startsWith(value.toLowerCase())) {
+ c.add(model);
+ }
+ }
+ return c;
+ }
+
+ /**
+ * Method to test the AI API configuration.
+ * This is called when the "Test Configuration" button is clicked.
+ */
+ @POST
+ public FormValidation doTestConfiguration(@QueryParameter("apiKey") Secret apiKey,
+ @QueryParameter("url") String url,
+ @QueryParameter("model") String model) throws ExplanationException {
+ Jenkins.get().checkPermission(Jenkins.ADMINISTER);
+
+ AnthropicProvider provider = new AnthropicProvider(url, model, apiKey);
+ try {
+ provider.explainError("Send 'Configuration test successful' to me.", null);
+ return FormValidation.ok("Configuration test successful! API connection is working properly.");
+ } catch (ExplanationException e) {
+ return FormValidation.error("Configuration test failed: " + e.getMessage(), e);
+ }
+ }
+
+ }
+
+}
diff --git a/src/main/java/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider.java b/src/main/java/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider.java
new file mode 100644
index 0000000..aa6541b
--- /dev/null
+++ b/src/main/java/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider.java
@@ -0,0 +1,131 @@
+package io.jenkins.plugins.explain_error.provider;
+
+import dev.langchain4j.model.azure.AzureOpenAiChatModel;
+import dev.langchain4j.model.chat.ChatModel;
+import dev.langchain4j.model.chat.request.ResponseFormat;
+import dev.langchain4j.service.AiServices;
+import edu.umd.cs.findbugs.annotations.CheckForNull;
+import edu.umd.cs.findbugs.annotations.NonNull;
+import hudson.Extension;
+import hudson.Util;
+import hudson.model.AutoCompletionCandidates;
+import hudson.model.TaskListener;
+import hudson.util.FormValidation;
+import hudson.util.Secret;
+import io.jenkins.plugins.explain_error.ExplanationException;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import jenkins.model.Jenkins;
+import org.jenkinsci.Symbol;
+import org.kohsuke.stapler.DataBoundConstructor;
+import org.kohsuke.stapler.QueryParameter;
+import org.kohsuke.stapler.verb.GET;
+import org.kohsuke.stapler.verb.POST;
+
+public class AzureOpenAIProvider extends BaseAIProvider {
+
+ private static final Logger LOGGER = Logger.getLogger(AzureOpenAIProvider.class.getName());
+
+ protected Secret apiKey;
+
+ @DataBoundConstructor
+ public AzureOpenAIProvider(String url, String model, Secret apiKey) {
+ super(url, model);
+ this.apiKey = apiKey;
+ }
+
+ public Secret getApiKey() {
+ return apiKey;
+ }
+
+ @Override
+ public Assistant createAssistant() {
+ // For Azure, the URL is the endpoint and model is the deployment name
+ ChatModel model = AzureOpenAiChatModel.builder()
+ .endpoint(Util.fixEmptyAndTrim(getUrl())) // Azure endpoint is required
+ .apiKey(getApiKey().getPlainText())
+ .deploymentName(getModel()) // In Azure, this is the deployment name
+ .temperature(0.3)
+ .responseFormat(ResponseFormat.JSON)
+ .strictJsonSchema(true)
+ .build();
+
+ return AiServices.create(Assistant.class, model);
+ }
+
+ @Override
+ public boolean isNotValid(@CheckForNull TaskListener listener) {
+ if (listener != null) {
+ if (Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null) {
+ listener.getLogger().println("No Api key configured for Azure OpenAI.");
+ } else if (Util.fixEmptyAndTrim(getUrl()) == null) {
+ listener.getLogger().println("No Endpoint configured for Azure OpenAI.");
+ } else if (Util.fixEmptyAndTrim(getModel()) == null) {
+ listener.getLogger().println("No Deployment Name configured for Azure OpenAI.");
+ }
+ }
+ return Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null ||
+ Util.fixEmptyAndTrim(getUrl()) == null ||
+ Util.fixEmptyAndTrim(getModel()) == null;
+ }
+
+ @Extension
+ @Symbol("azureOpenai")
+ public static class DescriptorImpl extends BaseProviderDescriptor {
+
+ // Common deployment name examples users might create in Azure OpenAI
+ // Note: These are example deployment names, not model names
+ private static final String[] COMMON_DEPLOYMENT_NAMES = new String[]{
+ "gpt-5",
+ "gpt-5-mini",
+ "gpt-4.1",
+ "gpt-4.1-mini",
+ "gpt-4-turbo",
+ "gpt-35-turbo"
+ };
+
+ @NonNull
+ @Override
+ public String getDisplayName() {
+ return "Azure OpenAI";
+ }
+
+ @Override
+ public String getDefaultModel() {
+ return "gpt-4.1";
+ }
+
+ @GET
+ @SuppressWarnings("lgtm[jenkins/no-permission-check]")
+ public AutoCompletionCandidates doAutoCompleteModel(@QueryParameter String value) {
+ AutoCompletionCandidates c = new AutoCompletionCandidates();
+ for (String model : COMMON_DEPLOYMENT_NAMES) {
+ if (model.toLowerCase().startsWith(value.toLowerCase())) {
+ c.add(model);
+ }
+ }
+ return c;
+ }
+
+ /**
+ * Method to test the AI API configuration.
+ * This is called when the "Test Configuration" button is clicked.
+ */
+ @POST
+ public FormValidation doTestConfiguration(@QueryParameter("apiKey") Secret apiKey,
+ @QueryParameter("url") String url,
+ @QueryParameter("model") String model) throws ExplanationException {
+ Jenkins.get().checkPermission(Jenkins.ADMINISTER);
+
+ AzureOpenAIProvider provider = new AzureOpenAIProvider(url, model, apiKey);
+ try {
+ provider.explainError("Send 'Configuration test successful' to me.", null);
+ return FormValidation.ok("Configuration test successful! API connection is working properly.");
+ } catch (ExplanationException e) {
+ return FormValidation.error("Configuration test failed: " + e.getMessage(), e);
+ }
+ }
+
+ }
+
+}
diff --git a/src/main/java/io/jenkins/plugins/explain_error/provider/DeepSeekProvider.java b/src/main/java/io/jenkins/plugins/explain_error/provider/DeepSeekProvider.java
new file mode 100644
index 0000000..fb3a348
--- /dev/null
+++ b/src/main/java/io/jenkins/plugins/explain_error/provider/DeepSeekProvider.java
@@ -0,0 +1,131 @@
+package io.jenkins.plugins.explain_error.provider;
+
+import dev.langchain4j.model.chat.ChatModel;
+import dev.langchain4j.model.chat.request.ResponseFormat;
+import dev.langchain4j.model.openai.OpenAiChatModel;
+import dev.langchain4j.service.AiServices;
+import edu.umd.cs.findbugs.annotations.CheckForNull;
+import edu.umd.cs.findbugs.annotations.NonNull;
+import hudson.Extension;
+import hudson.Util;
+import hudson.model.AutoCompletionCandidates;
+import hudson.model.TaskListener;
+import hudson.util.FormValidation;
+import hudson.util.Secret;
+import io.jenkins.plugins.explain_error.ExplanationException;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import jenkins.model.Jenkins;
+import org.jenkinsci.Symbol;
+import org.kohsuke.stapler.DataBoundConstructor;
+import org.kohsuke.stapler.QueryParameter;
+import org.kohsuke.stapler.verb.GET;
+import org.kohsuke.stapler.verb.POST;
+
+public class DeepSeekProvider extends BaseAIProvider {
+
+ private static final Logger LOGGER = Logger.getLogger(DeepSeekProvider.class.getName());
+ private static final String DEFAULT_URL = "https://api.deepseek.com";
+
+ protected Secret apiKey;
+
+ @DataBoundConstructor
+ public DeepSeekProvider(String url, String model, Secret apiKey) {
+ super(url, model);
+ this.apiKey = apiKey;
+ }
+
+ public Secret getApiKey() {
+ return apiKey;
+ }
+
+ @Override
+ public Assistant createAssistant() {
+ // DeepSeek provides an OpenAI-compatible API
+ String baseUrl = Util.fixEmptyAndTrim(getUrl());
+ if (baseUrl == null) {
+ baseUrl = DEFAULT_URL;
+ }
+
+ ChatModel model = OpenAiChatModel.builder()
+ .baseUrl(baseUrl)
+ .apiKey(getApiKey().getPlainText())
+ .modelName(getModel())
+ .temperature(0.3)
+ .responseFormat(ResponseFormat.JSON)
+ .strictJsonSchema(true)
+ .logRequests(LOGGER.isLoggable(Level.FINE))
+ .logResponses(LOGGER.isLoggable(Level.FINE))
+ .build();
+
+ return AiServices.create(Assistant.class, model);
+ }
+
+ @Override
+ public boolean isNotValid(@CheckForNull TaskListener listener) {
+ if (listener != null) {
+ if (Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null) {
+ listener.getLogger().println("No Api key configured for DeepSeek.");
+ } else if (Util.fixEmptyAndTrim(getModel()) == null) {
+ listener.getLogger().println("No Model configured for DeepSeek.");
+ }
+ }
+ return Util.fixEmptyAndTrim(Secret.toString(getApiKey())) == null ||
+ Util.fixEmptyAndTrim(getModel()) == null;
+ }
+
+ @Extension
+ @Symbol("deepseek")
+ public static class DescriptorImpl extends BaseProviderDescriptor {
+
+ private static final String[] MODELS = new String[]{
+ "deepseek-chat",
+ "deepseek-coder",
+ "deepseek-reasoner"
+ };
+
+ @NonNull
+ @Override
+ public String getDisplayName() {
+ return "DeepSeek";
+ }
+
+ @Override
+ public String getDefaultModel() {
+ return "deepseek-chat";
+ }
+
+ @GET
+ @SuppressWarnings("lgtm[jenkins/no-permission-check]")
+ public AutoCompletionCandidates doAutoCompleteModel(@QueryParameter String value) {
+ AutoCompletionCandidates c = new AutoCompletionCandidates();
+ for (String model : MODELS) {
+ if (model.toLowerCase().startsWith(value.toLowerCase())) {
+ c.add(model);
+ }
+ }
+ return c;
+ }
+
+ /**
+ * Method to test the AI API configuration.
+ * This is called when the "Test Configuration" button is clicked.
+ */
+ @POST
+ public FormValidation doTestConfiguration(@QueryParameter("apiKey") Secret apiKey,
+ @QueryParameter("url") String url,
+ @QueryParameter("model") String model) throws ExplanationException {
+ Jenkins.get().checkPermission(Jenkins.ADMINISTER);
+
+ DeepSeekProvider provider = new DeepSeekProvider(url, model, apiKey);
+ try {
+ provider.explainError("Send 'Configuration test successful' to me.", null);
+ return FormValidation.ok("Configuration test successful! API connection is working properly.");
+ } catch (ExplanationException e) {
+ return FormValidation.error("Configuration test failed: " + e.getMessage(), e);
+ }
+ }
+
+ }
+
+}
diff --git a/src/main/resources/io/jenkins/plugins/explain_error/provider/AnthropicProvider/config.jelly b/src/main/resources/io/jenkins/plugins/explain_error/provider/AnthropicProvider/config.jelly
new file mode 100644
index 0000000..1f74758
--- /dev/null
+++ b/src/main/resources/io/jenkins/plugins/explain_error/provider/AnthropicProvider/config.jelly
@@ -0,0 +1,18 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/src/main/resources/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider/config.jelly b/src/main/resources/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider/config.jelly
new file mode 100644
index 0000000..7a60c65
--- /dev/null
+++ b/src/main/resources/io/jenkins/plugins/explain_error/provider/AzureOpenAIProvider/config.jelly
@@ -0,0 +1,18 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/src/main/resources/io/jenkins/plugins/explain_error/provider/DeepSeekProvider/config.jelly b/src/main/resources/io/jenkins/plugins/explain_error/provider/DeepSeekProvider/config.jelly
new file mode 100644
index 0000000..504765c
--- /dev/null
+++ b/src/main/resources/io/jenkins/plugins/explain_error/provider/DeepSeekProvider/config.jelly
@@ -0,0 +1,18 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/src/test/java/io/jenkins/plugins/explain_error/CasCTest.java b/src/test/java/io/jenkins/plugins/explain_error/CasCTest.java
index 1198fe1..bf9b8e5 100644
--- a/src/test/java/io/jenkins/plugins/explain_error/CasCTest.java
+++ b/src/test/java/io/jenkins/plugins/explain_error/CasCTest.java
@@ -2,11 +2,15 @@
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertInstanceOf;
+import static org.junit.jupiter.api.Assertions.assertNotNull;
import io.jenkins.plugins.casc.misc.ConfiguredWithCode;
import io.jenkins.plugins.casc.misc.JenkinsConfiguredWithCodeRule;
import io.jenkins.plugins.casc.misc.junit.jupiter.WithJenkinsConfiguredWithCode;
+import io.jenkins.plugins.explain_error.provider.AnthropicProvider;
+import io.jenkins.plugins.explain_error.provider.AzureOpenAIProvider;
import io.jenkins.plugins.explain_error.provider.BaseAIProvider;
+import io.jenkins.plugins.explain_error.provider.DeepSeekProvider;
import io.jenkins.plugins.explain_error.provider.OllamaProvider;
import org.junit.jupiter.api.Test;
@@ -32,4 +36,44 @@ void loadNewConfig(JenkinsConfiguredWithCodeRule jcwcRule) {
assertEquals("gemma3:1b", provider.getModel());
assertEquals("http://localhost:11434", provider.getUrl());
}
+
+ @Test
+ @ConfiguredWithCode("casc_anthropic.yaml")
+ void loadAnthropicConfig(JenkinsConfiguredWithCodeRule jcwcRule) {
+ GlobalConfigurationImpl config = GlobalConfigurationImpl.get();
+ BaseAIProvider provider = config.getAiProvider();
+ assertInstanceOf(AnthropicProvider.class, provider);
+ assertEquals("claude-3-5-sonnet-20241022", provider.getModel());
+
+ AnthropicProvider anthropicProvider = (AnthropicProvider) provider;
+ assertNotNull(anthropicProvider.getApiKey());
+ assertEquals("test-anthropic-key", anthropicProvider.getApiKey().getPlainText());
+ }
+
+ @Test
+ @ConfiguredWithCode("casc_azure.yaml")
+ void loadAzureConfig(JenkinsConfiguredWithCodeRule jcwcRule) {
+ GlobalConfigurationImpl config = GlobalConfigurationImpl.get();
+ BaseAIProvider provider = config.getAiProvider();
+ assertInstanceOf(AzureOpenAIProvider.class, provider);
+ assertEquals("gpt-4.1-deployment", provider.getModel());
+ assertEquals("https://test-resource.openai.azure.com", provider.getUrl());
+
+ AzureOpenAIProvider azureProvider = (AzureOpenAIProvider) provider;
+ assertNotNull(azureProvider.getApiKey());
+ assertEquals("test-azure-key", azureProvider.getApiKey().getPlainText());
+ }
+
+ @Test
+ @ConfiguredWithCode("casc_deepseek.yaml")
+ void loadDeepSeekConfig(JenkinsConfiguredWithCodeRule jcwcRule) {
+ GlobalConfigurationImpl config = GlobalConfigurationImpl.get();
+ BaseAIProvider provider = config.getAiProvider();
+ assertInstanceOf(DeepSeekProvider.class, provider);
+ assertEquals("deepseek-chat", provider.getModel());
+
+ DeepSeekProvider deepseekProvider = (DeepSeekProvider) provider;
+ assertNotNull(deepseekProvider.getApiKey());
+ assertEquals("test-deepseek-key", deepseekProvider.getApiKey().getPlainText());
+ }
}
diff --git a/src/test/java/io/jenkins/plugins/explain_error/provider/ProviderTest.java b/src/test/java/io/jenkins/plugins/explain_error/provider/ProviderTest.java
index 03195e8..b22d75e 100644
--- a/src/test/java/io/jenkins/plugins/explain_error/provider/ProviderTest.java
+++ b/src/test/java/io/jenkins/plugins/explain_error/provider/ProviderTest.java
@@ -155,4 +155,157 @@ void testOllamaNullUrl() {
assertEquals("The provider is not properly configured.", result.getMessage());
}
+
+ // ============= Anthropic Provider Tests =============
+
+ @Test
+ void testAnthropicWithNullApiKey() {
+ BaseAIProvider provider = new AnthropicProvider(null, "claude-3-5-sonnet-20241022", null);
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAnthropicWithEmptyApiKey() {
+ BaseAIProvider provider = new AnthropicProvider(null, "claude-3-5-sonnet-20241022", Secret.fromString(""));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAnthropicWithNullModel() {
+ BaseAIProvider provider = new AnthropicProvider(null, null, Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAnthropicWithEmptyModel() {
+ BaseAIProvider provider = new AnthropicProvider(null, "", Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAnthropicProviderCreation() {
+ BaseAIProvider provider = new AnthropicProvider(null, "claude-3-5-sonnet-20241022", Secret.fromString("test-key"));
+ assertEquals("claude-3-5-sonnet-20241022", provider.getModel());
+ assertEquals(null, provider.getUrl());
+ }
+
+ @Test
+ void testAnthropicProviderWithCustomUrl() {
+ BaseAIProvider provider = new AnthropicProvider("https://custom-anthropic.example.com", "claude-3-5-sonnet-20241022", Secret.fromString("test-key"));
+ assertEquals("claude-3-5-sonnet-20241022", provider.getModel());
+ assertEquals("https://custom-anthropic.example.com", provider.getUrl());
+ }
+
+ // ============= Azure OpenAI Provider Tests =============
+
+ @Test
+ void testAzureOpenAIWithNullApiKey() {
+ BaseAIProvider provider = new AzureOpenAIProvider("https://test.openai.azure.com", "gpt-4.1", null);
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIWithEmptyApiKey() {
+ BaseAIProvider provider = new AzureOpenAIProvider("https://test.openai.azure.com", "gpt-4.1", Secret.fromString(""));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIWithNullEndpoint() {
+ BaseAIProvider provider = new AzureOpenAIProvider(null, "gpt-4.1", Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIWithEmptyEndpoint() {
+ BaseAIProvider provider = new AzureOpenAIProvider("", "gpt-4.1", Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIWithNullDeployment() {
+ BaseAIProvider provider = new AzureOpenAIProvider("https://test.openai.azure.com", null, Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIWithEmptyDeployment() {
+ BaseAIProvider provider = new AzureOpenAIProvider("https://test.openai.azure.com", "", Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testAzureOpenAIProviderCreation() {
+ BaseAIProvider provider = new AzureOpenAIProvider("https://test.openai.azure.com", "my-deployment", Secret.fromString("test-key"));
+ assertEquals("my-deployment", provider.getModel());
+ assertEquals("https://test.openai.azure.com", provider.getUrl());
+ }
+
+ // ============= DeepSeek Provider Tests =============
+
+ @Test
+ void testDeepSeekWithNullApiKey() {
+ BaseAIProvider provider = new DeepSeekProvider(null, "deepseek-chat", null);
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testDeepSeekWithEmptyApiKey() {
+ BaseAIProvider provider = new DeepSeekProvider(null, "deepseek-chat", Secret.fromString(""));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testDeepSeekWithNullModel() {
+ BaseAIProvider provider = new DeepSeekProvider(null, null, Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testDeepSeekWithEmptyModel() {
+ BaseAIProvider provider = new DeepSeekProvider(null, "", Secret.fromString("test-key"));
+ ExplanationException result = assertThrows(ExplanationException.class, () -> provider.explainError("Test error", null));
+
+ assertEquals("The provider is not properly configured.", result.getMessage());
+ }
+
+ @Test
+ void testDeepSeekProviderCreation() {
+ BaseAIProvider provider = new DeepSeekProvider(null, "deepseek-chat", Secret.fromString("test-key"));
+ assertEquals("deepseek-chat", provider.getModel());
+ assertEquals(null, provider.getUrl());
+ }
+
+ @Test
+ void testDeepSeekProviderWithCustomUrl() {
+ BaseAIProvider provider = new DeepSeekProvider("https://custom-deepseek.example.com", "deepseek-coder", Secret.fromString("test-key"));
+ assertEquals("deepseek-coder", provider.getModel());
+ assertEquals("https://custom-deepseek.example.com", provider.getUrl());
+ }
}
diff --git a/src/test/resources/io/jenkins/plugins/explain_error/casc_anthropic.yaml b/src/test/resources/io/jenkins/plugins/explain_error/casc_anthropic.yaml
new file mode 100644
index 0000000..d8ccb4a
--- /dev/null
+++ b/src/test/resources/io/jenkins/plugins/explain_error/casc_anthropic.yaml
@@ -0,0 +1,7 @@
+unclassified:
+ explainError:
+ aiProvider:
+ anthropic:
+ apiKey: "test-anthropic-key"
+ model: "claude-3-5-sonnet-20241022"
+ enableExplanation: true
diff --git a/src/test/resources/io/jenkins/plugins/explain_error/casc_azure.yaml b/src/test/resources/io/jenkins/plugins/explain_error/casc_azure.yaml
new file mode 100644
index 0000000..3612c34
--- /dev/null
+++ b/src/test/resources/io/jenkins/plugins/explain_error/casc_azure.yaml
@@ -0,0 +1,8 @@
+unclassified:
+ explainError:
+ aiProvider:
+ azureOpenai:
+ apiKey: "test-azure-key"
+ url: "https://test-resource.openai.azure.com"
+ model: "gpt-4.1-deployment"
+ enableExplanation: true
diff --git a/src/test/resources/io/jenkins/plugins/explain_error/casc_deepseek.yaml b/src/test/resources/io/jenkins/plugins/explain_error/casc_deepseek.yaml
new file mode 100644
index 0000000..9cff5f0
--- /dev/null
+++ b/src/test/resources/io/jenkins/plugins/explain_error/casc_deepseek.yaml
@@ -0,0 +1,7 @@
+unclassified:
+ explainError:
+ aiProvider:
+ deepseek:
+ apiKey: "test-deepseek-key"
+ model: "deepseek-chat"
+ enableExplanation: true